NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2000-01-01
The use of the Principal Component Analysis technique for the analysis of geophysical time series has been questioned in particular for its tendency to extract components that mix several physical phenomena even when the signal is just their linear sum. We demonstrate with a data simulation experiment that the Independent Component Analysis, a recently developed technique, is able to solve this problem. This new technique requires the statistical independence of components, a stronger constraint, that uses higher-order statistics, instead of the classical decorrelation a weaker constraint, that uses only second-order statistics. Furthermore, ICA does not require additional a priori information such as the localization constraint used in Rotational Techniques.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Le; Timbie, Peter T.; Bunn, Emory F.
In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...
Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs
NASA Technical Reports Server (NTRS)
Somani, Arun K.
1996-01-01
Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.
NASA Technical Reports Server (NTRS)
Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.
2015-01-01
The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.
NASA Astrophysics Data System (ADS)
Ojima, Nobutoshi; Fujiwara, Izumi; Inoue, Yayoi; Tsumura, Norimichi; Nakaguchi, Toshiya; Iwata, Kayoko
2011-03-01
Uneven distribution of skin color is one of the biggest concerns about facial skin appearance. Recently several techniques to analyze skin color have been introduced by separating skin color information into chromophore components, such as melanin and hemoglobin. However, there are not many reports on quantitative analysis of unevenness of skin color by considering type of chromophore, clusters of different sizes and concentration of the each chromophore. We propose a new image analysis and simulation method based on chromophore analysis and spatial frequency analysis. This method is mainly composed of three techniques: independent component analysis (ICA) to extract hemoglobin and melanin chromophores from a single skin color image, an image pyramid technique which decomposes each chromophore into multi-resolution images, which can be used for identifying different sizes of clusters or spatial frequencies, and analysis of the histogram obtained from each multi-resolution image to extract unevenness parameters. As the application of the method, we also introduce an image processing technique to change unevenness of melanin component. As the result, the method showed high capabilities to analyze unevenness of each skin chromophore: 1) Vague unevenness on skin could be discriminated from noticeable pigmentation such as freckles or acne. 2) By analyzing the unevenness parameters obtained from each multi-resolution image for Japanese ladies, agerelated changes were observed in the parameters of middle spatial frequency. 3) An image processing system modulating the parameters was proposed to change unevenness of skin images along the axis of the obtained age-related change in real time.
CHEMICAL ANALYSIS METHODS FOR ATMOSPHERIC AEROSOL COMPONENTS
This chapter surveys the analytical techniques used to determine the concentrations of aerosol mass and its chemical components. The techniques surveyed include mass, major ions (sulfate, nitrate, ammonium), organic carbon, elemental carbon, and trace elements. As reported in...
Factor Analysis and Counseling Research
ERIC Educational Resources Information Center
Weiss, David J.
1970-01-01
Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…
NASA Astrophysics Data System (ADS)
Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas
2014-05-01
Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.
Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.
2010-01-01
The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
An Analysis of Nondestructive Evaluation Techniques for Polymer Matrix Composite Sandwich Materials
NASA Technical Reports Server (NTRS)
Cosgriff, Laura M.; Roberts, Gary D.; Binienda, Wieslaw K.; Zheng, Diahua; Averbeck, Timothy; Roth, Donald J.; Jeanneau, Philippe
2006-01-01
Structural sandwich materials composed of triaxially braided polymer matrix composite material face sheets sandwiching a foam core are being utilized for applications including aerospace components and recreational equipment. Since full scale components are being made from these sandwich materials, it is necessary to develop proper inspection practices for their manufacture and in-field use. Specifically, nondestructive evaluation (NDE) techniques need to be investigated for analysis of components made from these materials. Hockey blades made from sandwich materials and a flat sandwich sample were examined with multiple NDE techniques including thermographic, radiographic, and shearographic methods to investigate damage induced in the blades and flat panel components. Hockey blades used during actual play and a flat polymer matrix composite sandwich sample with damage inserted into the foam core were investigated with each technique. NDE images from the samples were presented and discussed. Structural elements within each blade were observed with radiographic imaging. Damaged regions and some structural elements of the hockey blades were identified with thermographic imaging. Structural elements, damaged regions, and other material variations were detected in the hockey blades with shearography. Each technique s advantages and disadvantages were considered in making recommendations for inspection of components made from these types of materials.
NASA Astrophysics Data System (ADS)
Avitabile, Peter; O'Callahan, John
2009-01-01
Generally, response analysis of systems containing discrete nonlinear connection elements such as typical mounting connections require the physical finite element system matrices to be used in a direct integration algorithm to compute the nonlinear response analysis solution. Due to the large size of these physical matrices, forced nonlinear response analysis requires significant computational resources. Usually, the individual components of the system are analyzed and tested as separate components and their individual behavior may essentially be linear when compared to the total assembled system. However, the joining of these linear subsystems using highly nonlinear connection elements causes the entire system to become nonlinear. It would be advantageous if these linear modal subsystems could be utilized in the forced nonlinear response analysis since much effort has usually been expended in fine tuning and adjusting the analytical models to reflect the tested subsystem configuration. Several more efficient techniques have been developed to address this class of problem. Three of these techniques given as: equivalent reduced model technique (ERMT);modal modification response technique (MMRT); andcomponent element method (CEM); are presented in this paper and are compared to traditional methods.
Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)
NASA Astrophysics Data System (ADS)
De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.
1993-01-01
The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
Using dynamic mode decomposition for real-time background/foreground separation in video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutz, Jose Nathan; Grosek, Jacob; Brunton, Steven
The technique of dynamic mode decomposition (DMD) is disclosed herein for the purpose of robustly separating video frames into background (low-rank) and foreground (sparse) components in real-time. Foreground/background separation is achieved at the computational cost of just one singular value decomposition (SVD) and one linear equation solve, thus producing results orders of magnitude faster than robust principal component analysis (RPCA). Additional techniques, including techniques for analyzing the video for multi-resolution time-scale components, and techniques for reusing computations to allow processing of streaming video in real time, are also described herein.
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
A guide to understanding meta-analysis.
Israel, Heidi; Richter, Randy R
2011-07-01
With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
Chan, H L; Lin, J L; Huang, H H; Wu, C P
1997-09-01
A new technique for interference-term suppression in Wigner-Ville distribution (WVD) is proposed for the signal with 1/f spectrum shape. The spectral characteristic of the signal is altered by f alpha filtering before time-frequency analysis and compensated after analysis. With the utilization of the proposed technique in smoothed pseudo Wigner-Ville distribution, an excellent suppression of interference component can be achieved.
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
The combined use of order tracking techniques for enhanced Fourier analysis of order components
NASA Astrophysics Data System (ADS)
Wang, K. S.; Heyns, P. S.
2011-04-01
Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.
K-Fold Crossvalidation in Canonical Analysis.
ERIC Educational Resources Information Center
Liang, Kun-Hsia; And Others
1995-01-01
A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)
Laser-induced breakdown spectroscopy is a reliable method for urinary stone analysis
Mutlu, Nazım; Çiftçi, Seyfettin; Gülecen, Turgay; Öztoprak, Belgin Genç; Demir, Arif
2016-01-01
Objective We compared laser-induced breakdown spectroscopy (LIBS) with the traditionally used and recommended X-ray diffraction technique (XRD) for urinary stone analysis. Material and methods In total, 65 patients with urinary calculi were enrolled in this prospective study. Stones were obtained after surgical or extracorporeal shockwave lithotripsy procedures. All stones were divided into two equal pieces. One sample was analyzed by XRD and the other by LIBS. The results were compared by the kappa (κ) and Spearman’s correlation coefficient (rho) tests. Results Using LIBS, 95 components were identified from 65 stones, while XRD identified 88 components. LIBS identified 40 stones with a single pure component, 20 stones with two different components, and 5 stones with three components. XRD demonstrated 42 stones with a single component, 22 stones with two different components, and only 1 stone with three different components. There was a strong relationship in the detection of stone types between LIBS and XRD for stones components (Spearman rho, 0.866; p<0.001). There was excellent agreement between the two techniques among 38 patients with pure stones (κ index, 0.910; Spearman rho, 0.916; p<0.001). Conclusion Our study indicates that LIBS is a valid and reliable technique for determining urinary stone composition. Moreover, it is a simple, low-cost, and nondestructive technique. LIBS can be safely used in routine daily practice if our results are supported by studies with larger numbers of patients. PMID:27011877
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.
2015-01-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242
Analysis of the principal component algorithm in phase-shifting interferometry.
Vargas, J; Quiroga, J Antonio; Belenguer, T
2011-06-15
We recently presented a new asynchronous demodulation method for phase-sampling interferometry. The method is based in the principal component analysis (PCA) technique. In the former work, the PCA method was derived heuristically. In this work, we present an in-depth analysis of the PCA demodulation method.
NASA Technical Reports Server (NTRS)
Parada, N. D. J.; Novo, E. M. L. M.
1983-01-01
Two sets of MSS/LANDSAT data with solar elevation ranging from 22 deg to 41 deg were used at the Image-100 System to implement the Eliason et alii technique for extracting the topographic modulation component. An unsupervised cluster analysis was used to obtain an average brightness image for each channel. Analysis of the enhanced imaged shows that the technique for extracting topographic modulation component is more appropriated to MSS data obtained under high sun elevation ngles. Low sun elevation increases the variance of each cluster so that the average brightness doesn't represent its albedo proprties. The topographic modulation component applied to low sun elevation angle damages rather than enhance topographic information. Better results were produced for channels 4 and 5 than for channels 6 and 7.
Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.
Rodriguez-Cruz, Sandra E
2006-01-01
The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.
Non-destructive evaluation techniques, high temperature ceramic component parts for gas turbines
NASA Technical Reports Server (NTRS)
Reiter, H.; Hirsekorn, S.; Lottermoser, J.; Goebbels, K.
1984-01-01
This report concerns studies conducted on various tests undertaken on material without destroying the material. Tests included: microradiographic techniques, vibration analysis, high-frequency ultrasonic tests with the addition of evaluation of defects and structure through analysis of ultrasonic scattering data, microwave tests and analysis of sound emission.
Screening of polar components of petroleum products by electrospray ionization mass spectrometry
Rostad, Colleen E.
2005-01-01
The polar components of fuels may enable differentiation between fuel types or commercial fuel sources. Screening for these components in the hydrocarbon product is difficult due to their very low concentrations in such a complex matrix. Various commercial fuels from several sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at very low concentrations in commercial hydrocarbon products. This analysis was then applied to hydrocarbon samples collected from the subsurface with a different extent of biodegradation or weathering. Although the alkane and isoprenoid portion had begun to biodegrade or weather, the polar components had changed little over time. Because these polar compounds are unique in different fuels, this screening technique can provide source information on hydrocarbons released into the environment.
Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S
2017-06-01
Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.
Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz
2010-08-06
Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.
NASA Technical Reports Server (NTRS)
Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.
1996-01-01
The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FIS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques.
Restoration of recto-verso colour documents using correlated component analysis
NASA Astrophysics Data System (ADS)
Tonazzini, Anna; Bedini, Luigi
2013-12-01
In this article, we consider the problem of removing see-through interferences from pairs of recto-verso documents acquired either in grayscale or RGB modality. The see-through effect is a typical degradation of historical and archival documents or manuscripts, and is caused by transparency or seeping of ink from the reverse side of the page. We formulate the problem as one of separating two individual texts, overlapped in the recto and verso maps of the colour channels through a linear convolutional mixing operator, where the mixing coefficients are unknown, while the blur kernels are assumed known a priori or estimated off-line. We exploit statistical techniques of blind source separation to estimate both the unknown model parameters and the ideal, uncorrupted images of the two document sides. We show that recently proposed correlated component analysis techniques overcome the already satisfactory performance of independent component analysis techniques and colour decorrelation, when the two texts are even sensibly correlated.
Integrative sparse principal component analysis of gene expression data.
Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge
2017-12-01
In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.
Developing techniques for cause-responsibility analysis of occupational accidents.
Jabbari, Mousa; Ghorbani, Roghayeh
2016-11-01
The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.
Advanced Fingerprint Analysis Project Fingerprint Constituents
DOE Office of Scientific and Technical Information (OSTI.GOV)
GM Mong; CE Petersen; TRW Clauss
The work described in this report was focused on generating fundamental data on fingerprint components which will be used to develop advanced forensic techniques to enhance fluorescent detection, and visualization of latent fingerprints. Chemical components of sweat gland secretions are well documented in the medical literature and many chemical techniques are available to develop latent prints, but there have been no systematic forensic studies of fingerprint sweat components or of the chemical and physical changes these substances undergo over time.
Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center
NASA Technical Reports Server (NTRS)
Reinath, Michael S.
1997-01-01
Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.
Information extraction from multivariate images
NASA Technical Reports Server (NTRS)
Park, S. K.; Kegley, K. A.; Schiess, J. R.
1986-01-01
An overview of several multivariate image processing techniques is presented, with emphasis on techniques based upon the principal component transformation (PCT). Multiimages in various formats have a multivariate pixel value, associated with each pixel location, which has been scaled and quantized into a gray level vector, and the bivariate of the extent to which two images are correlated. The PCT of a multiimage decorrelates the multiimage to reduce its dimensionality and reveal its intercomponent dependencies if some off-diagonal elements are not small, and for the purposes of display the principal component images must be postprocessed into multiimage format. The principal component analysis of a multiimage is a statistical analysis based upon the PCT whose primary application is to determine the intrinsic component dimensionality of the multiimage. Computational considerations are also discussed.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
SCADA alarms processing for wind turbine component failure detection
NASA Astrophysics Data System (ADS)
Gonzalez, E.; Reder, M.; Melero, J. J.
2016-09-01
Wind turbine failure and downtime can often compromise the profitability of a wind farm due to their high impact on the operation and maintenance (O&M) costs. Early detection of failures can facilitate the changeover from corrective maintenance towards a predictive approach. This paper presents a cost-effective methodology to combine various alarm analysis techniques, using data from the Supervisory Control and Data Acquisition (SCADA) system, in order to detect component failures. The approach categorises the alarms according to a reviewed taxonomy, turning overwhelming data into valuable information to assess component status. Then, different alarms analysis techniques are applied for two purposes: the evaluation of the SCADA alarm system capability to detect failures, and the investigation of the relation between components faults being followed by failure occurrences in others. Various case studies are presented and discussed. The study highlights the relationship between faulty behaviour in different components and between failures and adverse environmental conditions.
The Intercultural Component in Textbooks for Teaching a Service Technical Writing Course
ERIC Educational Resources Information Center
Matveeva, Natalia
2007-01-01
This research article investigates new developments in the representation of the intercultural component in textbooks for a service technical writing course. Through textual analysis, using quantitative and qualitative techniques, I report discourse analysis of 15 technical writing textbooks published during 1993-2006. The theoretical and…
Study of advanced techniques for determining the long-term performance of components
NASA Technical Reports Server (NTRS)
1972-01-01
A study was conducted of techniques having the capability of determining the performance and reliability of components for spacecraft liquid propulsion applications for long term missions. The study utilized two major approaches; improvement in the existing technology, and the evolution of new technology. The criteria established and methods evolved are applicable to valve components. Primary emphasis was placed on the propellants oxygen difluoride and diborane combination. The investigation included analysis, fabrication, and tests of experimental equipment to provide data and performance criteria.
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clegg, Samuel M; Barefield, James E; Wiens, Roger C
2008-01-01
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less
Portable XRF and principal component analysis for bill characterization in forensic science.
Appoloni, C R; Melquiades, F L
2014-02-01
Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. Copyright © 2013 Elsevier Ltd. All rights reserved.
Fourier Analysis and the Rhythm of Conversation.
ERIC Educational Resources Information Center
Dabbs, James M., Jr.
Fourier analysis, a common technique in engineering, breaks down a complex wave form into its simple sine wave components. Communication researchers have recently suggested that this technique may provide an index of the rhythm of conversation, since vocalizing and pausing produce a complex wave form pattern of alternation between two speakers. To…
Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques
NASA Technical Reports Server (NTRS)
McDonald, G.; Storrie-Lombardi, M.; Nealson, K.
1999-01-01
The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.
ERIC Educational Resources Information Center
Misra, Anjali; Schloss, Patrick J.
1989-01-01
The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…
NASA Technical Reports Server (NTRS)
Sopher, R.; Hallock, D. W.
1985-01-01
A time history analysis for rotorcraft dynamics based on dynamical substructures, and nonstructural mathematical and aerodynamic components is described. The analysis is applied to predict helicopter ground resonance and response to rotor damage. Other applications illustrate the stability and steady vibratory response of stopped and gimballed rotors, representative of new technology. Desirable attributes expected from modern codes are realized, although the analysis does not employ a complete set of techniques identified for advanced software. The analysis is able to handle a comprehensive set of steady state and stability problems with a small library of components.
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz
2018-05-05
A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.
Computer-Aided Design of Low-Noise Microwave Circuits
NASA Astrophysics Data System (ADS)
Wedge, Scott William
1991-02-01
Devoid of most natural and manmade noise, microwave frequencies have detection sensitivities limited by internally generated receiver noise. Low-noise amplifiers are therefore critical components in radio astronomical antennas, communications links, radar systems, and even home satellite dishes. A general technique to accurately predict the noise performance of microwave circuits has been lacking. Current noise analysis methods have been limited to specific circuit topologies or neglect correlation, a strong effect in microwave devices. Presented here are generalized methods, developed for computer-aided design implementation, for the analysis of linear noisy microwave circuits comprised of arbitrarily interconnected components. Included are descriptions of efficient algorithms for the simultaneous analysis of noisy and deterministic circuit parameters based on a wave variable approach. The methods are therefore particularly suited to microwave and millimeter-wave circuits. Noise contributions from lossy passive components and active components with electronic noise are considered. Also presented is a new technique for the measurement of device noise characteristics that offers several advantages over current measurement methods.
NASA Technical Reports Server (NTRS)
Duong, T. A.
2004-01-01
In this paper, we present a new, simple, and optimized hardware architecture sequential learning technique for adaptive Principle Component Analysis (PCA) which will help optimize the hardware implementation in VLSI and to overcome the difficulties of the traditional gradient descent in learning convergence and hardware implementation.
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
Lü, Gui-Cai; Zhao, Wei-Hong; Wang, Jiang-Tao
2011-01-01
The identification techniques for 10 species of red tide algae often found in the coastal areas of China were developed by combining the three-dimensional fluorescence spectra of fluorescence dissolved organic matter (FDOM) from the cultured red tide algae with principal component analysis. Based on the results of principal component analysis, the first principal component loading spectrum of three-dimensional fluorescence spectrum was chosen as the identification characteristic spectrum for red tide algae, and the phytoplankton fluorescence characteristic spectrum band was established. Then the 10 algae species were tested using Bayesian discriminant analysis with a correct identification rate of more than 92% for Pyrrophyta on the level of species, and that of more than 75% for Bacillariophyta on the level of genus in which the correct identification rates were more than 90% for the phaeodactylum and chaetoceros. The results showed that the identification techniques for 10 species of red tide algae based on the three-dimensional fluorescence spectra of FDOM from the cultured red tide algae and principal component analysis could work well.
The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright; Milos Manic
2010-05-01
This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Chromatographic and electrophoretic approaches in ink analysis.
Zlotnick, J A; Smith, F P
1999-10-15
Inks are manufactured from a wide variety of substances that exhibit very different chemical behaviors. Inks designed for use in different writing instruments or printing methods have quite dissimilar components. Since the 1950s chromatographic and electrophoretic methods have played important roles in the analysis of inks, where compositional information may have bearing on the investigation of counterfeiting, fraud, forgery, and other crimes. Techniques such as paper chromatography and electrophoresis, thin-layer chromatography, high-performance liquid chromatography, gas chromatography, gel electrophoresis, and the relatively new technique of capillary electrophoresis have all been explored as possible avenues for the separation of components of inks. This paper reviews the components of different types of inks and applications of the above separation methods are reviewed.
A comparison of autonomous techniques for multispectral image analysis and classification
NASA Astrophysics Data System (ADS)
Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso
2012-10-01
Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.
Moon, Young-Wan; Kim, Hyun-Jung; Ahn, Hyeong-Sik; Park, Chan-Deok; Lee, Dae-Hee
2016-09-01
This meta-analysis was designed to compare the accuracy of soft tissue balancing and femoral component rotation as well as change in joint line positions, between the measured resection and gap balancing techniques in primary total knee arthroplasty. Studies were included in the meta-analysis if they compared soft tissue balancing and/or radiologic outcomes in patients who underwent total knee arthroplasty with the gap balancing and measured resection techniques. Comparisons included differences in flexion/extension, medial/lateral flexion, and medial/lateral extension gaps (LEGs), femoral component rotation, and change in joint line positions. Finally, 8 studies identified via electronic (MEDLINE, EMBASE, and the Cochrane Library) and manual searches were included. All 8 studies showed a low risk of selection bias and provided detailed demographic data. There was some inherent heterogeneity due to uncontrolled bias, because all included studies were observational comparison studies. The pooled mean difference in gap differences between the gap balancing and measured resection techniques did not differ significantly (-0.09 mm, 95% confidence interval [CI]: -0.40 to +0.21 mm; P = 0.55), except that the medial/LEG difference was 0.58 mm greater for measured resection than gap balancing (95% CI: -1.01 to -0.15 mm; P = 0.008). Conversely, the pooled mean difference in femoral component external rotation (0.77°, 95% CI: 0.18° to 1.35°; P = 0.01) and joint line change (1.17 mm, 95% CI: 0.82 to 1.52 mm; P < 0.001) were significantly greater for the gap balancing than the measured resection technique. The gap balancing and measured resection techniques showed similar soft tissue balancing, except for medial/LEG difference. However, the femoral component was more externally rotated and the joint line was more elevated with gap balancing than measured resection. These differences were minimal (around 1 mm or 1°) and therefore may have little effect on the biomechanics of the knee joint. This suggests that the gap balancing and measured resection techniques are not mutually exclusive.
Energy resolution improvement of CdTe detectors by using the principal component analysis technique
NASA Astrophysics Data System (ADS)
Alharbi, T.
2018-02-01
In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
An Inquiry-Based Project Focused on the X-Ray Powder Diffraction Analysis of Common Household Solids
ERIC Educational Resources Information Center
Hulien, Molly L.; Lekse, Jonathan W.; Rosmus, Kimberly A.; Devlin, Kasey P.; Glenn, Jennifer R.; Wisneski, Stephen D.; Wildfong, Peter; Lake, Charles H.; MacNeil, Joseph H.; Aitken, Jennifer A.
2015-01-01
While X-ray powder diffraction (XRPD) is a fundamental analytical technique used by solid-state laboratories across a breadth of disciplines, it is still underrepresented in most undergraduate curricula. In this work, we incorporate XRPD analysis into an inquiry-based project that requires students to identify the crystalline component(s) of…
An RFI Detection Algorithm for Microwave Radiometers Using Sparse Component Analysis
NASA Technical Reports Server (NTRS)
Mohammed-Tano, Priscilla N.; Korde-Patel, Asmita; Gholian, Armen; Piepmeier, Jeffrey R.; Schoenwald, Adam; Bradley, Damon
2017-01-01
Radio Frequency Interference (RFI) is a threat to passive microwave measurements and if undetected, can corrupt science retrievals. The sparse component analysis (SCA) for blind source separation has been investigated to detect RFI in microwave radiometer data. Various techniques using SCA have been simulated to determine detection performance with continuous wave (CW) RFI.
Mapping brain activity in gradient-echo functional MRI using principal component analysis
NASA Astrophysics Data System (ADS)
Khosla, Deepak; Singh, Manbir; Don, Manuel
1997-05-01
The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.
Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan
2018-04-23
Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.
Recent trends in atomic fluorescence spectrometry towards miniaturized instrumentation-A review.
Zou, Zhirong; Deng, Yujia; Hu, Jing; Jiang, Xiaoming; Hou, Xiandeng
2018-08-17
Atomic fluorescence spectrometry (AFS), as one of the common atomic spectrometric techniques with high sensitivity, simple instrumentation, and low acquisition and running cost, has been widely used in various fields for trace elemental analysis, notably the determination of hydride-forming elements by hydride generation atomic fluorescence spectrometry (HG-AFS). In recent years, the soaring demand of field analysis has significantly promoted the miniaturization of analytical atomic spectrometers or at least instrumental components. Various techniques have also been developed to approach the goal of portable/miniaturized AFS instrumentation for field analysis. In this review, potentially portable/miniaturized AFS techniques, primarily involving advanced instrumental components and whole instrumentation with references since 2000, are summarized and discussed. The discussion mainly includes five aspects: radiation source, atomizer, detector, sample introduction, and miniaturized atomic fluorescence spectrometer/system. Copyright © 2018 Elsevier B.V. All rights reserved.
Using foreground/background analysis to determine leaf and canopy chemistry
NASA Technical Reports Server (NTRS)
Pinzon, J. E.; Ustin, S. L.; Hart, Q. J.; Jacquemoud, S.; Smith, M. O.
1995-01-01
Spectral Mixture Analysis (SMA) has become a well established procedure for analyzing imaging spectrometry data, however, the technique is relatively insensitive to minor sources of spectral variation (e.g., discriminating stressed from unstressed vegetation and variations in canopy chemistry). Other statistical approaches have been tried e.g., stepwise multiple linear regression analysis to predict canopy chemistry. Grossman et al. reported that SMLR is sensitive to measurement error and that the prediction of minor chemical components are not independent of patterns observed in more dominant spectral components like water. Further, they observed that the relationships were strongly dependent on the mode of expressing reflectance (R, -log R) and whether chemistry was expressed on a weight (g/g) or are basis (g/sq m). Thus, alternative multivariate techniques need to be examined. Smith et al. reported a revised SMA that they termed Foreground/Background Analysis (FBA) that permits directing the analysis along any axis of variance by identifying vectors through the n-dimensional spectral volume orthonormal to each other. Here, we report an application of the FBA technique for the detection of canopy chemistry using a modified form of the analysis.
Time-dependent inertia analysis of vehicle mechanisms
NASA Astrophysics Data System (ADS)
Salmon, James Lee
Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.
Biostatistics Series Module 10: Brief Overview of Multivariate Methods.
Hazra, Avijit; Gogtay, Nithya
2017-01-01
Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.
NASA Astrophysics Data System (ADS)
Biswal, Milan; Mishra, Srikanta
2018-05-01
The limited information on origin and nature of stimulus frequency otoacoustic emissions (SFOAEs) necessitates a thorough reexamination into SFOAE analysis procedures. This will lead to a better understanding of the generation of SFOAEs. The SFOAE response waveform in the time domain can be interpreted as a summation of amplitude modulated and frequency modulated component waveforms. The efficiency of a technique to segregate these components is critical to describe the nature of SFOAEs. Recent advancements in robust time-frequency analysis algorithms have staked claims on the more accurate extraction of these components, from composite signals buried in noise. However, their potential has not been fully explored for SFOAEs analysis. Indifference to distinct information, due to nature of these analysis techniques, may impact the scientific conclusions. This paper attempts to bridge this gap in literature by evaluating the performance of three linear time-frequency analysis algorithms: short-time Fourier transform (STFT), continuous Wavelet transform (CWT), S-transform (ST) and two nonlinear algorithms: Hilbert-Huang Transform (HHT), synchrosqueezed Wavelet transform (SWT). We revisit the extraction of constituent components and estimation of their magnitude and delay, by carefully evaluating the impact of variation in analysis parameters. The performance of HHT and SWT from the perspective of time-frequency filtering and delay estimation were found to be relatively less efficient for analyzing SFOAEs. The intrinsic mode functions of HHT does not completely characterize the reflection components and hence IMF based filtering alone, is not recommended for segregating principal emission from multiple reflection components. We found STFT, WT, and ST to be suitable for canceling multiple internal reflection components with marginal altering in SFOAE.
NASA Technical Reports Server (NTRS)
1984-01-01
Nonlinear structural analysis techniques for engine structures and components are addressed. The finite element method and boundary element method are discussed in terms of stress and structural analyses of shells, plates, and laminates.
2014-01-01
Current musculoskeletal imaging techniques usually target the macro-morphology of articular cartilage or use histological analysis. These techniques are able to reveal advanced osteoarthritic changes in articular cartilage but fail to give detailed information to distinguish early osteoarthritis from healthy cartilage, and this necessitates high-resolution imaging techniques measuring cells and the extracellular matrix within the multilayer structure of articular cartilage. This review provides a comprehensive exploration of the cellular components and extracellular matrix of articular cartilage as well as high-resolution imaging techniques, including magnetic resonance image, electron microscopy, confocal laser scanning microscopy, second harmonic generation microscopy, and laser scanning confocal arthroscopy, in the measurement of multilayer ultra-structures of articular cartilage. This review also provides an overview for micro-structural analysis of the main components of normal or osteoarthritic cartilage and discusses the potential and challenges associated with developing non-invasive high-resolution imaging techniques for both research and clinical diagnosis of early to late osteoarthritis. PMID:24946278
2013-09-01
ORGANIZATION REPORT NUMBER ARL-TR-6576 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR... 11 Figure 11 . Estimated angle-of-attack components history, projectile no.2... 11 Figure 12. Comparison of angle-of-attack component estimates, projectile no.2. ........................12 Figure 13. Total angle-of
Independent component analysis for automatic note extraction from musical trills
NASA Astrophysics Data System (ADS)
Brown, Judith C.; Smaragdis, Paris
2004-05-01
The method of principal component analysis, which is based on second-order statistics (or linear independence), has long been used for redundancy reduction of audio data. The more recent technique of independent component analysis, enforcing much stricter statistical criteria based on higher-order statistical independence, is introduced and shown to be far superior in separating independent musical sources. This theory has been applied to piano trills and a database of trill rates was assembled from experiments with a computer-driven piano, recordings of a professional pianist, and commercially available compact disks. The method of independent component analysis has thus been shown to be an outstanding, effective means of automatically extracting interesting musical information from a sea of redundant data.
Ozone data and mission sampling analysis
NASA Technical Reports Server (NTRS)
Robbins, J. L.
1980-01-01
A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.
BUILDING ENVELOPE OPTIMIZATION USING EMERGY ANALYSIS
Energy analysis is an integral component of sustainable building practices. Energy analysis coupled with optimization techniques may offer solutions for greater energy efficiency over the lifetime of the building. However, all such computationsemploy the energy used for operation...
Using Machine Learning Techniques in the Analysis of Oceanographic Data
NASA Astrophysics Data System (ADS)
Falcinelli, K. E.; Abuomar, S.
2017-12-01
Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
Methods for spectral image analysis by exploiting spatial simplicity
Keenan, Michael R.
2010-05-25
Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.
Methods for spectral image analysis by exploiting spatial simplicity
Keenan, Michael R.
2010-11-23
Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
NASA Technical Reports Server (NTRS)
Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.
1996-01-01
The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FLS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques. The data that was used for the evaluation of the usage monitoring techniques was collected under an independent HUMS Flight trial program, using a commercially available HUMS and data recording system. The usage data collect from the HUMS trial aircraft was analyzed off-line using PC-based software that included the FCR and FLS techniques. In the future, if the technique prove feasible, usage monitoring would be incorporated into the onboard HUMS.
NASA Astrophysics Data System (ADS)
Joseph, Abhilash J.; Kumar, Binay
2018-03-01
The conventionally reported value of remanent polarization (Pr) contains contribution from non-remanent components which are not usable for memory device applications. This report presents techniques which extract the true-remanent (intrinsic) component of polarization after eliminating the non-remanent component in ferroelectric ceramics. For this, "remanent hysteresis task" and "positive-up-negative-down technique" were performed which utilized the switchable properties of polarizations to nullify the contributions from the non-remanent (non-switchable) components. The report also addresses the time-dependent leakage behavior of the ceramics focusing on the presence of resistive leakage (a time-dependent parameter) present in the ceramics. The techniques presented here are especially useful for polycrystalline ceramics where leakage current leads to an erroneous estimation of Pr.
Independent Component Analysis of Textures
NASA Technical Reports Server (NTRS)
Manduchi, Roberto; Portilla, Javier
2000-01-01
A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.
Respiratory protective device design using control system techniques
NASA Technical Reports Server (NTRS)
Burgess, W. A.; Yankovich, D.
1972-01-01
The feasibility of a control system analysis approach to provide a design base for respiratory protective devices is considered. A system design approach requires that all functions and components of the system be mathematically identified in a model of the RPD. The mathematical notations describe the operation of the components as closely as possible. The individual component mathematical descriptions are then combined to describe the complete RPD. Finally, analysis of the mathematical notation by control system theory is used to derive compensating component values that force the system to operate in a stable and predictable manner.
EVALUATION OF ACID DEPOSITION MODELS USING PRINCIPAL COMPONENT SPACES
An analytical technique involving principal components analysis is proposed for use in the evaluation of acid deposition models. elationships among model predictions are compared to those among measured data, rather than the more common one-to-one comparison of predictions to mea...
Shape component analysis: structure-preserving dimension reduction on biological shape spaces.
Lee, Hao-Chih; Liao, Tao; Zhang, Yongjie Jessica; Yang, Ge
2016-03-01
Quantitative shape analysis is required by a wide range of biological studies across diverse scales, ranging from molecules to cells and organisms. In particular, high-throughput and systems-level studies of biological structures and functions have started to produce large volumes of complex high-dimensional shape data. Analysis and understanding of high-dimensional biological shape data require dimension-reduction techniques. We have developed a technique for non-linear dimension reduction of 2D and 3D biological shape representations on their Riemannian spaces. A key feature of this technique is that it preserves distances between different shapes in an embedded low-dimensional shape space. We demonstrate an application of this technique by combining it with non-linear mean-shift clustering on the Riemannian spaces for unsupervised clustering of shapes of cellular organelles and proteins. Source code and data for reproducing results of this article are freely available at https://github.com/ccdlcmu/shape_component_analysis_Matlab The implementation was made in MATLAB and supported on MS Windows, Linux and Mac OS. geyang@andrew.cmu.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Efficient computational nonlinear dynamic analysis using modal modification response technique
NASA Astrophysics Data System (ADS)
Marinone, Timothy; Avitabile, Peter; Foley, Jason; Wolfson, Janet
2012-08-01
Generally, structural systems contain nonlinear characteristics in many cases. These nonlinear systems require significant computational resources for solution of the equations of motion. Much of the model, however, is linear where the nonlinearity results from discrete local elements connecting different components together. Using a component mode synthesis approach, a nonlinear model can be developed by interconnecting these linear components with highly nonlinear connection elements. The approach presented in this paper, the Modal Modification Response Technique (MMRT), is a very efficient technique that has been created to address this specific class of nonlinear problem. By utilizing a Structural Dynamics Modification (SDM) approach in conjunction with mode superposition, a significantly smaller set of matrices are required for use in the direct integration of the equations of motion. The approach will be compared to traditional analytical approaches to make evident the usefulness of the technique for a variety of test cases.
Component Pin Recognition Using Algorithms Based on Machine Learning
NASA Astrophysics Data System (ADS)
Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang
2018-04-01
The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.
Maximally reliable spatial filtering of steady state visual evoked potentials.
Dmochowski, Jacek P; Greaves, Alex S; Norcia, Anthony M
2015-04-01
Due to their high signal-to-noise ratio (SNR) and robustness to artifacts, steady state visual evoked potentials (SSVEPs) are a popular technique for studying neural processing in the human visual system. SSVEPs are conventionally analyzed at individual electrodes or linear combinations of electrodes which maximize some variant of the SNR. Here we exploit the fundamental assumption of evoked responses--reproducibility across trials--to develop a technique that extracts a small number of high SNR, maximally reliable SSVEP components. This novel spatial filtering method operates on an array of Fourier coefficients and projects the data into a low-dimensional space in which the trial-to-trial spectral covariance is maximized. When applied to two sample data sets, the resulting technique recovers physiologically plausible components (i.e., the recovered topographies match the lead fields of the underlying sources) while drastically reducing the dimensionality of the data (i.e., more than 90% of the trial-to-trial reliability is captured in the first four components). Moreover, the proposed technique achieves a higher SNR than that of the single-best electrode or the Principal Components. We provide a freely-available MATLAB implementation of the proposed technique, herein termed "Reliable Components Analysis". Copyright © 2015 Elsevier Inc. All rights reserved.
Principal Component Analysis for Enhancement of Infrared Spectra Monitoring
NASA Astrophysics Data System (ADS)
Haney, Ricky Lance
The issue of air quality within the aircraft cabin is receiving increasing attention from both pilot and flight attendant unions. This is due to exposure events caused by poor air quality that in some cases may have contained toxic oil components due to bleed air that flows from outside the aircraft and then through the engines into the aircraft cabin. Significant short and long-term medical issues for aircraft crew have been attributed to exposure. The need for air quality monitoring is especially evident in the fact that currently within an aircraft there are no sensors to monitor the air quality and potentially harmful gas levels (detect-to-warn sensors), much less systems to monitor and purify the air (detect-to-treat sensors) within the aircraft cabin. The specific purpose of this research is to utilize a mathematical technique called principal component analysis (PCA) in conjunction with principal component regression (PCR) and proportionality constant calculations (PCC) to simplify complex, multi-component infrared (IR) spectra data sets into a reduced data set used for determination of the concentrations of the individual components. Use of PCA can significantly simplify data analysis as well as improve the ability to determine concentrations of individual target species in gas mixtures where significant band overlap occurs in the IR spectrum region. Application of this analytical numerical technique to IR spectrum analysis is important in improving performance of commercial sensors that airlines and aircraft manufacturers could potentially use in an aircraft cabin environment for multi-gas component monitoring. The approach of this research is two-fold, consisting of a PCA application to compare simulation and experimental results with the corresponding PCR and PCC to determine quantitatively the component concentrations within a mixture. The experimental data sets consist of both two and three component systems that could potentially be present as air contaminants in an aircraft cabin. In addition, experimental data sets are analyzed for a hydrogen peroxide (H2O2) aqueous solution mixture to determine H2O2 concentrations at various levels that could be produced during use of a vapor phase hydrogen peroxide (VPHP) decontamination system. After the PCA application to two and three component systems, the analysis technique is further expanded to include the monitoring of potential bleed air contaminants from engine oil combustion. Simulation data sets created from database spectra were utilized to predict gas components and concentrations in unknown engine oil samples at high temperatures as well as time-evolved gases from the heating of engine oils.
Novel Framework for Reduced Order Modeling of Aero-engine Components
NASA Astrophysics Data System (ADS)
Safi, Ali
The present study focuses on the popular dynamic reduction methods used in design of complex assemblies (millions of Degrees of Freedom) where numerous iterations are involved to achieve the final design. Aerospace manufacturers such as Rolls Royce and Pratt & Whitney are actively seeking techniques that reduce computational time while maintaining accuracy of the models. This involves modal analysis of components with complex geometries to determine the dynamic behavior due to non-linearity and complicated loading conditions. In such a case the sub-structuring and dynamic reduction techniques prove to be an efficient tool to reduce design cycle time. The components whose designs are finalized can be dynamically reduced to mass and stiffness matrices at the boundary nodes in the assembly. These matrices conserve the dynamics of the component in the assembly, and thus avoid repeated calculations during the analysis runs for design modification of other components. This thesis presents a novel framework in terms of modeling and meshing of any complex structure, in this case an aero-engine casing. In this study the affect of meshing techniques on the run time are highlighted. The modal analysis is carried out using an extremely fine mesh to ensure all minor details in the structure are captured correctly in the Finite Element (FE) model. This is used as the reference model, to compare against the results of the reduced model. The study also shows the conditions/criteria under which dynamic reduction can be implemented effectively, proving the accuracy of Criag-Bampton (C.B.) method and limitations of Static Condensation. The study highlights the longer runtime needed to produce the reduced matrices of components compared to the overall runtime of the complete unreduced model. Although once the components are reduced, the assembly run is significantly. Hence the decision to use Component Mode Synthesis (CMS) is to be taken judiciously considering the number of iterations that may be required during the design cycle.
Planning for Cost Effectiveness.
ERIC Educational Resources Information Center
Schlaebitz, William D.
1984-01-01
A heat pump life-cycle cost analysis is used to explain the technique. Items suggested for the life-cycle analysis approach include lighting, longer-life batteries, site maintenance, and retaining experts to inspect specific building components. (MLF)
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
Study of advanced techniques for determining the long term performance of components
NASA Technical Reports Server (NTRS)
1973-01-01
The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.
The Performance of A Sampled Data Delay Lock Loop Implemented with a Kalman Loop Filter.
1980-01-01
que for analysis is computer simulation. Other techniques include state variable techniques and z-transform methods. Since the Kalman filter is linear...LOGIC NOT SHOWN Figure 2. Block diagram of the sampled data delay lock loop (SDDLL) Es A/ A 3/A/ Figure 3. Sampled error voltage ( Es ) as a function of...from a sum of two components. The first component is the previous filtered es - timate advanced one step forward by the state transition matrix. The 8
Underdetermined blind separation of three-way fluorescence spectra of PAHs in water
NASA Astrophysics Data System (ADS)
Yang, Ruifang; Zhao, Nanjing; Xiao, Xue; Zhu, Wei; Chen, Yunan; Yin, Gaofang; Liu, Jianguo; Liu, Wenqing
2018-06-01
In this work, underdetermined blind decomposition method is developed to recognize individual components from the three-way fluorescent spectra of their mixtures by using sparse component analysis (SCA). The mixing matrix is estimated from the mixtures using fuzzy data clustering algorithm together with the scatters corresponding to local energy maximum value in the time-frequency domain, and the spectra of object components are recovered by pseudo inverse technique. As an example, using this method three and four pure components spectra can be blindly extracted from two samples of their mixture, with similarities between resolved and reference spectra all above 0.80. This work opens a new and effective path to realize monitoring PAHs in water by three-way fluorescence spectroscopy technique.
Using Interactive Graphics to Teach Multivariate Data Analysis to Psychology Students
ERIC Educational Resources Information Center
Valero-Mora, Pedro M.; Ledesma, Ruben D.
2011-01-01
This paper discusses the use of interactive graphics to teach multivariate data analysis to Psychology students. Three techniques are explored through separate activities: parallel coordinates/boxplots; principal components/exploratory factor analysis; and cluster analysis. With interactive graphics, students may perform important parts of the…
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; LaBel, Kenneth; Kim, Hak
2014-01-01
An informative session regarding SRAM FPGA basics. Presenting a framework for fault injection techniques applied to Xilinx Field Programmable Gate Arrays (FPGAs). Introduce an overlooked time component that illustrates fault injection is impractical for most real designs as a stand-alone characterization tool. Demonstrate procedures that benefit from fault injection error analysis.
The gap technique does not rotate the femur parallel to the epicondylar axis.
Matziolis, Georg; Boenicke, Hinrich; Pfiel, Sascha; Wassilew, Georgi; Perka, Carsten
2011-02-01
In the analysis of painful total knee replacements, the surgical epicondylar axis (SEA) has become established as a standard in the diagnosis of femoral component rotation. It remains unclear whether the gap technique widely used to determine femoral rotation, when applied correctly, results in a rotation parallel to the SEA. In this prospective study, 69 patients (69 joints) were included who received a navigated bicondylar surface replacement due to primary arthritis of the knee joint. In 67 cases in which a perfect soft-tissue balancing of the extension gap (<1° asymmetry) was achieved, the flexion gap and the rotation of the femoral component necessary for its symmetry was determined and documented. The femoral component was implanted additionally taking into account the posterior condylar axis and the Whiteside's line. Postoperatively, the rotation of the femoral component to the SEA was determined and this was used to calculate the angle between a femur implanted according to the gap technique and the SEA. If the gap technique had been used consistently, it would have resulted in a deviation of the femoral components by -0.6° ± 2.9° (-7.4°-5.9°) from the SEA. The absolute deviation would have been 2.4° ± 1.8°, with a range between 0.2° and 7.4°. Even if the extension gap is perfectly balanced, the gap technique does not lead to a parallel alignment of the femoral component to the SEA. Since the clinical results of this technique are equivalent to those of the femur first technique in the literature, an evaluation of this deviation as a malalignment must be considered critically.
Thermal Modeling of the Mars Reconnaissance Orbiter's Solar Panel and Instruments during Aerobraking
NASA Technical Reports Server (NTRS)
Dec, John A.; Gasbarre, Joseph F.; Amundsen, Ruth M.
2007-01-01
The Mars Reconnaissance Orbiter (MRO) launched on August 12, 2005 and started aerobraking at Mars in March 2006. During the spacecraft s design phase, thermal models of the solar panels and instruments were developed to determine which components would be the most limiting thermally during aerobraking. Having determined the most limiting components, thermal limits in terms of heat rate were established. Advanced thermal modeling techniques were developed utilizing Thermal Desktop and Patran Thermal. Heat transfer coefficients were calculated using a Direct Simulation Monte Carlo technique. Analysis established that the solar panels were the most limiting components during the aerobraking phase of the mission.
Preparation and analysis of a two-components breath figure at the nanoscale
NASA Astrophysics Data System (ADS)
Kofman, R.; Allione, M.; Celestini, F.; Barkay, Z.; Lereah, Y.
2008-12-01
Solid/liquid two-components Ga-Pb structures in isolated nanometer sized particles have been produced and studied by electron microscopy. Production is based on the breath figure technique and we investigate the way the two components are distributed. We clearly identify two growth regimes associated with the two different ways a Pb atom incorporates into a Ga nanodrop. Using TEM and SEM, the shape and microstructure of the nanoparticles are studied and the results obtained are in good agreement with the proposed model. The experimental technique used appears to be appropriate to produce Pb nanocrystals in liquid Ga nano-containers.
Resolving the percentage of component terrains within single resolution elements
NASA Technical Reports Server (NTRS)
Marsh, S. E.; Switzer, P.; Kowalik, W. S.; Lyon, R. J. P.
1980-01-01
An approximate maximum likelihood technique employing a widely available discriminant analysis program is discussed that has been developed for resolving the percentage of component terrains within single resolution elements. The method uses all four channels of Landsat data simultaneously and does not require prior knowledge of the percentage of components in mixed pixels. It was tested in five cases that were chosen to represent mixtures of outcrop, soil and vegetation which would typically be encountered in geologic studies with Landsat data. For all five cases, the method proved to be superior to single band weighted average and linear regression techniques and permitted an estimate of the total area occupied by component terrains to within plus or minus 6% of the true area covered. Its major drawback is a consistent overestimation of the pixel component percent of the darker materials (vegetation) and an underestimation of the pixel component percent of the brighter materials (sand).
NASA Astrophysics Data System (ADS)
Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi
2018-04-01
Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.
CO Component Estimation Based on the Independent Component Analysis
NASA Astrophysics Data System (ADS)
Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki; Takeuchi, Tsutomu T.; Fukui, Yasuo
2014-01-01
Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independent component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.
CO component estimation based on the independent component analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki
2014-01-01
Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independentmore » component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.« less
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, A.; Duffy, S. F.; Gyekenyesi, J. P.
1992-01-01
An updated version of the integrated design program C/CARES (composite ceramic analysis and reliability evaluation of structures) was developed for the reliability evaluation of CMC laminated shell components. The algorithm is now split in two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The new interface program from the finite-element code MARC also includes the option of using hybrid laminates and allows for variations in temperature fields throughout the component.
Visual enhancement of images of natural resources: Applications in geology
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Neto, G.; Araujo, E. O.; Mascarenhas, N. D. A.; Desouza, R. C. M.
1980-01-01
The principal components technique for use in multispectral scanner LANDSAT data processing results in optimum dimensionality reduction. A powerful tool for MSS IMAGE enhancement, the method provides a maximum impression of terrain ruggedness; this fact makes the technique well suited for geological analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucia, M., E-mail: mlucia@pppl.gov; Kaita, R.; Majeski, R.
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.
Mammone, Nadia; Morabito, Francesco Carlo
2008-09-01
Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.
Vairavan, Srinivasan; Eswaran, Hari; Preissl, Hubert; Wilson, James D; Haddad, Naim; Lowery, Curtis L; Govindan, Rathinaswamy B
2010-01-01
The fetal magnetoencephalogram (fMEG) is measured in the presence of large interference from maternal and fetal magnetocardiograms (mMCG and fMCG). These cardiac interferences can be attenuated by orthogonal projection (OP) technique of the corresponding spatial vectors. However, the OP technique redistributes the fMEG signal among the channels and also leaves some cardiac residuals (partially attenuated mMCG and fMCG) due to loss of stationarity in the signal. In this paper, we propose a novel way to extract and localize the neonatal and fetal spontaneous brain activity by using independent component analysis (ICA) technique. In this approach, we perform ICA on a small subset of sensors for 1-min duration. The independent components obtained are further investigated for the presence of discontinuous patterns as identified by the Hilbert phase analysis and are used as decision criteria for localizing the spontaneous brain activity. In order to locate the region of highest spontaneous brain activity content, this analysis is performed on the sensor subsets, which are traversed across the entire sensor space. The region of the spontaneous brain activity as identified by the proposed approach correlated well with the neonatal and fetal head location. In addition, the burst duration and the inter-burst interval computed for the identified discontinuous brain patterns are in agreement with the reported values.
Radioactive nondestructive test method
NASA Technical Reports Server (NTRS)
Obrien, J. R.; Pullen, K. E.
1971-01-01
Various radioisotope techniques were used as diagnostic tools for determining the performance of spacecraft propulsion feed system elements. Applications were studied in four tasks. The first two required experimental testing involving the propellant liquid oxygen difluoride (OF2): the neutron activation analysis of dissolved or suspended metals, and the use of radioactive tracers to evaluate the probability of constrictions in passive components (orifices and filters) becoming clogged by matter dissolved or suspended in the OF2. The other tasks were an appraisal of the applicability of radioisotope techniques to problems arising from the exposure of components to liquid/gas combinations, and an assessment of the applicability of the techniques to other propellants.
DART-MS: A New Analytical Technique for Forensic Paint Analysis.
Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice
2018-06-05
Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.
[Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].
Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia
2008-07-01
Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.
Hilbert-Huang transform analysis of dynamic and earthquake motion recordings
Zhang, R.R.; Ma, S.; Safak, E.; Hartzell, S.
2003-01-01
This study examines the rationale of Hilbert-Huang transform (HHT) for analyzing dynamic and earthquake motion recordings in studies of seismology and engineering. In particular, this paper first provides the fundamentals of the HHT method, which consist of the empirical mode decomposition (EMD) and the Hilbert spectral analysis. It then uses the HHT to analyze recordings of hypothetical and real wave motion, the results of which are compared with the results obtained by the Fourier data processing technique. The analysis of the two recordings indicates that the HHT method is able to extract some motion characteristics useful in studies of seismology and engineering, which might not be exposed effectively and efficiently by Fourier data processing technique. Specifically, the study indicates that the decomposed components in EMD of HHT, namely, the intrinsic mode function (IMF) components, contain observable, physical information inherent to the original data. It also shows that the grouped IMF components, namely, the EMD-based low- and high-frequency components, can faithfully capture low-frequency pulse-like as well as high-frequency wave signals. Finally, the study illustrates that the HHT-based Hilbert spectra are able to reveal the temporal-frequency energy distribution for motion recordings precisely and clearly.
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Wade, Raymond P.; Izadnegahdar, Alain
2011-01-01
The Component-Level Electronic-Assembly Repair (CLEAR) project at the NASA Glenn Research Center is aimed at developing technologies that will enable space-flight crews to perform in situ component-level repair of electronics on Moon and Mars outposts, where there is no existing infrastructure for logistics spares. These technologies must provide effective repair capabilities yet meet the payload and operational constraints of space facilities. Effective repair depends on a diagnostic capability that is versatile but easy to use by crew members that have limited training in electronics. CLEAR studied two techniques that involve extensive precharacterization of "known good" circuits to produce graphical signatures that provide an easy-to-use comparison method to quickly identify faulty components. Analog Signature Analysis (ASA) allows relatively rapid diagnostics of complex electronics by technicians with limited experience. Because of frequency limits and the growing dependence on broadband technologies, ASA must be augmented with other capabilities. To meet this challenge while preserving ease of use, CLEAR proposed an alternative called Complex Signature Analysis (CSA). Tests of ASA and CSA were used to compare capabilities and to determine if the techniques provided an overlapping or complementary capability. The results showed that the methods are complementary.
Guided Wave Propagation Study on Laminated Composites by Frequency-Wavenumber Technique
NASA Technical Reports Server (NTRS)
Tian, Zhenhua; Yu, Lingyu; Leckey, Cara A. C.
2014-01-01
Toward the goal of delamination detection and quantification in laminated composites, this paper examines guided wave propagation and wave interaction with delamination damage in laminated carbon fiber reinforced polymer (CFRP) composites using frequency-wavenumber (f-kappa) analysis. Three-dimensional elastodynamic finite integration technique (EFIT) is used to acquire simulated time-space wavefields for a CFRP composite. The time-space wavefields show trapped waves in the delamination region. To unveil the wave propagation physics, the time-space wavefields are further analyzed by using two-dimensional (2D) Fourier transforms (FT). In the analysis results, new f-k components are observed when the incident guided waves interact with the delamination damage. These new f-kappa components in the simulations are experimentally verified through data obtained from scanning laser Doppler vibrometer (SLDV) tests. By filtering the new f-kappa components, delamination damage is detected and quantified.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen
2016-04-01
Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
Psychometric Measurement Models and Artificial Neural Networks
ERIC Educational Resources Information Center
Sese, Albert; Palmer, Alfonso L.; Montano, Juan J.
2004-01-01
The study of measurement models in psychometrics by means of dimensionality reduction techniques such as Principal Components Analysis (PCA) is a very common practice. In recent times, an upsurge of interest in the study of artificial neural networks apt to computing a principal component extraction has been observed. Despite this interest, the…
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Analyzing coastal environments by means of functional data analysis
NASA Astrophysics Data System (ADS)
Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.
2017-07-01
Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, F. S.
Functionally graded components exhibit spatial variations of mechanical properties in contrast with, and as an alternative to, purely homogeneous components. A large class of graded materials, however, are in fact mostly homogeneous materials with property variations (chemical or mechanical) restricted to a specific area or layer produced by applying for example a coating or by introducing sub-surface residual stresses. However, it is also possible to obtain graded materials with a smooth transition of mechanical properties along the entire component, for example in a 40 mm component. This is possible, for example, by using centrifugal casting technique or incremental melting andmore » solidification technique. In this paper we will study fully metallic functionally graded components with a smooth gradient, focusing on fatigue crack propagation. Fatigue propagation will be assessed in the direction parallel to the gradation (in different homogeneous layers of the functionally graded component) to assess what would be fatigue crack propagation on the direction perpendicular to the gradation. Fatigue crack growth rate (standard mode I fatigue crack growth) will be correlated to the mode I stress intensity factor range. Other mechanical properties of different layers of the component (Young's modulus) will also be considered in this analysis. The effect of residual stresses along the component gradation on crack propagation will also be taken into account. A qualitative analysis of the effects of some important features, present in functionally graded materials, will be made based on the obtained results.« less
Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin
2018-01-01
Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.
Detection of counterfeit electronic components through ambient mass spectrometry and chemometrics.
Pfeuffer, Kevin P; Caldwell, Jack; Shelley, Jake T; Ray, Steven J; Hieftje, Gary M
2014-09-21
In the last several years, illicit electronic components have been discovered in the inventories of several distributors and even installed in commercial and military products. Illicit or counterfeit electronic components include a broad category of devices that can range from the correct unit with a more recent date code to lower-specification or non-working systems with altered names, manufacturers and date codes. Current methodologies for identification of counterfeit electronics rely on visual microscopy by expert users and, while effective, are very time-consuming. Here, a plasma-based ambient desorption/ionization source, the flowing atmospheric pressure afterglow (FAPA) is used to generate a mass-spectral fingerprint from the surface of a variety of discrete electronic integrated circuits (ICs). Chemometric methods, specifically principal component analysis (PCA) and the bootstrapped error-adjusted single-sample technique (BEAST), are used successfully to differentiate between genuine and counterfeit ICs. In addition, chemical and physical surface-removal techniques are explored and suggest which surface-altering techniques were utilized by counterfeiters.
The Importance of Engine External's Health
NASA Technical Reports Server (NTRS)
Stoner, Barry L.
2006-01-01
Engine external components include all the fluid carrying, electron carrying, and support devices that are needed to operate the propulsion system. These components are varied and include: pumps, valves, actuators, solenoids, sensors, switches, heat exchangers, electrical generators, electrical harnesses, tubes, ducts, clamps and brackets. The failure of any component to perform its intended function will result in a maintenance action, a dispatch delay, or an engine in flight shutdown. The life of each component, in addition to its basic functional design, is closely tied to its thermal and dynamic environment .Therefore, to reach a mature design life, the component's thermal and dynamic environment must be understood and controlled, which can only be accomplished by attention to design analysis and testing. The purpose of this paper is to review analysis and test techniques toward achieving good component health.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Underdetermined blind separation of three-way fluorescence spectra of PAHs in water.
Yang, Ruifang; Zhao, Nanjing; Xiao, Xue; Zhu, Wei; Chen, Yunan; Yin, Gaofang; Liu, Jianguo; Liu, Wenqing
2018-06-15
In this work, underdetermined blind decomposition method is developed to recognize individual components from the three-way fluorescent spectra of their mixtures by using sparse component analysis (SCA). The mixing matrix is estimated from the mixtures using fuzzy data clustering algorithm together with the scatters corresponding to local energy maximum value in the time-frequency domain, and the spectra of object components are recovered by pseudo inverse technique. As an example, using this method three and four pure components spectra can be blindly extracted from two samples of their mixture, with similarities between resolved and reference spectra all above 0.80. This work opens a new and effective path to realize monitoring PAHs in water by three-way fluorescence spectroscopy technique. Copyright © 2018 Elsevier B.V. All rights reserved.
Long-life mission reliability for outer planet atmospheric entry probes
NASA Technical Reports Server (NTRS)
Mccall, M. T.; Rouch, L.; Maycock, J. N.
1976-01-01
The results of a literature analysis on the effects of prolonged exposure to deep space environment on the properties of outer planet atmospheric entry probe components are presented. Materials considered included elastomers and plastics, pyrotechnic devices, thermal control components, metal springs and electronic components. The rates of degradation of each component were determined and extrapolation techniques were used to predict the effects of exposure for up to eight years to deep space. Pyrotechnic devices were aged under accelerated conditions to an equivalent of eight years in space and functionally tested. Results of the literature analysis of the selected components and testing of the devices indicated that no severe degradation should be expected during an eight year space mission.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Sheppard, P S; Stevenson, J M; Graham, R B
2016-05-01
The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. Copyright © 2016. Published by Elsevier Ltd.
Covariate selection with iterative principal component analysis for predicting physical
USDA-ARS?s Scientific Manuscript database
Local and regional soil data can be improved by coupling new digital soil mapping techniques with high resolution remote sensing products to quantify both spatial and absolute variation of soil properties. The objective of this research was to advance data-driven digital soil mapping techniques for ...
Wear studies made of slip rings and gas bearing components
NASA Technical Reports Server (NTRS)
Furr, A. K.
1967-01-01
Neutron activation analysis techniques were employed for the study of the wear and performance characteristics of slip ring and rotor assemblies and of the problems arising from environmental conditions with special reference to surface contamination. Results showed that the techniques could be successfully applied to measurement of wear parameters.
Muse, Thomas O; Zwischenberger, Brittany A; Miller, M Troy; Borman, Daniel A; Davenport, Daniel L; Roth, J Scott
2018-03-01
Complex ventral hernias remain a challenge for general surgeons despite advances in minimally invasive surgical techniques. This study compares outcomes following Rives-Stoppa (RS) repair, components separation technique with mesh (CST-M) or without mesh (CST), and endoscopic components separation technique (ECST). A retrospective review of patients undergoing open ventral hernia repair between 2006 and 2011 was performed. Analysis included patient demographics, surgical site occurrences, hernia recurrence, hospital readmission, and mortality. The search was limited to open repairs, specifically the RS, CST-M, CST, and ECST with mesh techniques. A total of 362 patients underwent repair with RS (66), CST-M (126), CST (117), or ECST (53). The groups were demographically similar. ECST was more frequently used for patients with a history of two or more recurrences (P < 0.001). The RS method had the lowest rate of recurrence (9.1%) compared with CST and CST-M with 28 and 25 per cent recurrences, respectively (P = 0.011). The RS recurrence rate was not significantly different than ECST (15%). There were no significant differences between groups for surgical site occurrences (P = 0.305), hospital readmission (P = 0.288), or death (P = 0.197). When components separation is necessary for complex ventral hernia repair, ECST is a viable option without added morbidity or mortality.
Consistent Principal Component Modes from Molecular Dynamics Simulations of Proteins.
Cossio-Pérez, Rodrigo; Palma, Juliana; Pierdominici-Sottile, Gustavo
2017-04-24
Principal component analysis is a technique widely used for studying the movements of proteins using data collected from molecular dynamics simulations. In spite of its extensive use, the technique has a serious drawback: equivalent simulations do not afford the same PC-modes. In this article, we show that concatenating equivalent trajectories and calculating the PC-modes from the concatenated one significantly enhances the reproducibility of the results. Moreover, the consistency of the modes can be systematically improved by adding more individual trajectories to the concatenated one.
Radar cross section fundamentals for the aircraft designer
NASA Technical Reports Server (NTRS)
Stadmore, H. A.
1979-01-01
Various aspects of radar cross-section (RCS) techniques are summarized, with emphasis placed on fundamental electromagnetic phenomena, such as plane and spherical wave formulations, and the definition of RCS is given in the far-field sense. The basic relationship between electronic countermeasures and a signature level is discussed in terms of the detectability range of a target vehicle. Fundamental radar-signature analysis techniques, such as the physical-optics and geometrical-optics approximations, are presented along with examples in terms of aircraft components. Methods of analysis based on the geometrical theory of diffraction are considered and various wave-propagation phenomena are related to local vehicle geometry. Typical vehicle components are also discussed, together with their contribution to total vehicle RCS and their individual signature sensitivities.
Vilardaga, Roger; Heffner, Jaimee L.; Mercer, Laina D.; Bricker, Jonathan
2014-01-01
No studies to date have examined the effect of counselor techniques on smoking cessation over the course of treatment. To address this gap, we examined the degree to which the use of specific Acceptance and Commitment Therapy (ACT) counseling techniques in a given session predicted smoking cessation reported at the next session. The data came from the ACT arm of a randomized controlled trial of a telephone-delivered smoking cessation intervention. Trained raters coded 139 counseling sessions across 44 participants. The openness, awareness and activation components of the ACT model were rated for each telephone counseling session. Multilevel logistic regression models were used to estimate the predictive relationship between each component during any given telephone session and smoking cessation at the following telephone session. For every 1-unit increase in counselors’ use of openness and awareness techniques there were 42% and 52% decreases in the odds of smoking at the next counseling session, respectively. However, there was no significant predictive relationship between counselors’ use of activation techniques and smoking cessation. Overall, results highlight the theoretical and clinical value of examining therapists’ techniques as predictors of outcome during the course of treatment. PMID:25156397
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradonjic, Milan; Hagberg, Aric; Hengartner, Nick
We analyze component evolution in general random intersection graphs (RIGs) and give conditions on existence and uniqueness of the giant component. Our techniques generalize the existing methods for analysis on component evolution in RIGs. That is, we analyze survival and extinction properties of a dependent, inhomogeneous Galton-Watson branching process on general RIGs. Our analysis relies on bounding the branching processes and inherits the fundamental concepts from the study on component evolution in Erdos-Renyi graphs. The main challenge becomes from the underlying structure of RIGs, when the number of offsprings follows a binomial distribution with a different number of nodes andmore » different rate at each step during the evolution. RIGs can be interpreted as a model for large randomly formed non-metric data sets. Besides the mathematical analysis on component evolution, which we provide in this work, we perceive RIGs as an important random structure which has already found applications in social networks, epidemic networks, blog readership, or wireless sensor networks.« less
A new multicriteria risk mapping approach based on a multiattribute frontier concept
Denys Yemshanov; Frank H. Koch; Yakov Ben-Haim; Marla Downing; Frank Sapio; Marty Siltanen
2013-01-01
Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that...
NASA Technical Reports Server (NTRS)
Freund, Friedemann
1991-01-01
Substantial progress has been made towards a better understanding of the dissolution of common gas/fluid phase components, notably H2O and CO2, in minerals. It has been shown that the dissolution mechanisms are significantly more complex than currently believed. By judiciously combining various solid state analytical techniques, convincing evidence was obtained that traces of dissolved gas/fluid phase components undergo, at least in part, a redox conversion by which they split into reduced H2 and and reduced C on one hand and oxidized oxygen, O(-), on the other. Analysis for 2 and C as well as for any organic molecules which may form during the process of co-segregation are still impeded by the omnipresent danger of extraneous contamination. However, the presence of O(-), an unusual oxidized form of oxygen, has been proven beyond a reasonable doubt. The presence of O(-) testifies to the fact that a redox reaction must have taken place in the solid state involving the dissolved traces of gas/fluid phase components. Detailed information on the techniques used and the results obtained are given.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Concurrent white matter bundles and grey matter networks using independent component analysis.
O'Muircheartaigh, Jonathan; Jbabdi, Saad
2018-04-15
Developments in non-invasive diffusion MRI tractography techniques have permitted the investigation of both the anatomy of white matter pathways connecting grey matter regions and their structural integrity. In parallel, there has been an expansion in automated techniques aimed at parcellating grey matter into distinct regions based on functional imaging. Here we apply independent component analysis to whole-brain tractography data to automatically extract brain networks based on their associated white matter pathways. This method decomposes the tractography data into components that consist of paired grey matter 'nodes' and white matter 'edges', and automatically separates major white matter bundles, including known cortico-cortical and cortico-subcortical tracts. We show how this framework can be used to investigate individual variations in brain networks (in terms of both nodes and edges) as well as their associations with individual differences in behaviour and anatomy. Finally, we investigate correspondences between tractography-based brain components and several canonical resting-state networks derived from functional MRI. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Fraser, C. S.; Lu, G.
2015-08-01
Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.
Blind source separation of ex-vivo aorta tissue multispectral images
Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson
2015-01-01
Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method’s performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue. PMID:26137366
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.
1992-01-01
An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.
Principal component analysis of the cytokine and chemokine response to human traumatic brain injury.
Helmy, Adel; Antoniades, Chrystalina A; Guilfoyle, Mathew R; Carpenter, Keri L H; Hutchinson, Peter J
2012-01-01
There is a growing realisation that neuro-inflammation plays a fundamental role in the pathology of Traumatic Brain Injury (TBI). This has led to the search for biomarkers that reflect these underlying inflammatory processes using techniques such as cerebral microdialysis. The interpretation of such biomarker data has been limited by the statistical methods used. When analysing data of this sort the multiple putative interactions between mediators need to be considered as well as the timing of production and high degree of statistical co-variance in levels of these mediators. Here we present a cytokine and chemokine dataset from human brain following human traumatic brain injury and use principal component analysis and partial least squares discriminant analysis to demonstrate the pattern of production following TBI, distinct phases of the humoral inflammatory response and the differing patterns of response in brain and in peripheral blood. This technique has the added advantage of making no assumptions about the Relative Recovery (RR) of microdialysis derived parameters. Taken together these techniques can be used in complex microdialysis datasets to summarise the data succinctly and generate hypotheses for future study.
Development progress of the Materials Analysis and Particle Probe
NASA Astrophysics Data System (ADS)
Lucia, M.; Kaita, R.; Majeski, R.; Bedoya, F.; Allain, J. P.; Boyle, D. P.; Schmitt, J. C.; Onge, D. A. St.
2014-11-01
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
Development progress of the Materials Analysis and Particle Probe.
Lucia, M; Kaita, R; Majeski, R; Bedoya, F; Allain, J P; Boyle, D P; Schmitt, J C; Onge, D A St
2014-11-01
The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.
Ahmed, Mohammed M; Otto, Thomas J; Moed, Berton R
2016-04-22
Limited-incision total hip arthroplasty (THA) preserves hip abductors, posterior capsule, and external rotators potentially diminishing dislocation risk. However, potential complications also exist, such as component malposition. Specific implants have been manufactured that enhance compatibility with this technique, while preserving metaphyseal bone; however, little data exists documenting early complications and component position. The purpose was to evaluate primary THA using a curved, bone-sparing stem inserted through the anterior approach with respect to component alignment and early complications. In a retrospective analysis of 108 cases, the surgical technique was outlined and the occurrence of intraoperative fractures, postoperative dislocations, infection, and limb length inequality was determined. Femoral stem and acetabular cup alignment was quantified using the initial postoperative radiographs. Patient follow-up averaged 12.9 (range 2 to 36) months. There were eight (7.4 %) complications requiring revision surgery in three (2.8 %) patients with three (2.8 %) infections and three (2.8 %) dislocations. Intraoperative complications included one calcar fracture above the lesser trochanter. Leg length inequality >5 mm was present in three (2.8 %) patients. Radiographic analysis showed that femoral neutral alignment was achieved in 95 hips (88.0 %). All femoral stems demonstrated satisfactory fit and fill and no evidence of subsidence, osteolysis, or loosening. An average abduction angle of 44.8° (± 5.3) and average cup anteversion of 16.2° (± 4.2) were also noted. Although the technique with this implant and approach is promising, it does not appear to offer important advantages over standard techniques. However, the findings merit further, long-term study.
USDA-ARS?s Scientific Manuscript database
Total Body Nitrogen (TBN) can be used to estimate Total Body Protein (TBP), an important body composition component at the molecular level. A system using the associated particle technique in conjunction with prompt gamma neutron activation analysis has been developed for the measurement of TBN in ...
High frequency oscillations evoked by peripheral magnetic stimulation.
Biller, S; Simon, L; Fiedler, P; Strohmeier, D; Haueisen, J
2011-01-01
The analysis of somatosensory evoked potentials (SEP) and / or fields (SEF) is a well-established and important tool for investigating the functioning of the peripheral and central human nervous system. A standard technique to evoke SEPs / SEFs is the stimulation of the median nerve by using a bipolar electrical stimulus. We aim at an alternative stimulation technique enabling stimulation of deep nerve structures while reducing patient stress and error susceptibility. In the current study, we apply a commercial transcranial magnetic stimulation system for peripheral magnetic stimulation of the median nerve. We compare the results of simultaneously recorded EEG signals to prove applicability of our technique to evoke SEPs including low frequency components (LFC) as well as high frequency oscillations (HFO). Therefore, we compare amplitude, latency and time-frequency characteristics of the SEP of 14 healthy volunteers after electric and magnetic stimulation. Both low frequency components and high frequency oscillations were detected. The HFOs were superimposed onto the primary cortical response N20. Statistical analysis revealed significantly lower amplitudes and increased latencies for LFC and HFO components after magnetic stimulation. The differences indicate the inability of magnetic stimulation to elicit supramaximal responses. A psycho-perceptual evaluation showed that magnetic stimulation was less unpleasant for 12 out of the 14 volunteers. In conclusion, we showed that LFC and HFO components related to median nerve stimulation can be evoked by peripheral magnetic stimulation.
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
Determination of authenticity of brand perfume using electronic nose prototypes
NASA Astrophysics Data System (ADS)
Gebicki, Jacek; Szulczynski, Bartosz; Kaminski, Marian
2015-12-01
The paper presents the practical application of an electronic nose technique for fast and efficient discrimination between authentic and fake perfume samples. Two self-built electronic nose prototypes equipped with a set of semiconductor sensors were employed for that purpose. Additionally 10 volunteers took part in the sensory analysis. The following perfumes and their fake counterparts were analysed: Dior—Fahrenheit, Eisenberg—J’ose, YSL—La nuit de L’homme, 7 Loewe and Spice Bomb. The investigations were carried out using the headspace of the aqueous solutions. Data analysis utilized multidimensional techniques: principle component analysis (PCA), linear discrimination analysis (LDA), k-nearest neighbour (k-NN). The results obtained confirmed the legitimacy of the electronic nose technique as an alternative to the sensory analysis as far as the determination of authenticity of perfume is concerned.
NASA Astrophysics Data System (ADS)
Gulliver, Eric A.
The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
NASA Astrophysics Data System (ADS)
Pacholski, Michaeleen L.
2004-06-01
Principal component analysis (PCA) has been successfully applied to time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra, images and depth profiles. Although SIMS spectral data sets can be small (in comparison to datasets typically discussed in literature from other analytical techniques such as gas or liquid chromatography), each spectrum has thousands of ions resulting in what can be a difficult comparison of samples. Analysis of industrially-derived samples means the identity of most surface species are unknown a priori and samples must be analyzed rapidly to satisfy customer demands. PCA enables rapid assessment of spectral differences (or lack there of) between samples and identification of chemically different areas on sample surfaces for images. Depth profile analysis helps define interfaces and identify low-level components in the system.
In situ X-ray diffraction analysis of (CF x) n batteries: signal extraction by multivariate analysis
Rodriguez, Mark A.; Keenan, Michael R.; Nagasubramanian, Ganesan
2007-11-10
In this study, (CF x) n cathode reaction during discharge has been investigated using in situ X-ray diffraction (XRD). Mathematical treatment of the in situ XRD data set was performed using multivariate curve resolution with alternating least squares (MCR–ALS), a technique of multivariate analysis. MCR–ALS analysis successfully separated the relatively weak XRD signal intensity due to the chemical reaction from the other inert cell component signals. The resulting dynamic reaction component revealed the loss of (CF x) n cathode signal together with the simultaneous appearance of LiF by-product intensity. Careful examination of the XRD data set revealed an additional dynamicmore » component which may be associated with the formation of an intermediate compound during the discharge process.« less
Techniques for forced response involving discrete nonlinearities. I - Theory. II - Applications
NASA Astrophysics Data System (ADS)
Avitabile, Peter; Callahan, John O.
Several new techniques developed for the forced response analysis of systems containing discrete nonlinear connection elements are presented and compared to the traditional methods. In particular, the techniques examined are the Equivalent Reduced Model Technique (ERMT), Modal Modification Response Technique (MMRT), and Component Element Method (CEM). The general theory of the techniques is presented, and applications are discussed with particular reference to the beam nonlinear system model using ERMT, MMRT, and CEM; frame nonlinear response using the three techniques; and comparison of the results obtained by using the ERMT, MMRT, and CEM models.
Applications Of Binary Image Analysis Techniques
NASA Astrophysics Data System (ADS)
Tropf, H.; Enderle, E.; Kammerer, H. P.
1983-10-01
After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Advanced analysis technique for the evaluation of linear alternators and linear motors
NASA Technical Reports Server (NTRS)
Holliday, Jeffrey C.
1995-01-01
A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.
The use of artificial intelligence techniques to improve the multiple payload integration process
NASA Technical Reports Server (NTRS)
Cutts, Dannie E.; Widgren, Brian K.
1992-01-01
A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.
Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA
NASA Astrophysics Data System (ADS)
Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.
2012-01-01
In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mian, Muhammad Umer, E-mail: umermian@gmail.com; Khir, M. H. Md.; Tang, T. B.
Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for themore » proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.« less
POD/MAC-Based Modal Basis Selection for a Reduced Order Nonlinear Response Analysis
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Przekop, Adam
2007-01-01
A feasibility study was conducted to explore the applicability of a POD/MAC basis selection technique to a nonlinear structural response analysis. For the case studied the application of the POD/MAC technique resulted in a substantial improvement of the reduced order simulation when compared to a classic approach utilizing only low frequency modes present in the excitation bandwidth. Further studies are aimed to expand application of the presented technique to more complex structures including non-planar and two-dimensional configurations. For non-planar structures the separation of different displacement components may not be necessary or desirable.
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.
1980-11-01
The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.
A Meta-Analytic Review of Components Associated with Parent Training Program Effectiveness
ERIC Educational Resources Information Center
Kaminski, Jennifer Wyatt; Valle, Linda Anne; Filene, Jill H.; Boyle, Cynthia L.
2008-01-01
This component analysis used meta-analytic techniques to synthesize the results of 77 published evaluations of parent training programs (i.e., programs that included the active acquisition of parenting skills) to enhance behavior and adjustment in children aged 0-7. Characteristics of program content and delivery method were used to predict effect…
Hooper, R.P.; Peters, N.E.
1989-01-01
A principal-components analysis was performed on the major solutes in wet deposition collected from 194 stations in the United States and its territories. Approximately 90% of the components derived could be interpreted as falling into one of three categories - acid, salt, or an agricultural/soil association. The total mass, or the mass of any one solute, was apportioned among these components by multiple linear regression techniques. The use of multisolute components for determining trends or spatial distribution represents a substantial improvement over single-solute analysis in that these components are more directly related to the sources of the deposition. The geographic patterns displayed by the components in this analysis indicate a far more important role for acid deposition in the Southeast and intermountain regions of the United States than would be indicated by maps of sulfate or nitrate deposition alone. In the Northeast and Midwest, the acid component is not declining at most stations, as would be expected from trends in sulfate deposition, but is holding constant or increasing. This is due, in part, to a decline in the agriculture/soil factor throughout this region, which would help to neutralize the acidity.
Fiołka, Marta J; Grzywnowicz, Krzysztof; Mendyk, Ewaryst; Zagaja, Mirosław; Szewczyk, Rafał; Rawski, Michał; Keller, Radosław; Rzymowska, Jolanta; Wydrych, Jerzy
2015-12-01
In this paper, an antimycobacterial component of extracellular metabolites of a gut bacterium Raoultella ornithinolytica from D. veneta earthworms was isolated and its antimycobacterial action was tested using Mycobacterium smegmatis. After incubation with the complex obtained, formation of pores and furrows in cell walls was observed using microscopic techniques. The cells lost their shape, stuck together and formed clusters. Surface-enhanced Raman spectroscopy analysis showed that, after incubation, the complex was attached to the cell walls of the Mycobacterium. Analyses of the component performed with Fourier transform infrared spectroscopy demonstrated high similarity to a bacteriocin nisin, but energy dispersive X-ray spectroscopy analysis revealed differences in the elemental composition of this antimicrobial peptide. The component with antimycobacterial activity was identified using mass spectrometry techniques as a glycolipid-peptide complex. As it exhibits no cytotoxicity on normal human fibroblasts, the glycolipid-peptide complex appears to be a promising compound for investigations of its activity against pathogenic mycobacteria. © 2015 APMIS. Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.
2013-01-01
We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.
Structure of the Nucleon and its Excitations
NASA Astrophysics Data System (ADS)
Kamleh, Waseem; Leinweber, Derek; Liu, Zhan-wei; Stokes, Finn; Thomas, Anthony; Thomas, Samuel; Wu, Jia-jun
2018-03-01
The structure of the ground state nucleon and its finite-volume excitations are examined from three different perspectives. Using new techniques to extract the relativistic components of the nucleon wave function, the node structure of both the upper and lower components of the nucleon wave function are illustrated. A non-trivial role for gluonic components is manifest. In the second approach, the parity-expanded variational analysis (PEVA) technique is utilised to isolate states at finite momenta, enabling a novel examination of the electric and magnetic form factors of nucleon excitations. Here the magnetic form factors of low-lying odd-parity nucleons are particularly interesting. Finally, the structure of the nucleon spectrum is examined in a Hamiltonian effective field theory analysis incorporating recent lattice-QCD determinations of low-lying two-particle scattering-state energies in the finite volume. The Roper resonance of Nature is observed to originate from multi-particle coupled-channel interactions while the first radial excitation of the nucleon sits much higher at approximately 1.9 GeV.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.
2014-01-01
Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.
Studying Ultradisperse Diamond Structure within Explosively Synthesized Samples via X-Ray Techniques
NASA Astrophysics Data System (ADS)
Sharkov, M. D.; Boiko, M. E.; Ivashevskaya, S. N.; Belyakova, N. S.
2013-08-01
XRD (X-Ray Diffraction) and SAXS (Small-Angle X-Ray Scattering) data have been measured for a pair of samples produced with the help of explosives. XRD peaks have shown the both samples to contain crystal diamond components as well as graphite ones. Basing on SAXS analysis, possible presence of grains with radii up to 30-50 nm within all the samples has been shown. Structure components with fractal dimension between 1 and 2 in the sample have been detected, this fact being in agreement with the assumption of diamond grain coating similarity to onion shells. In order to broad rocking curves analysis, the standard SAXS treatment technique has been complemented by a Fourier filtering procedure. For the sample #1, rocking curve components corresponding to individual interplanar distances with magnitudes from 5 nm up to 15 nm have been separated. A hypothesis relating these values to the distances between concentric onion-like shells of diamond grains has been formulated.
Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C
2013-01-01
The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.
Chapman, Peter J; Vogt, Frank; Dutta, Pampa; Datskos, Panos G; Devault, Gerald L; Sepaniak, Michael J
2007-01-01
The very simple coupling of a standard, packed-column gas chromatograph with a microcantilever array (MCA) is demonstrated for enhanced selectivity and potential analyte identification in the analysis of volatile organic compounds (VOCs). The cantilevers in MCAs are differentially coated on one side with responsive phases (RPs) and produce bending responses of the cantilevers due to analyte-induced surface stresses. Generally, individual components are difficult to elucidate when introduced to MCA systems as mixtures, although pattern recognition techniques are helpful in identifying single components, binary mixtures, or composite responses of distinct mixtures (e.g., fragrances). In the present work, simple test VOC mixtures composed of acetone, ethanol, and trichloroethylene (TCE) in pentane and methanol and acetonitrile in pentane are first separated using a standard gas chromatograph and then introduced into a MCA flow cell. Significant amounts of response diversity to the analytes in the mixtures are demonstrated across the RP-coated cantilevers of the array. Principal component analysis is used to demonstrate that only three components of a four-component VOC mixture could be identified without mixture separation. Calibration studies are performed, demonstrating a good linear response over 2 orders of magnitude for each component in the primary study mixture. Studies of operational parameters including column temperature, column flow rate, and array cell temperature are conducted. Reproducibility studies of VOC peak areas and peak heights are also carried out showing RSDs of less than 4 and 3%, respectively, for intra-assay studies. Of practical significance is the facile manner by which the hyphenation of a mature separation technique and the burgeoning sensing approach is accomplished, and the potential to use pattern recognition techniques with MCAs as a new type of detector for chromatography with analyte-identifying capabilities.
A managerial accounting analysis of hospital costs.
Frank, W G
1976-01-01
Variance analysis, an accounting technique, is applied to an eight-component model of hospital costs to determine the contribution each component makes to cost increases. The method is illustrated by application to data on total costs from 1950 to 1973 for all U.S. nongovernmental not-for-profit short-term general hospitals. The costs of a single hospital are analyzed and compared to the group costs. The potential uses and limitations of the method as a planning and research tool are discussed. PMID:965233
A managerial accounting analysis of hospital costs.
Frank, W G
1976-01-01
Variance analysis, an accounting technique, is applied to an eight-component model of hospital costs to determine the contribution each component makes to cost increases. The method is illustrated by application to data on total costs from 1950 to 1973 for all U.S. nongovernmental not-for-profit short-term general hospitals. The costs of a single hospital are analyzed and compared to the group costs. The potential uses and limitations of the method as a planning and research tool are discussed.
2007-10-01
1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by
BLIND EXTRACTION OF AN EXOPLANETARY SPECTRUM THROUGH INDEPENDENT COMPONENT ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldmann, I. P.; Tinetti, G.; Hollis, M. D. J.
2013-03-20
Blind-source separation techniques are used to extract the transmission spectrum of the hot-Jupiter HD189733b recorded by the Hubble/NICMOS instrument. Such a 'blind' analysis of the data is based on the concept of independent component analysis. The detrending of Hubble/NICMOS data using the sole assumption that nongaussian systematic noise is statistically independent from the desired light-curve signals is presented. By not assuming any prior or auxiliary information but the data themselves, it is shown that spectroscopic errors only about 10%-30% larger than parametric methods can be obtained for 11 spectral bins with bin sizes of {approx}0.09 {mu}m. This represents a reasonablemore » trade-off between a higher degree of objectivity for the non-parametric methods and smaller standard errors for the parametric de-trending. Results are discussed in light of previous analyses published in the literature. The fact that three very different analysis techniques yield comparable spectra is a strong indication of the stability of these results.« less
Valero, E; Sanz, J; Martínez-Castro, I
2001-06-01
Direct thermal desorption (DTD) has been used as a technique for extracting volatile components of cheese as a preliminary step to their gas chromatographic (GC) analysis. In this study, it is applied to different cheese varieties: Camembert, blue, Chaumes, and La Serena. Volatiles are also extracted using other techniques such as simultaneous distillation-extraction and dynamic headspace. Separation and identification of the cheese components are carried out by GC-mass spectrometry. Approximately 100 compounds are detected in the examined cheeses. The described results show that DTD is fast, simple, and easy to automate; requires only a small amount of sample (approximately 50 mg); and affords quantitative information about the main groups of compounds present in cheeses.
McCarty, James; Parrinello, Michele
2017-11-28
In this paper, we combine two powerful computational techniques, well-tempered metadynamics and time-lagged independent component analysis. The aim is to develop a new tool for studying rare events and exploring complex free energy landscapes. Metadynamics is a well-established and widely used enhanced sampling method whose efficiency depends on an appropriate choice of collective variables. Often the initial choice is not optimal leading to slow convergence. However by analyzing the dynamics generated in one such run with a time-lagged independent component analysis and the techniques recently developed in the area of conformational dynamics, we obtain much more efficient collective variables that are also better capable of illuminating the physics of the system. We demonstrate the power of this approach in two paradigmatic examples.
NASA Astrophysics Data System (ADS)
McCarty, James; Parrinello, Michele
2017-11-01
In this paper, we combine two powerful computational techniques, well-tempered metadynamics and time-lagged independent component analysis. The aim is to develop a new tool for studying rare events and exploring complex free energy landscapes. Metadynamics is a well-established and widely used enhanced sampling method whose efficiency depends on an appropriate choice of collective variables. Often the initial choice is not optimal leading to slow convergence. However by analyzing the dynamics generated in one such run with a time-lagged independent component analysis and the techniques recently developed in the area of conformational dynamics, we obtain much more efficient collective variables that are also better capable of illuminating the physics of the system. We demonstrate the power of this approach in two paradigmatic examples.
Wood lens design philosophy based on a binary additive manufacturing technique
NASA Astrophysics Data System (ADS)
Marasco, Peter L.; Bailey, Christopher
2016-04-01
Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Pietro, E.; Visca, E.; Orsini, A.
1995-12-31
The design of plasma-facing components for ITER, as for any of the envisaged next-step machines, relies heavily on the use of brazed junctions to couple armour materials to the heat sink and cooling tubes. Moreover, the typical number of brazed components and the envisaged effects of local overheating due to failure in a single brazed junction stress the importance of having a set of NDE techniques developed that can ensure the flawless quality of the joint. The qualification and application of two NDE techniques (ultrasonic and thermographic analysis) for inspection of CFC-to-metal joints is described with particular regard to themore » annular geometry typical of macroblock/monoblock solutions for divertor high-heat-flux components. The results of the eddy current inspection are not reported. The development has been focused specifically on the joint between carbon-fiber composite and TZM molybdenum alloy; techniques for the production of reference defect samples have been devised and a set of reference defect samples produced. The comparative results of the NDE inspections are reported and discussed, also on the basis of the destructive examination of the samples. The nature and size of relevant and detectable defects are discussed together with hints for a possible NDE strategy for divertor high-heat-flux components.« less
Moon, Young-Wan; Kim, Hyun-Jung; Ahn, Hyeong-Sik; Park, Chan-Deok; Lee, Dae-Hee
2016-01-01
Abstract Background: This meta-analysis was designed to compare the accuracy of soft tissue balancing and femoral component rotation as well as change in joint line positions, between the measured resection and gap balancing techniques in primary total knee arthroplasty. Methods: Studies were included in the meta-analysis if they compared soft tissue balancing and/or radiologic outcomes in patients who underwent total knee arthroplasty with the gap balancing and measured resection techniques. Comparisons included differences in flexion/extension, medial/lateral flexion, and medial/lateral extension gaps (LEGs), femoral component rotation, and change in joint line positions. Finally, 8 studies identified via electronic (MEDLINE, EMBASE, and the Cochrane Library) and manual searches were included. All 8 studies showed a low risk of selection bias and provided detailed demographic data. There was some inherent heterogeneity due to uncontrolled bias, because all included studies were observational comparison studies. Results: The pooled mean difference in gap differences between the gap balancing and measured resection techniques did not differ significantly (−0.09 mm, 95% confidence interval [CI]: −0.40 to +0.21 mm; P = 0.55), except that the medial/LEG difference was 0.58 mm greater for measured resection than gap balancing (95% CI: −1.01 to −0.15 mm; P = 0.008). Conversely, the pooled mean difference in femoral component external rotation (0.77°, 95% CI: 0.18° to 1.35°; P = 0.01) and joint line change (1.17 mm, 95% CI: 0.82 to 1.52 mm; P < 0.001) were significantly greater for the gap balancing than the measured resection technique. Conclusion: The gap balancing and measured resection techniques showed similar soft tissue balancing, except for medial/LEG difference. However, the femoral component was more externally rotated and the joint line was more elevated with gap balancing than measured resection. These differences were minimal (around 1 mm or 1°) and therefore may have little effect on the biomechanics of the knee joint. This suggests that the gap balancing and measured resection techniques are not mutually exclusive. PMID:27684862
Agreement in polar motion measurements during the MERIT campaign
NASA Astrophysics Data System (ADS)
Djurovic, D.; Techy, C.; Paquet, P.
From the original polar motion (PM) measurements performed during the MERIT Campaign, the Chandler and the annual components are removed. The analysis of the residuals shows a high level of significant correlation between the various techniques mainly for phenomenon ranging from 30 days to a few months. For periods smaller than one month the series are not correlated except for the X component, deduced from laser and Doppler techniques, which remains significant at the 99 percent level. These results led to the belief for a new earth rotation service open to different sources of data.
Bonetti, Jennifer; Quarino, Lawrence
2014-05-01
This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2014-12-01
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise), and study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, we apply vbICA to different tectonically active scenarios, such as earthquakes in central and northern Italy, as well as the study of slow slip events in Cascadia.
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.
2009-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D
2008-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.
Similarities between principal components of protein dynamics and random diffusion
NASA Astrophysics Data System (ADS)
Hess, Berk
2000-12-01
Principal component analysis, also called essential dynamics, is a powerful tool for finding global, correlated motions in atomic simulations of macromolecules. It has become an established technique for analyzing molecular dynamics simulations of proteins. The first few principal components of simulations of large proteins often resemble cosines. We derive the principal components for high-dimensional random diffusion, which are almost perfect cosines. This resemblance between protein simulations and noise implies that for many proteins the time scales of current simulations are too short to obtain convergence of collective motions.
Karain, Wael I
2017-11-28
Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.
NASA Astrophysics Data System (ADS)
Moreau, O.; Libbrecht, C.; Lee, D.-W.; Surdej, J.
2005-06-01
Using an optimal image subtraction technique, we have derived the V and R light curves of the four lensed QSO components of Q2237+0305 from the monitoring CCD frames obtained by the GLITP collaboration with the 2.6 m NOT telescope in 1999/2000 (Alcalde et al. 2002). We give here a detailed account of the data reduction and analysis and of the error estimates. In agreement with Woźniak et al. (2000a,b), the good derived photometric accuracy of the GLITP data allows to discuss the possible interpretation of the light curve of component A as due to a microlensing event taking place in the deflecting galaxy. This interpretation is strengthened by the colour dependence of the early rise of the light curve of component A, as it probably corresponds to a caustics crossing by the QSO source.
NASA Technical Reports Server (NTRS)
Knuth, Kevin H.; Shah, Ankoor S.; Truccolo, Wilson; Ding, Ming-Zhou; Bressler, Steven L.; Schroeder, Charles E.
2003-01-01
Electric potentials and magnetic fields generated by ensembles of synchronously active neurons in response to external stimuli provide information essential to understanding the processes underlying cognitive and sensorimotor activity. Interpreting recordings of these potentials and fields is difficult as each detector records signals simultaneously generated by various regions throughout the brain. We introduce the differentially Variable Component Analysis (dVCA) algorithm, which relies on trial-to-trial variability in response amplitude and latency to identify multiple components. Using simulations we evaluate the importance of response variability to component identification, the robustness of dVCA to noise, and its ability to characterize single-trial data. Finally, we evaluate the technique using visually evoked field potentials recorded at incremental depths across the layers of cortical area VI, in an awake, behaving macaque monkey.
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
The application of low-rank and sparse decomposition method in the field of climatology
NASA Astrophysics Data System (ADS)
Gupta, Nitika; Bhaskaran, Prasad K.
2018-04-01
The present study reports a low-rank and sparse decomposition method that separates the mean and the variability of a climate data field. Until now, the application of this technique was limited only in areas such as image processing, web data ranking, and bioinformatics data analysis. In climate science, this method exactly separates the original data into a set of low-rank and sparse components, wherein the low-rank components depict the linearly correlated dataset (expected or mean behavior), and the sparse component represents the variation or perturbation in the dataset from its mean behavior. The study attempts to verify the efficacy of this proposed technique in the field of climatology with two examples of real world. The first example attempts this technique on the maximum wind-speed (MWS) data for the Indian Ocean (IO) region. The study brings to light a decadal reversal pattern in the MWS for the North Indian Ocean (NIO) during the months of June, July, and August (JJA). The second example deals with the sea surface temperature (SST) data for the Bay of Bengal region that exhibits a distinct pattern in the sparse component. The study highlights the importance of the proposed technique used for interpretation and visualization of climate data.
NASA Astrophysics Data System (ADS)
Su, Rongguo; Chen, Xiaona; Wu, Zhenzhen; Yao, Peng; Shi, Xiaoyong
2015-07-01
The feasibility of using fluorescence excitation-emission matrix (EEM) along with parallel factor analysis (PARAFAC) and nonnegative least squares (NNLS) method for the differentiation of phytoplankton taxonomic groups was investigated. Forty-one phytoplankton species belonging to 28 genera of five divisions were studied. First, the PARAFAC model was applied to EEMs, and 15 fluorescence components were generated. Second, 15 fluorescence components were found to have a strong discriminating capability based on Bayesian discriminant analysis (BDA). Third, all spectra of the fluorescence component compositions for the 41 phytoplankton species were spectrographically sorted into 61 reference spectra using hierarchical cluster analysis (HCA), and then, the reference spectra were used to establish a database. Finally, the phytoplankton taxonomic groups was differentiated by the reference spectra database using the NNLS method. The five phytoplankton groups were differentiated with the correct discrimination ratios (CDRs) of 100% for single-species samples at the division level. The CDRs for the mixtures were above 91% for the dominant phytoplankton species and above 73% for the subdominant phytoplankton species. Sixteen of the 85 field samples collected from the Changjiang River estuary were analyzed by both HPLC-CHEMTAX and the fluorometric technique developed. The results of both methods reveal that Bacillariophyta was the dominant algal group in these 16 samples and that the subdominant algal groups comprised Dinophyta, Chlorophyta and Cryptophyta. The differentiation results by the fluorometric technique were in good agreement with those from HPLC-CHEMTAX. The results indicate that the fluorometric technique could differentiate algal taxonomic groups accurately at the division level.
Principal Component Analysis in the Spectral Analysis of the Dynamic Laser Speckle Patterns
NASA Astrophysics Data System (ADS)
Ribeiro, K. M.; Braga, R. A., Jr.; Horgan, G. W.; Ferreira, D. D.; Safadi, T.
2014-02-01
Dynamic laser speckle is a phenomenon that interprets an optical patterns formed by illuminating a surface under changes with coherent light. Therefore, the dynamic change of the speckle patterns caused by biological material is known as biospeckle. Usually, these patterns of optical interference evolving in time are analyzed by graphical or numerical methods, and the analysis in frequency domain has also been an option, however involving large computational requirements which demands new approaches to filter the images in time. Principal component analysis (PCA) works with the statistical decorrelation of data and it can be used as a data filtering. In this context, the present work evaluated the PCA technique to filter in time the data from the biospeckle images aiming the reduction of time computer consuming and improving the robustness of the filtering. It was used 64 images of biospeckle in time observed in a maize seed. The images were arranged in a data matrix and statistically uncorrelated by PCA technique, and the reconstructed signals were analyzed using the routine graphical and numerical methods to analyze the biospeckle. Results showed the potential of the PCA tool in filtering the dynamic laser speckle data, with the definition of markers of principal components related to the biological phenomena and with the advantage of fast computational processing.
Rinaldi, Maurizio; Gindro, Roberto; Barbeni, Massimo; Allegrone, Gianna
2009-01-01
Orange (Citrus sinensis L.) juice comprises a complex mixture of volatile components that are difficult to identify and quantify. Classification and discrimination of the varieties on the basis of the volatile composition could help to guarantee the quality of a juice and to detect possible adulteration of the product. To provide information on the amounts of volatile constituents in fresh-squeezed juices from four orange cultivars and to establish suitable discrimination rules to differentiate orange juices using new chemometric approaches. Fresh juices of four orange cultivars were analysed by headspace solid-phase microextraction (HS-SPME) coupled with GC-MS. Principal component analysis, linear discriminant analysis and heuristic methods, such as neural networks, allowed clustering of the data from HS-SPME analysis while genetic algorithms addressed the problem of data reduction. To check the quality of the results the chemometric techniques were also evaluated on a sample. Thirty volatile compounds were identified by HS-SPME and GC-MS analyses and their relative amounts calculated. Differences in composition of orange juice volatile components were observed. The chosen orange cultivars could be discriminated using neural networks, genetic relocation algorithms and linear discriminant analysis. Genetic algorithms applied to the data were also able to detect the most significant compounds. SPME is a useful technique to investigate orange juice volatile composition and a flexible chemometric approach is able to correctly separate the juices.
Multimodal Neuroimaging: Basic Concepts and Classification of Neuropsychiatric Diseases.
Tulay, Emine Elif; Metin, Barış; Tarhan, Nevzat; Arıkan, Mehmet Kemal
2018-06-01
Neuroimaging techniques are widely used in neuroscience to visualize neural activity, to improve our understanding of brain mechanisms, and to identify biomarkers-especially for psychiatric diseases; however, each neuroimaging technique has several limitations. These limitations led to the development of multimodal neuroimaging (MN), which combines data obtained from multiple neuroimaging techniques, such as electroencephalography, functional magnetic resonance imaging, and yields more detailed information about brain dynamics. There are several types of MN, including visual inspection, data integration, and data fusion. This literature review aimed to provide a brief summary and basic information about MN techniques (data fusion approaches in particular) and classification approaches. Data fusion approaches are generally categorized as asymmetric and symmetric. The present review focused exclusively on studies based on symmetric data fusion methods (data-driven methods), such as independent component analysis and principal component analysis. Machine learning techniques have recently been introduced for use in identifying diseases and biomarkers of disease. The machine learning technique most widely used by neuroscientists is classification-especially support vector machine classification. Several studies differentiated patients with psychiatric diseases and healthy controls with using combined datasets. The common conclusion among these studies is that the prediction of diseases increases when combining data via MN techniques; however, there remain a few challenges associated with MN, such as sample size. Perhaps in the future N-way fusion can be used to combine multiple neuroimaging techniques or nonimaging predictors (eg, cognitive ability) to overcome the limitations of MN.
An FT-raman study of softwood, hardwood, and chemically modified black spruce MWLS
Umesh P. Agarwal; James D. McSweeny; Sally A. Ralph
1999-01-01
Raman spectroscopy is being increasingly used to carry out in situ analysis of wood and other lignocellulosics. To obtain useful information from the spectra, the vibrational bands need to be assigned in terms of contributions from various chemical components and component sub-structures. In additional, so that the technique can be better applied as an analytical...
Algorithm based on the short-term Rényi entropy and IF estimation for noisy EEG signals analysis.
Lerga, Jonatan; Saulig, Nicoletta; Mozetič, Vladimir
2017-01-01
Stochastic electroencephalogram (EEG) signals are known to be nonstationary and often multicomponential. Detecting and extracting their components may help clinicians to localize brain neurological dysfunctionalities for patients with motor control disorders due to the fact that movement-related cortical activities are reflected in spectral EEG changes. A new algorithm for EEG signal components detection from its time-frequency distribution (TFD) has been proposed in this paper. The algorithm utilizes the modification of the Rényi entropy-based technique for number of components estimation, called short-term Rényi entropy (STRE), and upgraded by an iterative algorithm which was shown to enhance existing approaches. Combined with instantaneous frequency (IF) estimation, the proposed method was applied to EEG signal analysis both in noise-free and noisy environments for limb movements EEG signals, and was shown to be an efficient technique providing spectral description of brain activities at each electrode location up to moderate additive noise levels. Furthermore, the obtained information concerning the number of EEG signal components and their IFs show potentials to enhance diagnostics and treatment of neurological disorders for patients with motor control illnesses. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Norinder, Ulf
1990-12-01
An experimental design based 3-D QSAR analysis using a combination of principal component and PLS analysis is presented and applied to human corticosteroid-binding globulin complexes. The predictive capability of the created model is good. The technique can also be used as guidance when selecting new compounds to be investigated.
Estimation of surface curvature from full-field shape data using principal component analysis
NASA Astrophysics Data System (ADS)
Sharma, Sameer; Vinuchakravarthy, S.; Subramanian, S. J.
2017-01-01
Three-dimensional digital image correlation (3D-DIC) is a popular image-based experimental technique for estimating surface shape, displacements and strains of deforming objects. In this technique, a calibrated stereo rig is used to obtain and stereo-match pairs of images of the object of interest from which the shapes of the imaged surface are then computed using the calibration parameters of the rig. Displacements are obtained by performing an additional temporal correlation of the shapes obtained at various stages of deformation and strains by smoothing and numerically differentiating the displacement data. Since strains are of primary importance in solid mechanics, significant efforts have been put into computation of strains from the measured displacement fields; however, much less attention has been paid to date to computation of curvature from the measured 3D surfaces. In this work, we address this gap by proposing a new method of computing curvature from full-field shape measurements using principal component analysis (PCA) along the lines of a similar work recently proposed to measure strains (Grama and Subramanian 2014 Exp. Mech. 54 913-33). PCA is a multivariate analysis tool that is widely used to reveal relationships between a large number of variables, reduce dimensionality and achieve significant denoising. This technique is applied here to identify dominant principal components in the shape fields measured by 3D-DIC and these principal components are then differentiated systematically to obtain the first and second fundamental forms used in the curvature calculation. The proposed method is first verified using synthetically generated noisy surfaces and then validated experimentally on some real world objects with known ground-truth curvatures.
Practical issues of hyperspectral imaging analysis of solid dosage forms.
Amigo, José Manuel
2010-09-01
Hyperspectral imaging techniques have widely demonstrated their usefulness in different areas of interest in pharmaceutical research during the last decade. In particular, middle infrared, near infrared, and Raman methods have gained special relevance. This rapid increase has been promoted by the capability of hyperspectral techniques to provide robust and reliable chemical and spatial information on the distribution of components in pharmaceutical solid dosage forms. Furthermore, the valuable combination of hyperspectral imaging devices with adequate data processing techniques offers the perfect landscape for developing new methods for scanning and analyzing surfaces. Nevertheless, the instrumentation and subsequent data analysis are not exempt from issues that must be thoughtfully considered. This paper describes and discusses the main advantages and drawbacks of the measurements and data analysis of hyperspectral imaging techniques in the development of solid dosage forms.
Approaches to the Analysis of School Costs, an Introduction.
ERIC Educational Resources Information Center
Payzant, Thomas
A review and general discussion of quantitative and qualitative techniques for the analysis of economic problems outside of education is presented to help educators discover new tools for planning, allocating, and evaluating educational resources. The pamphlet covers some major components of cost accounting, cost effectiveness, cost-benefit…
NASA Astrophysics Data System (ADS)
Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho
2013-10-01
Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.
Three-dimensional analysis of magnetometer array data
NASA Technical Reports Server (NTRS)
Richmond, A. D.; Baumjohann, W.
1984-01-01
A technique is developed for mapping magnetic variation fields in three dimensions using data from an array of magnetometers, based on the theory of optimal linear estimation. The technique is applied to data from the Scandinavian Magnetometer Array. Estimates of the spatial power spectra for the internal and external magnetic variations are derived, which in turn provide estimates of the spatial autocorrelation functions of the three magnetic variation components. Statistical errors involved in mapping the external and internal fields are quantified and displayed over the mapping region. Examples of field mapping and of separation into external and internal components are presented. A comparison between the three-dimensional field separation and a two-dimensional separation from a single chain of stations shows that significant differences can arise in the inferred internal component.
Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition
NASA Astrophysics Data System (ADS)
Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso
2005-04-01
Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.
Jia, Zhixin; Wu, Caisheng; Jin, Hongtao; Zhang, Jinlan
2014-11-15
Saussurea involucrata is a rare traditional Chinese medicine (TCM) that displays anti-fatigue, anti-inflammatory and anti-tumor effects. In this paper, the different chemical components of Saussurea involucrata were characterized and identified over a wide dynamic range by high-performance liquid chromatography coupled with high-resolution hybrid mass spectrometry (HPLC/HRMS/MS(n)) and the mass spectral trees similarity filter (MTSF) technique. The aerial parts of Saussurea involucrata were extracted with 75% ethanol. The partial extract was separated on a chromatography column to concentrate the low-concentration compounds. Mass data were acquired using full-scan mass analysis (resolving power 50,000) with data-dependent incorporation of dynamic exclusion analysis. The identified compounds were used as templates to construct a database of mass spectral trees. Data for the unknown compounds were matched with those templates and matching candidate structures were obtained. The detected compounds were characterized based on matching to candidate structures by the MTSF technique and were further identified by their accurate mass weight, multiple-stage analysis and fragmentation patterns and through comparison with literature data. A total of 38 compounds were identified including 19 flavones, 11 phenylpropanoids and 8 sphingolipids. Among them, 7 flavonoids, 8 phenylpropanoids and 8 sphingolipids were identified for the first time in Saussurea involucrata. HPLC/HRMS/MS(n) combined with MTSF was successfully used to discover and identify the chemical compounds in Saussurea involucrata. The results indicated that this combined technique was extremely useful for the rapid detection and identification of the chemical components in TCMs. Copyright © 2014 John Wiley & Sons, Ltd.
Calculating the mounting parameters for Taylor Spatial Frame correction using computed tomography.
Kucukkaya, Metin; Karakoyun, Ozgur; Armagan, Raffi; Kuzgun, Unal
2011-07-01
The Taylor Spatial Frame uses a computer program-based six-axis deformity analysis. However, there is often a residual deformity after the initial correction, especially in deformities with a rotational component. This problem can be resolved by recalculating the parameters and inputting all new deformity and mounting parameters. However, this may necessitate repeated x-rays and delay treatment. We believe that error in the mounting parameters is the main reason for most residual deformities. To prevent these problems, we describe a new calculation technique for determining the mounting parameters that uses computed tomography. This technique is especially advantageous for deformities with a rotational component. Using this technique, exact calculation of the mounting parameters is possible and the residual deformity and number of repeated x-rays can be minimized. This new technique is an alternative method to accurately calculating the mounting parameters.
Kong, Jessica; Giridharagopal, Rajiv; Harrison, Jeffrey S; Ginger, David S
2018-05-31
Correlating nanoscale chemical specificity with operational physics is a long-standing goal of functional scanning probe microscopy (SPM). We employ a data analytic approach combining multiple microscopy modes, using compositional information in infrared vibrational excitation maps acquired via photoinduced force microscopy (PiFM) with electrical information from conductive atomic force microscopy. We study a model polymer blend comprising insulating poly(methyl methacrylate) (PMMA) and semiconducting poly(3-hexylthiophene) (P3HT). We show that PiFM spectra are different from FTIR spectra, but can still be used to identify local composition. We use principal component analysis to extract statistically significant principal components and principal component regression to predict local current and identify local polymer composition. In doing so, we observe evidence of semiconducting P3HT within PMMA aggregates. These methods are generalizable to correlated SPM data and provide a meaningful technique for extracting complex compositional information that are impossible to measure from any one technique.
Estimating the vibration level of an L-shaped beam using power flow techniques
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.; Mccollum, M.; Rassineux, J. L.; Gilbert, T.
1986-01-01
The response of one component of an L-shaped beam, with point force excitation on the other component, is estimated using the power flow method. The transmitted power from the source component to the receiver component is expressed in terms of the transfer and input mobilities at the excitation point and the joint. The response is estimated both in narrow frequency bands, using the exact geometry of the beams, and as a frequency averaged response using infinite beam models. The results using this power flow technique are compared to the results obtained using finite element analysis (FEA) of the L-shaped beam for the low frequency response and to results obtained using statistical energy analysis (SEA) for the high frequencies. The agreement between the FEA results and the power flow method results at low frequencies is very good. SEA results are in terms of frequency averaged levels and these are in perfect agreement with the results obtained using the infinite beam models in the power flow method. The narrow frequency band results from the power flow method also converge to the SEA results at high frequencies. The advantage of the power flow method is that detail of the response can be retained while reducing computation time, which will allow the narrow frequency band analysis of the response to be extended to higher frequencies.
Qin, Yan; Pang, Yingming; Cheng, Zhihong
2016-11-01
The needle trap device (NTD) technique is a new microextraction method for sampling and preconcentration of volatile organic compounds (VOCs). Previous NTD studies predominantly focused on analysis of environmental volatile compounds in the gaseous and liquid phases. Little work has been done on its potential application in biological samples and no work has been reported on analysis of bioactive compounds in essential oils from herbal medicines. The main purpose of the present study is to develop a NTD sampling method for profiling VOCs in biological samples using herbal medicines as a case study. A combined method of NTD sample preparation and gas chromatography-mass spectrometry was developed for qualitative analysis of VOCs in Viola tianschanica. A 22-gauge stainless steel, triple-bed needle packed with Tenax, Carbopack X and Carboxen 1000 sorbents was used for analysis of VOCs in the herb. Furthermore, different parameters affecting the extraction efficiency and capacity were studied. The peak capacity obtained by NTDs was 104, more efficient than those of the static headspace (46) and hydrodistillation (93). This NTD method shows potential to trap a wide range of VOCs including the lower and higher volatile components, while the static headspace and hydrodistillation only detects lower volatile components, and semi-volatile and higher volatile components, respectively. The developed NTD sample preparation method is a more rapid, simpler, convenient, and sensitive extraction/desorption technique for analysis of VOCs in herbal medicines than the conventional methods such as static headspace and hydrodistillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
Principal component greenness transformation in multitemporal agricultural Landsat data
NASA Technical Reports Server (NTRS)
Abotteen, R. A.
1978-01-01
A data compression technique for multitemporal Landsat imagery which extracts phenological growth pattern information for agricultural crops is described. The principal component greenness transformation was applied to multitemporal agricultural Landsat data for information retrieval. The transformation was favorable for applications in agricultural Landsat data analysis because of its physical interpretability and its relation to the phenological growth of crops. It was also found that the first and second greenness eigenvector components define a temporal small-grain trajectory and nonsmall-grain trajectory, respectively.
Improving the geological interpretation of magnetic and gravity satellite anomalies
NASA Technical Reports Server (NTRS)
Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.
1987-01-01
Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.
Principal components analysis of Jupiter VIMS spectra
Bellucci, G.; Formisano, V.; D'Aversa, E.; Brown, R.H.; Baines, K.H.; Bibring, J.-P.; Buratti, B.J.; Capaccioni, F.; Cerroni, P.; Clark, R.N.; Coradini, A.; Cruikshank, D.P.; Drossart, P.; Jaumann, R.; Langevin, Y.; Matson, D.L.; McCord, T.B.; Mennella, V.; Nelson, R.M.; Nicholson, P.D.; Sicardy, B.; Sotin, Christophe; Chamberlain, M.C.; Hansen, G.; Hibbits, K.; Showalter, M.; Filacchione, G.
2004-01-01
During Cassini - Jupiter flyby occurred in December 2000, Visual-Infrared mapping spectrometer (VIMS) instrument took several image cubes of Jupiter at different phase angles and distances. We have analysed the spectral images acquired by the VIMS visual channel by means of a principal component analysis technique (PCA). The original data set consists of 96 spectral images in the 0.35-1.05 ??m wavelength range. The product of the analysis are new PC bands, which contain all the spectral variance of the original data. These new components have been used to produce a map of Jupiter made of seven coherent spectral classes. The map confirms previously published work done on the Great Red Spot by using NIMS data. Some other new findings, presently under investigation, are presented. ?? 2004 Published by Elsevier Ltd on behalf of COSPAR.
Analysis techniques for residual acceleration data
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.
1990-01-01
Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.
Phased-mission system analysis using Boolean algebraic methods
NASA Technical Reports Server (NTRS)
Somani, Arun K.; Trivedi, Kishor S.
1993-01-01
Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.
A multiple technique approach to the analysis of urinary calculi.
Rodgers, A L; Nassimbeni, L R; Mulder, K J
1982-01-01
10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.
ERIC Educational Resources Information Center
Silverman, Mitchell
Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…
Cebi, Nur; Yilmaz, Mustafa Tahsin; Sagdic, Osman
2017-08-15
Sibutramine may be illicitly included in herbal slimming foods and supplements marketed as "100% natural" to enhance weight loss. Considering public health and legal regulations, there is an urgent need for effective, rapid and reliable techniques to detect sibutramine in dietetic herbal foods, teas and dietary supplements. This research comprehensively explored, for the first time, detection of sibutramine in green tea, green coffee and mixed herbal tea using ATR-FTIR spectroscopic technique combined with chemometrics. Hierarchical cluster analysis and PCA principle component analysis techniques were employed in spectral range (2746-2656cm -1 ) for classification and discrimination through Euclidian distance and Ward's algorithm. Unadulterated and adulterated samples were classified and discriminated with respect to their sibutramine contents with perfect accuracy without any false prediction. The results suggest that existence of the active substance could be successfully determined at the levels in the range of 0.375-12mg in totally 1.75g of green tea, green coffee and mixed herbal tea by using FTIR-ATR technique combined with chemometrics. Copyright © 2017 Elsevier Ltd. All rights reserved.
Woodrow, Graham
2007-06-01
Complex abnormalities of body composition occur in peritoneal dialysis (PD). These abnormalities reflect changes in hydration, nutrition, and body fat, and they are of major clinical significance. Clinical assessment of these body compartments is insensitive and inaccurate. Frequently, simultaneous changes of hydration, wasting, and body fat content can occur, confounding clinical assessment of each component. Body composition can be described by models of varying complexity that use one or more measurement techniques. "Gold standard" methods provide accurate and precise data, but are not practical for routine clinical use. Dual energy X-ray absorptiometry allows for measurement of regional as well as whole-body composition, which can provide further information of clinical relevance. Simpler techniques such as anthropometry and bioelectrical impedance analysis are suited to routine use in clinic or at the bedside, but may be less accurate. Body composition methodology sometimes makes assumptions regarding relationships between components, particularly in regard to hydration, which may be invalid in pathologic states. Uncritical application of these methods to the PD patient may result in erroneous interpretation of results. Understanding the foundations and limitations of body composition techniques allows for optimal application in clinical practice.
Multi-component separation and analysis of bat echolocation calls.
DiCecco, John; Gaudette, Jason E; Simmons, James A
2013-01-01
The vast majority of animal vocalizations contain multiple frequency modulated (FM) components with varying amounts of non-linear modulation and harmonic instability. This is especially true of biosonar sounds where precise time-frequency templates are essential for neural information processing of echoes. Understanding the dynamic waveform design by bats and other echolocating animals may help to improve the efficacy of man-made sonar through biomimetic design. Bats are known to adapt their call structure based on the echolocation task, proximity to nearby objects, and density of acoustic clutter. To interpret the significance of these changes, a method was developed for component separation and analysis of biosonar waveforms. Techniques for imaging in the time-frequency plane are typically limited due to the uncertainty principle and interference cross terms. This problem is addressed by extending the use of the fractional Fourier transform to isolate each non-linear component for separate analysis. Once separated, empirical mode decomposition can be used to further examine each component. The Hilbert transform may then successfully extract detailed time-frequency information from each isolated component. This multi-component analysis method is applied to the sonar signals of four species of bats recorded in-flight by radiotelemetry along with a comparison of other common time-frequency representations.
Recent Developments in Microsystems Fabricated by the Liga-Technique
NASA Technical Reports Server (NTRS)
Schulz, J.; Bade, K.; El-Kholi, A.; Hein, H.; Mohr, J.
1995-01-01
As an example of microsystems fabricated by the LIGA-technique (x-ray lithography, electroplating and molding), three systems are described and characterized: a triaxial acceleration sensor system, a micro-optical switch, and a microsystem for the analysis of pollutants. The fabrication technologies are reviewed with respect to the key components of the three systems: an acceleration sensor, and electrostatic actuator, and a spectrometer made by the LIGA-technique. Aa micro-pump and micro-valve made by using micromachined tools for molding and optical fiber imaging are made possible by combining LIGA and anisotropic etching of silicon in a batch process. These examples show that the combination of technologies and components is the key to complex microsystems. The design of such microsystems will be facilitated is standardized interfaces are available.
NASA Astrophysics Data System (ADS)
Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao
2018-04-01
In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.
Classification of fMRI resting-state maps using machine learning techniques: A comparative study
NASA Astrophysics Data System (ADS)
Gallos, Ioannis; Siettos, Constantinos
2017-11-01
We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.
Component pattern analysis of chemicals using multispectral THz imaging system
NASA Astrophysics Data System (ADS)
Kawase, Kodo; Ogawa, Yuichi; Watanabe, Yuki
2004-04-01
We have developed a novel basic technology for terahertz (THz) imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral transillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.
Racetrack resonator as a loss measurement platform for photonic components.
Jones, Adam M; DeRose, Christopher T; Lentine, Anthony L; Starbuck, Andrew; Pomerene, Andrew T S; Norwood, Robert A
2015-11-02
This work represents the first complete analysis of the use of a racetrack resonator to measure the insertion loss of efficient, compact photonic components. Beginning with an in-depth analysis of potential error sources and a discussion of the calibration procedure, the technique is used to estimate the insertion loss of waveguide width tapers of varying geometry with a resulting 95% confidence interval of 0.007 dB. The work concludes with a performance comparison of the analyzed tapers with results presented for four taper profiles and three taper lengths.
Characterization of exopolymers of aquatic bacteria by pyrolysis-mass spectrometry
NASA Technical Reports Server (NTRS)
Ford, T.; Sacco, E.; Black, J.; Kelley, T.; Goodacre, R.; Berkeley, R. C.; Mitchell, R.
1991-01-01
Exopolymers from a diverse collection of marine and freshwater bacteria were characterized by pyrolysis-mass spectrometry (Py-MS). Py-MS provides spectra of pyrolysis fragments that are characteristic of the original material. Analysis of the spectra by multivariate statistical techniques (principal component and canonical variate analysis) separated these exopolymers into distinct groups. Py-MS clearly distinguished characteristic fragments, which may be derived from components responsible for functional differences between polymers. The importance of these distinctions and the relevance of pyrolysis information to exopolysaccharide function in aquatic bacteria is discussed.
Isomorphisms between Petri nets and dataflow graphs
NASA Technical Reports Server (NTRS)
Kavi, Krishna M.; Buckles, Billy P.; Bhat, U. Narayan
1987-01-01
Dataflow graphs are a generalized model of computation. Uninterpreted dataflow graphs with nondeterminism resolved via probabilities are shown to be isomorphic to a class of Petri nets known as free choice nets. Petri net analysis methods are readily available in the literature and this result makes those methods accessible to dataflow research. Nevertheless, combinatorial explosion can render Petri net analysis inoperative. Using a previously known technique for decomposing free choice nets into smaller components, it is demonstrated that, in principle, it is possible to determine aspects of the overall behavior from the particular behavior of components.
Racetrack resonator as a loss measurement platform for photonic components
Jones, Adam M.; Univ. of Arizona, Tucson, AZ; DeRose, Christopher T.; ...
2015-10-27
This work represents the first complete analysis of the use of a racetrack resonator to measure the insertion loss of efficient, compact photonic components. Beginning with an in-depth analysis of potential error sources and a discussion of the calibration procedure, the technique is used to estimate the insertion loss of waveguide width tapers of varying geometry with a resulting 95% confidence interval of 0.007 dB. Furthermore, the work concludes with a performance comparison of the analyzed tapers with results presented for four taper profiles and three taper lengths.
Using Movies to Analyse Gene Circuit Dynamics in Single Cells
Locke, James CW; Elowitz, Michael B
2010-01-01
Preface Many bacterial systems rely on dynamic genetic circuits to control critical processes. A major goal of systems biology is to understand these behaviours in terms of individual genes and their interactions. However, traditional techniques based on population averages wash out critical dynamics that are either unsynchronized between cells or driven by fluctuations, or ‘noise,’ in cellular components. Recently, the combination of time-lapse microscopy, quantitative image analysis, and fluorescent protein reporters has enabled direct observation of multiple cellular components over time in individual cells. In conjunction with mathematical modelling, these techniques are now providing powerful insights into genetic circuit behaviour in diverse microbial systems. PMID:19369953
Lamberti, Alfredo; Chiesura, Gabriele; Luyckx, Geert; Degrieck, Joris; Kaufmann, Markus; Vanlanduit, Steve
2015-01-01
The measurement of the internal deformations occurring in real-life composite components is a very challenging task, especially for those components that are rather difficult to access. Optical fiber sensors can overcome such a problem, since they can be embedded in the composite materials and serve as in situ sensors. In this article, embedded optical fiber Bragg grating (FBG) sensors are used to analyze the vibration characteristics of two real-life composite components. The first component is a carbon fiber-reinforced polymer automotive control arm; the second is a glass fiber-reinforced polymer aeronautic hinge arm. The modal parameters of both components were estimated by processing the FBG signals with two interrogation techniques: the maximum detection and fast phase correlation algorithms were employed for the demodulation of the FBG signals; the Peak-Picking and PolyMax techniques were instead used for the parameter estimation. To validate the FBG outcomes, reference measurements were performed by means of a laser Doppler vibrometer. The analysis of the results showed that the FBG sensing capabilities were enhanced when the recently-introduced fast phase correlation algorithm was combined with the state-of-the-art PolyMax estimator curve fitting method. In this case, the FBGs provided the most accurate results, i.e., it was possible to fully characterize the vibration behavior of both composite components. When using more traditional interrogation algorithms (maximum detection) and modal parameter estimation techniques (Peak-Picking), some of the modes were not successfully identified. PMID:26516854
NASA Astrophysics Data System (ADS)
Schelkanova, Irina; Toronov, Vladislav
2011-07-01
Although near infrared spectroscopy (NIRS) is now widely used both in emerging clinical techniques and in cognitive neuroscience, the development of the apparatuses and signal processing methods for these applications is still a hot research topic. The main unresolved problem in functional NIRS is the separation of functional signals from the contaminations by systemic and local physiological fluctuations. This problem was approached by using various signal processing methods, including blind signal separation techniques. In particular, principal component analysis (PCA) and independent component analysis (ICA) were applied to the data acquired at the same wavelength and at multiple sites on the human or animal heads during functional activation. These signal processing procedures resulted in a number of principal or independent components that could be attributed to functional activity but their physiological meaning remained unknown. On the other hand, the best physiological specificity is provided by broadband NIRS. Also, a comparison with functional magnetic resonance imaging (fMRI) allows determining the spatial origin of fNIRS signals. In this study we applied PCA and ICA to broadband NIRS data to distill the components correlating with the breath hold activation paradigm and compared them with the simultaneously acquired fMRI signals. Breath holding was used because it generates blood carbon dioxide (CO2) which increases the blood-oxygen-level-dependent (BOLD) signal as CO2 acts as a cerebral vasodilator. Vasodilation causes increased cerebral blood flow which washes deoxyhaemoglobin out of the cerebral capillary bed thus increasing both the cerebral blood volume and oxygenation. Although the original signals were quite diverse, we found very few different components which corresponded to fMRI signals at different locations in the brain and to different physiological chromophores.
Vergara, Victor M; Ulloa, Alvaro; Calhoun, Vince D; Boutte, David; Chen, Jiayu; Liu, Jingyu
2014-09-01
Multi-modal data analysis techniques, such as the Parallel Independent Component Analysis (pICA), are essential in neuroscience, medical imaging and genetic studies. The pICA algorithm allows the simultaneous decomposition of up to two data modalities achieving better performance than separate ICA decompositions and enabling the discovery of links between modalities. However, advances in data acquisition techniques facilitate the collection of more than two data modalities from each subject. Examples of commonly measured modalities include genetic information, structural magnetic resonance imaging (MRI) and functional MRI. In order to take full advantage of the available data, this work extends the pICA approach to incorporate three modalities in one comprehensive analysis. Simulations demonstrate the three-way pICA performance in identifying pairwise links between modalities and estimating independent components which more closely resemble the true sources than components found by pICA or separate ICA analyses. In addition, the three-way pICA algorithm is applied to real experimental data obtained from a study that investigate genetic effects on alcohol dependence. Considered data modalities include functional MRI (contrast images during alcohol exposure paradigm), gray matter concentration images from structural MRI and genetic single nucleotide polymorphism (SNP). The three-way pICA approach identified links between a SNP component (pointing to brain function and mental disorder associated genes, including BDNF, GRIN2B and NRG1), a functional component related to increased activation in the precuneus area, and a gray matter component comprising part of the default mode network and the caudate. Although such findings need further verification, the simulation and in-vivo results validate the three-way pICA algorithm presented here as a useful tool in biomedical data fusion applications. Copyright © 2014 Elsevier Inc. All rights reserved.
CARES/Life Software for Designing More Reliable Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.
Choo, Yuen May; Ng, Mei Han; Ma, Ah Ngan; Chuah, Cheng Hock; Hashim, Mohd Ali
2005-04-01
The application of supercritical fluid chromatography (SFC) coupled with a UV variable-wavelength detector to isolate the minor components (carotenes, vitamin E, sterols, and squalene) in crude palm oil (CPO) and the residual oil from palm-pressed fiber is reported. SFC is a good technique for the isolation and analysis of these compounds from the sources mentioned. The carotenes, vitamin E, sterols, and squalene were isolated in less than 20 min. The individual vitamin E isomers present in palm oil were also isolated into their respective components, alpha-tocopherol, alpha-tocotrienol, gamma-tocopherol, gamma-tocotrienol, and delta-tocotrienol. Calibration of all the minor components of palm as well as the individual components of palm vitamin E was carried out and was found to be comparable to those analyzed by other established analytical methods.
NASA Astrophysics Data System (ADS)
Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław
Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.; Tanner, J. A.
1984-01-01
An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.
Color enhancement of landsat agricultural imagery: JPL LACIE image processing support task
NASA Technical Reports Server (NTRS)
Madura, D. P.; Soha, J. M.; Green, W. B.; Wherry, D. B.; Lewis, S. D.
1978-01-01
Color enhancement techniques were applied to LACIE LANDSAT segments to determine if such enhancement can assist analysis in crop identification. The procedure involved increasing the color range by removing correlation between components. First, a principal component transformation was performed, followed by contrast enhancement to equalize component variances, followed by an inverse transformation to restore familiar color relationships. Filtering was applied to lower order components to reduce color speckle in the enhanced products. Use of single acquisition and multiple acquisition statistics to control the enhancement were compared, and the effects of normalization investigated. Evaluation is left to LACIE personnel.
A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines
NASA Astrophysics Data System (ADS)
Pennacchi, P.; Borghesani, P.; Chatterton, S.
2015-08-01
Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.
Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study
NASA Astrophysics Data System (ADS)
Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh
2018-03-01
Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.
Development of neural network techniques for finger-vein pattern classification
NASA Astrophysics Data System (ADS)
Wu, Jian-Da; Liu, Chiung-Tsiung; Tsai, Yi-Jang; Liu, Jun-Ching; Chang, Ya-Wen
2010-02-01
A personal identification system using finger-vein patterns and neural network techniques is proposed in the present study. In the proposed system, the finger-vein patterns are captured by a device that can transmit near infrared through the finger and record the patterns for signal analysis and classification. The biometric system for verification consists of a combination of feature extraction using principal component analysis and pattern classification using both back-propagation network and adaptive neuro-fuzzy inference systems. Finger-vein features are first extracted by principal component analysis method to reduce the computational burden and removes noise residing in the discarded dimensions. The features are then used in pattern classification and identification. To verify the effect of the proposed adaptive neuro-fuzzy inference system in the pattern classification, the back-propagation network is compared with the proposed system. The experimental results indicated the proposed system using adaptive neuro-fuzzy inference system demonstrated a better performance than the back-propagation network for personal identification using the finger-vein patterns.
Chemometric Data Analysis for Deconvolution of Overlapped Ion Mobility Profiles
NASA Astrophysics Data System (ADS)
Zekavat, Behrooz; Solouki, Touradj
2012-11-01
We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution.
Exploring patterns enriched in a dataset with contrastive principal component analysis.
Abid, Abubakar; Zhang, Martin J; Bagaria, Vivek K; Zou, James
2018-05-30
Visualization and exploration of high-dimensional data is a ubiquitous challenge across disciplines. Widely used techniques such as principal component analysis (PCA) aim to identify dominant trends in one dataset. However, in many settings we have datasets collected under different conditions, e.g., a treatment and a control experiment, and we are interested in visualizing and exploring patterns that are specific to one dataset. This paper proposes a method, contrastive principal component analysis (cPCA), which identifies low-dimensional structures that are enriched in a dataset relative to comparison data. In a wide variety of experiments, we demonstrate that cPCA with a background dataset enables us to visualize dataset-specific patterns missed by PCA and other standard methods. We further provide a geometric interpretation of cPCA and strong mathematical guarantees. An implementation of cPCA is publicly available, and can be used for exploratory data analysis in many applications where PCA is currently used.
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saito, M.; Suzuki, S.; Kimura, M.
Quantitative X-ray structural analysis coupled with anomalous X-ray scattering has been used for characterizing the atomic-scale structure of rust formed on steel surfaces. Samples were prepared from rust layers formed on the surfaces of two commercial steels. X-ray scattered intensity profiles of the two samples showed that the rusts consisted mainly of two types of ferric oxyhydroxide, {alpha}-FeOOH and {gamma}-FeOOH. The amounts of these rust components and the realistic atomic arrangements in the components were estimated by fitting both the ordinary and the environmental interference functions with a model structure calculated using the reverse Monte Carlo simulation technique. The twomore » rust components were found to be the network structure formed by FeO{sub 6} octahedral units, the network structure itself deviating from the ideal case. The present results also suggest that the structural analysis method using anomalous X-ray scattering and the reverse Monte Carlo technique is very successful in determining the atomic-scale structure of rusts formed on the steel surfaces.« less
Kesharaju, Manasa; Nagarajah, Romesh
2015-09-01
The motivation for this research stems from a need for providing a non-destructive testing method capable of detecting and locating any defects and microstructural variations within armour ceramic components before issuing them to the soldiers who rely on them for their survival. The development of an automated ultrasonic inspection based classification system would make possible the checking of each ceramic component and immediately alert the operator about the presence of defects. Generally, in many classification problems a choice of features or dimensionality reduction is significant and simultaneously very difficult, as a substantial computational effort is required to evaluate possible feature subsets. In this research, a combination of artificial neural networks and genetic algorithms are used to optimize the feature subset used in classification of various defects in reaction-sintered silicon carbide ceramic components. Initially wavelet based feature extraction is implemented from the region of interest. An Artificial Neural Network classifier is employed to evaluate the performance of these features. Genetic Algorithm based feature selection is performed. Principal Component Analysis is a popular technique used for feature selection and is compared with the genetic algorithm based technique in terms of classification accuracy and selection of optimal number of features. The experimental results confirm that features identified by Principal Component Analysis lead to improved performance in terms of classification percentage with 96% than Genetic algorithm with 94%. Copyright © 2015 Elsevier B.V. All rights reserved.
NIR monitoring of in-service wood structures
Michela Zanetti; Timothy G. Rials; Douglas Rammer
2005-01-01
Near infrared spectroscopy (NIRS) was used to study a set of Southern Yellow Pine boards exposed to natural weathering for different periods of exposure time. This non-destructive spectroscopic technique is a very powerful tool to predict the weathering of wood when used in combination with multivariate analysis (Principal Component Analysis, PCA, and Projection to...
USDA-ARS?s Scientific Manuscript database
A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...
A reduced order, test verified component mode synthesis approach for system modeling applications
NASA Astrophysics Data System (ADS)
Butland, Adam; Avitabile, Peter
2010-05-01
Component mode synthesis (CMS) is a very common approach used for the generation of large system models. In general, these modeling techniques can be separated into two categories: those utilizing a combination of constraint modes and fixed interface normal modes and those based on a combination of free interface normal modes and residual flexibility terms. The major limitation of the methods utilizing constraint modes and fixed interface normal modes is the inability to easily obtain the required information from testing; the result of this limitation is that constraint mode-based techniques are primarily used with numerical models. An alternate approach is proposed which utilizes frequency and shape information acquired from modal testing to update reduced order finite element models using exact analytical model improvement techniques. The connection degrees of freedom are then rigidly constrained in the test verified, reduced order model to provide the boundary conditions necessary for constraint modes and fixed interface normal modes. The CMS approach is then used with this test verified, reduced order model to generate the system model for further analysis. A laboratory structure is used to show the application of the technique with both numerical and simulated experimental components to describe the system and validate the proposed approach. Actual test data is then used in the approach proposed. Due to typical measurement data contaminants that are always included in any test, the measured data is further processed to remove contaminants and is then used in the proposed approach. The final case using improved data with the reduced order, test verified components is shown to produce very acceptable results from the Craig-Bampton component mode synthesis approach. Use of the technique with its strengths and weaknesses are discussed.
Poonkuzhali, K; Rajeswari, V; Saravanakumar, T; Viswanathamurthi, P; Park, Seung-Moon; Govarthanan, M; Sathishkumar, P; Palvannan, T
2014-05-15
The effluent discharge treatment for controlling the environment from non biodegradable metal contaminants using plant extract is an efficient technique. The reduction of hexavalent chromium by abundantly available weed, Aerva lanata L. was investigated using batch equilibrium technique. The variables studied were Cr(VI) concentration, Aerva lanata L. dose, contact time, pH, temperature and agitation speed. Cyclic voltammetry and ICP-MS analysis confirmed the reduction of Cr(VI) to Cr(III). Electrochemical analysis proved that, the chromium has not been degraded and the valency of the chromium has only been changed. ICP-MS analysis shows that 100ng/L of hexavalent chromium was reduced to 97.01ng/L trivalent chromium. These results suggest that components present in the Aerva lanata L. are responsible for the reduction of Cr(VI) to Cr(III). The prime components ferulic acid, kaempherol and β-carboline present in the Aerva lanata L. may be responsible for the reduction of Cr(VI) as evident from LC-MS analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tolstikov, Vladimir V.
Analysis of the metabolome with coverage of all of the possibly detectable components in the sample, rather than analysis of each individual metabolite at a given time, can be accomplished by metabolic analysis. Targeted and/or nontargeted approaches are applied as needed for particular experiments. Monitoring hundreds or more metabolites at a given time requires high-throughput and high-end techniques that enable screening for relative changes in, rather than absolute concentrations of, compounds within a wide dynamic range. Most of the analytical techniques useful for these purposes use GC or HPLC/UPLC separation modules coupled to a fast and accurate mass spectrometer. GC separations require chemical modification (derivatization) before analysis, and work efficiently for the small molecules. HPLC separations are better suited for the analysis of labile and nonvolatile polar and nonpolar compounds in their native form. Direct infusion and NMR-based techniques are mostly used for fingerprinting and snap phenotyping, where applicable. Discovery and validation of metabolic biomarkers are exciting and promising opportunities offered by metabolic analysis applied to biological and biomedical experiments. We have demonstrated that GC-TOF-MS, HPLC/UPLC-RP-MS and HILIC-LC-MS techniques used for metabolic analysis offer sufficient metabolome mapping providing researchers with confident data for subsequent multivariate analysis and data mining.
Indirect measurements of hydrogen: The deficit method for a many-component system
NASA Astrophysics Data System (ADS)
Levine, Timothy E.; Yu, Ning; Kodali, Padma; Walter, Kevin C.; Nastasi, Michael; Tesmer, Joseph R.; Maggiore, Carl J.; Mayer, James W.
We have developed a simple technique for determining hydrogen atomic fraction from the ion backscattering spectrometry (IBS) signals of the remaining species. This technique uses the surface heights of various IBS signals in the form of a linear matrix equation. We apply this technique to in situ analysis of ion-beam-induced densification of sol-gel zirconia thin films, where hydrogen is the most volatile species during irradiation. Attendant errors are discussed with an emphasis on stopping powers and Bragg's rule.
UPb ages of zircon rims: A new analytical method using the air-abrasion technique
Aleinikoff, J.N.; Winegarden, D.L.; Walter, M.
1990-01-01
We present a new technique for directly dating, by conventional techniques, the rims of zircons. Several circumstances, such as a xenocrystic or inherited component in igneous zircon and metamorphic overgrowths on igneous cores, can result in grains with physically distinct age components. Pneumatic abrasion has been previously shown by Krogh to remove overgrowths and damaged areas of zircon, leaving more resistant and isotopically less disturbed parts available for analysis. A new abrader design, which is capable of very gently grinding only tips and interfacial edges of even needle-like grains, permits easy collection of abraded material for dating. Five examples demonstrate the utility of the "dust-collecting" technique, including two studies that compare conventional, ion microprobe and abrader data. Common Pb may be strongly concentrated in the outermost zones of many zircons and this Pb is not easily removed by leaching (even in weak HF). Thus, the benefit of removing only the outermost zones (and avoiding mixing of age components) is somewhat compromised by the much higher common Pb contents which result in less precise age determinations. A very brief abrasion to remove the high common Pb zones prior to collection of material for dating is selected. ?? 1990.
Raine, Dan; Langley, Philip; Murray, Alan; Dunuwille, Asunga; Bourke, John P
2004-09-01
The aims of this study were to evaluate (1) principal component analysis as a technique for extracting the atrial signal waveform from the standard 12-lead ECG and (2) its ability to distinguish changes in atrial fibrillation (AF) frequency parameters over time and in response to pharmacologic manipulation using drugs with different effects on atrial electrophysiology. Twenty patients with persistent AF were studied. Continuous 12-lead Holter ECGs were recorded for 60 minutes, first, in the drug-free state. Mean and variability of atrial waveform frequency were measured using an automated computer technique. This extracted the atrial signal by principal component analysis and identified the main frequency component using Fourier analysis. Patients were then allotted sequentially to receive 1 of 4 drugs intravenously (amiodarone, flecainide, sotalol, or metoprolol), and changes induced in mean and variability of atrial waveform frequency measured. Mean and variability of atrial waveform frequency did not differ within patients between the two 30-minute sections of the drug-free state. As hypothesized, significant changes in mean and variability of atrial waveform frequency were detected after manipulation with amiodarone (mean: 5.77 vs 4.86 Hz; variability: 0.55 vs 0.31 Hz), flecainide (mean: 5.33 vs 4.72 Hz; variability: 0.71 vs 0.31 Hz), and sotalol (mean: 5.94 vs 4.90 Hz; variability: 0.73 vs 0.40 Hz) but not with metoprolol (mean: 5.41 vs 5.17 Hz; variability: 0.81 vs 0.82 Hz). A technique for continuously analyzing atrial frequency characteristics of AF from the surface ECG has been developed and validated.
Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems
NASA Astrophysics Data System (ADS)
Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn
The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.
ERIC Educational Resources Information Center
Van Atta, Robert E.; Van Atta, R. Lewis
1980-01-01
Provides a gas chromatography experiment that exercises the quantitative technique of standard addition to the analysis for a minor component, methyl salicylate, in a commercial product, "wintergreen rubbing alcohol." (CS)
Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method
NASA Astrophysics Data System (ADS)
De Waal, Sybrand A.
1996-07-01
A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.
NASA Astrophysics Data System (ADS)
Chen, Feier; Tian, Kang; Ding, Xiaoxu; Miao, Yuqi; Lu, Chunxia
2016-11-01
Analysis of freight rate volatility characteristics attracts more attention after year 2008 due to the effect of credit crunch and slowdown in marine transportation. The multifractal detrended fluctuation analysis technique is employed to analyze the time series of Baltic Dry Bulk Freight Rate Index and the market trend of two bulk ship sizes, namely Capesize and Panamax for the period: March 1st 1999-February 26th 2015. In this paper, the degree of the multifractality with different fluctuation sizes is calculated. Besides, multifractal detrending moving average (MF-DMA) counting technique has been developed to quantify the components of multifractal spectrum with the finite-size effect taken into consideration. Numerical results show that both Capesize and Panamax freight rate index time series are of multifractal nature. The origin of multifractality for the bulk freight rate market series is found mostly due to nonlinear correlation.
NASA Astrophysics Data System (ADS)
Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan
2017-09-01
Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.
Dipeptide Sequence Determination: Analyzing Phenylthiohydantoin Amino Acids by HPLC
NASA Astrophysics Data System (ADS)
Barton, Janice S.; Tang, Chung-Fei; Reed, Steven S.
2000-02-01
Amino acid composition and sequence determination, important techniques for characterizing peptides and proteins, are essential for predicting conformation and studying sequence alignment. This experiment presents improved, fundamental methods of sequence analysis for an upper-division biochemistry laboratory. Working in pairs, students use the Edman reagent to prepare phenylthiohydantoin derivatives of amino acids for determination of the sequence of an unknown dipeptide. With a single HPLC technique, students identify both the N-terminal amino acid and the composition of the dipeptide. This method yields good precision of retention times and allows use of a broad range of amino acids as components of the dipeptide. Students learn fundamental principles and techniques of sequence analysis and HPLC.
Quantitative Hydrocarbon Surface Analysis
NASA Technical Reports Server (NTRS)
Douglas, Vonnie M.
2000-01-01
The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.
NASA Technical Reports Server (NTRS)
Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.
1981-01-01
An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.
Analysis of local delaminations caused by angle ply matrix cracks
NASA Technical Reports Server (NTRS)
Salpekar, Satish A.; Obrien, T. Kevin; Shivakumar, K. N.
1993-01-01
Two different families of graphite/epoxy laminates with similar layups but different stacking sequences, (0,theta,-theta) sub s and (-theta/theta/0) sub s were analyzed using three-dimensional finite element analysis for theta = 15 and 30 degrees. Delaminations were modeled in the -theta/theta interface, bounded by a matrix crack and the stress free edge. The total strain energy release rate, G, along the delamination front was computed using three different techniques: the virtual crack closure technique (VCCT), the equivalent domain Integral (EDI) technique, and a global energy balance technique. The opening fracture mode component of the strain energy release rate, Gl, along the delamination front was also computed for various delamination lengths using VCCT. The effect of residual thermal and moisture stresses on G was evaluated.
Computational Fatigue Life Analysis of Carbon Fiber Laminate
NASA Astrophysics Data System (ADS)
Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.
2018-02-01
In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.
Weight estimation techniques for composite airplanes in general aviation industry
NASA Technical Reports Server (NTRS)
Paramasivam, T.; Horn, W. J.; Ritter, J.
1986-01-01
Currently available weight estimation methods for general aviation airplanes were investigated. New equations with explicit material properties were developed for the weight estimation of aircraft components such as wing, fuselage and empennage. Regression analysis was applied to the basic equations for a data base of twelve airplanes to determine the coefficients. The resulting equations can be used to predict the component weights of either metallic or composite airplanes.
Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background
NASA Astrophysics Data System (ADS)
McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.
2017-12-01
Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.
Demirci, Oguz; Clark, Vincent P; Calhoun, Vince D
2008-02-15
Schizophrenia is diagnosed based largely upon behavioral symptoms. Currently, no quantitative, biologically based diagnostic technique has yet been developed to identify patients with schizophrenia. Classification of individuals into patient with schizophrenia and healthy control groups based on quantitative biologically based data is of great interest to support and refine psychiatric diagnoses. We applied a novel projection pursuit technique on various components obtained with independent component analysis (ICA) of 70 subjects' fMRI activation maps obtained during an auditory oddball task. The validity of the technique was tested with a leave-one-out method and the detection performance varied between 80% and 90%. The findings suggest that the proposed data reduction algorithm is effective in classifying individuals into schizophrenia and healthy control groups and may eventually prove useful as a diagnostic tool.
Beekman, Alice; Shan, Daxian; Ali, Alana; Dai, Weiguo; Ward-Smith, Stephen; Goldenberg, Merrill
2005-04-01
This study evaluated the effect of the imaginary component of the refractive index on laser diffraction particle size data for pharmaceutical samples. Excipient particles 1-5 microm in diameter (irregular morphology) were measured by laser diffraction. Optical parameters were obtained and verified based on comparison of calculated vs. actual particle volume fraction. Inappropriate imaginary components of the refractive index can lead to inaccurate results, including false peaks in the size distribution. For laser diffraction measurements, obtaining appropriate or "effective" imaginary components of the refractive index was not always straightforward. When the recommended criteria such as the concentration match and the fit of the scattering data gave similar results for very different calculated size distributions, a supplemental technique, microscopy with image analysis, was used to decide between the alternatives. Use of effective optical parameters produced a good match between laser diffraction data and microscopy/image analysis data. The imaginary component of the refractive index can have a major impact on particle size results calculated from laser diffraction data. When performed properly, laser diffraction and microscopy with image analysis can yield comparable results.
Tipping point analysis of atmospheric oxygen concentration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.
2015-03-15
We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.
Ground-based Observations of Large Solar Flares Precursors
NASA Astrophysics Data System (ADS)
Sheyner, Olga; Smirnova, Anna; Snegirev, Sergey
2010-05-01
The importance problem of Solar-terrestrial physics is regular forecasting of solar activity phenomena, which negatively influence the human's health, operating safety, communication, radar sets and others. The opportunity of development of short-term forecasting technique of geoeffective solar flares is presented in this study. This technique is based on the effect of growth of pulsations of horizontal component of geomagnetic field before the solar proton flares. The long-period (30-60 minutes) pulsations of H-component of geomagnetic field are detected for the events of different intensity on March 22, 1991, November 4, 2001, and November 17, 2001 using the method of wavelet-analysis. Amplitudes of fluctuations of horizontal component of geomagnetic field with the 30-60 minute's periods grow at the most of tested stations during 0.5-3.5 days before the solar flares. The particularities of spectral component are studied for the stations situated on different latitudes. The assumptions about the reason of such precursors-fluctuations appearance are made.
Imaging of polysaccharides in the tomato cell wall with Raman microspectroscopy
2014-01-01
Background The primary cell wall of fruits and vegetables is a structure mainly composed of polysaccharides (pectins, hemicelluloses, cellulose). Polysaccharides are assembled into a network and linked together. It is thought that the percentage of components and of plant cell wall has an important influence on mechanical properties of fruits and vegetables. Results In this study the Raman microspectroscopy technique was introduced to the visualization of the distribution of polysaccharides in cell wall of fruit. The methodology of the sample preparation, the measurement using Raman microscope and multivariate image analysis are discussed. Single band imaging (for preliminary analysis) and multivariate image analysis methods (principal component analysis and multivariate curve resolution) were used for the identification and localization of the components in the primary cell wall. Conclusions Raman microspectroscopy supported by multivariate image analysis methods is useful in distinguishing cellulose and pectins in the cell wall in tomatoes. It presents how the localization of biopolymers was possible with minimally prepared samples. PMID:24917885
1990-08-01
of the review are presented in Tables 1 and 2 by aircraft and type of component. The totals for each component are combined in Table 3. Adjusted...of Table 3 have been grouped according to basic system functions and combined percentages for each of the basic functions have been computed as shown...and the free oxygen combines with the tungsten to form 29 Fig. 2.5 Notching of lamp aged 77 hours at 28 Volts DC. 2000X. (Reference 2.1) 30 DAMAGE
NASA Astrophysics Data System (ADS)
Seo, Jihye; An, Yuri; Lee, Jungsul; Choi, Chulhee
2015-03-01
Indocyanine green (ICG), a near-infrared fluorophore, has been used in visualization of vascular structure and non-invasive diagnosis of vascular disease. Although many imaging techniques have been developed, there are still limitations in diagnosis of vascular diseases. We have recently developed a minimally invasive diagnostics system based on ICG fluorescence imaging for sensitive detection of vascular insufficiency. In this study, we used principal component analysis (PCA) to examine ICG spatiotemporal profile and to obtain pathophysiological information from ICG dynamics. Here we demonstrated that principal components of ICG dynamics in both feet showed significant differences between normal control and diabetic patients with vascula complications. We extracted the PCA time courses of the first three components and found distinct pattern in diabetic patient. We propose that PCA of ICG dynamics reveal better classification performance compared to fluorescence intensity analysis. We anticipate that specific feature of spatiotemporal ICG dynamics can be useful in diagnosis of various vascular diseases.
NASA Astrophysics Data System (ADS)
Zhang, Qiong; Peng, Cong; Lu, Yiming; Wang, Hao; Zhu, Kaiguang
2018-04-01
A novel technique is developed to level airborne geophysical data using principal component analysis based on flight line difference. In the paper, flight line difference is introduced to enhance the features of levelling error for airborne electromagnetic (AEM) data and improve the correlation between pseudo tie lines. Thus we conduct levelling to the flight line difference data instead of to the original AEM data directly. Pseudo tie lines are selected distributively cross profile direction, avoiding the anomalous regions. Since the levelling errors of selective pseudo tie lines show high correlations, principal component analysis is applied to extract the local levelling errors by low-order principal components reconstruction. Furthermore, we can obtain the levelling errors of original AEM data through inverse difference after spatial interpolation. This levelling method does not need to fly tie lines and design the levelling fitting function. The effectiveness of this method is demonstrated by the levelling results of survey data, comparing with the results from tie-line levelling and flight-line correlation levelling.
Rostad, C.E.
2006-01-01
Polar components in fuels may enable differentiation between fuel types or commercial fuel sources. A range of commercial fuels from numerous sources were analyzed by flow injection analysis/electrospray ionization/mass spectrometry without extensive sample preparation, separation, or chromatography. This technique enabled screening for unique polar components at parts per million levels in commercial hydrocarbon products, including a range of products from a variety of commercial sources and locations. Because these polar compounds are unique in different fuels, their presence may provide source information on hydrocarbons released into the environment. This analysis was then applied to mixtures of various products, as might be found in accidental releases into the environment. Copyright ?? Taylor & Francis Group, LLC.
Ahmadi, Mehdi; Shahlaei, Mohsen
2015-01-01
P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure-activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7-7-1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure-activity relationship model suggested is robust and satisfactory.
Ahmadi, Mehdi; Shahlaei, Mohsen
2015-01-01
P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure–activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7−7−1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure–activity relationship model suggested is robust and satisfactory. PMID:26600858
Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C
2018-06-01
The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.
Discriminant analysis of resting-state functional connectivity patterns on the Grassmann manifold
NASA Astrophysics Data System (ADS)
Fan, Yong; Liu, Yong; Jiang, Tianzi; Liu, Zhening; Hao, Yihui; Liu, Haihong
2010-03-01
The functional networks, extracted from fMRI images using independent component analysis, have been demonstrated informative for distinguishing brain states of cognitive functions and neurological diseases. In this paper, we propose a novel algorithm for discriminant analysis of functional networks encoded by spatial independent components. The functional networks of each individual are used as bases for a linear subspace, referred to as a functional connectivity pattern, which facilitates a comprehensive characterization of temporal signals of fMRI data. The functional connectivity patterns of different individuals are analyzed on the Grassmann manifold by adopting a principal angle based subspace distance. In conjunction with a support vector machine classifier, a forward component selection technique is proposed to select independent components for constructing the most discriminative functional connectivity pattern. The discriminant analysis method has been applied to an fMRI based schizophrenia study with 31 schizophrenia patients and 31 healthy individuals. The experimental results demonstrate that the proposed method not only achieves a promising classification performance for distinguishing schizophrenia patients from healthy controls, but also identifies discriminative functional networks that are informative for schizophrenia diagnosis.
González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio
2015-03-01
A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.
Separation techniques: Chromatography
Coskun, Ozlem
2016-01-01
Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406
A first application of independent component analysis to extracting structure from stock returns.
Back, A D; Weigend, A S
1997-08-01
This paper explores the application of a signal processing technique known as independent component analysis (ICA) or blind source separation to multivariate financial time series such as a portfolio of stocks. The key idea of ICA is to linearly map the observed multivariate time series into a new space of statistically independent components (ICs). We apply ICA to three years of daily returns of the 28 largest Japanese stocks and compare the results with those obtained using principal component analysis. The results indicate that the estimated ICs fall into two categories, (i) infrequent large shocks (responsible for the major changes in the stock prices), and (ii) frequent smaller fluctuations (contributing little to the overall level of the stocks). We show that the overall stock price can be reconstructed surprisingly well by using a small number of thresholded weighted ICs. In contrast, when using shocks derived from principal components instead of independent components, the reconstructed price is less similar to the original one. ICA is shown to be a potentially powerful method of analyzing and understanding driving mechanisms in financial time series. The application to portfolio optimization is described in Chin and Weigend (1998).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Huaying, E-mail: zhaoh3@mail.nih.gov; Schuck, Peter, E-mail: zhaoh3@mail.nih.gov
2015-01-01
Global multi-method analysis for protein interactions (GMMA) can increase the precision and complexity of binding studies for the determination of the stoichiometry, affinity and cooperativity of multi-site interactions. The principles and recent developments of biophysical solution methods implemented for GMMA in the software SEDPHAT are reviewed, their complementarity in GMMA is described and a new GMMA simulation tool set in SEDPHAT is presented. Reversible macromolecular interactions are ubiquitous in signal transduction pathways, often forming dynamic multi-protein complexes with three or more components. Multivalent binding and cooperativity in these complexes are often key motifs of their biological mechanisms. Traditional solution biophysicalmore » techniques for characterizing the binding and cooperativity are very limited in the number of states that can be resolved. A global multi-method analysis (GMMA) approach has recently been introduced that can leverage the strengths and the different observables of different techniques to improve the accuracy of the resulting binding parameters and to facilitate the study of multi-component systems and multi-site interactions. Here, GMMA is described in the software SEDPHAT for the analysis of data from isothermal titration calorimetry, surface plasmon resonance or other biosensing, analytical ultracentrifugation, fluorescence anisotropy and various other spectroscopic and thermodynamic techniques. The basic principles of these techniques are reviewed and recent advances in view of their particular strengths in the context of GMMA are described. Furthermore, a new feature in SEDPHAT is introduced for the simulation of multi-method data. In combination with specific statistical tools for GMMA in SEDPHAT, simulations can be a valuable step in the experimental design.« less
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis.
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-03-09
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B 1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B 1 affected peanuts at EU regulatory limits of 1250 μg kg -1 and 8 μg kg -1 , respectively.
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-01-01
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg−1 and 8 μg kg−1, respectively. PMID:28276454
Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis
NASA Astrophysics Data System (ADS)
Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris
2017-03-01
Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg-1 and 8 μg kg-1, respectively.
Yang, Jun-Ho; Yoh, Jack J
2018-01-01
A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.
Synchrotron IR microspectroscopy for protein structure analysis: Potential and questions
Yu, Peiqiang
2006-01-01
Synchrotron radiation-based Fourier transform infrared microspectroscopy (S-FTIR) has been developed as a rapid, direct, non-destructive, bioanalytical technique. This technique takes advantage of synchrotron light brightness and small effective source size and is capable of exploring the molecular chemical make-up within microstructures of a biological tissue without destruction of inherent structures at ultra-spatial resolutions within cellular dimension. To date there has been very little application of this advanced technique to the study of pure protein inherent structure at a cellular level in biological tissues. In this review, a novel approach was introduced to show the potential of the newly developed, advancedmore » synchrotron-based analytical technology, which can be used to localize relatively “pure“ protein in the plant tissues and relatively reveal protein inherent structure and protein molecular chemical make-up within intact tissue at cellular and subcellular levels. Several complex protein IR spectra data analytical techniques (Gaussian and Lorentzian multi-component peak modeling, univariate and multivariate analysis, principal component analysis (PCA), and hierarchical cluster analysis (CLA) are employed to relatively reveal features of protein inherent structure and distinguish protein inherent structure differences between varieties/species and treatments in plant tissues. By using a multi-peak modeling procedure, RELATIVE estimates (but not EXACT determinations) for protein secondary structure analysis can be made for comparison purpose. The issues of pro- and anti-multi-peaking modeling/fitting procedure for relative estimation of protein structure were discussed. By using the PCA and CLA analyses, the plant molecular structure can be qualitatively separate one group from another, statistically, even though the spectral assignments are not known. The synchrotron-based technology provides a new approach for protein structure research in biological tissues at ultraspatial resolutions.« less
Two biased estimation techniques in linear regression: Application to aircraft
NASA Technical Reports Server (NTRS)
Klein, Vladislav
1988-01-01
Several ways for detection and assessment of collinearity in measured data are discussed. Because data collinearity usually results in poor least squares estimates, two estimation techniques which can limit a damaging effect of collinearity are presented. These two techniques, the principal components regression and mixed estimation, belong to a class of biased estimation techniques. Detection and assessment of data collinearity and the two biased estimation techniques are demonstrated in two examples using flight test data from longitudinal maneuvers of an experimental aircraft. The eigensystem analysis and parameter variance decomposition appeared to be a promising tool for collinearity evaluation. The biased estimators had far better accuracy than the results from the ordinary least squares technique.
Colour image segmentation using unsupervised clustering technique for acute leukemia images
NASA Astrophysics Data System (ADS)
Halim, N. H. Abd; Mashor, M. Y.; Nasir, A. S. Abdul; Mustafa, N.; Hassan, R.
2015-05-01
Colour image segmentation has becoming more popular for computer vision due to its important process in most medical analysis tasks. This paper proposes comparison between different colour components of RGB(red, green, blue) and HSI (hue, saturation, intensity) colour models that will be used in order to segment the acute leukemia images. First, partial contrast stretching is applied on leukemia images to increase the visual aspect of the blast cells. Then, an unsupervised moving k-means clustering algorithm is applied on the various colour components of RGB and HSI colour models for the purpose of segmentation of blast cells from the red blood cells and background regions in leukemia image. Different colour components of RGB and HSI colour models have been analyzed in order to identify the colour component that can give the good segmentation performance. The segmented images are then processed using median filter and region growing technique to reduce noise and smooth the images. The results show that segmentation using saturation component of HSI colour model has proven to be the best in segmenting nucleus of the blast cells in acute leukemia image as compared to the other colour components of RGB and HSI colour models.
Grindlay, Guillermo; Mora, Juan; Gras, Luis; de Loos-Vollebregt, Margaretha T C
2011-04-08
The analysis of wine is of great importance since wine components strongly determine its stability, organoleptic or nutrition characteristics. In addition, wine analysis is also important to prevent fraud and to assess toxicological issues. Among the different analytical techniques described in the literature, atomic spectrometry has been traditionally employed for elemental wine analysis due to its simplicity and good analytical figures of merit. The scope of this review is to summarize the main advantages and drawbacks of various atomic spectrometry techniques for elemental wine analysis. Special attention is paid to interferences (i.e. matrix effects) affecting the analysis as well as the strategies available to mitigate them. Finally, latest studies about wine speciation are briefly discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
Wavelet packets for multi- and hyper-spectral imagery
NASA Astrophysics Data System (ADS)
Benedetto, J. J.; Czaja, W.; Ehler, M.; Flake, C.; Hirn, M.
2010-01-01
State of the art dimension reduction and classification schemes in multi- and hyper-spectral imaging rely primarily on the information contained in the spectral component. To better capture the joint spatial and spectral data distribution we combine the Wavelet Packet Transform with the linear dimension reduction method of Principal Component Analysis. Each spectral band is decomposed by means of the Wavelet Packet Transform and we consider a joint entropy across all the spectral bands as a tool to exploit the spatial information. Dimension reduction is then applied to the Wavelet Packets coefficients. We present examples of this technique for hyper-spectral satellite imaging. We also investigate the role of various shrinkage techniques to model non-linearity in our approach.
Hybrid 3D reconstruction and image-based rendering techniques for reality modeling
NASA Astrophysics Data System (ADS)
Sequeira, Vitor; Wolfart, Erik; Bovisio, Emanuele; Biotti, Ester; Goncalves, Joao G. M.
2000-12-01
This paper presents a component approach that combines in a seamless way the strong features of laser range acquisition with the visual quality of purely photographic approaches. The relevant components of the system are: (i) Panoramic images for distant background scenery where parallax is insignificant; (ii) Photogrammetry for background buildings and (iii) High detailed laser based models for the primary environment, structure of exteriors of buildings and interiors of rooms. These techniques have a wide range of applications in visualization, virtual reality, cost effective as-built analysis of architectural and industrial environments, building facilities management, real-estate, E-commerce, remote inspection of hazardous environments, TV production and many others.
Lamberti, Alfredo; Chiesura, Gabriele; Luyckx, Geert; Degrieck, Joris; Kaufmann, Markus; Vanlanduit, Steve
2015-10-26
The measurement of the internal deformations occurring in real-life composite components is a very challenging task, especially for those components that are rather difficult to access. Optical fiber sensors can overcome such a problem, since they can be embedded in the composite materials and serve as in situ sensors. In this article, embedded optical fiber Bragg grating (FBG) sensors are used to analyze the vibration characteristics of two real-life composite components. The first component is a carbon fiber-reinforced polymer automotive control arm; the second is a glass fiber-reinforced polymer aeronautic hinge arm. The modal parameters of both components were estimated by processing the FBG signals with two interrogation techniques: the maximum detection and fast phase correlation algorithms were employed for the demodulation of the FBG signals; the Peak-Picking and PolyMax techniques were instead used for the parameter estimation. To validate the FBG outcomes, reference measurements were performed by means of a laser Doppler vibrometer. Sensors 2015, 15 27175 The analysis of the results showed that the FBG sensing capabilities were enhanced when the recently-introduced fast phase correlation algorithm was combined with the state-of-the-art PolyMax estimator curve fitting method. In this case, the FBGs provided the most accurate results, i.e. it was possible to fully characterize the vibration behavior of both composite components. When using more traditional interrogation algorithms (maximum detection) and modal parameter estimation techniques (Peak-Picking), some of the modes were not successfully identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salud, J.; Lopez, D.O.; Barrio, M.
The experimental two-component phase diagram between the orientationally disordered crystals 2-amino-2-methyl-1,3-propanediol (AMP) and 1,1,1-tris(hydroxymethyl)propane (PG) has been established from room temperature to the liquid state using thermal analysis and X-ray powder diffraction techniques. The intermolecular interactions in the orientationally disordered mixed crystals of the mentioned system and other related two-component systems are discussed by analyzing the evolution of the packing coefficient as a function of the composition. A thermodynamic analysis of the presented phase diagram and the redetermined AMP/NPG (2,2-dimethyl-1,3-propanediol) is reported on the basis of the enthalpy-entropy compensation theory.
NASA Astrophysics Data System (ADS)
Streamer, M.; Bohlsen, T.; Ogmen, Y.
2016-06-01
Eclipsing binary stars are especially valuable for studies of stellar evolution. If pulsating components are also present then the stellar interior can be studied using asteroseismology techniques. We present photometric data and the analysis of the delta Scuti pulsations that we have discovered in five eclipsing binary systems. The systems are: LT Herculis, RZ Microscopii, LY Puppis, V632 Scorpii and V638 Scorpii. The dominant pulsation frequencies range between 13 - 29 cycles per day with semi-amplitudes of 4 - 20 millimagnitudes.
Aguilera, Teodoro; Lozano, Jesús; Paredes, José A.; Álvarez, Fernando J.; Suárez, José I.
2012-01-01
The aim of this work is to propose an alternative way for wine classification and prediction based on an electronic nose (e-nose) combined with Independent Component Analysis (ICA) as a dimensionality reduction technique, Partial Least Squares (PLS) to predict sensorial descriptors and Artificial Neural Networks (ANNs) for classification purpose. A total of 26 wines from different regions, varieties and elaboration processes have been analyzed with an e-nose and tasted by a sensory panel. Successful results have been obtained in most cases for prediction and classification. PMID:22969387
NASA Astrophysics Data System (ADS)
Kistenev, Yury V.; Borisov, Alexey V.; Kuzmin, Dmitry A.; Bulanova, Anna A.
2016-08-01
Technique of exhaled breath sampling is discussed. The procedure of wavelength auto-calibration is proposed and tested. Comparison of the experimental data with the model absorption spectra of 5% CO2 is conducted. The classification results of three study groups obtained by using support vector machine and principal component analysis methods are presented.
ERIC Educational Resources Information Center
Sobel, Robert M.; Ballantine, David S.; Ryzhov, Victor
2005-01-01
Industrial application of gas chromatography-mass spectrometry (GC-MS) analysis is a powerful technique that could be used to elucidate components of a complex mixture while offering the benefits of high-precision quantitative analysis. The natural wintergreen oil is examined for its phenol concentration to determine the level of refining…
ERIC Educational Resources Information Center
Meulman, Jacqueline J.; Verboon, Peter
1993-01-01
Points of view analysis, as a way to deal with individual differences in multidimensional scaling, was largely supplanted by the weighted Euclidean model. It is argued that the approach deserves new attention, especially as a technique to analyze group differences. A streamlined and integrated process is proposed. (SLD)
Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman
2010-01-01
Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...
Function modeling: improved raster analysis through delayed reading and function raster datasets
John S. Hogland; Nathaniel M. Anderson; J .Greg Jones
2013-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...
Modal Analysis of Space-rocket Equipment Components
NASA Astrophysics Data System (ADS)
Igolkin, A. A.; Safin, A. I.; Prokofiev, A. B.
2018-01-01
In order to prevent vibration damage an analysis of natural frequencies and mode shapes of elements of rocket and space technology should be developed. This paper discusses technique of modal analysis on the example of the carrier platform. Modal analysis was performed by using mathematical modeling and laser vibrometer. Experimental data was clarified by using Test.Lab software. As a result of modal analysis amplitude-frequency response of carrier platform was obtained and the parameters of the elasticity was clarified.
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard
1991-01-01
Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/ mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.
NASA Technical Reports Server (NTRS)
Shalkhauser, Kurt A.; Bartos, Karen F.; Fite, E. B.; Sharp, G. R.
1992-01-01
Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Classification Techniques for Multivariate Data Analysis.
1980-03-28
analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor
Ship Speed Retrieval From Single Channel TerraSAR-X Data
NASA Astrophysics Data System (ADS)
Soccorsi, Matteo; Lehner, Susanne
2010-04-01
A method to estimate the speed of a moving ship is presented. The technique, introduced in Kirscht (1998), is extended to marine application and validated on TerraSAR-X High-Resolution (HR) data. The generation of a sequence of single-look SAR images from a single- channel image corresponds to an image time series with reduced resolution. This allows applying change detection techniques on the time series to evaluate the velocity components in range and azimuth of the ship. The evaluation of the displacement vector of a moving target in consecutive images of the sequence allows the estimation of the azimuth velocity component. The range velocity component is estimated by evaluating the variation of the signal amplitude during the sequence. In order to apply the technique on TerraSAR-X Spot Light (SL) data a further processing step is needed. The phase has to be corrected as presented in Eineder et al. (2009) due to the SL acquisition mode; otherwise the image sequence cannot be generated. The analysis, when possible validated by the Automatic Identification System (AIS), was performed in the framework of the ESA project MARISS.
NASA Astrophysics Data System (ADS)
Mahmoudishadi, S.; Malian, A.; Hosseinali, F.
2017-09-01
The image processing techniques in transform domain are employed as analysis tools for enhancing the detection of mineral deposits. The process of decomposing the image into important components increases the probability of mineral extraction. In this study, the performance of Principal Component Analysis (PCA) and Independent Component Analysis (ICA) has been evaluated for the visible and near-infrared (VNIR) and Shortwave infrared (SWIR) subsystems of ASTER data. Ardestan is located in part of Central Iranian Volcanic Belt that hosts many well-known porphyry copper deposits. This research investigated the propylitic and argillic alteration zones and outer mineralogy zone in part of Ardestan region. The two mentioned approaches were applied to discriminate alteration zones from igneous bedrock using the major absorption of indicator minerals from alteration and mineralogy zones in spectral rang of ASTER bands. Specialized PC components (PC2, PC3 and PC6) were used to identify pyrite and argillic and propylitic zones that distinguish from igneous bedrock in RGB color composite image. Due to the eigenvalues, the components 2, 3 and 6 account for 4.26% ,0.9% and 0.09% of the total variance of the data for Ardestan scene, respectively. For the purpose of discriminating the alteration and mineralogy zones of porphyry copper deposit from bedrocks, those mentioned percentages of data in ICA independent components of IC2, IC3 and IC6 are more accurately separated than noisy bands of PCA. The results of ICA method conform to location of lithological units of Ardestan region, as well.
NASA Astrophysics Data System (ADS)
Martel, Anne L.
2004-04-01
In order to extract quantitative information from dynamic contrast-enhanced MR images (DCE-MRI) it is usually necessary to identify an arterial input function. This is not a trivial problem if there are no major vessels present in the field of view. Most existing techniques rely on operator intervention or use various curve parameters to identify suitable pixels but these are often specific to the anatomical region or the acquisition method used. They also require the signal from several pixels to be averaged in order to improve the signal to noise ratio, however this introduces errors due to partial volume effects. We have described previously how factor analysis can be used to automatically separate arterial and venous components from DCE-MRI studies of the brain but although that method works well for single slice images through the brain when the blood brain barrier technique is intact, it runs into problems for multi-slice images with more complex dynamics. This paper will describe a factor analysis method that is more robust in such situations and is relatively insensitive to the number of physiological components present in the data set. The technique is very similar to that used to identify spectral end-members from multispectral remote sensing images.
EXAFS: New tool for study of battery and fuel cell materials
NASA Technical Reports Server (NTRS)
Mcbreen, James; Ogrady, William E.; Pandya, Kaumudi I.
1987-01-01
Extended X ray absorption fine structure (EXAFS) is a powerful technique for probing the local atomic structure of battery and fuel cell materials. The major advantages of EXAFS are that both the probe and the signal are X rays and the technique is element selective and applicable to all states of matter. This permits in situ studies of electrodes and determination of the structure of single components in composite electrodes, or even complete cells. EXAFS specifically probes short range order and yields coordination numbers, bond distances, and chemical identity of nearest neighbors. Thus, it is ideal for structural studies of ions in solution and the poorly crystallized materials that are often the active materials or catalysts in batteries and fuel cells. Studies on typical battery and fuel cell components are used to describe the technique and the capability of EXAFS as a structural tool in these applications. Typical experimental and data analysis procedures are outlined. The advantages and limitations of the technique are also briefly discussed.
Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight
NASA Technical Reports Server (NTRS)
Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William
2015-01-01
As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.
Genome-wide selection components analysis in a fish with male pregnancy.
Flanagan, Sarah P; Jones, Adam G
2017-04-01
A major goal of evolutionary biology is to identify the genome-level targets of natural and sexual selection. With the advent of next-generation sequencing, whole-genome selection components analysis provides a promising avenue in the search for loci affected by selection in nature. Here, we implement a genome-wide selection components analysis in the sex role reversed Gulf pipefish, Syngnathus scovelli. Our approach involves a double-digest restriction-site associated DNA sequencing (ddRAD-seq) technique, applied to adult females, nonpregnant males, pregnant males, and their offspring. An F ST comparison of allele frequencies among these groups reveals 47 genomic regions putatively experiencing sexual selection, as well as 468 regions showing a signature of differential viability selection between males and females. A complementary likelihood ratio test identifies similar patterns in the data as the F ST analysis. Sexual selection and viability selection both tend to favor the rare alleles in the population. Ultimately, we conclude that genome-wide selection components analysis can be a useful tool to complement other approaches in the effort to pinpoint genome-level targets of selection in the wild. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nee, K.; Bryan, S.; Levitskaia, T.
The reliability of chemical processes can be greatly improved by implementing inline monitoring systems. Combining multivariate analysis with non-destructive sensors can enhance the process without interfering with the operation. Here, we present here hierarchical models using both principal component analysis and partial least square analysis developed for different chemical components representative of solvent extraction process streams. A training set of 380 samples and an external validation set of 95 samples were prepared and Near infrared and Raman spectral data as well as conductivity under variable temperature conditions were collected. The results from the models indicate that careful selection of themore » spectral range is important. By compressing the data through Principal Component Analysis (PCA), we lower the rank of the data set to its most dominant features while maintaining the key principal components to be used in the regression analysis. Within the studied data set, concentration of five chemical components were modeled; total nitrate (NO 3 -), total acid (H +), neodymium (Nd 3+), sodium (Na +), and ionic strength (I.S.). The best overall model prediction for each of the species studied used a combined data set comprised of complementary techniques including NIR, Raman, and conductivity. Finally, our study shows that chemometric models are powerful but requires significant amount of carefully analyzed data to capture variations in the chemistry.« less
Nee, K.; Bryan, S.; Levitskaia, T.; ...
2017-12-28
The reliability of chemical processes can be greatly improved by implementing inline monitoring systems. Combining multivariate analysis with non-destructive sensors can enhance the process without interfering with the operation. Here, we present here hierarchical models using both principal component analysis and partial least square analysis developed for different chemical components representative of solvent extraction process streams. A training set of 380 samples and an external validation set of 95 samples were prepared and Near infrared and Raman spectral data as well as conductivity under variable temperature conditions were collected. The results from the models indicate that careful selection of themore » spectral range is important. By compressing the data through Principal Component Analysis (PCA), we lower the rank of the data set to its most dominant features while maintaining the key principal components to be used in the regression analysis. Within the studied data set, concentration of five chemical components were modeled; total nitrate (NO 3 -), total acid (H +), neodymium (Nd 3+), sodium (Na +), and ionic strength (I.S.). The best overall model prediction for each of the species studied used a combined data set comprised of complementary techniques including NIR, Raman, and conductivity. Finally, our study shows that chemometric models are powerful but requires significant amount of carefully analyzed data to capture variations in the chemistry.« less
Fetal source extraction from magnetocardiographic recordings by dependent component analysis
NASA Astrophysics Data System (ADS)
de Araujo, Draulio B.; Kardec Barros, Allan; Estombelo-Montesco, Carlos; Zhao, Hui; Roque da Silva Filho, A. C.; Baffa, Oswaldo; Wakai, Ronald; Ohnishi, Noboru
2005-10-01
Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.
Enrollment Projection within a Decision-Making Framework.
ERIC Educational Resources Information Center
Armstrong, David F.; Nunley, Charlene Wenckowski
1981-01-01
Two methods used to predict enrollment at Montgomery College in Maryland are compared and evaluated, and the administrative context in which they are used is considered. The two methods involve time series analysis (curve fitting) and indicator techniques (yield from components). (MSE)
Incorporating principal component analysis into air quality model evaluation
The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...
Survey of aircraft electrical power systems
NASA Technical Reports Server (NTRS)
Lee, C. H.; Brandner, J. J.
1972-01-01
Areas investigated include: (1) load analysis; (2) power distribution, conversion techniques and generation; (3) design criteria and performance capabilities of hydraulic and pneumatic systems; (4) system control and protection methods; (5) component and heat transfer systems cooling; and (6) electrical system reliability.
von Maltzahn, Nadine Freifrau; Holstermann, Jan; Kohorst, Philipp
2016-08-01
The adhesive connection between titanium base and zirconia coping of two-part abutments may be responsible for the failure rate. A high mechanical stability between both components is essential for the long-term success. The aim of the present in-vitro study was to evaluate the influence of different surface modification techniques and resin-based luting agents on the retention forces between titanium and zirconia components in two-part implant abutments. A total of 120 abutments with a titanium base bonded to a zirconia coping were investigated. Two different resin-based luting agents (Panavia F 2.0 and RelyX Unicem) and six different surface modifications were used to fix these components, resulting in 12 test groups (n = 10). The surface of the test specimens was mechanically pretreated with aluminium oxide blasting in combination with application of two surface activating primers (Alloy Primer, Clearfil Ceramic Primer) or a tribological conditioning (Rocatec), respectively. All specimens underwent 10,000 thermal cycles between 5°C and 55°C in a moist environment. A pull-off test was then conducted to determine retention forces between the titanium and zirconia components, and statistical analysis was performed (two-way anova). Finally, fracture surfaces were analyzed by light and scanning electron microscopy. No significant differences were found between Panavia F 2.0 and RelyX Unicem. However, the retention forces were significantly influenced by the surface modification technique used (p < 0.001). For both luting agents, the highest retention forces were found when adhesion surfaces of both the titanium bases and the zirconia copings were pretreated with aluminium oxide blasting, and with the application of Clearfil Ceramic Primer. Surface modification techniques crucially influence the retention forces between titanium and zirconia components in two-part implant abutments. All adhesion surfaces should be pretreated by sandblasting. Moreover, a phosphate-based primer serves to enhance long-term retention of the components. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Yamazaki, Takaharu; Futai, Kazuma; Tomita, Tetsuya; Sato, Yoshinobu; Yoshikawa, Hideki; Tamura, Shinichi; Sugamoto, Kazuomi
2011-03-01
To achieve 3D kinematic analysis of total knee arthroplasty (TKA), 2D/3D registration techniques, which use X-ray fluoroscopic images and computer-aided design (CAD) model of the knee implant, have attracted attention in recent years. These techniques could provide information regarding the movement of radiopaque femoral and tibial components but could not provide information of radiolucent polyethylene insert, because the insert silhouette on X-ray image did not appear clearly. Therefore, it was difficult to obtain 3D kinemaitcs of polyethylene insert, particularly mobile-bearing insert that move on the tibial component. This study presents a technique and the accuracy for 3D kinematic analysis of mobile-bearing insert in TKA using X-ray fluoroscopy, and finally performs clinical applications. For a 3D pose estimation technique of the mobile-bearing insert in TKA using X-ray fluoroscopy, tantalum beads and CAD model with its beads are utilized, and the 3D pose of the insert model is estimated using a feature-based 2D/3D registration technique. In order to validate the accuracy of the present technique, experiments including computer simulation test were performed. The results showed the pose estimation accuracy was sufficient for analyzing mobile-bearing TKA kinematics (the RMS error: about 1.0 mm, 1.0 degree). In the clinical applications, seven patients with mobile-bearing TKA in deep knee bending motion were studied and analyzed. Consequently, present technique enables us to better understand mobile-bearing TKA kinematics, and this type of evaluation was thought to be helpful for improving implant design and optimizing TKA surgical techniques.
Slonecker, E.T.; Tilley, J.S.
2004-01-01
The percentage of impervious surface area in a watershed has been widely recognized as a key indicator of terrestrial and aquatic ecosystem condition. Although the use of the impervious indicator is widespread, there is currently no consistent or mutually accepted method of computing impervious area and the approach of various commonly used techniques varies widely. Further, we do not have reliable information on the components of impervious surfaces, which would be critical in any future planning attempts to remediate problems associated with impervious surface coverage. In cooperation with the USGS Geographic Analysis and Monitoring Program (GAM) and The National Map, and the EPA Landscape Ecology Program, this collaborative research project utilized very high resolution imagery and GIS techniques to map and quantify the individual components of total impervious area in six urban/suburban watersheds in different parts of the United States. These data were served as ground reference, or "truth," for the evaluation for four techniques used to compute impervious area. The results show some important aspects about the component make-up of impervious cover and the variability of methods commonly used to compile this critical emerging indicator of ecosystem condition. ?? 2004 by V. H. Winston and Sons, Inc. All rights reserved.
Use of Raman spectroscopy in the analysis of nickel allergy
NASA Astrophysics Data System (ADS)
Alda, Javier; Castillo-Martinez, Claudio; Valdes-Rodriguez, Rodrigo; Hernández-Blanco, Diana; Moncada, Benjamin; González, Francisco J.
2013-06-01
Raman spectra of the skin of subjects with nickel allergy are analyzed and compared to the spectra of healthy subjects to detect possible biochemical differences in the structure of the skin that could help diagnose metal allergies in a noninvasive manner. Results show differences between the two groups of Raman spectra. These spectral differences can be classified using principal component analysis. Based on these findings, a novel computational technique to make a fast evaluation and classification of the Raman spectra of the skin is presented and proposed as a noninvasive technique for the detection of nickel allergy.
The interactive astronomical data analysis facility - image enhancement techniques to Comet Halley
NASA Astrophysics Data System (ADS)
Klinglesmith, D. A.
1981-10-01
PDP 11/40 computer is at the heart of a general purpose interactive data analysis facility designed to permit easy access to data in both visual imagery and graphic representations. The major components consist of: the 11/40 CPU and 256 K bytes of 16-bit memory; two TU10 tape drives; 20 million bytes of disk storage; three user terminals; and the COMTAL image processing display system. The application of image enhancement techniques to two sequences of photographs of Comet Halley taken in Egypt in 1910 provides evidence for eruptions from the comet's nucleus.
Precise measurement of the half-life of the Fermi {beta} decay of {sup 26}Al{sup m}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Rebecca J.; Thompson, Maxwell N.; Rassool, Roger P.
2011-08-15
State-of-the-art signal digitization and analysis techniques have been used to measure the half-life of the Fermi {beta} decay of {sup 26}Al{sup m}. The half-life was determined to be 6347.8 {+-} 2.5 ms. This new datum contributes to the experimental testing of the conserved-vector-current hypothesis and the required unitarity of the Cabibbo-Kobayashi-Maskawa matrix: two essential components of the standard model. Detailed discussion of the experimental techniques and data analysis and a thorough investigation of the statistical and systematic uncertainties are presented.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1994-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.
NASA Technical Reports Server (NTRS)
Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.
1995-01-01
NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).
Cejnar, Pavel; Kuckova, Stepanka; Prochazka, Ales; Karamonova, Ludmila; Svobodova, Barbora
2018-06-15
Explorative statistical analysis of mass spectrometry data is still a time-consuming step. We analyzed critical factors for application of principal component analysis (PCA) in mass spectrometry and focused on two whole spectrum based normalization techniques and their application in the analysis of registered peak data and, in comparison, in full spectrum data analysis. We used this technique to identify different metabolic patterns in the bacterial culture of Cronobacter sakazakii, an important foodborne pathogen. Two software utilities, the ms-alone, a python-based utility for mass spectrometry data preprocessing and peak extraction, and the multiMS-toolbox, an R software tool for advanced peak registration and detailed explorative statistical analysis, were implemented. The bacterial culture of Cronobacter sakazakii was cultivated on Enterobacter sakazakii Isolation Agar, Blood Agar Base and Tryptone Soya Agar for 24 h and 48 h and applied by the smear method on an Autoflex speed MALDI-TOF mass spectrometer. For three tested cultivation media only two different metabolic patterns of Cronobacter sakazakii were identified using PCA applied on data normalized by two different normalization techniques. Results from matched peak data and subsequent detailed full spectrum analysis identified only two different metabolic patterns - a cultivation on Enterobacter sakazakii Isolation Agar showed significant differences to the cultivation on the other two tested media. The metabolic patterns for all tested cultivation media also proved the dependence on cultivation time. Both whole spectrum based normalization techniques together with the full spectrum PCA allow identification of important discriminative factors in experiments with several variable condition factors avoiding any problems with improper identification of peaks or emphasis on bellow threshold peak data. The amounts of processed data remain still manageable. Both implemented software utilities are available free of charge from http://uprt.vscht.cz/ms. Copyright © 2018 John Wiley & Sons, Ltd.
Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.
Gdeisat, Munther A; Burton, David R; Lalor, Michael J
2002-09-10
A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.
SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.
Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan
2017-09-01
With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.
Brown, C. Erwin
1993-01-01
Correlation analysis in conjunction with principal-component and multiple-regression analyses were applied to laboratory chemical and petrographic data to assess the usefulness of these techniques in evaluating selected physical and hydraulic properties of carbonate-rock aquifers in central Pennsylvania. Correlation and principal-component analyses were used to establish relations and associations among variables, to determine dimensions of property variation of samples, and to filter the variables containing similar information. Principal-component and correlation analyses showed that porosity is related to other measured variables and that permeability is most related to porosity and grain size. Four principal components are found to be significant in explaining the variance of data. Stepwise multiple-regression analysis was used to see how well the measured variables could predict porosity and (or) permeability for this suite of rocks. The variation in permeability and porosity is not totally predicted by the other variables, but the regression is significant at the 5% significance level. ?? 1993.
NASA Astrophysics Data System (ADS)
Fanood, Mohammad M. Rafiee; Ram, N. Bhargava; Lehmann, C. Stefan; Powis, Ivan; Janssen, Maurice H. M.
2015-06-01
Simultaneous, enantiomer-specific identification of chiral molecules in multi-component mixtures is extremely challenging. Many established techniques for single-component analysis fail to provide selectivity in multi-component mixtures and lack sensitivity for dilute samples. Here we show how enantiomers may be differentiated by mass-selected photoelectron circular dichroism using an electron-ion coincidence imaging spectrometer. As proof of concept, vapours containing ~1% of two chiral monoterpene molecules, limonene and camphor, are irradiated by a circularly polarized femtosecond laser, resulting in multiphoton near-threshold ionization with little molecular fragmentation. Large chiral asymmetries (2-4%) are observed in the mass-tagged photoelectron angular distributions. These asymmetries switch sign according to the handedness (R- or S-) of the enantiomer in the mixture and scale with enantiomeric excess of a component. The results demonstrate that mass spectrometric identification of mixtures of chiral molecules and quantitative determination of enantiomeric excess can be achieved in a table-top instrument.
Fanood, Mohammad M Rafiee; Ram, N. Bhargava; Lehmann, C. Stefan; Powis, Ivan; Janssen, Maurice H. M.
2015-01-01
Simultaneous, enantiomer-specific identification of chiral molecules in multi-component mixtures is extremely challenging. Many established techniques for single-component analysis fail to provide selectivity in multi-component mixtures and lack sensitivity for dilute samples. Here we show how enantiomers may be differentiated by mass-selected photoelectron circular dichroism using an electron–ion coincidence imaging spectrometer. As proof of concept, vapours containing ∼1% of two chiral monoterpene molecules, limonene and camphor, are irradiated by a circularly polarized femtosecond laser, resulting in multiphoton near-threshold ionization with little molecular fragmentation. Large chiral asymmetries (2–4%) are observed in the mass-tagged photoelectron angular distributions. These asymmetries switch sign according to the handedness (R- or S-) of the enantiomer in the mixture and scale with enantiomeric excess of a component. The results demonstrate that mass spectrometric identification of mixtures of chiral molecules and quantitative determination of enantiomeric excess can be achieved in a table-top instrument. PMID:26104140
Socaci, Sonia A; Socaciu, Carmen; Tofană, Maria; Raţi, Ioan V; Pintea, Adela
2013-01-01
The health benefits of sea buckthorn (Hippophae rhamnoides L.) are well documented due to its rich content in bioactive phytochemicals (pigments, phenolics and vitamins) as well as volatiles responsible for specific flavours and bacteriostatic action. The volatile compounds are good biomarkers of berry freshness, quality and authenticity. To develop a fast and efficient GC-MS method including a minimal sample preparation technique (in-tube extraction, ITEX) for the discrimination of sea buckthorn varieties based on their chromatographic volatile fingerprint. Twelve sea buckthorn varieties (wild and cultivated) were collected from forestry departments and experimental fields, respectively. The extraction of volatile compounds was performed using the ITEX technique whereas separation and identification was performed using a GC-MS QP-2010. Principal component analysis (PCA) was applied to discriminate the differences among sample composition. Using GC-MS analysis, from the headspace of sea buckthorn samples, 46 volatile compounds were separated with 43 being identified. The most abundant derivatives were ethyl esters of 2-methylbutanoic acid, 3-methylbutanoic acid, hexanoic acid, octanoic acid and butanoic acid, as well as 3-methylbutyl 3-methylbutanoate, 3-methylbutyl 2-methylbutanoate and benzoic acid ethyl ester (over 80% of all volatile compounds). Principal component analysis showed that the first two components explain 79% of data variance, demonstrating a good discrimination between samples. A reliable, fast and eco-friendly ITEX/GC-MS method was applied to fingerprint the volatile profile and to discriminate between wild and cultivated sea buckthorn berries originating from the Carpathians, with relevance to food science and technology. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Liang, B.; Iwnicki, S. D.; Zhao, Y.
2013-08-01
The power spectrum is defined as the square of the magnitude of the Fourier transform (FT) of a signal. The advantage of FT analysis is that it allows the decomposition of a signal into individual periodic frequency components and establishes the relative intensity of each component. It is the most commonly used signal processing technique today. If the same principle is applied for the detection of periodicity components in a Fourier spectrum, the process is called the cepstrum analysis. Cepstrum analysis is a very useful tool for detection families of harmonics with uniform spacing or the families of sidebands commonly found in gearbox, bearing and engine vibration fault spectra. Higher order spectra (HOS) (also known as polyspectra) consist of higher order moment of spectra which are able to detect non-linear interactions between frequency components. For HOS, the most commonly used is the bispectrum. The bispectrum is the third-order frequency domain measure, which contains information that standard power spectral analysis techniques cannot provide. It is well known that neural networks can represent complex non-linear relationships, and therefore they are extremely useful for fault identification and classification. This paper presents an application of power spectrum, cepstrum, bispectrum and neural network for fault pattern extraction of induction motors. The potential for using the power spectrum, cepstrum, bispectrum and neural network as a means for differentiating between healthy and faulty induction motor operation is examined. A series of experiments is done and the advantages and disadvantages between them are discussed. It has been found that a combination of power spectrum, cepstrum and bispectrum plus neural network analyses could be a very useful tool for condition monitoring and fault diagnosis of induction motors.
Maione, Camila; Barbosa, Rommel Melgaço
2018-01-24
Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.
Lapthorn, Cris; Pullen, Frank
2009-01-01
The performance of the direct analysis in real-time (DART) technique was evaluated across a range of metastable gas temperatures for a pharmaceutical compound, Voriconazole, in order to investigate the effect of metastable gas temperature on molecular ion intensity and fragmentation. The DART source has been used to analyse a range of analytes and from a range of matrices including drugs in solid tablet form and preparations, active ingredients in ointment, naturally occurring plant alkaloids, flavours and fragrances, from thin layer chromatography (TLC) plates, melting point tubes and biological matrices including hair, urine and blood. The advantages of this technique include rapid analysis time (as little as 5 s), a reduction in sample preparation requirements, elimination of mobile phase requirement and analysis of samples not typically amenable to atmospheric pressure ionisation (API) techniques. This technology has therefore been proposed as an everyday tool for identification of components in crude organic reaction mixtures.
Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques
NASA Astrophysics Data System (ADS)
Gulgundi, Mohammad Shahid; Shetty, Amba
2018-03-01
Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.
Singh, Vineeta; Gupta, Atul Kumar; Singh, S. P.; Kumar, Anil
2012-01-01
Cinnamomum tamala Nees & Eberm. is an important traditional medicinal plant, mentioned in various ancient literatures such as Ayurveda. Several of its medicinal properties have recently been proved. To characterize diversity in terms of metabolite profiles of Cinnamomum tamala Nees and Eberm genotypes, a newly emerging mass spectral ionization technique direct time in real time (DART) is very helpful. The DART ion source has been used to analyze an extremely wide range of phytochemicals present in leaves of Cinnamomum tamala. Ten genotypes were assessed for the presence of different phytochemicals. Phytochemical analysis showed the presence of mainly terpenes and phenols. These constituents vary in the different genotypes of Cinnamomum tamala. Principal component analysis has also been employed to analyze the DART data of these Cinnamomum genotypes. The result shows that the genotype of Cinnamomum tamala could be differentiated using DART MS data. The active components present in Cinnamomum tamala may be contributing significantly to high amount of antioxidant property of leaves and, in turn, conditional effects for diabetic patients. PMID:22701361
Electronic Components Subsystems and Equipment: a Compilation
NASA Technical Reports Server (NTRS)
1975-01-01
Developments in electronic components, subsystems, and equipment are summarized. Topics discussed include integrated circuit components and techniques, circuit components and techniques, and cables and connectors.
Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan
2005-01-01
Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.
NASA Astrophysics Data System (ADS)
Crook, Nigel P.; Hoon, Stephen R.; Taylor, Kevin G.; Perry, Chris T.
2002-05-01
This study investigates the application of high sensitivity electron spin resonance (ESR) to environmental magnetism in conjunction with the more conventional techniques of magnetic susceptibility, vibrating sample magnetometry (VSM) and chemical compositional analysis. Using these techniques we have studied carbonate sediment samples from Discovery Bay, Jamaica, which has been impacted to varying degrees by a bauxite loading facility. The carbonate sediment samples contain magnetic minerals ranging from moderate to low concentrations. The ESR spectra for all sites essentially contain three components. First, a six-line spectra centred around g = 2 resulting from Mn2+ ions within a carbonate matrix; second a g = 4.3 signal from isolated Fe3+ ions incorporated as impurities within minerals such as gibbsite, kaolinite or quartz; third a ferrimagnetic resonance with a maxima at 230 mT resulting from the ferrimagnetic minerals present within the bauxite contamination. Depending upon the location of the sites within the embayment these signals vary in their relative amplitude in a systematic manner related to the degree of bauxite input. Analysis of the ESR spectral components reveals linear relationships between the amplitude of the Mn2+ and ferrimagnetic signals and total Mn and Fe concentrations. To assist in determining the origin of the ESR signals coral and bauxite reference samples were employed. Coral representative of the matrix of the sediment was taken remote from the bauxite loading facility whilst pure bauxite was collected from nearby mining facilities. We find ESR to be a very sensitive technique particularly appropriate to magnetic analysis of ferri- and para-magnetic components within environmental samples otherwise dominated by diamagnetic (carbonate) minerals. When employing typical sample masses of 200 mg the practical detection limit of ESR to ferri- and para-magnetic minerals within a diamagnetic carbonate matrix is of the order of 1 ppm and 1 ppb respectively, approximately 102 and 105 times the sensitivity achievable employing the VSM in our laboratory.
Optical analysis of thermal induced structural distortions
NASA Technical Reports Server (NTRS)
Weinswig, Shepard; Hookman, Robert A.
1991-01-01
The techniques used for the analysis of thermally induced structural distortions of optical components such as scanning mirrors and telescope optics are outlined. Particular attention is given to the methodology used in the thermal and structural analysis of the GOES scan mirror, the optical analysis using Zernike coefficients, and the optical system performance evaluation. It is pointed out that the use of Zernike coefficients allows an accurate, effective, and simple linkage between thermal/mechanical effects and the optical design.
NASA Astrophysics Data System (ADS)
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we introduce the vbICA technique and present its application on synthetic data that simulate a GPS network recording ground deformation in a tectonically active region, with synthetic time-series containing interseismic, coseismic, and postseismic deformation, plus seasonal deformation, and white and coloured noise. We study the ability of the algorithm to recover the original (known) sources of deformation, and then apply it to a real scenario: the Emilia seismic sequence (2012, northern Italy), which is an example of seismic sequence occurred in a slowly converging tectonic setting, characterized by several local to regional anthropogenic or natural sources of deformation, mainly subsidence due to fluid withdrawal and sediments compaction. We apply both PCA and vbICA to displacement time-series recorded by continuous GPS and InSAR (Pezzo et al., EGU2015-8950).
Crack Detection with Lamb Wave Wavenumber Analysis
NASA Technical Reports Server (NTRS)
Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu
2013-01-01
In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling
Scaling Techniques for Combustion Device Random Vibration Predictions
NASA Technical Reports Server (NTRS)
Kenny, R. J.; Ferebee, R. C.; Duvall, L. D.
2016-01-01
This work presents compares scaling techniques that can be used for prediction of combustion device component random vibration levels with excitation due to the internal combustion dynamics. Acceleration and unsteady dynamic pressure data from multiple component test programs are compared and normalized per the two scaling approaches reviewed. Two scaling technique are reviewed and compared against the collected component test data. The first technique is an existing approach developed by Barrett, and the second technique is an updated approach new to this work. Results from utilizing both techniques are presented and recommendations about future component random vibration prediction approaches are given.
E-learning platform for automated testing of electronic circuits using signature analysis method
NASA Astrophysics Data System (ADS)
Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel
2016-12-01
Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.
Application of copulas to improve covariance estimation for partial least squares.
D'Angelo, Gina M; Weissfeld, Lisa A
2013-02-20
Dimension reduction techniques, such as partial least squares, are useful for computing summary measures and examining relationships in complex settings. Partial least squares requires an estimate of the covariance matrix as a first step in the analysis, making this estimate critical to the results. In addition, the covariance matrix also forms the basis for other techniques in multivariate analysis, such as principal component analysis and independent component analysis. This paper has been motivated by an example from an imaging study in Alzheimer's disease where there is complete separation between Alzheimer's and control subjects for one of the imaging modalities. This separation occurs in one block of variables and does not occur with the second block of variables resulting in inaccurate estimates of the covariance. We propose the use of a copula to obtain estimates of the covariance in this setting, where one set of variables comes from a mixture distribution. Simulation studies show that the proposed estimator is an improvement over the standard estimators of covariance. We illustrate the methods from the motivating example from a study in the area of Alzheimer's disease. Copyright © 2012 John Wiley & Sons, Ltd.
Xu, J; Durand, L G; Pibarot, P
2001-03-01
The objective of this paper is to adapt and validate a nonlinear transient chirp signal modeling approach for the analysis and synthesis of overlapping aortic (A2) and pulmonary (P2) components of the second heart sound (S2). The approach is based on the time-frequency representation of multicomponent signals for estimating and reconstructing the instantaneous phase and amplitude functions of each component. To evaluate the accuracy of the approach, a simulated S2 with A2 and P2 components having different overlapping intervals (5-30 ms) was synthesized. The simulation results show that the technique is very effective for extracting the two components, even in the presence of noise (-15 dB). The normalized root-mean-squared error between the original A2 and P2 components and their reconstructed versions varied between 1% and 6%, proportionally to the duration of the overlapping interval, and it increased by less than 2% in the presence of noise. The validated technique was then applied to S2 components recorded in pigs under normal or high pulmonary artery pressures. The results show that this approach can successfully isolate and extract overlapping A2 and P2 components from successive S2 recordings obtained from different heartbeats of the same animal as well from different animals.
A review of second law techniques applicable to basic thermal science research
NASA Astrophysics Data System (ADS)
Drost, M. Kevin; Zamorski, Joseph R.
1988-11-01
This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.
2014-01-01
Background The chemical composition of aerosols and particle size distributions are the most significant factors affecting air quality. In particular, the exposure to finer particles can cause short and long-term effects on human health. In the present paper PM10 (particulate matter with aerodynamic diameter lower than 10 μm), CO, NOx (NO and NO2), Benzene and Toluene trends monitored in six monitoring stations of Bari province are shown. The data set used was composed by bi-hourly means for all parameters (12 bi-hourly means per day for each parameter) and it’s referred to the period of time from January 2005 and May 2007. The main aim of the paper is to provide a clear illustration of how large data sets from monitoring stations can give information about the number and nature of the pollutant sources, and mainly to assess the contribution of the traffic source to PM10 concentration level by using multivariate statistical techniques such as Principal Component Analysis (PCA) and Absolute Principal Component Scores (APCS). Results Comparing the night and day mean concentrations (per day) for each parameter it has been pointed out that there is a different night and day behavior for some parameters such as CO, Benzene and Toluene than PM10. This suggests that CO, Benzene and Toluene concentrations are mainly connected with transport systems, whereas PM10 is mostly influenced by different factors. The statistical techniques identified three recurrent sources, associated with vehicular traffic and particulate transport, covering over 90% of variance. The contemporaneous analysis of gas and PM10 has allowed underlining the differences between the sources of these pollutants. Conclusions The analysis of the pollutant trends from large data set and the application of multivariate statistical techniques such as PCA and APCS can give useful information about air quality and pollutant’s sources. These knowledge can provide useful advices to environmental policies in order to reach the WHO recommended levels. PMID:24555534
Graphic design of pinhole cameras
NASA Technical Reports Server (NTRS)
Edwards, H. B.; Chu, W. P.
1979-01-01
The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.
Particle size analysis of amalgam powder and handpiece generated specimens.
Drummond, J L; Hathorn, R M; Cailas, M D; Karuhn, R
2001-07-01
The increasing interest in the elimination of amalgam particles from the dental waste (DW) stream, requires efficient devices to remove these particles. The major objective of this project was to perform a comparative evaluation of five basic methods of particle size analysis in terms of the instrument's ability to quantify the size distribution of the various components within the DW stream. The analytical techniques chosen were image analysis via scanning electron microscopy, standard wire mesh sieves, X-ray sedigraphy, laser diffraction, and electrozone analysis. The DW particle stream components were represented by amalgam powders and handpiece/diamond bur generated specimens of enamel; dentin, whole tooth, and condensed amalgam. Each analytical method quantified the examined DW particle stream components. However, X-ray sedigraphy, electrozone, and laser diffraction particle analyses provided similar results for determining particle distributions of DW samples. These three methods were able to more clearly quantify the properties of the examined powder and condensed amalgam samples. Furthermore, these methods indicated that a significant fraction of the DW stream contains particles less than 20 microm. The findings of this study indicated that the electrozone method is likely to be the most effective technique for quantifying the particle size distribution in the DW particle stream. This method required a relative small volume of sample, was not affected by density, shape factors or optical properties, and measured a sufficient number of particles to provide a reliable representation of the particle size distribution curve.
Advanced Treatment Monitoring for Olympic-Level Athletes Using Unsupervised Modeling Techniques
Siedlik, Jacob A.; Bergeron, Charles; Cooper, Michael; Emmons, Russell; Moreau, William; Nabhan, Dustin; Gallagher, Philip; Vardiman, John P.
2016-01-01
Context Analysis of injury and illness data collected at large international competitions provides the US Olympic Committee and the national governing bodies for each sport with information to best prepare for future competitions. Research in which authors have evaluated medical contacts to provide the expected level of medical care and sports medicine services at international competitions is limited. Objective To analyze the medical-contact data for athletes, staff, and coaches who participated in the 2011 Pan American Games in Guadalajara, Mexico, using unsupervised modeling techniques to identify underlying treatment patterns. Design Descriptive epidemiology study. Setting Pan American Games. Patients or Other Participants A total of 618 US athletes (337 males, 281 females) participated in the 2011 Pan American Games. Main Outcome Measure(s) Medical data were recorded from the injury-evaluation and injury-treatment forms used by clinicians assigned to the central US Olympic Committee Sport Medicine Clinic and satellite locations during the operational 17-day period of the 2011 Pan American Games. We used principal components analysis and agglomerative clustering algorithms to identify and define grouped modalities. Lift statistics were calculated for within-cluster subgroups. Results Principal component analyses identified 3 components, accounting for 72.3% of the variability in datasets. Plots of the principal components showed that individual contacts focused on 4 treatment clusters: massage, paired manipulation and mobilization, soft tissue therapy, and general medical. Conclusions Unsupervised modeling techniques were useful for visualizing complex treatment data and provided insights for improved treatment modeling in athletes. Given its ability to detect clinically relevant treatment pairings in large datasets, unsupervised modeling should be considered a feasible option for future analyses of medical-contact data from international competitions. PMID:26794628
Dinç, Erdal; Ozdemir, Abdil
2005-01-01
Multivariate chromatographic calibration technique was developed for the quantitative analysis of binary mixtures enalapril maleate (EA) and hydrochlorothiazide (HCT) in tablets in the presence of losartan potassium (LST). The mathematical algorithm of multivariate chromatographic calibration technique is based on the use of the linear regression equations constructed using relationship between concentration and peak area at the five-wavelength set. The algorithm of this mathematical calibration model having a simple mathematical content was briefly described. This approach is a powerful mathematical tool for an optimum chromatographic multivariate calibration and elimination of fluctuations coming from instrumental and experimental conditions. This multivariate chromatographic calibration contains reduction of multivariate linear regression functions to univariate data set. The validation of model was carried out by analyzing various synthetic binary mixtures and using the standard addition technique. Developed calibration technique was applied to the analysis of the real pharmaceutical tablets containing EA and HCT. The obtained results were compared with those obtained by classical HPLC method. It was observed that the proposed multivariate chromatographic calibration gives better results than classical HPLC.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
Extension of similarity test procedures to cooled engine components with insulating ceramic coatings
NASA Technical Reports Server (NTRS)
Gladden, H. J.
1980-01-01
Material thermal conductivity was analyzed for its effect on the thermal performance of air cooled gas turbine components, both with and without a ceramic thermal-barrier material, tested at reduced temperatures and pressures. The analysis shows that neglecting the material thermal conductivity can contribute significant errors when metal-wall-temperature test data taken on a turbine vane are extrapolated to engine conditions. This error in metal temperature for an uncoated vane is of opposite sign from that for a ceramic-coated vane. A correction technique is developed for both ceramic-coated and uncoated components.
Three-Dimensional Modeling of Aircraft High-Lift Components with Vehicle Sketch Pad
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
Vehicle Sketch Pad (OpenVSP) is a parametric geometry modeler that has been used extensively for conceptual design studies of aircraft, including studies using higher-order analysis. OpenVSP can model flap and slat surfaces using simple shearing of the airfoil coordinates, which is an appropriate level of complexity for lower-order aerodynamic analysis methods. For three-dimensional analysis, however, there is not a built-in method for defining the high-lift components in OpenVSP in a realistic manner, or for controlling their complex motions in a parametric manner that is intuitive to the designer. This paper seeks instead to utilize OpenVSP's existing capabilities, and establish a set of best practices for modeling high-lift components at a level of complexity suitable for higher-order analysis methods. Techniques are described for modeling the flap and slat components as separate three-dimensional surfaces, and for controlling their motion using simple parameters defined in the local hinge-axis frame of reference. To demonstrate the methodology, an OpenVSP model for the Energy-Efficient Transport (EET) AR12 wind-tunnel model has been created, taking advantage of OpenVSP's Advanced Parameter Linking capability to translate the motions of the high-lift components from the hinge-axis coordinate system to a set of transformations in OpenVSP's frame of reference.
Multivariate image analysis of laser-induced photothermal imaging used for detection of caries tooth
NASA Astrophysics Data System (ADS)
El-Sherif, Ashraf F.; Abdel Aziz, Wessam M.; El-Sharkawy, Yasser H.
2010-08-01
Time-resolved photothermal imaging has been investigated to characterize tooth for the purpose of discriminating between normal and caries areas of the hard tissue using thermal camera. Ultrasonic thermoelastic waves were generated in hard tissue by the absorption of fiber-coupled Q-switched Nd:YAG laser pulses operating at 1064 nm in conjunction with a laser-induced photothermal technique used to detect the thermal radiation waves for diagnosis of human tooth. The concepts behind the use of photo-thermal techniques for off-line detection of caries tooth features were presented by our group in earlier work. This paper illustrates the application of multivariate image analysis (MIA) techniques to detect the presence of caries tooth. MIA is used to rapidly detect the presence and quantity of common caries tooth features as they scanned by the high resolution color (RGB) thermal cameras. Multivariate principal component analysis is used to decompose the acquired three-channel tooth images into a two dimensional principal components (PC) space. Masking score point clusters in the score space and highlighting corresponding pixels in the image space of the two dominant PCs enables isolation of caries defect pixels based on contrast and color information. The technique provides a qualitative result that can be used for early stage caries tooth detection. The proposed technique can potentially be used on-line or real-time resolved to prescreen the existence of caries through vision based systems like real-time thermal camera. Experimental results on the large number of extracted teeth as well as one of the thermal image panoramas of the human teeth voltanteer are investigated and presented.
Bartlett, Yvonne K; Sheeran, Paschal; Hawley, Mark S
2014-01-01
Purpose The purpose of this study was to identify the behaviour change techniques (BCTs) that are associated with greater effectiveness in smoking cessation interventions for people with chronic obstructive pulmonary disease (COPD). Methods A systematic review and meta-analysis was conducted. Web of Knowledge, CINAHL, EMBASE, PsycINFO, and MEDLINE were searched from the earliest date available to December 2012. Data were extracted and weighted average effect sizes calculated; BCTs used were coded according to an existing smoking cessation-specific BCT taxonomy. Results Seventeen randomized controlled trials (RCTs) were identified that involved a total sample of 7446 people with COPD. The sample-weighted mean quit rate for all RCTs was 13.19%, and the overall sample-weighted effect size was d+ = 0.33. Thirty-seven BCTs were each used in at least three interventions. Four techniques were associated with significantly larger effect sizes: Facilitate action planning/develop treatment plan, Prompt self-recording, Advise on methods of weight control, and Advise on/facilitate use of social support. Three new COPD-specific BCTs were identified, and Linking COPD and smoking was found to result in significantly larger effect sizes. Conclusions Smoking cessation interventions aimed at people with COPD appear to benefit from using techniques focussed on forming detailed plans and self-monitoring. Additional RCTs that use standardized reporting of intervention components and BCTs would be valuable to corroborate findings from the present meta-analysis. Statement of contribution What is already known on this subject? Chronic obstructive pulmonary disease (COPD) is responsible for considerable health and economic burden worldwide, and smoking cessation (SC) is the only known treatment that can slow the decline in lung function experienced. Previous reviews of smoking cessation interventions for this population have established that a combination of pharmacological support and behavioural counselling is most effective. While pharmacological support has been detailed, and effectiveness ranked, the content of behavioural counselling varies between interventions, and it is not clear what the most effective components are. What does this study add? Detailed description of ‘behavioural counselling’ component of SC interventions for people with COPD. Meta-analysis to identify effective behaviour change techniques tailored for this population. Discussion of these findings in the context of designing tailored SC interventions. PMID:24397814
Visual Exploration of Semantic Relationships in Neural Word Embeddings
Liu, Shusen; Bremer, Peer-Timo; Thiagarajan, Jayaraman J.; ...
2017-08-29
Constructing distributed representations for words through neural language models and using the resulting vector spaces for analysis has become a crucial component of natural language processing (NLP). But, despite their widespread application, little is known about the structure and properties of these spaces. To gain insights into the relationship between words, the NLP community has begun to adapt high-dimensional visualization techniques. Particularly, researchers commonly use t-distributed stochastic neighbor embeddings (t-SNE) and principal component analysis (PCA) to create two-dimensional embeddings for assessing the overall structure and exploring linear relationships (e.g., word analogies), respectively. Unfortunately, these techniques often produce mediocre or evenmore » misleading results and cannot address domain-specific visualization challenges that are crucial for understanding semantic relationships in word embeddings. We introduce new embedding techniques for visualizing semantic and syntactic analogies, and the corresponding tests to determine whether the resulting views capture salient structures. Additionally, we introduce two novel views for a comprehensive study of analogy relationships. Finally, we augment t-SNE embeddings to convey uncertainty information in order to allow a reliable interpretation. Combined, the different views address a number of domain-specific tasks difficult to solve with existing tools.« less
NASA Technical Reports Server (NTRS)
Rampe, E. B.; Lanza, N. L.
2012-01-01
Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.
Visual Exploration of Semantic Relationships in Neural Word Embeddings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Shusen; Bremer, Peer-Timo; Thiagarajan, Jayaraman J.
Constructing distributed representations for words through neural language models and using the resulting vector spaces for analysis has become a crucial component of natural language processing (NLP). But, despite their widespread application, little is known about the structure and properties of these spaces. To gain insights into the relationship between words, the NLP community has begun to adapt high-dimensional visualization techniques. Particularly, researchers commonly use t-distributed stochastic neighbor embeddings (t-SNE) and principal component analysis (PCA) to create two-dimensional embeddings for assessing the overall structure and exploring linear relationships (e.g., word analogies), respectively. Unfortunately, these techniques often produce mediocre or evenmore » misleading results and cannot address domain-specific visualization challenges that are crucial for understanding semantic relationships in word embeddings. We introduce new embedding techniques for visualizing semantic and syntactic analogies, and the corresponding tests to determine whether the resulting views capture salient structures. Additionally, we introduce two novel views for a comprehensive study of analogy relationships. Finally, we augment t-SNE embeddings to convey uncertainty information in order to allow a reliable interpretation. Combined, the different views address a number of domain-specific tasks difficult to solve with existing tools.« less
Sensors for ceramic components in advanced propulsion systems
NASA Technical Reports Server (NTRS)
Koller, A. C.; Bennethum, W. H.; Burkholder, S. D.; Brackett, R. R.; Harris, J. P.
1995-01-01
This report includes: (1) a survey of the current methods for the measurement of surface temperature of ceramic materials suitable for use as hot section flowpath components in aircraft gas turbine engines; (2) analysis and selection of three sensing techniques with potential to extend surface temperature measurement capability beyond current limits; and (3) design, manufacture, and evaluation of the three selected techniques which include the following: platinum rhodium thin film thermocouple on alumina and mullite substrates; doped silicon carbide thin film thermocouple on silicon carbide, silicon nitride, and aluminum nitride substrates; and long and short wavelength radiation pyrometry on the substrates listed above plus yttria stabilized zirconia. Measurement of surface emittance of these materials at elevated temperature was included as part of this effort.
Nonlinear Acoustic and Ultrasonic NDT of Aeronautical Components
NASA Astrophysics Data System (ADS)
Van Den Abeele, Koen; Katkowski, Tomasz; Mattei, Christophe
2006-05-01
In response to the demand for innovative microdamage inspection systems, with high sensitivity and undoubted accuracy, we are currently investigating the use and robustness of several acoustic and ultrasonic NDT techniques based on Nonlinear Elastic Wave Spectroscopy (NEWS) for the characterization of microdamage in aeronautical components. In this report, we illustrate the results of an amplitude dependent analysis of the resonance behaviour, both in time (signal reverberation) and in frequency (sweep) domain. The technique is applied to intact and damaged samples of Carbon Fiber Reinforced Plastics (CFRP) composites after thermal loading or mechanical fatigue. The method shows a considerable gain in sensitivity and an incontestable interpretation of the results for nonlinear signatures in comparison with the linear characteristics. For highly fatigued samples, slow dynamical effects are observed.
NASA Astrophysics Data System (ADS)
Waugh, Rachael C.; Dulieu-Barton, Janice M.; Quinn, S.
2015-03-01
Thermoelastic stress analysis (TSA) is an established active thermographic approach which uses the thermoelastic effect to correlate the temperature change that occurs as a material is subjected to elastic cyclic loading to the sum of the principal stresses on the surface of the component. Digital image correlation (DIC) tracks features on the surface of a material to establish a displacement field of a component subjected to load, which can then be used to calculate the strain field. The application of both DIC and TSA on a composite plate representative of aircraft secondary structure subject to resonant frequency loading using a portable loading device, i.e. `remote loading' is described. Laboratory based loading for TSA and DIC is typically imparted using a test machine, however in the current work a vibration loading system is used which is able to excite the component of interest at resonant frequency which enables TSA and DIC to be carried out. The accuracy of the measurements made under remote loading of both of the optical techniques applied is discussed. The data are compared to extract complimentary information from the two techniques. This work forms a step towards a combined strain based non-destructive evaluation procedure able to identify and quantify the effect of defects more fully, particularly when examining component performance in service applications.
Improving KPCA Online Extraction by Orthonormalization in the Feature Space.
Souza Filho, Joao B O; Diniz, Paulo S R
2018-04-01
Recently, some online kernel principal component analysis (KPCA) techniques based on the generalized Hebbian algorithm (GHA) were proposed for use in large data sets, defining kernel components using concise dictionaries automatically extracted from data. This brief proposes two new online KPCA extraction algorithms, exploiting orthogonalized versions of the GHA rule. In both the cases, the orthogonalization of kernel components is achieved by the inclusion of some low complexity additional steps to the kernel Hebbian algorithm, thus not substantially affecting the computational cost of the algorithm. Results show improved convergence speed and accuracy of components extracted by the proposed methods, as compared with the state-of-the-art online KPCA extraction algorithms.
Comparison of ITRF2014 station coordinate input time series of DORIS, VLBI and GNSS
NASA Astrophysics Data System (ADS)
Tornatore, Vincenza; Tanır Kayıkçı, Emine; Roggero, Marco
2016-12-01
In this paper station coordinate time series from three space geodesy techniques that have contributed to the realization of the International Terrestrial Reference Frame 2014 (ITRF2014) are compared. In particular the height component time series extracted from official combined intra-technique solutions submitted for ITRF2014 by DORIS, VLBI and GNSS Combination Centers have been investigated. The main goal of this study is to assess the level of agreement among these three space geodetic techniques. A novel analytic method, modeling time series as discrete-time Markov processes, is presented and applied to the compared time series. The analysis method has proven to be particularly suited to obtain quasi-cyclostationary residuals which are an important property to carry out a reliable harmonic analysis. We looked for common signatures among the three techniques. Frequencies and amplitudes of the detected signals have been reported along with their percentage of incidence. Our comparison shows that two of the estimated signals, having one-year and 14 days periods, are common to all the techniques. Different hypotheses on the nature of the signal having a period of 14 days are presented. As a final check we have compared the estimated velocities and their standard deviations (STD) for the sites that co-located the VLBI, GNSS and DORIS stations, obtaining a good agreement among the three techniques both in the horizontal (1.0 mm/yr mean STD) and in the vertical (0.7 mm/yr mean STD) component, although some sites show larger STDs, mainly due to lack of data, different data spans or noisy observations.
Jiang, Hua; Peng, Jin; Zhou, Zhi-yuan; Duan, Yu; Chen, Wei; Cai, Bin; Yang, Hao; Zhang, Wei
2010-09-01
Spinal cord injury (SCI) is a complex trauma that consists of multiple pathological mechanisms involving cytotoxic, oxidation stress and immune-endocrine. This study aimed to establish plasma metabonomics fingerprinting atlas for SCI using (1)H nuclear magnetic resonance (NMR) based metabonomics methodology and principal component analysis techniques. Nine Sprague-Dawley (SD) male rats were randomly divided into SCI, normal and sham-operation control groups. Plasma samples were collected for (1)H NMR spectroscopy 3 days after operation. The NMR data were analyzed using principal component analysis technique with Matlab software. Metabonomics analysis was able to distinguish the three groups (SCI, normal control, sham-operation). The fingerprinting atlas indicated that, compared with those without SCI, the SCI group demonstrated the following characteristics with regard to second principal component: it is made up of fatty acids, myc-inositol, arginine, very low-density lipoprotein (VLDL), low-density lipoprotein (LDL), triglyceride (TG), glucose, and 3-methyl-histamine. The data indicated that SCI results in several significant changes in plasma metabolism early on and that a metabonomics approach based on (1)H NMR spectroscopy can provide a metabolic profile comprising several metabolite classes and allow for relative quantification of such changes. The results also provided support for further development and application of metabonomics technologies for studying SCI and for the utilization of multivariate models for classifying the extent of trauma within an individual.
Meyer, Karin; Kirkpatrick, Mark
2005-01-01
Principal component analysis is a widely used 'dimension reduction' technique, albeit generally at a phenotypic level. It is shown that we can estimate genetic principal components directly through a simple reparameterisation of the usual linear, mixed model. This is applicable to any analysis fitting multiple, correlated genetic effects, whether effects for individual traits or sets of random regression coefficients to model trajectories. Depending on the magnitude of genetic correlation, a subset of the principal component generally suffices to capture the bulk of genetic variation. Corresponding estimates of genetic covariance matrices are more parsimonious, have reduced rank and are smoothed, with the number of parameters required to model the dispersion structure reduced from k(k + 1)/2 to m(2k - m + 1)/2 for k effects and m principal components. Estimation of these parameters, the largest eigenvalues and pertaining eigenvectors of the genetic covariance matrix, via restricted maximum likelihood using derivatives of the likelihood, is described. It is shown that reduced rank estimation can reduce computational requirements of multivariate analyses substantially. An application to the analysis of eight traits recorded via live ultrasound scanning of beef cattle is given. PMID:15588566
NASA Astrophysics Data System (ADS)
Paganelli, F.; Schubert, G.; Lopes, R. M. C.; Malaska, M.; Le Gall, A. A.; Kirk, R. L.
2016-12-01
The current SAR data coverage on Titan encompasses several areas in which multiple radar passes are present and overlapping, providing additional information to aid the interpretation of geological and structural features. We exploit the different combinations of look direction and variable incidence angle to examine Cassini Synthetic Aperture RADAR (SAR) data using the Principal Component Analysis (PCA) technique and high-resolution radiometry, as a tool to aid in the interpretation of geological and structural features. Look direction and variable incidence angle is of particular importance in the analysis of variance in the images, which aid in the perception and identification of geological and structural features, as extensively demonstrated in Earth and planetary examples. The PCA enhancement technique uses projected non-ortho-rectified SAR imagery in order to maintain the inherent differences in scattering and geometric properties due to the different look directions, while enhancing the geometry of surface features. The PC2 component provides a stereo view of the areas in which complex surface features and structural patterns can be enhanced and outlined. We focus on several areas of interest, in older and recently acquired flybys, in which evidence of geological and structural features can be enhanced and outlined in the PC1 and PC2 components. Results of this technique provide enhanced geometry and insights into the interpretation of the observed geological and structural features, thus allowing a better understanding towards the geology and tectonics on Titan.
Automated nystagmus analysis. [on-line computer technique for eye data processing
NASA Technical Reports Server (NTRS)
Oman, C. M.; Allum, J. H. J.; Tole, J. R.; Young, L. R.
1973-01-01
Several methods have recently been used for on-line analysis of nystagmus: A digital computer program has been developed to accept sampled records of eye position, detect fast phase components, and output cumulative slow phase position, continuous slow phase velocity, instantaneous fast phase frequency, and other parameters. The slow phase velocity is obtained by differentiation of the calculated cumulative position rather than the original eye movement record. Also, a prototype analog device has been devised which calculates the velocity of the slow phase component during caloric testing. Examples of clinical and research eye movement records analyzed with these devices are shown.
Multivariate Analysis of Seismic Field Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alam, M. Kathleen
1999-06-01
This report includes the details of the model building procedure and prediction of seismic field data. Principal Components Regression, a multivariate analysis technique, was used to model seismic data collected as two pieces of equipment were cycled on and off. Models built that included only the two pieces of equipment of interest had trouble predicting data containing signals not included in the model. Evidence for poor predictions came from the prediction curves as well as spectral F-ratio plots. Once the extraneous signals were included in the model, predictions improved dramatically. While Principal Components Regression performed well for the present datamore » sets, the present data analysis suggests further work will be needed to develop more robust modeling methods as the data become more complex.« less
Yang, Yan-Qin; Yin, Hong-Xu; Yuan, Hai-Bo; Jiang, Yong-Wen; Dong, Chun-Wang; Deng, Yu-Liang
2018-01-01
In the present work, a novel infrared-assisted extraction coupled to headspace solid-phase microextraction (IRAE-HS-SPME) followed by gas chromatography-mass spectrometry (GC-MS) was developed for rapid determination of the volatile components in green tea. The extraction parameters such as fiber type, sample amount, infrared power, extraction time, and infrared lamp distance were optimized by orthogonal experimental design. Under optimum conditions, a total of 82 volatile compounds in 21 green tea samples from different geographical origins were identified. Compared with classical water-bath heating, the proposed technique has remarkable advantages of considerably reducing the analytical time and high efficiency. In addition, an effective classification of green teas based on their volatile profiles was achieved by partial least square-discriminant analysis (PLS-DA) and hierarchical clustering analysis (HCA). Furthermore, the application of a dual criterion based on the variable importance in the projection (VIP) values of the PLS-DA models and on the category from one-way univariate analysis (ANOVA) allowed the identification of 12 potential volatile markers, which were considered to make the most important contribution to the discrimination of the samples. The results suggest that IRAE-HS-SPME/GC-MS technique combined with multivariate analysis offers a valuable tool to assess geographical traceability of different tea varieties.
Yin, Hong-Xu; Yuan, Hai-Bo; Jiang, Yong-Wen; Dong, Chun-Wang; Deng, Yu-Liang
2018-01-01
In the present work, a novel infrared-assisted extraction coupled to headspace solid-phase microextraction (IRAE-HS-SPME) followed by gas chromatography-mass spectrometry (GC-MS) was developed for rapid determination of the volatile components in green tea. The extraction parameters such as fiber type, sample amount, infrared power, extraction time, and infrared lamp distance were optimized by orthogonal experimental design. Under optimum conditions, a total of 82 volatile compounds in 21 green tea samples from different geographical origins were identified. Compared with classical water-bath heating, the proposed technique has remarkable advantages of considerably reducing the analytical time and high efficiency. In addition, an effective classification of green teas based on their volatile profiles was achieved by partial least square-discriminant analysis (PLS-DA) and hierarchical clustering analysis (HCA). Furthermore, the application of a dual criterion based on the variable importance in the projection (VIP) values of the PLS-DA models and on the category from one-way univariate analysis (ANOVA) allowed the identification of 12 potential volatile markers, which were considered to make the most important contribution to the discrimination of the samples. The results suggest that IRAE-HS-SPME/GC-MS technique combined with multivariate analysis offers a valuable tool to assess geographical traceability of different tea varieties. PMID:29494626
PARENT Quick Blind Round-Robin Test Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.
The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
Sisco, Edward; Demoranville, Leonard T; Gillen, Greg
2013-09-10
The feasibility of using C60(+) cluster primary ion bombardment secondary ion mass spectrometry (C60(+) SIMS) for the analysis of the chemical composition of fingerprints is evaluated. It was found that C60(+) SIMS could be used to detect and image the spatial localization of a number of sebaceous and eccrine components in fingerprints. These analyses were also found to not be hindered by the use of common latent print powder development techniques. Finally, the ability to monitor the depth distribution of fingerprint constituents was found to be possible - a capability which has not been shown using other chemical imaging techniques. This paper illustrates a number of strengths and potential weaknesses of C60(+) SIMS as an additional or complimentary technique for the chemical analysis of fingerprints. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen; Talpe, Matthieu; Shum, C. K.; Schmidt, Michael
2017-12-01
In recent decades, decomposition techniques have enabled increasingly more applications for dimension reduction, as well as extraction of additional information from geophysical time series. Traditionally, the principal component analysis (PCA)/empirical orthogonal function (EOF) method and more recently the independent component analysis (ICA) have been applied to extract, statistical orthogonal (uncorrelated), and independent modes that represent the maximum variance of time series, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the autocovariance matrix and diagonalizing higher (than two) order statistical tensors from centered time series, respectively. However, the stationarity assumption in these techniques is not justified for many geophysical and climate variables even after removing cyclic components, e.g., the commonly removed dominant seasonal cycles. In this paper, we present a novel decomposition method, the complex independent component analysis (CICA), which can be applied to extract non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA, where (a) we first define a new complex dataset that contains the observed time series in its real part, and their Hilbert transformed series as its imaginary part, (b) an ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex dataset in (a), and finally, (c) the dominant independent complex modes are extracted and used to represent the dominant space and time amplitudes and associated phase propagation patterns. The performance of CICA is examined by analyzing synthetic data constructed from multiple physically meaningful modes in a simulation framework, with known truth. Next, global terrestrial water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) gravimetry mission (2003-2016), and satellite radiometric sea surface temperature (SST) data (1982-2016) over the Atlantic and Pacific Oceans are used with the aim of demonstrating signal separations of the North Atlantic Oscillation (NAO) from the Atlantic Multi-decadal Oscillation (AMO), and the El Niño Southern Oscillation (ENSO) from the Pacific Decadal Oscillation (PDO). CICA results indicate that ENSO-related patterns can be extracted from the Gravity Recovery And Climate Experiment Terrestrial Water Storage (GRACE TWS) with an accuracy of 0.5-1 cm in terms of equivalent water height (EWH). The magnitude of errors in extracting NAO or AMO from SST data using the complex EOF (CEOF) approach reaches up to 50% of the signal itself, while it is reduced to 16% when applying CICA. Larger errors with magnitudes of 100% and 30% of the signal itself are found while separating ENSO from PDO using CEOF and CICA, respectively. We thus conclude that the CICA is more effective than CEOF in separating non-stationary patterns.
The banana code-natural blend processing in the olfactory circuitry of Drosophila melanogaster.
Schubert, Marco; Hansson, Bill S; Sachse, Silke
2014-01-01
Odor information is predominantly perceived as complex odor blends. For Drosophila melanogaster one of the most attractive blends is emitted by an over-ripe banana. To analyze how the fly's olfactory system processes natural blends we combined the experimental advantages of gas chromatography and functional imaging (GC-I). In this way, natural banana compounds were presented successively to the fly antenna in close to natural occurring concentrations. This technique allowed us to identify the active odor components, use these compounds as stimuli and measure odor-induced Ca(2+) signals in input and output neurons of the Drosophila antennal lobe (AL), the first olfactory neuropil. We demonstrate that mixture interactions of a natural blend are very rare and occur only at the AL output level resulting in a surprisingly linear blend representation. However, the information regarding single components is strongly modulated by the olfactory circuitry within the AL leading to a higher similarity between the representation of individual components and the banana blend. This observed modulation might tune the olfactory system in a way to distinctively categorize odor components and improve the detection of suitable food sources. Functional GC-I thus enables analysis of virtually any unknown natural odorant blend and its components in their relative occurring concentrations and allows characterization of neuronal responses of complete neural assemblies. This technique can be seen as a valuable complementary method to classical GC/electrophysiology techniques, and will be a highly useful tool in future investigations of insect-insect and insect-plant chemical interactions.
Mori, Tetsuya; Tsuboi, Yuuri; Ishida, Nobuhiro; Nishikubo, Nobuyuki; Demura, Taku; Kikuchi, Jun
2015-01-01
Lignocellulose, which includes mainly cellulose, hemicellulose, and lignin, is a potential resource for the production of chemicals and for other applications. For effective production of materials derived from biomass, it is important to characterize the metabolites and polymeric components of the biomass. Nuclear magnetic resonance (NMR) spectroscopy has been used to identify biomass components; however, the NMR spectra of metabolites and lignocellulose components are ambiguously assigned in many cases due to overlapping chemical shift peaks. Using our 13C-labeling technique in higher plants such as poplar samples, we demonstrated that overlapping peaks could be resolved by three-dimensional NMR experiments to more accurately assign chemical shifts compared with two-dimensional NMR measurements. Metabolites of the 13C-poplar were measured by high-resolution magic angle spinning NMR spectroscopy, which allows sample analysis without solvent extraction, while lignocellulose components of the 13C-poplar dissolved in dimethylsulfoxide/pyridine solvent were analyzed by solution-state NMR techniques. Using these methods, we were able to unambiguously assign chemical shifts of small and macromolecular components in 13C-poplar samples. Furthermore, using samples of less than 5 mg, we could differentiate between two kinds of genes that were overexpressed in poplar samples, which produced clearly modified plant cell wall components. PMID:26143886
Syazwan, AI; Rafee, B Mohd; Juahir, Hafizan; Azman, AZF; Nizar, AM; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, AA; Yunos, MA Syafiq; Anita, AR; Hanafiah, J Muhamad; Shaharuddin, MS; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, MN Mohamad; Azizan, HS; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, FT
2012-01-01
Purpose To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. Design A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. Method A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Result Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. Conclusion This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures. PMID:23055779
Syazwan, Ai; Rafee, B Mohd; Juahir, Hafizan; Azman, Azf; Nizar, Am; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, Aa; Yunos, Ma Syafiq; Anita, Ar; Hanafiah, J Muhamad; Shaharuddin, Ms; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, Mn Mohamad; Azizan, Hs; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, Ft
2012-01-01
To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures.
Satellite ranging data analysis under LAGEOS A. O. No. OSTA 78-2
NASA Technical Reports Server (NTRS)
Shelus, P. J.
1981-01-01
LAGEOS and lunar laser ranging observations are combined to eliminate the shortcomings inherent in each technique, while accentuating the advantages of each. All three components of the Earth's rotation are produced with accuracy and precision which is compatible with observational uncertainties.
Enhancing Electromagnetic Side-Channel Analysis in an Operational Environment
2013-09-01
phenomenon of compromising power and EM emissions has been known and exploited for decades. Declassified TEMPEST documents reveal vulnerabilities of...Components. One technique to detect potentially compromising emissions is to use a wide-band receiver tuned to a specific frequency. High-end TEMPEST
Five-Year Wilkinson Microwave Anisotropy Probe (WMAP1) Observations: Galactic Foreground Emission
NASA Technical Reports Server (NTRS)
Gold, B.; Bennett, C.L.; Larson, D.; Hill, R.S.; Odegard, N.; Weiland, J.L.; Hinshaw, G.; Kogut, A.; Wollack, E.; Page, L.;
2008-01-01
We present a new estimate of foreground emission in the WMAP data, using a Markov chain Monte Carlo (MCMC) method. The new technique delivers maps of each foreground component for a variety of foreground models, error estimates of the uncertainty of each foreground component, and provides an overall goodness-of-fit measurement. The resulting foreground maps are in broad agreement with those from previous techniques used both within the collaboration and by other authors. We find that for WMAP data, a simple model with power-law synchrotron, free-free, and thermal dust components fits 90% of the sky with a reduced X(sup 2) (sub v) of 1.14. However, the model does not work well inside the Galactic plane. The addition of either synchrotron steepening or a modified spinning dust model improves the fit. This component may account for up to 14% of the total flux at Ka-band (33 GHz). We find no evidence for foreground contamination of the CMB temperature map in the 85% of the sky used for cosmological analysis.
Cold Spray Repair of Martensitic Stainless Steel Components
NASA Astrophysics Data System (ADS)
Faccoli, M.; Cornacchia, G.; Maestrini, D.; Marconi, G. P.; Roberti, R.
2014-12-01
The possibility of using cold spray as repair technique of martensitic stainless steel components was evaluated through laboratory investigations. An austenitic stainless steel feedstock powder was chosen, instead of soft metals powders like nickel, copper, or aluminum, used for repairing components made in light alloy or cast iron. The present study directly compares the microstructure, the residual stresses, and the micro-hardness of repairs obtained by cold spray and by TIG welding, that is commonly used as repair technique in large steel components. XRD and optical metallographic analysis of the repairs showed that cold spray offers some advantages, inducing compressive residual stresses in the repair and avoiding alterations of the interface between repair and base material. For these reasons, a heat treatment after the cold spray repair is not required to restore the base material properties, whereas a post-weld heat treatment is needed after the welding repair. Cold spray repair also exhibits a higher micro-hardness than the welding repair. In addition, the cavitation erosion resistance of a cold spray coating was investigated through ultrasonic cavitation tests, and the samples worn surfaces were observed by scanning electron microscopy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Mather, Barry A
A library of load variability classes is created to produce scalable synthetic data sets using historical high-speed raw data. These data are collected from distribution monitoring units connected at the secondary side of a distribution transformer. Because of the irregular patterns and large volume of historical high-speed data sets, the utilization of current load characterization and modeling techniques are challenging. Multi-resolution analysis techniques are applied to extract the necessary components and eliminate the unnecessary components from the historical high-speed raw data to create the library of classes, which are then utilized to create new synthetic load data sets. A validationmore » is performed to ensure that the synthesized data sets contain the same variability characteristics as the training data sets. The synthesized data sets are intended to be utilized in quasi-static time-series studies for distribution system planning studies on a granular scale, such as detailed PV interconnection studies.« less
Machine learning of frustrated classical spin models. I. Principal component analysis
NASA Astrophysics Data System (ADS)
Wang, Ce; Zhai, Hui
2017-10-01
This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.
Multivariate Analysis of Solar Spectral Irradiance Measurements
NASA Technical Reports Server (NTRS)
Pilewskie, P.; Rabbette, M.
2001-01-01
Principal component analysis is used to characterize approximately 7000 downwelling solar irradiance spectra retrieved at the Southern Great Plains site during an Atmospheric Radiation Measurement (ARM) shortwave intensive operating period. This analysis technique has proven to be very effective in reducing a large set of variables into a much smaller set of independent variables while retaining the information content. It is used to determine the minimum number of parameters necessary to characterize atmospheric spectral irradiance or the dimensionality of atmospheric variability. It was found that well over 99% of the spectral information was contained in the first six mutually orthogonal linear combinations of the observed variables (flux at various wavelengths). Rotation of the principal components was effective in separating various components by their independent physical influences. The majority of the variability in the downwelling solar irradiance (380-1000 nm) was explained by the following fundamental atmospheric parameters (in order of their importance): cloud scattering, water vapor absorption, molecular scattering, and ozone absorption. In contrast to what has been proposed as a resolution to a clear-sky absorption anomaly, no unexpected gaseous absorption signature was found in any of the significant components.
Lorenz, Kevin S.; Salama, Paul; Dunn, Kenneth W.; Delp, Edward J.
2013-01-01
Digital image analysis is a fundamental component of quantitative microscopy. However, intravital microscopy presents many challenges for digital image analysis. In general, microscopy volumes are inherently anisotropic, suffer from decreasing contrast with tissue depth, lack object edge detail, and characteristically have low signal levels. Intravital microscopy introduces the additional problem of motion artifacts, resulting from respiratory motion and heartbeat from specimens imaged in vivo. This paper describes an image registration technique for use with sequences of intravital microscopy images collected in time-series or in 3D volumes. Our registration method involves both rigid and non-rigid components. The rigid registration component corrects global image translations, while the non-rigid component manipulates a uniform grid of control points defined by B-splines. Each control point is optimized by minimizing a cost function consisting of two parts: a term to define image similarity, and a term to ensure deformation grid smoothness. Experimental results indicate that this approach is promising based on the analysis of several image volumes collected from the kidney, lung, and salivary gland of living rodents. PMID:22092443
NASA Astrophysics Data System (ADS)
Sitohang, Yosep Oktavianus; Darmawan, Gumgum
2017-08-01
This research attempts to compare between two forecasting models in time series analysis for predicting the sales volume of motorcycle in Indonesia. The first forecasting model used in this paper is Autoregressive Fractionally Integrated Moving Average (ARFIMA). ARFIMA can handle non-stationary data and has a better performance than ARIMA in forecasting accuracy on long memory data. This is because the fractional difference parameter can explain correlation structure in data that has short memory, long memory, and even both structures simultaneously. The second forecasting model is Singular spectrum analysis (SSA). The advantage of the technique is that it is able to decompose time series data into the classic components i.e. trend, cyclical, seasonal and noise components. This makes the forecasting accuracy of this technique significantly better. Furthermore, SSA is a model-free technique, so it is likely to have a very wide range in its application. Selection of the best model is based on the value of the lowest MAPE. Based on the calculation, it is obtained the best model for ARFIMA is ARFIMA (3, d = 0, 63, 0) with MAPE value of 22.95 percent. For SSA with a window length of 53 and 4 group of reconstructed data, resulting MAPE value of 13.57 percent. Based on these results it is concluded that SSA produces better forecasting accuracy.
Strittmatter, Nicole; Düring, Rolf-Alexander; Takáts, Zoltán
2012-09-07
An analysis method for aqueous samples by the direct combination of C18/SCX mixed mode thin-film microextraction (TFME) and desorption electrospray ionization mass spectrometry (DESI-MS) was developed. Both techniques make analytical workflow simpler and faster, hence the combination of the two techniques enables considerably shorter analysis time compared to the traditional liquid chromatography mass spectrometry (LC-MS) approach. The method was characterized using carbamazepine and triclosan as typical examples for pharmaceuticals and personal care product (PPCP) components which draw increasing attention as wastewater-derived environmental contaminants. Both model compounds were successfully detected in real wastewater samples and their concentrations determined using external calibration with isotope labeled standards. Effects of temperature, agitation, sample volume, and exposure time were investigated in the case of spiked aqueous samples. Results were compared to those of parallel HPLC-MS determinations and good agreement was found through a three orders of magnitude wide concentration range. Serious matrix effects were observed in treated wastewater, but lower limits of detection were still found to be in the low ng L(-1) range. Using an Orbitrap mass spectrometer, the technique was found to be ideal for screening purposes and led to the detection of various different PPCP components in wastewater treatment plant effluents, including beta-blockers, nonsteroidal anti-inflammatory drugs, and UV filters.
Chemometric techniques in oil classification from oil spill fingerprinting.
Ismail, Azimah; Toriman, Mohd Ekhwan; Juahir, Hafizan; Kassim, Azlina Md; Zain, Sharifuddin Md; Ahmad, Wan Kamaruzaman Wan; Wong, Kok Fah; Retnam, Ananthy; Zali, Munirah Abdul; Mokhtar, Mazlin; Yusri, Mohd Ayub
2016-10-15
Extended use of GC-FID and GC-MS in oil spill fingerprinting and matching is significantly important for oil classification from the oil spill sources collected from various areas of Peninsular Malaysia and Sabah (East Malaysia). Oil spill fingerprinting from GC-FID and GC-MS coupled with chemometric techniques (discriminant analysis and principal component analysis) is used as a diagnostic tool to classify the types of oil polluting the water. Clustering and discrimination of oil spill compounds in the water from the actual site of oil spill events are divided into four groups viz. diesel, Heavy Fuel Oil (HFO), Mixture Oil containing Light Fuel Oil (MOLFO) and Waste Oil (WO) according to the similarity of their intrinsic chemical properties. Principal component analysis (PCA) demonstrates that diesel, HFO, MOLFO and WO are types of oil or oil products from complex oil mixtures with a total variance of 85.34% and are identified with various anthropogenic activities related to either intentional releasing of oil or accidental discharge of oil into the environment. Our results show that the use of chemometric techniques is significant in providing independent validation for classifying the types of spilled oil in the investigation of oil spill pollution in Malaysia. This, in consequence would result in cost and time saving in identification of the oil spill sources. Copyright © 2016. Published by Elsevier Ltd.
Bajpai, Vikas; Sharma, Deepty; Kumar, Brijesh; Madhusudanan, K P
2010-12-01
Piper betle Linn. is a traditional plant associated with the Asian and southeast Asian cultures. Its use is also recorded in folk medicines in these regions. Several of its medicinal properties have recently been proven. Phytochemical analysis showed the presence of mainly terpenes and phenols in betel leaves. These constituents vary in the different cultivars of Piper betle. In this paper we have attempted to profile eight locally available betel cultivars using the recently developed mass spectral ionization technique of direct analysis in real time (DART). Principal component analysis has also been employed to analyze the DART MS data of these betel cultivars. The results show that the cultivars of Piper betle could be differentiated using DART MS data. Copyright © 2010 John Wiley & Sons, Ltd.
Neutron beam measurement of industrial polymer materials for composition and bulk integrity
NASA Astrophysics Data System (ADS)
Rogante, M.; Rosta, L.; Heaton, M. E.
2013-10-01
Neutron beam techniques, among other non-destructive diagnostics, are particularly irreplaceable in the complete analysis of industrial materials and components when supplying fundamental information. In this paper, nanoscale small-angle neutron scattering analysis and prompt gamma activation analysis for the characterization of industrial polymers are considered. The basic theoretical aspects are briefly introduced and some applications are presented. The investigations of the SU-8 polymer in axial airflow microturbines—i.e. microelectromechanical systems—are presented foremost. Also presented are full and feasibility studies on polyurethanes, composites based on cross-linked polymers reinforced by carbon fibres and polymer cement concrete. The obtained results have provided a substantial contribution to the improvement of the considered materials, and indeed confirmed the industrial applicability of the adopted techniques in the analysis of polymers.
Non-linear principal component analysis applied to Lorenz models and to North Atlantic SLP
NASA Astrophysics Data System (ADS)
Russo, A.; Trigo, R. M.
2003-04-01
A non-linear generalisation of Principal Component Analysis (PCA), denoted Non-Linear Principal Component Analysis (NLPCA), is introduced and applied to the analysis of three data sets. Non-Linear Principal Component Analysis allows for the detection and characterisation of low-dimensional non-linear structure in multivariate data sets. This method is implemented using a 5-layer feed-forward neural network introduced originally in the chemical engineering literature (Kramer, 1991). The method is described and details of its implementation are addressed. Non-Linear Principal Component Analysis is first applied to a data set sampled from the Lorenz attractor (1963). It is found that the NLPCA approximations are more representative of the data than are the corresponding PCA approximations. The same methodology was applied to the less known Lorenz attractor (1984). However, the results obtained weren't as good as those attained with the famous 'Butterfly' attractor. Further work with this model is underway in order to assess if NLPCA techniques can be more representative of the data characteristics than are the corresponding PCA approximations. The application of NLPCA to relatively 'simple' dynamical systems, such as those proposed by Lorenz, is well understood. However, the application of NLPCA to a large climatic data set is much more challenging. Here, we have applied NLPCA to the sea level pressure (SLP) field for the entire North Atlantic area and the results show a slight imcrement of explained variance associated. Finally, directions for future work are presented.%}
Novel selective TOCSY method enables NMR spectral elucidation of metabolomic mixtures
NASA Astrophysics Data System (ADS)
MacKinnon, Neil; While, Peter T.; Korvink, Jan G.
2016-11-01
Complex mixture analysis is routinely encountered in NMR-based investigations. With the aim of component identification, spectral complexity may be addressed chromatographically or spectroscopically, the latter being favored to reduce sample handling requirements. An attractive experiment is selective total correlation spectroscopy (sel-TOCSY), which is capable of providing tremendous spectral simplification and thereby enhancing assignment capability. Unfortunately, isolating a well resolved resonance is increasingly difficult as the complexity of the mixture increases and the assumption of single spin system excitation is no longer robust. We present TOCSY optimized mixture elucidation (TOOMIXED), a technique capable of performing spectral assignment particularly in the case where the assumption of single spin system excitation is relaxed. Key to the technique is the collection of a series of 1D sel-TOCSY experiments as a function of the isotropic mixing time (τm), resulting in a series of resonance intensities indicative of the underlying molecular structure. By comparing these τm -dependent intensity patterns with a library of pre-determined component spectra, one is able to regain assignment capability. After consideration of the technique's robustness, we tested TOOMIXED firstly on a model mixture. As a benchmark we were able to assign a molecule with high confidence in the case of selectively exciting an isolated resonance. Assignment confidence was not compromised when performing TOOMIXED on a resonance known to contain multiple overlapping signals, and in the worst case the method suggested a follow-up sel-TOCSY experiment to confirm an ambiguous assignment. TOOMIXED was then demonstrated on two realistic samples (whisky and urine), where under our conditions an approximate limit of detection of 0.6 mM was determined. Taking into account literature reports for the sel-TOCSY limit of detection, the technique should reach on the order of 10 μ M sensitivity. We anticipate this technique will be highly attractive to various analytical fields facing mixture analysis, including metabolomics, foodstuff analysis, pharmaceutical analysis, and forensics.
A computer program for cyclic plasticity and structural fatigue analysis
NASA Technical Reports Server (NTRS)
Kalev, I.
1980-01-01
A computerized tool for the analysis of time independent cyclic plasticity structural response, life to crack initiation prediction, and crack growth rate prediction for metallic materials is described. Three analytical items are combined: the finite element method with its associated numerical techniques for idealization of the structural component, cyclic plasticity models for idealization of the material behavior, and damage accumulation criteria for the fatigue failure.
The Importance of Practice in the Development of Statistics.
1983-01-01
RESOLUTION TEST CHART NATIONAL BUREAU OIF STANDARDS 1963 -A NRC Technical Summary Report #2471 C THE IMORTANCE OF PRACTICE IN to THE DEVELOPMENT OF STATISTICS...component analysis, bioassay, limits for a ratio, quality control, sampling inspection, non-parametric tests , transformation theory, ARIMA time series...models, sequential tests , cumulative sum charts, data analysis plotting techniques, and a resolution of the Bayes - frequentist controversy. It appears
Multivariate Quantitative Chemical Analysis
NASA Technical Reports Server (NTRS)
Kinchen, David G.; Capezza, Mary
1995-01-01
Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.
On the use of the hole-drilling technique for residual stress measurements in thin plates
NASA Technical Reports Server (NTRS)
Hampton, R. W.; Nelson, D. V.
1992-01-01
The strain gage blind hole-drilling technique may be used to determine residual stresses at and below the surface of components. In this paper, the hole-drilling analysis methodology for thick plates is reviewed, and experimental data are used to evaluate the methodology and to assess its applicability to thin plates. Data on the effects of gage pattern, surface preparation, hole spacing, hole eccentricity, and stress level are also presented.
NASA Astrophysics Data System (ADS)
Daniel, Amuthachelvi; Prakasarao, Aruna; Ganesan, Singaravelu
2018-02-01
The molecular level changes associated with oncogenesis precede the morphological changes in cells and tissues. Hence molecular level diagnosis would promote early diagnosis of the disease. Raman spectroscopy is capable of providing specific spectral signature of various biomolecules present in the cells and tissues under various pathological conditions. The aim of this work is to develop a non-linear multi-class statistical methodology for discrimination of normal, neoplastic and malignant cells/tissues. The tissues were classified as normal, pre-malignant and malignant by employing Principal Component Analysis followed by Artificial Neural Network (PC-ANN). The overall accuracy achieved was 99%. Further, to get an insight into the quantitative biochemical composition of the normal, neoplastic and malignant tissues, a linear combination of the major biochemicals by non-negative least squares technique was fit to the measured Raman spectra of the tissues. This technique confirms the changes in the major biomolecules such as lipids, nucleic acids, actin, glycogen and collagen associated with the different pathological conditions. To study the efficacy of this technique in comparison with histopathology, we have utilized Principal Component followed by Linear Discriminant Analysis (PC-LDA) to discriminate the well differentiated, moderately differentiated and poorly differentiated squamous cell carcinoma with an accuracy of 94.0%. And the results demonstrated that Raman spectroscopy has the potential to complement the good old technique of histopathology.
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
Komal
2018-05-01
Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
LaViolette, Peter S; Daun, Mitchell K; Paulson, Eric S; Schmainda, Kathleen M
2014-02-01
Abnormal brain tumor vasculature has recently been highlighted by a dynamic susceptibility contrast (DSC) MRI processing technique. The technique uses independent component analysis (ICA) to separate arterial and venous perfusion. The overlap of the two, i.e. arterio-venous overlap or AVOL, preferentially occurs in brain tumors and predicts response to anti-angiogenic therapy. The effects of contrast agent leakage on the AVOL biomarker have yet to be established. DSC was acquired during two separate contrast boluses in ten patients undergoing clinical imaging for brain tumor diagnosis. Three components were modeled with ICA, which included the arterial and venous components. The percentage of each component as well as a third component were determined within contrast enhancing tumor and compared. AVOL within enhancing tumor was also compared between doses. The percentage of enhancing tumor classified as not arterial or venous and instead into a third component with contrast agent leakage apparent in the time-series was significantly greater for the first contrast dose compared to the second. The amount of AVOL detected within enhancing tumor was also significantly greater with the second dose compared to the first. Contrast leakage results in large signal variance classified as a separate component by the ICA algorithm. The use of a second dose mitigates the effect and allows measurement of AVOL within enhancement.
Microstructural Analysis of Ti-6Al-4V Components Made by Electron Beam Additive Manufacturing
NASA Astrophysics Data System (ADS)
Coleman, Rashadd L.
Electron Beam Additive Manufacturing (EBAM) is a relatively new additive manufacturing (AM) technology that uses a high-energy electron beam to melt and fuse powders to build full-density parts in a layer by layer fashion. EBAM can fabricate metallic components, particularly, of complex shapes, in an efficient and cost-effective manner compared to conventional manufacturing means. EBAM is an enabling technology for rapid manufacturing (RM) of metallic components, and thus, can efficiently integrate the design and manufacturing of aerospace components. However, EBAM for aerospace-related applications remain limited because the effect of the EBAM process on part characteristics is not fully understood. In this study, various techniques including microhardness, optical microscopy (OM), X-ray diffraction (XRD), Scanning Electron Microscopy (SEM), and electron backscatter diffraction (EBSD) were used to characterize Ti-6Al-4V components processed using EBAM. The results were compared to Ti-6Al-4V components processed using conventional techniques. In this study it is shown that EBAM built Ti-64 components have increased hardness, elastic modulus, and yield strength compared to wrought Ti-6Al-4V. Further, it is also shown in this study that the horizontal build EBAM Ti-6Al-4V has increased hardness, elastic modulus, and yield strength compared to vertical build EBAM due to a preferential growth of the beta phase.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, D.G.; Sorensen, N.R.
1998-02-01
This report presents a nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel alloys from stainless steel alloys as well as an evaluation of cleaning techniques to remove a thermal oxide layer on aircraft exhaust components. The results of this assessment are presented in terms of how effective each technique classifies a known exhaust material. Results indicate that either inspection technique can separate inconel and stainless steel alloys. Based on the experiments conducted, the electrochemical spot test is the optimum for use by airframe and powerplant mechanics. A spot test procedure is proposed for incorporation into themore » Federal Aviation Administration Advisory Circular 65-9A Airframe & Powerplant Mechanic - General Handbook. 3 refs., 70 figs., 7 tabs.« less
Surface analysis characterisation of gum binders used in modern watercolour paints
NASA Astrophysics Data System (ADS)
Sano, Naoko; Cumpson, Peter J.
2016-02-01
Conducting this study has demonstrated that not only SEM-EDX but also XPS can be an efficient tool for characterising watercolour paint surfaces. We find that surface effects are mediated by water. Once the powdered components in the watercolour come into contact with water they dramatically transform their chemical structures at the surface and show the presence of pigment components with a random dispersion within the gum layer. Hence the topmost surface of the paint is confirmed as being composed of the gum binder components. This result is difficult to confirm using just one analytical technique (either XPS or SEM-EDX). In addition, peak fitting of C1s XPS spectra suggests that the gum binder in the commercial watercolour paints is probably gum arabic (by comparison with the reference materials). This identification is not conclusive, but the combination techniques of XPS and SEM shows the surface structure with material distribution of the gum binder and the other ingredients of the watercolour paints. Therefore as a unique technique, XPS combined with SEM-EDX may prove a useful method in the study of surface structure for not only watercolour objects but also other art objects; which may in future help in the conservation for art.
NASA Astrophysics Data System (ADS)
Kozikowski, Raymond T.; Smith, Sarah E.; Lee, Jennifer A.; Castleman, William L.; Sorg, Brian S.; Hahn, David W.
2012-06-01
Fluorescence spectroscopy has been widely investigated as a technique for identifying pathological tissue; however, unrelated subject-to-subject variations in spectra complicate data analysis and interpretation. We describe and evaluate a new biosensing technique, differential laser-induced perturbation spectroscopy (DLIPS), based on deep ultraviolet (UV) photochemical perturbation in combination with difference spectroscopy. This technique combines sequential fluorescence probing (pre- and post-perturbation) with sub-ablative UV perturbation and difference spectroscopy to provide a new spectral dimension, facilitating two improvements over fluorescence spectroscopy. First, the differential technique eliminates significant variations in absolute fluorescence response within subject populations. Second, UV perturbations alter the extracellular matrix (ECM), directly coupling the DLIPS response to the biological structure. Improved biosensing with DLIPS is demonstrated in vivo in a murine model of chemically induced skin lesion development. Component loading analysis of the data indicates that the DLIPS technique couples to structural proteins in the ECM. Analysis of variance shows that DLIPS has a significant response to emerging pathology as opposed to other population differences. An optimal likelihood ratio classifier for the DLIPS dataset shows that this technique holds promise for improved diagnosis of epithelial pathology. Results further indicate that DLIPS may improve diagnosis of tissue by augmenting fluorescence spectra (i.e. orthogonal sensing).
Preparing data for analysis using microsoft Excel.
Elliott, Alan C; Hynan, Linda S; Reisch, Joan S; Smith, Janet P
2006-09-01
A critical component essential to good research is the accurate and efficient collection and preparation of data for analysis. Most medical researchers have little or no training in data management, often causing not only excessive time spent cleaning data but also a risk that the data set contains collection or recording errors. The implementation of simple guidelines based on techniques used by professional data management teams will save researchers time and money and result in a data set better suited to answer research questions. Because Microsoft Excel is often used by researchers to collect data, specific techniques that can be implemented in Excel are presented.
NASA Astrophysics Data System (ADS)
Srivastava, Vishal; Mehta, D. S.
2013-02-01
To quantitatively obtain the phase map of Onion and human red blood cell (RBC) from white light interferogram we used Hilbert transform color fringe analysis technique. The three Red, Blue and Green color components are decomposed from single white light interferogram and Refractive index profile for Red, Blue and Green colour were computed in a completely non-invasive manner for Onion and human RBC. The present technique might be useful for non-invasive determination of the refractive index variation within cells and tissues and morphological features of sample with ease of operation and low cost.
PCC Framework for Program-Generators
NASA Technical Reports Server (NTRS)
Kong, Soonho; Choi, Wontae; Yi, Kwangkeun
2009-01-01
In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.
Three-dimensional human femoral strain analysis using ESPI
NASA Astrophysics Data System (ADS)
Tyrer, J. R.; Heras-Palou, C.; Slater, T.
With age, disease or injury the joints in the human body can wear out or bones may even fail catastrophically. In many cases it is possible to replace joints and bones with artificial components (prostheses). However, prosthetic joints can have a very limited life (often less than 10 years) and require replacement or 'revision'. In order to optimise prosthetic life, it is necessary to improve the design of components and implantation techniques, which is clearly also beneficial to both patients and hospitals.
Torres-González, Ahira; López-Rivera, Paulina; Duarte-Lisci, Georgina; López-Ramírez, Ángel; Correa-Benítez, Adriana; Rivero-Cruz, J Fausto
2016-01-01
A head space solid-phase microextraction method combined with gas chromatography-mass spectrometry was developed and optimised to extract and analyse volatile compounds of Melipona beecheii geopropolis. Seventy-three constituents were identified using this technique in the sample of geopropolis collected. The main compounds detected include β-fenchene (14.53-15.45%), styrene (8.72-9.98%), benzaldehyde (7.44-7.82%) and the most relevant volatile components presents at high level in the geopropolis were terpenoids (58.17%).
NASA Astrophysics Data System (ADS)
Zharinov, I. O.; Zharinov, O. O.
2017-12-01
The problem of the research is concerned with quantitative analysis of influence of technological variation of the screen color profile parameters on chromaticity coordinates of the displayed image. Some mathematical expressions which approximate the two-dimensional distribution of chromaticity coordinates of an image, which is displayed on the screen with a three-component color formation principle were proposed. Proposed mathematical expressions show the way to development of correction techniques to improve reproducibility of the colorimetric features of displays.
Safety considerations in the design and operation of large wind turbines
NASA Technical Reports Server (NTRS)
Reilly, D. H.
1979-01-01
The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.
NASA Astrophysics Data System (ADS)
Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin
2011-08-01
In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.
Standard surface-reflectance model and illuminant estimation
NASA Technical Reports Server (NTRS)
Tominaga, Shoji; Wandell, Brian A.
1989-01-01
A vector analysis technique was adopted to test the standard reflectance model. A computational model was developed to determine the components of the observed spectra and an estimate of the illuminant was obtained without using a reference white standard. The accuracy of the standard model is evaluated.
Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J
2001-08-01
Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.
Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei
2017-09-11
Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.
Fractal analysis of scatter imaging signatures to distinguish breast pathologies
NASA Astrophysics Data System (ADS)
Eguizabal, Alma; Laughney, Ashley M.; Krishnaswamy, Venkataramanan; Wells, Wendy A.; Paulsen, Keith D.; Pogue, Brian W.; López-Higuera, José M.; Conde, Olga M.
2013-02-01
Fractal analysis combined with a label-free scattering technique is proposed for describing the pathological architecture of tumors. Clinicians and pathologists are conventionally trained to classify abnormal features such as structural irregularities or high indices of mitosis. The potential of fractal analysis lies in the fact of being a morphometric measure of the irregular structures providing a measure of the object's complexity and self-similarity. As cancer is characterized by disorder and irregularity in tissues, this measure could be related to tumor growth. Fractal analysis has been probed in the understanding of the tumor vasculature network. This work addresses the feasibility of applying fractal analysis to the scattering power map (as a physical modeling) and principal components (as a statistical modeling) provided by a localized reflectance spectroscopic system. Disorder, irregularity and cell size variation in tissue samples is translated into the scattering power and principal components magnitude and its fractal dimension is correlated with the pathologist assessment of the samples. The fractal dimension is computed applying the box-counting technique. Results show that fractal analysis of ex-vivo fresh tissue samples exhibits separated ranges of fractal dimension that could help classifier combining the fractal results with other morphological features. This contrast trend would help in the discrimination of tissues in the intraoperative context and may serve as a useful adjunct to surgeons.
Statistical analysis of fNIRS data: a comprehensive review.
Tak, Sungho; Ye, Jong Chul
2014-01-15
Functional near-infrared spectroscopy (fNIRS) is a non-invasive method to measure brain activities using the changes of optical absorption in the brain through the intact skull. fNIRS has many advantages over other neuroimaging modalities such as positron emission tomography (PET), functional magnetic resonance imaging (fMRI), or magnetoencephalography (MEG), since it can directly measure blood oxygenation level changes related to neural activation with high temporal resolution. However, fNIRS signals are highly corrupted by measurement noises and physiology-based systemic interference. Careful statistical analyses are therefore required to extract neuronal activity-related signals from fNIRS data. In this paper, we provide an extensive review of historical developments of statistical analyses of fNIRS signal, which include motion artifact correction, short source-detector separation correction, principal component analysis (PCA)/independent component analysis (ICA), false discovery rate (FDR), serially-correlated errors, as well as inference techniques such as the standard t-test, F-test, analysis of variance (ANOVA), and statistical parameter mapping (SPM) framework. In addition, to provide a unified view of various existing inference techniques, we explain a linear mixed effect model with restricted maximum likelihood (ReML) variance estimation, and show that most of the existing inference methods for fNIRS analysis can be derived as special cases. Some of the open issues in statistical analysis are also described. Copyright © 2013 Elsevier Inc. All rights reserved.
High Temperature Transparent Furnace Development
NASA Technical Reports Server (NTRS)
Bates, Stephen C.
1997-01-01
This report describes the use of novel techniques for heat containment that could be used to build a high temperature transparent furnace. The primary objective of the work was to experimentally demonstrate transparent furnace operation at 1200 C. Secondary objectives were to understand furnace operation and furnace component specification to enable the design and construction of a low power prototype furnace for delivery to NASA in a follow-up project. The basic approach of the research was to couple high temperature component design with simple concept demonstration experiments that modify a commercially available transparent furnace rated at lower temperature. A detailed energy balance of the operating transparent furnace was performed, calculating heat losses through the furnace components as a result of conduction, radiation, and convection. The transparent furnace shells and furnace components were redesigned to permit furnace operation at at least 1200 C. Techniques were developed that are expected to lead to significantly improved heat containment compared with current transparent furnaces. The design of a thermal profile in a multizone high temperature transparent furnace design was also addressed. Experiments were performed to verify the energy balance analysis, to demonstrate some of the major furnace improvement techniques developed, and to demonstrate the overall feasibility of a high temperature transparent furnace. The important objective of the research was achieved: to demonstrate the feasibility of operating a transparent furnace at 1200 C.
Incipient fault detection study for advanced spacecraft systems
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.
1986-01-01
A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.
NASA Astrophysics Data System (ADS)
Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; McCleary, S. L.
1991-05-01
State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.
NASA Technical Reports Server (NTRS)
Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; Mccleary, S. L.
1991-01-01
State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.
Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M
2014-01-01
Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject “at rest”). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing “signal” (brain activity) can be distinguished form the “noise” components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX (“FMRIB’s ICA-based X-noiseifier”), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different Classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original data, to provide automated cleanup. On conventional resting-state fMRI (rfMRI) single-run datasets, FIX achieved about 95% overall accuracy. On high-quality rfMRI data from the Human Connectome Project, FIX achieves over 99% classification accuracy, and as a result is being used in the default rfMRI processing pipeline for generating HCP connectomes. FIX is publicly available as a plugin for FSL. PMID:24389422
NASA Technical Reports Server (NTRS)
Feldman, Sandra C.
1987-01-01
Methods of applying principal component (PC) analysis to high resolution remote sensing imagery were examined. Using Airborne Imaging Spectrometer (AIS) data, PC analysis was found to be useful for removing the effects of albedo and noise and for isolating the significant information on argillic alteration, zeolite, and carbonate minerals. An effective technique for using PC analysis using an input the first 16 AIS bands, 7 intermediate bands, and the last 16 AIS bands from the 32 flat field corrected bands between 2048 and 2337 nm. Most of the significant mineralogical information resided in the second PC. PC color composites and density sliced images provided a good mineralogical separation when applied to a AIS data set. Although computer intensive, the advantage of PC analysis is that it employs algorithms which already exist on most image processing systems.
High-Power Microwave Transmission and Mode Conversion Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vernon, Ronald J.
2015-08-14
This is a final technical report for a long term project to develop improved designs and design tools for the microwave hardware and components associated with the DOE Plasma Fusion Program. We have developed basic theory, software, fabrication techniques, and low-power measurement techniques for the design of microwave hardware associated gyrotrons, microwave mode converters and high-power microwave transmission lines. Specifically, in this report we discuss our work on designing quasi-optical mode converters for single and multiple frequencies, a new method for the analysis of perturbed-wall waveguide mode converters, perturbed-wall launcher design for TE0n mode gyrotrons, quasi-optical traveling-wave resonator design formore » high-power testing of microwave components, and possible improvements to the HSX microwave transmission line.« less
Laboratory and airborne techniques for measuring fluoresence of natural surfaces
NASA Technical Reports Server (NTRS)
Stoertz, G. E.; Hemphill, W. R.
1972-01-01
Techniques are described for obtaining fluorescence spectra from samples of natural surfaces that can be used to predict spectral regions in which these surfaces would emit solar-stimulated or laser-stimulated fluorescence detectable by remote sensor. Scattered or reflected stray light caused large errors in spectrofluorometer analysis or natural sample surfaces. Most spurious light components can be eliminated by recording successive fluorescence spectra for each sample, using identical instrument settings, first with an appropriate glass or gelatin filter on the excitation side of the sample, and subsequently with the same filter on the emission side of the sample. This technique appears more accurate than any alternative technique for testing the fluorescence of natural surfaces.
Dong, Wenjiang; Hu, Rongsuo; Chu, Zhong; Zhao, Jianping; Tan, Lehe
2017-11-01
This study investigated the effect of different drying techniques, namely, room-temperature drying (RTD), solar drying (SD), heat-pump drying (HPD), hot-air drying (HAD), and freeze drying (FD), on bioactive components, fatty acid composition, and the volatile compound profile of robusta coffee beans. The data showed that FD was an effective method to preserve fat, organic acids, and monounsaturated fatty acids. In contrast, HAD was ideal for retaining polyunsaturated fatty acids and amino acids. Sixty-two volatile compounds were identified in the differently dried coffee beans, representing 90% of the volatile compounds. HPD of the coffee beans produced the largest number of volatiles, whereas FD resulted in the highest volatile content. A principal component analysis demonstrated a close relationship between the HPD, SD, and RTD methods whereas the FD and HAD methods were significantly different. Overall, the results provide a basis for potential application to other similar thermal sensitive materials. Copyright © 2017 Elsevier Ltd. All rights reserved.
Learning Assumptions for Compositional Verification
NASA Technical Reports Server (NTRS)
Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)
2002-01-01
Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.
Advancements in tailored hot stamping simulations: Cooling channel and distortion analyses
NASA Astrophysics Data System (ADS)
Billur, Eren; Wang, Chao; Bloor, Colin; Holecek, Martin; Porzner, Harald; Altan, Taylan
2013-12-01
Hot stamped components have been widely used in the automotive industry in the last decade where ultra high strength is required. These parts, however, may not provide sufficient toughness to absorb crash energy. Therefore, these components are "tailored" by controlling the microstructure at various locations. Simulation of tailored hot stamped components requires more detailed analysis of microstructural changes. Furthermore, since the part is not uniformly quenched, severe distortion can be observed. CPF, together with ESI have developed a number of techniques to predict the final properties of a tailored part. This paper discusses the recent improvements in modeling distortion and die design with cooling channels.
Artifacts and noise removal in electrocardiograms using independent component analysis.
Chawla, M P S; Verma, H K; Kumar, Vinod
2008-09-26
Independent component analysis (ICA) is a novel technique capable of separating independent components from electrocardiogram (ECG) complex signals. The purpose of this analysis is to evaluate the effectiveness of ICA in removing artifacts and noise from ECG recordings. ICA is applied to remove artifacts and noise in ECG segments of either an individual ECG CSE data base file or all files. The reconstructed ECGs are compared with the original ECG signal. For the four special cases discussed, the R-Peak magnitudes of the CSE data base ECG waveforms before and after applying ICA are also found. In the results, it is shown that in most of the cases, the percentage error in reconstruction is very small. The results show that there is a significant improvement in signal quality, i.e. SNR. All the ECG recording cases dealt showed an improved ECG appearance after the use of ICA. This establishes the efficacy of ICA in elimination of noise and artifacts in electrocardiograms.
Long, Zhiying; Chen, Kewei; Wu, Xia; Reiman, Eric; Peng, Danling; Yao, Li
2009-02-01
Spatial Independent component analysis (sICA) has been widely used to analyze functional magnetic resonance imaging (fMRI) data. The well accepted implicit assumption is the spatially statistical independency of intrinsic sources identified by sICA, making the sICA applications difficult for data in which there exist interdependent sources and confounding factors. This interdependency can arise, for instance, from fMRI studies investigating two tasks in a single session. In this study, we introduced a linear projection approach and considered its utilization as a tool to separate task-related components from two-task fMRI data. The robustness and feasibility of the method are substantiated through simulation on computer data and fMRI real rest data. Both simulated and real two-task fMRI experiments demonstrated that sICA in combination with the projection method succeeded in separating spatially dependent components and had better detection power than pure model-based method when estimating activation induced by each task as well as both tasks.
NASA Astrophysics Data System (ADS)
Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.
2016-09-01
The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.
Wang, Gang; Teng, Chaolin; Li, Kuo; Zhang, Zhonglin; Yan, Xiangguo
2016-09-01
The recorded electroencephalography (EEG) signals are usually contaminated by electrooculography (EOG) artifacts. In this paper, by using independent component analysis (ICA) and multivariate empirical mode decomposition (MEMD), the ICA-based MEMD method was proposed to remove EOG artifacts (EOAs) from multichannel EEG signals. First, the EEG signals were decomposed by the MEMD into multiple multivariate intrinsic mode functions (MIMFs). The EOG-related components were then extracted by reconstructing the MIMFs corresponding to EOAs. After performing the ICA of EOG-related signals, the EOG-linked independent components were distinguished and rejected. Finally, the clean EEG signals were reconstructed by implementing the inverse transform of ICA and MEMD. The results of simulated and real data suggested that the proposed method could successfully eliminate EOAs from EEG signals and preserve useful EEG information with little loss. By comparing with other existing techniques, the proposed method achieved much improvement in terms of the increase of signal-to-noise and the decrease of mean square error after removing EOAs.
Performance analysis and prediction in triathlon.
Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B
2016-01-01
Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.
NASA Astrophysics Data System (ADS)
Cautun, Marius; van de Weygaert, Rien; Jones, Bernard J. T.; Frenk, Carlos S.
2014-07-01
The cosmic web is the largest scale manifestation of the anisotropic gravitational collapse of matter. It represents the transitional stage between linear and non-linear structures and contains easily accessible information about the early phases of structure formation processes. Here we investigate the characteristics and the time evolution of morphological components. Our analysis involves the application of the NEXUS Multiscale Morphology Filter technique, predominantly its NEXUS+ version, to high resolution and large volume cosmological simulations. We quantify the cosmic web components in terms of their mass and volume content, their density distribution and halo populations. We employ new analysis techniques to determine the spatial extent of filaments and sheets, like their total length and local width. This analysis identifies clusters and filaments as the most prominent components of the web. In contrast, while voids and sheets take most of the volume, they correspond to underdense environments and are devoid of group-sized and more massive haloes. At early times the cosmos is dominated by tenuous filaments and sheets, which, during subsequent evolution, merge together, such that the present-day web is dominated by fewer, but much more massive, structures. The analysis of the mass transport between environments clearly shows how matter flows from voids into walls, and then via filaments into cluster regions, which form the nodes of the cosmic web. We also study the properties of individual filamentary branches, to find long, almost straight, filaments extending to distances larger than 100 h-1 Mpc. These constitute the bridges between massive clusters, which seem to form along approximatively straight lines.
Rivera, Ana Leonor; Toledo-Roy, Juan C.; Ellis, Jason; Angelova, Maia
2017-01-01
Circadian rhythms become less dominant and less regular with chronic-degenerative disease, such that to accurately assess these pathological conditions it is important to quantify not only periodic characteristics but also more irregular aspects of the corresponding time series. Novel data-adaptive techniques, such as singular spectrum analysis (SSA), allow for the decomposition of experimental time series, in a model-free way, into a trend, quasiperiodic components and noise fluctuations. We compared SSA with the traditional techniques of cosinor analysis and intradaily variability using 1-week continuous actigraphy data in young adults with acute insomnia and healthy age-matched controls. The findings suggest a small but significant delay in circadian components in the subjects with acute insomnia, i.e. a larger acrophase, and alterations in the day-to-day variability of acrophase and amplitude. The power of the ultradian components follows a fractal 1/f power law for controls, whereas for those with acute insomnia this power law breaks down because of an increased variability at the 90min time scale, reminiscent of Kleitman’s basic rest-activity (BRAC) cycles. This suggests that for healthy sleepers attention and activity can be sustained at whatever time scale required by circumstances, whereas for those with acute insomnia this capacity may be impaired and these individuals need to rest or switch activities in order to stay focused. Traditional methods of circadian rhythm analysis are unable to detect the more subtle effects of day-to-day variability and ultradian rhythm fragmentation at the specific 90min time scale. PMID:28753669
NASA Astrophysics Data System (ADS)
Bordovsky, Michal; Catrysse, Peter; Dods, Steven; Freitas, Marcio; Klein, Jackson; Kotacka, Libor; Tzolov, Velko; Uzunov, Ivan M.; Zhang, Jiazong
2004-05-01
We present the state of the art for commercial design and simulation software in the 'front end' of photonic circuit design. One recent advance is to extend the flexibility of the software by using more than one numerical technique on the same optical circuit. There are a number of popular and proven techniques for analysis of photonic devices. Examples of these techniques include the Beam Propagation Method (BPM), the Coupled Mode Theory (CMT), and the Finite Difference Time Domain (FDTD) method. For larger photonic circuits, it may not be practical to analyze the whole circuit by any one of these methods alone, but often some smaller part of the circuit lends itself to at least one of these standard techniques. Later the whole problem can be analyzed on a unified platform. This kind of approach can enable analysis for cases that would otherwise be cumbersome, or even impossible. We demonstrate solutions for more complex structures ranging from the sub-component layout, through the entire device characterization, to the mask layout and its editing. We also present recent advances in the above well established techniques. This includes the analysis of nano-particles, metals, and non-linear materials by FDTD, photonic crystal design and analysis, and improved models for high concentration Er/Yb co-doped glass waveguide amplifiers.
Söhn, Matthias; Alber, Markus; Yan, Di
2007-09-01
The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as "eigenmodes," which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe approximately 94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses ( approximately 40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.
1981-01-01
A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less
Identification and Analysis of Bioactive Components of Fruit and Vegetable Products
ERIC Educational Resources Information Center
Mann, Francis M.
2015-01-01
Many small-molecule antioxidants found in whole fruits and vegetables are analyzed and identified in this laboratory module for upper-division biochemistry courses. During this experiment, students develop their knowledge of the bioactivity of fruit and vegetable products while learning techniques to identify vitamins and nutritionally derived…
An incremental economic analysis of establishing early successional habitat for biodiversity
Slayton W. Hazard-Daniel; Patrick Hiesl; Susan C. Loeb; Thomas J. Straka
2017-01-01
Early successional habitat (ESH) is an important component of natural landscapes and is crucial to maintaining biodiversity. ESH also impacts endangered species. The extent of forest disturbances resulting in ESH has been diminishing, and foresters have developed timber management regimes using standard silvicultural techniques that...
From Periodic Properties to a Periodic Table Arrangement
ERIC Educational Resources Information Center
Besalú, Emili
2013-01-01
A periodic table is constructed from the consideration of periodic properties and the application of the principal components analysis technique. This procedure is useful for objects classification and data reduction and has been used in the field of chemistry for many applications, such as lanthanides, molecules, or conformers classification.…
Testing a Conceptual Change Model Framework for Visual Data
ERIC Educational Resources Information Center
Finson, Kevin D.; Pedersen, Jon E.
2015-01-01
An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…
Ion beam analysis studies of desert varnish
NASA Astrophysics Data System (ADS)
Duerden, P.; Cohen, D. D.; Dragovich, D.; Clayton, E.
1986-04-01
The components of "desert varnish" cover on rock surfaces from western New South Wales have been investigated by a number of ion beam techniques. The elemental data obtained so far show significant changes in composition of F, Na, K, Ti, Ba and Mn when varnish is present on the rock surface.
2002-07-01
study concentrates on other available techniques to elucidate the association of the inorganic colloids and the bacterial components in order to further...the treated bacteria. This is consistent with the strong attraction expected between the silvered cells (with a large Hamaker constant) and their
An Expert System for On-Site Instructional Advice.
ERIC Educational Resources Information Center
Martindale, Elizabeth S.; Hofmeister, Alan M.
1988-01-01
Describes Written Language Consultant, an expert system designed to help teachers teach special education students how to write business letters. Three main components of the system are described, including entry of students' test scores; analysis of teachers' uses of classroom time and management techniques; and suggestions for improving test…
Agreement in Polar Motion Measurements During the MERIT Campaign
NASA Astrophysics Data System (ADS)
Pâquet, P.; Djurovic, D.; Techy, C.
From the original polar motion (PM) measurements performed during the MERIT campaign, the Chandler and the annual components are removed. The analysis of the residuals shows a high level of significant correlation between the various techniques mainly for phenomena ranging from 30 days to a few months.
Component Composition for Embedded Systems Using Semantic Aspect-Oriented Programming
2004-10-01
real - time systems for the defense community. Our research focused on Real-Time Java implementation and analysis techniques. Real-Time Java is important for the defense community because it holds out the promise of enabling developers to apply COTS Java technology to specialized military embedded systems. It also promises to allow the defense community to utilize a large Java-literate workforce for building defense systems. Our research has delivered several techniques that may make Real-Time Java a better platform for developing embedded
An improved infrared technique for sorting pecans
NASA Astrophysics Data System (ADS)
Graeve, Thorsten; Dereniak, Eustace L.; Lamonica, John A., Jr.
1991-10-01
This paper presents the results of a study of pecan spectral reflectances. It describes an experiment for measuring the contrast between several components of raw pecan product to be sorted. An analysis of the experimental data reveals high contrast ratios in the infrared spectrum, suggesting a potential improvement in sorting efficiency when separating pecan meat from shells. It is believed that this technique has the potential to dramatically improve the efficiency of current sorting machinery, and to reduce the cost of processing pecans for the consumer market.
Microwave techniques for measuring complex permittivity and permeability of materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guillon, P.
1995-08-01
Different materials are of fundamental importance to the aerospace, microwave, electronics and communications industries, and include for example microwave absorbing materials, antennas lenses and radomes, substrates for MMIC and microwave components and antennaes. Basic measurements for the complex permittivity and permeability of those homogeneous solid materials in the microwave spectral region are described including hardware, instrumentation and analysis. Elevated temperature measurements as well as measurements intercomparisons, with a discussion of the strengths and weaknesses of each techniques are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carla J. Miller
This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements
Analysis of heavy metal sources in soil using kriging interpolation on principal components.
Ha, Hoehun; Olson, James R; Bian, Ling; Rogerson, Peter A
2014-05-06
Anniston, Alabama has a long history of operation of foundries and other heavy industry. We assessed the extent of heavy metal contamination in soils by determining the concentrations of 11 heavy metals (Pb, As, Cd, Cr, Co, Cu, Mn, Hg, Ni, V, and Zn) based on 2046 soil samples collected from 595 industrial and residential sites. Principal Component Analysis (PCA) was adopted to characterize the distribution of heavy metals in soil in this region. In addition, a geostatistical technique (kriging) was used to create regional distribution maps for the interpolation of nonpoint sources of heavy metal contamination using geographical information system (GIS) techniques. There were significant differences found between sampling zones in the concentrations of heavy metals, with the exception of the levels of Ni. Three main components explaining the heavy metal variability in soils were identified. The results suggest that Pb, Cd, Cu, and Zn were associated with anthropogenic activities, such as the operations of some foundries and major railroads, which released these heavy metals, whereas the presence of Co, Mn, and V were controlled by natural sources, such as soil texture, pedogenesis, and soil hydrology. In general terms, the soil levels of heavy metals analyzed in this study were higher than those reported in previous studies in other industrial and residential communities.
NASA Technical Reports Server (NTRS)
Grissom, D. S.; Schneider, W. C.
1971-01-01
The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.
Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy
NASA Astrophysics Data System (ADS)
Jiang, Dejun; Zhao, Shusen; Shen, Jingling
2008-03-01
A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.
NASA Technical Reports Server (NTRS)
Schoenwald, Adam; Mohammed, Priscilla; Bradley, Damon; Piepmeier, Jeffrey; Wong, Englin; Gholian, Armen
2016-01-01
Radio-frequency interference (RFI) has negatively implicated scientific measurements across a wide variation passive remote sensing satellites. This has been observed in the L-band radiometers SMOS, Aquarius and more recently, SMAP [1, 2]. RFI has also been observed at higher frequencies such as K band [3]. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements [4]. This work explores the use of ICA (Independent Component Analysis) as a blind source separation technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.
RFI Detection and Mitigation using Independent Component Analysis as a Pre-Processor
NASA Technical Reports Server (NTRS)
Schoenwald, Adam J.; Gholian, Armen; Bradley, Damon C.; Wong, Mark; Mohammed, Priscilla N.; Piepmeier, Jeffrey R.
2016-01-01
Radio-frequency interference (RFI) has negatively impacted scientific measurements of passive remote sensing satellites. This has been observed in the L-band radiometers Soil Moisture and Ocean Salinity (SMOS), Aquarius and more recently, Soil Moisture Active Passive (SMAP). RFI has also been observed at higher frequencies such as K band. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements. This work explores the use of Independent Component Analysis (ICA) as a blind source separation (BSS) technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.
Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution
NASA Technical Reports Server (NTRS)
Zoladz, T. F.; Jones, J. H.; Jong, J.
1992-01-01
A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.
Colniță, Alia; Dina, Nicoleta Elena; Leopold, Nicolae; Vodnar, Dan Cristian; Bogdan, Diana; Porav, Sebastian Alin; David, Leontin
2017-09-01
Raman scattering and its particular effect, surface-enhanced Raman scattering (SERS), are whole-organism fingerprinting spectroscopic techniques that gain more and more popularity in bacterial detection. In this work, two relevant Gram-positive bacteria species, Lactobacillus casei ( L. casei ) and Listeria monocytogenes ( L. monocytogenes ) were characterized based on their Raman and SERS spectral fingerprints. The SERS spectra were used to identify the biochemical structures of the bacterial cell wall. Two synthesis methods of the SERS-active nanomaterials were used and the recorded spectra were analyzed. L. casei and L. monocytogenes were successfully discriminated by applying Principal Component Analysis (PCA) to their specific spectral data.
Leopold, Nicolae; Vodnar, Dan Cristian; Bogdan, Diana; Porav, Sebastian Alin; David, Leontin
2017-01-01
Raman scattering and its particular effect, surface-enhanced Raman scattering (SERS), are whole-organism fingerprinting spectroscopic techniques that gain more and more popularity in bacterial detection. In this work, two relevant Gram-positive bacteria species, Lactobacillus casei (L. casei) and Listeria monocytogenes (L. monocytogenes) were characterized based on their Raman and SERS spectral fingerprints. The SERS spectra were used to identify the biochemical structures of the bacterial cell wall. Two synthesis methods of the SERS-active nanomaterials were used and the recorded spectra were analyzed. L. casei and L. monocytogenes were successfully discriminated by applying Principal Component Analysis (PCA) to their specific spectral data. PMID:28862655
FTIR gas chromatographic analysis of perfumes
NASA Astrophysics Data System (ADS)
Diederich, H.; Stout, Phillip J.; Hill, Stephen L.; Krishnan, K.
1992-03-01
Perfumes, natural or synthetic, are complex mixtures consisting of numerous components. Gas chromatography (GC) and gas chromatography-mass spectrometry (GC-MS) techniques have been extensively utilized for the analysis of perfumes and essential oils. A limited number of perfume samples have also been analyzed by FT-IR gas chromatographic (GC-FTIR) techniques. Most of the latter studies have been performed using the conventional light pipe (LP) based GC-FTIR systems. In recent years, cold-trapping (in a matrix or neat) GC-FTIR systems have become available. The cold-trapping systems are capable of sub-nanogram sensitivities. In this paper, comparison data between the LP and the neat cold-trapping GC- FTIR systems is presented. The neat cold-trapping interface is known as Tracer. The results of GC-FTIR analysis of some commercial perfumes is also presented. For comparison of LP and Tracer GC-FTIR systems, a reference (synthetic) mixture containing 16 major and numerous minor constituents was used. The components of the mixture are the compounds commonly encountered in commercial perfumes. The GC-FTIR spectra of the reference mixture was obtained under identical chromatographic conditions from an LP and a Tracer system. A comparison of the two sets of data thus generated do indeed show the enhanced sensitivity level of the Tracer system. The comparison also shows that some of the major components detected by the Tracer system were absent from the LP data. Closer examination reveals that these compounds undergo thermal decomposition on contact with the hot gold surface that is part of the LP system. GC-FTIR data were obtained for three commercial perfume samples. The major components of these samples could easily be identified by spectra search against a digitized spectral library created using the Tracer data from the reference mixture.
NASA Astrophysics Data System (ADS)
Duarte, Janaina; Pacheco, Marcos T. T.; Silveira, Landulfo, Jr.; Machado, Rosangela Z.; Martins, Rodrigo A. L.; Zangaro, Renato A.; Villaverde, Antonio G. J. B.
2001-05-01
Near-infrared (NIR) Raman spectroscopy has been studied for the last years for many biomedical applications. It is a powerful tool for biological materials analysis. Toxoplasmosis is an important zoonosis in public health, cats being the principal responsible for the transmission of the disease in Brazil. The objective of this work is to investigate a new method of diagnosis of this disease. NIR Raman spectroscopy was used to detect anti Toxoplasma gondii antibodies in blood sera from domestic cats, without sample preparation. In all, six blood serum samples were used for this study. A previous serological test was done by the Indirect Immunoenzymatic Assay (ELISA) to permit a comparative study between both techniques and it showed that three serum samples were positive and the other three were negative to toxoplasmosis. Raman spectra were taken for all the samples and analyzed by using the principal components analysis (PCA). A diagnosis parameter was defined from the analysis of the second and third principal components of the Raman spectra. It was found that this parameter can detect the infection level of the animal. The results have indicated that NIR Raman spectroscopy, associated to the PCA can be a promising technique for serological analysis, such as toxoplasmosis, allowing a fast and sensitive method of diagnosis.
NASA Technical Reports Server (NTRS)
Tesch, W. A.; Moszee, R. H.; Steenken, W. G.
1976-01-01
NASA developed stability and frequency response analysis techniques were applied to a dynamic blade row compression component stability model to provide a more economic approach to surge line and frequency response determination than that provided by time-dependent methods. This blade row model was linearized and the Jacobian matrix was formed. The clean-inlet-flow stability characteristics of the compressors of two J85-13 engines were predicted by applying the alternate Routh-Hurwitz stability criterion to the Jacobian matrix. The predicted surge line agreed with the clean-inlet-flow surge line predicted by the time-dependent method to a high degree except for one engine at 94% corrected speed. No satisfactory explanation of this discrepancy was found. The frequency response of the linearized system was determined by evaluating its Laplace transfer function. The results of the linearized-frequency-response analysis agree with the time-dependent results when the time-dependent inlet total-pressure and exit-flow function amplitude boundary conditions are less than 1 percent and 3 percent, respectively. The stability analysis technique was extended to a two-sector parallel compressor model with and without interstage crossflow and predictions were carried out for total-pressure distortion extents of 180 deg, 90 deg, 60 deg, and 30 deg.
Mantini, D; Franciotti, R; Romani, G L; Pizzella, V
2008-03-01
The major limitation for the acquisition of high-quality magnetoencephalography (MEG) recordings is the presence of disturbances of physiological and technical origins: eye movements, cardiac signals, muscular contractions, and environmental noise are serious problems for MEG signal analysis. In the last years, multi-channel MEG systems have undergone rapid technological developments in terms of noise reduction, and many processing methods have been proposed for artifact rejection. Independent component analysis (ICA) has already shown to be an effective and generally applicable technique for concurrently removing artifacts and noise from the MEG recordings. However, no standardized automated system based on ICA has become available so far, because of the intrinsic difficulty in the reliable categorization of the source signals obtained with this technique. In this work, approximate entropy (ApEn), a measure of data regularity, is successfully used for the classification of the signals produced by ICA, allowing for an automated artifact rejection. The proposed method has been tested using MEG data sets collected during somatosensory, auditory and visual stimulation. It was demonstrated to be effective in attenuating both biological artifacts and environmental noise, in order to reconstruct clear signals that can be used for improving brain source localizations.
Support vector machine and principal component analysis for microarray data classification
NASA Astrophysics Data System (ADS)
Astuti, Widi; Adiwijaya
2018-03-01
Cancer is a leading cause of death worldwide although a significant proportion of it can be cured if it is detected early. In recent decades, technology called microarray takes an important role in the diagnosis of cancer. By using data mining technique, microarray data classification can be performed to improve the accuracy of cancer diagnosis compared to traditional techniques. The characteristic of microarray data is small sample but it has huge dimension. Since that, there is a challenge for researcher to provide solutions for microarray data classification with high performance in both accuracy and running time. This research proposed the usage of Principal Component Analysis (PCA) as a dimension reduction method along with Support Vector Method (SVM) optimized by kernel functions as a classifier for microarray data classification. The proposed scheme was applied on seven data sets using 5-fold cross validation and then evaluation and analysis conducted on term of both accuracy and running time. The result showed that the scheme can obtained 100% accuracy for Ovarian and Lung Cancer data when Linear and Cubic kernel functions are used. In term of running time, PCA greatly reduced the running time for every data sets.
Differentiation of tea varieties using UV-Vis spectra and pattern recognition techniques
NASA Astrophysics Data System (ADS)
Palacios-Morillo, Ana; Alcázar, Ángela.; de Pablos, Fernando; Jurado, José Marcos
2013-02-01
Tea, one of the most consumed beverages all over the world, is of great importance in the economies of a number of countries. Several methods have been developed to classify tea varieties or origins based in pattern recognition techniques applied to chemical data, such as metal profile, amino acids, catechins and volatile compounds. Some of these analytical methods become tedious and expensive to be applied in routine works. The use of UV-Vis spectral data as discriminant variables, highly influenced by the chemical composition, can be an alternative to these methods. UV-Vis spectra of methanol-water extracts of tea have been obtained in the interval 250-800 nm. Absorbances have been used as input variables. Principal component analysis was used to reduce the number of variables and several pattern recognition methods, such as linear discriminant analysis, support vector machines and artificial neural networks, have been applied in order to differentiate the most common tea varieties. A successful classification model was built by combining principal component analysis and multilayer perceptron artificial neural networks, allowing the differentiation between tea varieties. This rapid and simple methodology can be applied to solve classification problems in food industry saving economic resources.
Soler, C; García-Molina, A; Contell, J; Silvestre, M A; Sancho, M
2015-07-01
Evaluation of sperm morphology is a fundamental component of semen analysis, but its real significance has been obscured by a plethora of techniques that involve fixation and staining procedures that induce artefacts. Here we describe Trumorph℗®, a new method for sperm morphology assessment that is based upon examination of wet preparations of living spermatozoa immobilized by a short 60°C shock using negative phase contrast microscopy. We have observed samples from five animals of the following species: bull, boar, goat and rabbit. In every case, all the components of the sperm head and tail were perfectly defined, including the acrosome and midpiece (in all its length, including cytoplasmic droplets). A range of morphological forms was observed, similar to those found by conventional fixed and stained preparations, but other forms were found, distinguishable only by the optics used. The ease of preparation makes it a robust method applicable for analysis of living unmodified spermatozoa in a range of situations. Subsequent studies on well-characterized samples are required to describe the morphology of potentially fertilizing spermatozoa. Copyright © 2015 Elsevier B.V. All rights reserved.
Correlation between grade of pearlite spheroidization and laser induced spectra
NASA Astrophysics Data System (ADS)
Yao, Shunchun; Dong, Meirong; Lu, Jidong; Li, Jun; Dong, Xuan
2013-12-01
Laser induced breakdown spectroscopy (LIBS) which is used traditionally as a spectrochemical analytical technique was employed to analyze the grade of pearlite spheroidization. Three 12Cr1MoV steel specimens with different grades of pearlite spheroidization were ablated to produce plasma by pulse laser at 266 nm. In order to determine the optimal temporal condition and plasma parameters for correlating the grade of pearlite spheroidization and laser induced spectra, a set of spectra at different delays were analyzed by the principal component analysis method. Then, the relationship between plasma temperature, intensity ratios of ionic to atomic lines and grade of pearlite spheroidization was studied. The analysis results show that the laser induced spectra of different grades of pearlite spheroidization can be readily identifiable by principal component analysis in the range of 271.941-289.672 nm with 1000 ns delay time. It is also found that a good agreement exists between the Fe ionic to atomic line ratios and the tensile strength, whereas there is no obvious difference in the plasma temperature. Therefore, LIBS may be applied not only as a spectrochemical analytical technique but also as a new way to estimate the grade of pearlite spheroidization.
Alizadeh-Pasdar, Nooshin; Nakai, Shuryo; Li-Chan, Eunice C Y
2002-10-09
Raman spectroscopy was used to elucidate structural changes of beta-lactoglobulin (BLG), whey protein isolate (WPI), and bovine serum albumin (BSA), at 15% concentration, as a function of pH (5.0, 7.0, and 9.0), heating (80 degrees C, 30 min), and presence of 0.24% kappa-carrageenan. Three data-processing techniques were used to assist in identifying significant changes in Raman spectral data. Analysis of variance showed that of 12 characteristics examined in the Raman spectra, only a few were significantly affected by pH, heating, kappa-carrageenan, and their interactions. These included amide I (1658 cm(-1)) for WPI and BLG, alpha-helix for BLG and BSA, beta-sheet for BSA, CH stretching (2880 cm(-1)) for BLG and BSA, and CH stretching (2930 cm(-1)) for BSA. Principal component analysis reduced dimensionality of the characteristics. Heating and its interaction with kappa-carrageenan were identified as the most influential in overall structure of the whey proteins, using principal component similarity analysis.
Parallel group independent component analysis for massive fMRI data sets.
Chen, Shaojie; Huang, Lei; Qiu, Huitong; Nebel, Mary Beth; Mostofsky, Stewart H; Pekar, James J; Lindquist, Martin A; Eloyan, Ani; Caffo, Brian S
2017-01-01
Independent component analysis (ICA) is widely used in the field of functional neuroimaging to decompose data into spatio-temporal patterns of co-activation. In particular, ICA has found wide usage in the analysis of resting state fMRI (rs-fMRI) data. Recently, a number of large-scale data sets have become publicly available that consist of rs-fMRI scans from thousands of subjects. As a result, efficient ICA algorithms that scale well to the increased number of subjects are required. To address this problem, we propose a two-stage likelihood-based algorithm for performing group ICA, which we denote Parallel Group Independent Component Analysis (PGICA). By utilizing the sequential nature of the algorithm and parallel computing techniques, we are able to efficiently analyze data sets from large numbers of subjects. We illustrate the efficacy of PGICA, which has been implemented in R and is freely available through the Comprehensive R Archive Network, through simulation studies and application to rs-fMRI data from two large multi-subject data sets, consisting of 301 and 779 subjects respectively.
Classification of breast tissue in mammograms using efficient coding.
Costa, Daniel D; Campos, Lúcio F; Barros, Allan K
2011-06-24
Female breast cancer is the major cause of death by cancer in western countries. Efforts in Computer Vision have been made in order to improve the diagnostic accuracy by radiologists. Some methods of lesion diagnosis in mammogram images were developed based in the technique of principal component analysis which has been used in efficient coding of signals and 2D Gabor wavelets used for computer vision applications and modeling biological vision. In this work, we present a methodology that uses efficient coding along with linear discriminant analysis to distinguish between mass and non-mass from 5090 region of interest from mammograms. The results show that the best rates of success reached with Gabor wavelets and principal component analysis were 85.28% and 87.28%, respectively. In comparison, the model of efficient coding presented here reached up to 90.07%. Altogether, the results presented demonstrate that independent component analysis performed successfully the efficient coding in order to discriminate mass from non-mass tissues. In addition, we have observed that LDA with ICA bases showed high predictive performance for some datasets and thus provide significant support for a more detailed clinical investigation.
Varietal discrimination of hop pellets by near and mid infrared spectroscopy.
Machado, Julio C; Faria, Miguel A; Ferreira, Isabel M P L V O; Páscoa, Ricardo N M J; Lopes, João A
2018-04-01
Hop is one of the most important ingredients of beer production and several varieties are commercialized. Therefore, it is important to find an eco-real-time-friendly-low-cost technique to distinguish and discriminate hop varieties. This paper describes the development of a method based on vibrational spectroscopy techniques, namely near- and mid-infrared spectroscopy, for the discrimination of 33 commercial hop varieties. A total of 165 samples (five for each hop variety) were analysed by both techniques. Principal component analysis, hierarchical cluster analysis and partial least squares discrimination analysis were the chemometric tools used to discriminate positively the hop varieties. After optimizing the spectral regions and pre-processing methods a total of 94.2% and 96.6% correct hop varieties discrimination were obtained for near- and mid-infrared spectroscopy, respectively. The results obtained demonstrate the suitability of these vibrational spectroscopy techniques to discriminate different hop varieties and consequently their potential to be used as an authenticity tool. Compared with the reference procedures normally used for hops variety discrimination these techniques are quicker, cost-effective, non-destructive and eco-friendly. Copyright © 2017 Elsevier B.V. All rights reserved.
Failure Diagnosis for the Holdup Tank System via ISFA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Huijuan; Bragg-Sitton, Shannon; Smidts, Carol
This paper discusses the use of the integrated system failure analysis (ISFA) technique for fault diagnosis for the holdup tank system. ISFA is a simulation-based, qualitative and integrated approach used to study fault propagation in systems containing both hardware and software subsystems. The holdup tank system consists of a tank containing a fluid whose level is controlled by an inlet valve and an outlet valve. We introduce the component and functional models of the system, quantify the main parameters and simulate possible failure-propagation paths based on the fault propagation approach, ISFA. The results show that most component failures in themore » holdup tank system can be identified clearly and that ISFA is viable as a technique for fault diagnosis. Since ISFA is a qualitative technique that can be used in the very early stages of system design, this case study provides indications that it can be used early to study design aspects that relate to robustness and fault tolerance.« less
A diagnostic analysis of the VVP single-doppler retrieval technique
NASA Technical Reports Server (NTRS)
Boccippio, Dennis J.
1995-01-01
A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.
Neutron spectrometry for UF 6 enrichment verification in storage cylinders
Mengesha, Wondwosen; Kiff, Scott D.
2015-01-29
Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less
The banana code—natural blend processing in the olfactory circuitry of Drosophila melanogaster
Schubert, Marco; Hansson, Bill S.; Sachse, Silke
2014-01-01
Odor information is predominantly perceived as complex odor blends. For Drosophila melanogaster one of the most attractive blends is emitted by an over-ripe banana. To analyze how the fly's olfactory system processes natural blends we combined the experimental advantages of gas chromatography and functional imaging (GC-I). In this way, natural banana compounds were presented successively to the fly antenna in close to natural occurring concentrations. This technique allowed us to identify the active odor components, use these compounds as stimuli and measure odor-induced Ca2+ signals in input and output neurons of the Drosophila antennal lobe (AL), the first olfactory neuropil. We demonstrate that mixture interactions of a natural blend are very rare and occur only at the AL output level resulting in a surprisingly linear blend representation. However, the information regarding single components is strongly modulated by the olfactory circuitry within the AL leading to a higher similarity between the representation of individual components and the banana blend. This observed modulation might tune the olfactory system in a way to distinctively categorize odor components and improve the detection of suitable food sources. Functional GC-I thus enables analysis of virtually any unknown natural odorant blend and its components in their relative occurring concentrations and allows characterization of neuronal responses of complete neural assemblies. This technique can be seen as a valuable complementary method to classical GC/electrophysiology techniques, and will be a highly useful tool in future investigations of insect-insect and insect-plant chemical interactions. PMID:24600405
Spectral decomposition of AVIRIS data
NASA Technical Reports Server (NTRS)
Gaddis, Lisa; Soderblom, Laurence; Kieffer, Hugh; Becker, Kris; Torson, Jim; Mullins, Kevin
1993-01-01
A set of techniques is presented that uses only information contained within a raw Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) scene to estimate and to remove additive components such as multiple scattering and instrument dark current. Multiplicative components (instrument gain, topographic modulation of brightness, atmospheric transmission) can then be normalized, permitting enhancement, extraction, and identification of relative reflectance information related to surface composition and mineralogy. The technique for derivation of additive-component spectra from a raw AVIRIS scene is an adaption of the 'regression intersection method' of Crippen. This method uses two surface units that are spatially extensive, and located in rugged terrain. For a given wavelength pair, subtraction of the derived additive component from individual band values will remove topography in both regions in a band/band ratio image. Normalization of all spectra in the scene to the average scene spectrum then results in cancellation of multiplicative components and production of a relative-reflectance scene. The resulting AVIRIS product contains relative-reflectance features due to mineral absorption that depart from the average spectrum. These features commonly are extremely weak and difficult to recognize, but they can be enhanced by using two simple 3-D image-processing tools. The validity of these techniques will be demonstrated by comparisons between relative-reflectance AVIRIS spectra and those derived by using JPL standard calibrations. The AVIRIS data used in this analysis were acquired over the Kelso Dunes area (34 deg 55' N, 115 deg 43' W) of the eastern Mojave Desert, CA (in 1987) and the Upheaval Dome area (38 deg 27' N, 109 deg 55' W) of the Canyonlands National Park, UT (in 1991).
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
Bayesian analysis of anisotropic cosmologies: Bianchi VIIh and WMAP
NASA Astrophysics Data System (ADS)
McEwen, J. D.; Josset, T.; Feeney, S. M.; Peiris, H. V.; Lasenby, A. N.
2013-12-01
We perform a definitive analysis of Bianchi VIIh cosmologies with Wilkinson Microwave Anisotropy Probe (WMAP) observations of the cosmic microwave background (CMB) temperature anisotropies. Bayesian analysis techniques are developed to study anisotropic cosmologies using full-sky and partial-sky masked CMB temperature data. We apply these techniques to analyse the full-sky internal linear combination (ILC) map and a partial-sky masked W-band map of WMAP 9 yr observations. In addition to the physically motivated Bianchi VIIh model, we examine phenomenological models considered in previous studies, in which the Bianchi VIIh parameters are decoupled from the standard cosmological parameters. In the two phenomenological models considered, Bayes factors of 1.7 and 1.1 units of log-evidence favouring a Bianchi component are found in full-sky ILC data. The corresponding best-fitting Bianchi maps recovered are similar for both phenomenological models and are very close to those found in previous studies using earlier WMAP data releases. However, no evidence for a phenomenological Bianchi component is found in the partial-sky W-band data. In the physical Bianchi VIIh model, we find no evidence for a Bianchi component: WMAP data thus do not favour Bianchi VIIh cosmologies over the standard Λ cold dark matter (ΛCDM) cosmology. It is not possible to discount Bianchi VIIh cosmologies in favour of ΛCDM completely, but we are able to constrain the vorticity of physical Bianchi VIIh cosmologies at (ω/H)0 < 8.6 × 10-10 with 95 per cent confidence.
NASA Astrophysics Data System (ADS)
Molina-Aguilera, A.; Mancilla, F. D. L.; Julià, J.; Morales, J.
2017-12-01
Joint inversion techniques of P-receiver functions and wave dispersion data implicitly assume an isotropic radial stratified earth. The conventional approach invert stacked radial component receiver functions from different back-azimuths to obtain a laterally homogeneous single-velocity model. However, in the presence of strong lateral heterogeneities as anisotropic layers and/or dipping interfaces, receiver functions are considerably perturbed and both the radial and transverse components exhibit back azimuthal dependences. Harmonic analysis methods exploit these azimuthal periodicities to separate the effects due to the isotropic flat-layered structure from those effects caused by lateral heterogeneities. We implement a harmonic analysis method based on radial and transverse receiver functions components and carry out a synthetic study to illuminate the capabilities of the method in isolating the isotropic flat-layered part of receiver functions and constrain the geometry and strength of lateral heterogeneities. The independent of the baz P receiver function are jointly inverted with phase and group dispersion curves using a linearized inversion procedure. We apply this approach to high dense seismic profiles ( 2 km inter-station distance, see figure) located in the central Betics (western Mediterranean region), a region which has experienced complex geodynamic processes and exhibit strong variations in Moho topography. The technique presented here is robust and can be applied systematically to construct a 3-D model of the crust and uppermost mantle across large networks.
Joining X-Ray to Lensing: An Accurate Combined Analysis of MACS J0416.1-2403
NASA Astrophysics Data System (ADS)
Bonamigo, M.; Grillo, C.; Ettori, S.; Caminha, G. B.; Rosati, P.; Mercurio, A.; Annunziatella, M.; Balestra, I.; Lombardi, M.
2017-06-01
We present a novel approach for a combined analysis of X-ray and gravitational lensing data and apply this technique to the merging galaxy cluster MACS J0416.1-2403. The method exploits the information on the intracluster gas distribution that comes from a fit of the X-ray surface brightness and then includes the hot gas as a fixed mass component in the strong-lensing analysis. With our new technique, we can separate the collisional from the collision-less diffuse mass components, thus obtaining a more accurate reconstruction of the dark matter distribution in the core of a cluster. We introduce an analytical description of the X-ray emission coming from a set of dual pseudo-isothermal elliptical mass distributions, which can be directly used in most lensing softwares. By combining Chandra observations with Hubble Frontier Fields imaging and Multi Unit Spectroscopic Explorer spectroscopy in MACS J0416.1-2403, we measure a projected gas-to-total mass fraction of approximately 10% at 350 kpc from the cluster center. Compared to the results of a more traditional cluster mass model (diffuse halos plus member galaxies), we find a significant difference in the cumulative projected mass profile of the dark matter component and that the dark matter over total mass fraction is almost constant, out to more than 350 kpc. In the coming era of large surveys, these results show the need of multiprobe analyses for detailed dark matter studies in galaxy clusters.
Artifact suppression and analysis of brain activities with electroencephalography signals.
Rashed-Al-Mahfuz, Md; Islam, Md Rabiul; Hirose, Keikichi; Molla, Md Khademul Islam
2013-06-05
Brain-computer interface is a communication system that connects the brain with computer (or other devices) but is not dependent on the normal output of the brain (i.e., peripheral nerve and muscle). Electro-oculogram is a dominant artifact which has a significant negative influence on further analysis of real electroencephalography data. This paper presented a data adaptive technique for artifact suppression and brain wave extraction from electroencephalography signals to detect regional brain activities. Empirical mode decomposition based adaptive thresholding approach was employed here to suppress the electro-oculogram artifact. Fractional Gaussian noise was used to determine the threshold level derived from the analysis data without any training. The purified electroencephalography signal was composed of the brain waves also called rhythmic components which represent the brain activities. The rhythmic components were extracted from each electroencephalography channel using adaptive wiener filter with the original scale. The regional brain activities were mapped on the basis of the spatial distribution of rhythmic components, and the results showed that different regions of the brain are activated in response to different stimuli. This research analyzed the activities of a single rhythmic component, alpha with respect to different motor imaginations. The experimental results showed that the proposed method is very efficient in artifact suppression and identifying individual motor imagery based on the activities of alpha component.
Levis, Denise M; Westbrook, Kyresa
2013-01-01
Many health organizations and practitioners in the United States promote preconception health (PCH) to consumers. However, summaries and evaluations of PCH promotional activities are limited. We conducted a content analysis of PCH health education materials collected from local-, state-, national-, and federal-level partners by using an existing database of partners, outreach to maternal and child health organizations, and a snowball sampling technique. Not applicable. Not applicable. Thirty-two materials were included for analysis, based on inclusion/exclusion criteria. A codebook guided coding of materials' characteristics (type, authorship, language, cost), use of marketing and behavioral strategies to reach the target population (target audience, message framing, call to action), and inclusion of PCH subject matter (clinical-behavioral components). The self-assessment of PCH behaviors was the most common material (28%) to appear in the sample. Most materials broadly targeted women, and there was a near-equal distribution in targeting by pregnancy planning status segments (planners and nonplanners). "Practicing PCH benefits the baby's health" was the most common message frame used. Materials contained a wide range of clinical-behavioral components. Strategic targeting of subgroups of consumers is an important but overlooked strategy. More research is needed around PCH components, in terms of packaging and increasing motivation, which could guide use and placement of clinical-behavioral components within promotional materials.
NASA Astrophysics Data System (ADS)
Lipovsky, B.; Funning, G. J.
2009-12-01
We compare several techniques for the analysis of geodetic time series with the ultimate aim to characterize the physical processes which are represented therein. We compare three methods for the analysis of these data: Principal Component Analysis (PCA), Non-Linear PCA (NLPCA), and Rotated PCA (RPCA). We evaluate each method by its ability to isolate signals which may be any combination of low amplitude (near noise level), temporally transient, unaccompanied by seismic emissions, and small scale with respect to the spatial domain. PCA is a powerful tool for extracting structure from large datasets which is traditionally realized through either the solution of an eigenvalue problem or through iterative methods. PCA is an transformation of the coordinate system of our data such that the new "principal" data axes retain maximal variance and minimal reconstruction error (Pearson, 1901; Hotelling, 1933). RPCA is achieved by an orthogonal transformation of the principal axes determined in PCA. In the analysis of meteorological data sets, RPCA has been seen to overcome domain shape dependencies, correct for sampling errors, and to determine principal axes which more closely represent physical processes (e.g., Richman, 1986). NLPCA generalizes PCA such that principal axes are replaced by principal curves (e.g., Hsieh 2004). We achieve NLPCA through an auto-associative feed-forward neural network (Scholz, 2005). We show the geophysical relevance of these techniques by application of each to a synthetic data set. Results are compared by inverting principal axes to determine deformation source parameters. Temporal variability in source parameters, estimated by each method, are also compared.
Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L
2010-04-16
The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.
Tensorial extensions of independent component analysis for multisubject FMRI analysis.
Beckmann, C F; Smith, S M
2005-03-01
We discuss model-free analysis of multisubject or multisession FMRI data by extending the single-session probabilistic independent component analysis model (PICA; Beckmann and Smith, 2004. IEEE Trans. on Medical Imaging, 23 (2) 137-152) to higher dimensions. This results in a three-way decomposition that represents the different signals and artefacts present in the data in terms of their temporal, spatial, and subject-dependent variations. The technique is derived from and compared with parallel factor analysis (PARAFAC; Harshman and Lundy, 1984. In Research methods for multimode data analysis, chapter 5, pages 122-215. Praeger, New York). Using simulated data as well as data from multisession and multisubject FMRI studies we demonstrate that the tensor PICA approach is able to efficiently and accurately extract signals of interest in the spatial, temporal, and subject/session domain. The final decompositions improve upon PARAFAC results in terms of greater accuracy, reduced interference between the different estimated sources (reduced cross-talk), robustness (against deviations of the data from modeling assumptions and against overfitting), and computational speed. On real FMRI 'activation' data, the tensor PICA approach is able to extract plausible activation maps, time courses, and session/subject modes as well as provide a rich description of additional processes of interest such as image artefacts or secondary activation patterns. The resulting data decomposition gives simple and useful representations of multisubject/multisession FMRI data that can aid the interpretation and optimization of group FMRI studies beyond what can be achieved using model-based analysis techniques.