Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
NASA Technical Reports Server (NTRS)
Goldstein, Arthur W; Alpert, Sumner; Beede, William; Kovach, Karl
1949-01-01
In order to understand the operation and the interaction of jet-engine components during engine operation and to determine how component characteristics may be used to compute engine performance, a method to analyze and to estimate performance of such engines was devised and applied to the study of the characteristics of a research turbojet engine built for this investigation. An attempt was made to correlate turbine performance obtained from engine experiments with that obtained by the simpler procedure of separately calibrating the turbine with cold air as a driving fluid in order to investigate the applicability of component calibration. The system of analysis was also applied to prediction of the engine and component performance with assumed modifications of the burner and bearing characteristics, to prediction of component and engine operation during engine acceleration, and to estimates of the performance of the engine and the components when the exhaust gas was used to drive a power turbine.
Why Does Behavioral Instruction Work? A Component Analysis of Performance and Motivational Outcomes.
ERIC Educational Resources Information Center
Omelich, Carol L.; Covington, Martin V.
Two fundamental components of behavioral instruction were investigated: the reported testing feature and absolute performance standards. The component analysis was conducted by offering an undergraduate psychology course simultaneously along two dimensions: grading systems and number of study/test cycles. The 425 college student subjects were…
Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.
2010-01-01
The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284
78 FR 8150 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... three components: the ``Design and Implementation Study,'' the ``Performance Analysis Study,'' and the...- Component Evaluation--Data Collection Related to the Performance Analysis Study and the Impact and the In-depth Implementation Study. OMB No.: 0970-0398 Description: The Office of Data Analysis, Research, and...
A Component Analysis of the Impact of Evaluative and Objective Feedback on Performance
ERIC Educational Resources Information Center
Johnson, Douglas A.
2013-01-01
Despite the frequency with which performance feedback interventions are used in organizational behavior management, component analyses of such feedback are rare. It has been suggested that evaluation of performance and objective details about performance are two necessary components for performance feedback. The present study was designed to help…
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC: NASA Design and Analysis of Rotorcraft. Appendix 3; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet speci?ed requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft con?gurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates con?guration ?exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-?delity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy speci?ed design conditions and missions. The analysis tasks can include off-design mission performance calculation, ?ight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft con?gurations is facilitated, while retaining the capability to model novel and advanced concepts. Speci?c rotorcraft con?gurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-?delity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne R.
2009-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC - NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2015-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
Independent component analysis decomposition of hospital emergency department throughput measures
NASA Astrophysics Data System (ADS)
He, Qiang; Chu, Henry
2016-05-01
We present a method adapted from medical sensor data analysis, viz. independent component analysis of electroencephalography data, to health system analysis. Timely and effective care in a hospital emergency department is measured by throughput measures such as median times patients spent before they were admitted as an inpatient, before they were sent home, before they were seen by a healthcare professional. We consider a set of five such measures collected at 3,086 hospitals distributed across the U.S. One model of the performance of an emergency department is that these correlated throughput measures are linear combinations of some underlying sources. The independent component analysis decomposition of the data set can thus be viewed as transforming a set of performance measures collected at a site to a collection of outputs of spatial filters applied to the whole multi-measure data. We compare the independent component sources with the output of the conventional principal component analysis to show that the independent components are more suitable for understanding the data sets through visualizations.
A case study in nonconformance and performance trend analysis
NASA Technical Reports Server (NTRS)
Maloy, Joseph E.; Newton, Coy P.
1990-01-01
As part of NASA's effort to develop an agency-wide approach to trend analysis, a pilot nonconformance and performance trending analysis study was conducted on the Space Shuttle auxiliary power unit (APU). The purpose of the study was to (1) demonstrate that nonconformance analysis can be used to identify repeating failures of a specific item (and the associated failure modes and causes) and (2) determine whether performance parameters could be analyzed and monitored to provide an indication of component or system degradation prior to failure. The nonconformance analysis of the APU did identify repeating component failures, which possibly could be reduced if key performance parameters were monitored and analyzed. The performance-trending analysis verified that the characteristics of hardware parameters can be effective in detecting degradation of hardware performance prior to failure.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Performance analysis and prediction in triathlon.
Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B
2016-01-01
Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.
ERIC Educational Resources Information Center
Wilson, Mark V.; Wilson, Erin
2017-01-01
In this work we describe an authentic performance project for Instrumental Analysis in which students designed, built, and tested spectrophotometers made from simple components. The project addressed basic course content such as instrument design principles, UV-vis spectroscopy, and spectroscopic instrument components as well as skills such as…
Computing Lives And Reliabilities Of Turboprop Transmissions
NASA Technical Reports Server (NTRS)
Coy, J. J.; Savage, M.; Radil, K. C.; Lewicki, D. G.
1991-01-01
Computer program PSHFT calculates lifetimes of variety of aircraft transmissions. Consists of main program, series of subroutines applying to specific configurations, generic subroutines for analysis of properties of components, subroutines for analysis of system, and common block. Main program selects routines used in analysis and causes them to operate in desired sequence. Series of configuration-specific subroutines put in configuration data, perform force and life analyses for components (with help of generic component-property-analysis subroutines), fill property array, call up system-analysis routines, and finally print out results of analysis for system and components. Written in FORTRAN 77(IV).
Yoo, Minjae; Shin, Jimin; Kim, Hyunmin; Kim, Jihye; Kang, Jaewoo; Tan, Aik Choon
2018-04-04
Traditional Chinese Medicine (TCM) has been practiced over thousands of years in China and other Asian countries for treating various symptoms and diseases. However, the underlying molecular mechanisms of TCM are poorly understood, partly due to the "multi-component, multi-target" nature of TCM. To uncover the molecular mechanisms of TCM, we perform comprehensive gene expression analysis using connectivity map. We interrogated gene expression signatures obtained 102 TCM components using the next generation Connectivity Map (CMap) resource. We performed systematic data mining and analysis on the mechanism of action (MoA) of these TCM components based on the CMap results. We clustered the 102 TCM components into four groups based on their MoAs using next generation CMap resource. We performed gene set enrichment analysis on these components to provide additional supports for explaining these molecular mechanisms. We also provided literature evidence to validate the MoAs identified through this bioinformatics analysis. Finally, we developed the Traditional Chinese Medicine Drug Repurposing Hub (TCM Hub) - a connectivity map resource to facilitate the elucidation of TCM MoA for drug repurposing research. TCMHub is freely available in http://tanlab.ucdenver.edu/TCMHub. Molecular mechanisms of TCM could be uncovered by using gene expression signatures and connectivity map. Through this analysis, we identified many of the TCM components possess diverse MoAs, this may explain the applications of TCM in treating various symptoms and diseases. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Hyperspectral functional imaging of the human brain
NASA Astrophysics Data System (ADS)
Toronov, Vladislav; Schelkanova, Irina
2013-03-01
We performed the independent component analysis of the hyperspectral functional near-infrared data acquired on humans during exercise and rest. We found that the hyperspectral functional data acquired on the human brain requires only two physiologically meaningful components to cover more than 50% o the temporal variance in hundreds of wavelengths. The analysis of the spectra of independent components showed that these components could be interpreted as results of changes in the cerebral blood volume and blood flow. Also, we found significant contributions of water and cytochrome c oxydase into changes associated with the independent components. Another remarkable effect of ICA was its good performance in terms of the filtering of the data noise.
The Distressed Brain: A Group Blind Source Separation Analysis on Tinnitus
De Ridder, Dirk; Vanneste, Sven; Congedo, Marco
2011-01-01
Background Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress. Methodology In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ), an electroencephalography (EEG) is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30) and high distress (N = 25). The groups are homogeneous for tinnitus type, tinnitus duration or tinnitus laterality. A group blind source separation (BSS) analysis is performed using a large normative sample (N = 84), generating seven normative components to which high and low tinnitus patients are compared. A correlation analysis of the obtained normative components' relative power and distress is performed. Furthermore, the functional connectivity as reflected by lagged phase synchronization is analyzed between the brain areas defined by the components. Finally, a group BSS analysis on the Tinnitus group as a whole is performed. Conclusions Tinnitus can be characterized by at least four BSS components, two of which are posterior cingulate based, one based on the subgenual anterior cingulate and one based on the parahippocampus. Only the subgenual component correlates with distress. When performed on a normative sample, group BSS reveals that distress is characterized by two anterior cingulate based components. Spectral analysis of these components demonstrates that distress in tinnitus is related to alpha and beta changes in a network consisting of the subgenual anterior cingulate cortex extending to the pregenual and dorsal anterior cingulate cortex as well as the ventromedial prefrontal cortex/orbitofrontal cortex, insula, and parahippocampus. This network overlaps partially with brain areas implicated in distress in patients suffering from pain, functional somatic syndromes and posttraumatic stress disorder, and might therefore represent a specific distress network. PMID:21998628
Critical Factors Explaining the Leadership Performance of High-Performing Principals
ERIC Educational Resources Information Center
Hutton, Disraeli M.
2018-01-01
The study explored critical factors that explain leadership performance of high-performing principals and examined the relationship between these factors based on the ratings of school constituents in the public school system. The principal component analysis with the use of Varimax Rotation revealed that four components explain 51.1% of the…
Performance deterioration based on existing (historical) data; JT9D jet engine diagnostics program
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1978-01-01
The results of the collection and analysis of historical data pertaining to the deterioration of JT9D engine performance are presented. The results of analyses of prerepair and postrepair engine test stand performance data from a number of airlines to establish the individual as well as average losses in engine performance with respect to service use are included. Analysis of the changes in mechanical condition of parts, obtained by inspection of used gas-path parts of varying age, allowed preliminary assessments of component performance deterioration levels and identification of the causitive factors. These component performance estimates, refined by data from special engine back-to-back testing related to module performance restoration, permitted the development of preliminary models of engine component/module performance deterioration with respect to usage. The preliminary assessment of the causes of module performance deterioration and the trends with usage are explained, along with the role each module plays in overall engine performance deterioration. Preliminary recommendations with respect to operating and maintenance practices which could be adopted to control the level of performance deterioration are presented. The needs for additional component sensitivity testing as well as outstanding issues are discussed.
Constrained independent component analysis approach to nonobtrusive pulse rate measurements
NASA Astrophysics Data System (ADS)
Tsouri, Gill R.; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K.
2012-07-01
Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
Constrained independent component analysis approach to nonobtrusive pulse rate measurements.
Tsouri, Gill R; Kyal, Survi; Dianat, Sohail; Mestha, Lalit K
2012-07-01
Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-the-art algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...
Reliability and availability analysis of a 10 kW@20 K helium refrigerator
NASA Astrophysics Data System (ADS)
Li, J.; Xiong, L. Y.; Liu, L. Q.; Wang, H. R.; Wang, B. M.
2017-02-01
A 10 kW@20 K helium refrigerator has been established in the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. To evaluate and improve this refrigerator’s reliability and availability, a reliability and availability analysis is performed. According to the mission profile of this refrigerator, a functional analysis is performed. The failure data of the refrigerator components are collected and failure rate distributions are fitted by software Weibull++ V10.0. A Failure Modes, Effects & Criticality Analysis (FMECA) is performed and the critical components with higher risks are pointed out. Software BlockSim V9.0 is used to calculate the reliability and the availability of this refrigerator. The result indicates that compressors, turbine and vacuum pump are the critical components and the key units of this refrigerator. The mitigation actions with respect to design, testing, maintenance and operation are proposed to decrease those major and medium risks.
Handbook of experiences in the design and installation of solar heating and cooling systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, D.S.; Oberoi, H.S.
1980-07-01
A large array of problems encountered are detailed, including design errors, installation mistakes, cases of inadequate durability of materials and unacceptable reliability of components, and wide variations in the performance and operation of different solar systems. Durability, reliability, and design problems are reviewed for solar collector subsystems, heat transfer fluids, thermal storage, passive solar components, piping/ducting, and reliability/operational problems. The following performance topics are covered: criteria for design and performance analysis, domestic hot water systems, passive space heating systems, active space heating systems, space cooling systems, analysis of systems performance, and performance evaluations. (MHR)
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
Monakhova, Yulia B; Godelmann, Rolf; Kuballa, Thomas; Mushtakova, Svetlana P; Rutledge, Douglas N
2015-08-15
Discriminant analysis (DA) methods, such as linear discriminant analysis (LDA) or factorial discriminant analysis (FDA), are well-known chemometric approaches for solving classification problems in chemistry. In most applications, principle components analysis (PCA) is used as the first step to generate orthogonal eigenvectors and the corresponding sample scores are utilized to generate discriminant features for the discrimination. Independent components analysis (ICA) based on the minimization of mutual information can be used as an alternative to PCA as a preprocessing tool for LDA and FDA classification. To illustrate the performance of this ICA/DA methodology, four representative nuclear magnetic resonance (NMR) data sets of wine samples were used. The classification was performed regarding grape variety, year of vintage and geographical origin. The average increase for ICA/DA in comparison with PCA/DA in the percentage of correct classification varied between 6±1% and 8±2%. The maximum increase in classification efficiency of 11±2% was observed for discrimination of the year of vintage (ICA/FDA) and geographical origin (ICA/LDA). The procedure to determine the number of extracted features (PCs, ICs) for the optimum DA models was discussed. The use of independent components (ICs) instead of principle components (PCs) resulted in improved classification performance of DA methods. The ICA/LDA method is preferable to ICA/FDA for recognition tasks based on NMR spectroscopic measurements. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
NASA Technical Reports Server (NTRS)
Rajagopal, K. R.
1992-01-01
The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.
Component Cost Analysis of Large Scale Systems
NASA Technical Reports Server (NTRS)
Skelton, R. E.; Yousuff, A.
1982-01-01
The ideas of cost decomposition is summarized to aid in the determination of the relative cost (or 'price') of each component of a linear dynamic system using quadratic performance criteria. In addition to the insights into system behavior that are afforded by such a component cost analysis CCA, these CCA ideas naturally lead to a theory for cost-equivalent realizations.
Effects of Gas Turbine Component Performance on Engine and Rotary Wing Vehicle Size and Performance
NASA Technical Reports Server (NTRS)
Snyder, Christopher A.; Thurman, Douglas R.
2010-01-01
In support of the Fundamental Aeronautics Program, Subsonic Rotary Wing Project, further gas turbine engine studies have been performed to quantify the effects of advanced gas turbine technologies on engine weight and fuel efficiency and the subsequent effects on a civilian rotary wing vehicle size and mission fuel. The Large Civil Tiltrotor (LCTR) vehicle and mission and a previous gas turbine engine study will be discussed as a starting point for this effort. Methodology used to assess effects of different compressor and turbine component performance on engine size, weight and fuel efficiency will be presented. A process to relate engine performance to overall LCTR vehicle size and fuel use will also be given. Technology assumptions and levels of performance used in this analysis for the compressor and turbine components performances will be discussed. Optimum cycles (in terms of power specific fuel consumption) will be determined with subsequent engine weight analysis. The combination of engine weight and specific fuel consumption will be used to estimate their effect on the overall LCTR vehicle size and mission fuel usage. All results will be summarized to help suggest which component performance areas have the most effect on the overall mission.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
Liu, Xiaona; Zhang, Qiao; Wu, Zhisheng; Shi, Xinyuan; Zhao, Na; Qiao, Yanjiang
2015-01-01
Laser-induced breakdown spectroscopy (LIBS) was applied to perform a rapid elemental analysis and provenance study of Blumea balsamifera DC. Principal component analysis (PCA) and partial least squares discriminant analysis (PLS-DA) were implemented to exploit the multivariate nature of the LIBS data. Scores and loadings of computed principal components visually illustrated the differing spectral data. The PLS-DA algorithm showed good classification performance. The PLS-DA model using complete spectra as input variables had similar discrimination performance to using selected spectral lines as input variables. The down-selection of spectral lines was specifically focused on the major elements of B. balsamifera samples. Results indicated that LIBS could be used to rapidly analyze elements and to perform provenance study of B. balsamifera. PMID:25558999
Reverse engineering of wörner type drilling machine structure.
NASA Astrophysics Data System (ADS)
Wibowo, A.; Belly, I.; llhamsyah, R.; Indrawanto; Yuwana, Y.
2018-03-01
A product design needs to be modified based on the conditions of production facilities and existing resource capabilities without reducing the functional aspects of the product itself. This paper describes the reverse engineering process of the main structure of the wörner type drilling machine to obtain a machine structure design that can be made by resources with limited ability by using simple processes. Some structural, functional and the work mechanism analyzes have been performed to understand the function and role of each basic components. The process of dismantling of the drilling machine and measuring each of the basic components was performed to obtain sets of the geometry and size data of each component. The geometric model of each structure components and the machine assembly were built to facilitate the simulation process and machine performance analysis that refers to ISO standard of drilling machine. The tolerance stackup analysis also performed to determine the type and value of geometrical and dimensional tolerances, which could affect the ease of the components to be manufactured and assembled
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.
Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.
1989-01-01
A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.
ERIC Educational Resources Information Center
Kronenberger, William G.; Thompson, Robert J., Jr.; Morrow, Catherine
1997-01-01
A principal components analysis of the Family Environment Scale (FES) (R. Moos and B. Moos, 1994) was performed using 113 undergraduates. Research supported 3 broad components encompassing the 10 FES subscales. These results supported previous research and the generalization of the FES to college samples. (SLD)
Desova, A A; Dorofeyuk, A A; Anokhin, A M
2017-01-01
We performed a comparative analysis of the types of spectral density typical of various parameters of pulse signal. The experimental material was obtained during the examination of school age children with various psychosomatic disorders. We also performed a typological analysis of the spectral density functions corresponding to the time series of different parameters of a single oscillation of pulse signals; the results of their comparative analysis are presented. We determined the most significant spectral components for two disordersin children: arterial hypertension and mitral valve prolapse.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Study of advanced techniques for determining the long-term performance of components
NASA Technical Reports Server (NTRS)
1972-01-01
A study was conducted of techniques having the capability of determining the performance and reliability of components for spacecraft liquid propulsion applications for long term missions. The study utilized two major approaches; improvement in the existing technology, and the evolution of new technology. The criteria established and methods evolved are applicable to valve components. Primary emphasis was placed on the propellants oxygen difluoride and diborane combination. The investigation included analysis, fabrication, and tests of experimental equipment to provide data and performance criteria.
Cell module and fuel conditioner development
NASA Technical Reports Server (NTRS)
Feret, J. M.
1982-01-01
The efforts performed to develop a phosphoric acid fuel cell (PAFC) stack design having a 10 kW power rating for operation at higher than atmospheric pressure based on the existing Mark II design configuration are described. The work involves: (1) Performance of pertinent functional analysis, trade studies and thermodynamic cycle analysis for requirements definition and system operating parameter selection purposes, (2) characterization of fuel cell materials and components, and performance testing and evaluation of the repeating electrode components, (3) establishment of the state-of-the-art manufacturing technology for all fuel cell components at Westinghouse and the fabrication of short stacks of various sites, and (4) development of a 10 kW PAFC stack design for higher pressure operation utilizing the top down systems engineering approach.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Key components of financial-analysis education for clinical nurses.
Lim, Ji Young; Noh, Wonjung
2015-09-01
In this study, we identified key components of financial-analysis education for clinical nurses. We used a literature review, focus group discussions, and a content validity index survey to develop key components of financial-analysis education. First, a wide range of references were reviewed, and 55 financial-analysis education components were gathered. Second, two focus group discussions were performed; the participants were 11 nurses who had worked for more than 3 years in a hospital, and nine components were agreed upon. Third, 12 professionals, including professors, nurse executive, nurse managers, and an accountant, participated in the content validity index. Finally, six key components of financial-analysis education were selected. These key components were as follows: understanding the need for financial analysis, introduction to financial analysis, reading and implementing balance sheets, reading and implementing income statements, understanding the concepts of financial ratios, and interpretation and practice of financial ratio analysis. The results of this study will be used to develop an education program to increase financial-management competency among clinical nurses. © 2015 Wiley Publishing Asia Pty Ltd.
NASA Technical Reports Server (NTRS)
Parker, K. C.; Torian, J. G.
1980-01-01
A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.
Independent component analysis algorithm FPGA design to perform real-time blind source separation
NASA Astrophysics Data System (ADS)
Meyer-Baese, Uwe; Odom, Crispin; Botella, Guillermo; Meyer-Baese, Anke
2015-05-01
The conditions that arise in the Cocktail Party Problem prevail across many fields creating a need for of Blind Source Separation. The need for BSS has become prevalent in several fields of work. These fields include array processing, communications, medical signal processing, and speech processing, wireless communication, audio, acoustics and biomedical engineering. The concept of the cocktail party problem and BSS led to the development of Independent Component Analysis (ICA) algorithms. ICA proves useful for applications needing real time signal processing. The goal of this research was to perform an extensive study on ability and efficiency of Independent Component Analysis algorithms to perform blind source separation on mixed signals in software and implementation in hardware with a Field Programmable Gate Array (FPGA). The Algebraic ICA (A-ICA), Fast ICA, and Equivariant Adaptive Separation via Independence (EASI) ICA were examined and compared. The best algorithm required the least complexity and fewest resources while effectively separating mixed sources. The best algorithm was the EASI algorithm. The EASI ICA was implemented on hardware with Field Programmable Gate Arrays (FPGA) to perform and analyze its performance in real time.
Task analysis exemplified: the process of resolving unfinished business.
Greenberg, L S; Foerster, F S
1996-06-01
The steps of a task-analytic research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention manual and the components of client processes of resolution. A refined model of the change process developed by these procedures is validated by comparing 11 successful and 11 unsuccessful performances. Four performance components-intense expression of feeling, expression of need, shift in representation of other, and self-validation or understanding of the other-were found to discriminate between resolution and nonresolution performances. These components were measured on 4 process measures: the Structural Analysis of Social Behavior, the Experiencing Scale, the Client's Emotional Arousal Scale, and a need scale.
Enhanced Component Performance Study. Emergency Diesel Generators 1998–2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2014-11-01
This report presents an enhanced performance evaluation of emergency diesel generators (EDGs) at U.S. commercial nuclear power plants. This report evaluates component performance over time using Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES) data from 1998 through 2013 and maintenance unavailability (UA) performance data using Mitigating Systems Performance Index (MSPI) Basis Document data from 2002 through 2013. The objective is to present an analysis of factors that could influence the system and component trends in addition to annual performance trends of failure rates and probabilities. The factors analyzed for the EDG component are the differences in failuresmore » between all demands and actual unplanned engineered safety feature (ESF) demands, differences among manufacturers, and differences among EDG ratings. Statistical analyses of these differences are performed and results showing whether pooling is acceptable across these factors. In addition, engineering analyses were performed with respect to time period and failure mode. The factors analyzed are: sub-component, failure cause, detection method, recovery, manufacturer, and EDG rating.« less
Steinhauser, Marco; Hübner, Ronald
2009-10-01
It has been suggested that performance in the Stroop task is influenced by response conflict as well as task conflict. The present study investigated the idea that both conflict types can be isolated by applying ex-Gaussian distribution analysis which decomposes response time into a Gaussian and an exponential component. Two experiments were conducted in which manual versions of a standard Stroop task (Experiment 1) and a separated Stroop task (Experiment 2) were performed under task-switching conditions. Effects of response congruency and stimulus bivalency were used to measure response conflict and task conflict, respectively. Ex-Gaussian analysis revealed that response conflict was mainly observed in the Gaussian component, whereas task conflict was stronger in the exponential component. Moreover, task conflict in the exponential component was selectively enhanced under task-switching conditions. The results suggest that ex-Gaussian analysis can be used as a tool to isolate different conflict types in the Stroop task. PsycINFO Database Record (c) 2009 APA, all rights reserved.
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
J. Sep. Sci. 2016, 39, 4147-4157 DOI: 10.1002/jssc.201600284 Yinchenhao decoction (YCHD) is a famous Chinese herbal formula recorded in the Shang Han Lun which was prescribed by Zhongjing Zhang during 150-219 AD. A novel quantitative analysis method was developed, based on ultrahigh performance liquid chromatography coupled with a diode array detector for the simultaneous determination of 14 main active components in Yinchenhao decoction. Furthermore, the method has been applied for compositional difference analysis of the 14 components in eight normal extraction samples of Yinchenhao decoction, with the aid of hierarchical clustering analysis and similarity analysis. The present research could help hospital, factory and lab choose the best way to make Yinchenhao decoction with better efficacy. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gruen, D.M.; Young, C.E.; Pellin, M.J.
1989-12-26
A charged particle spectrometer is described for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode. 12 figs.
Stress analysis of 27% scale model of AH-64 main rotor hub
NASA Technical Reports Server (NTRS)
Hodges, R. V.
1985-01-01
Stress analysis of an AH-64 27% scale model rotor hub was performed. Component loads and stresses were calculated based upon blade root loads and motions. The static and fatigue analysis indicates positive margins of safety in all components checked. Using the format developed here, the hub can be stress checked for future application.
Intelligence, Surveillance, and Reconnaissance Fusion for Coalition Operations
2008-07-01
classification of the targets of interest. The MMI features extracted in this manner have two properties that provide a sound justification for...are generalizations of well- known feature extraction methods such as Principal Components Analysis (PCA) and Independent Component Analysis (ICA...augment (without degrading performance) a large class of generic fusion processes. Ontologies Classifications Feature extraction Feature analysis
Dong, Fengxia; Mitchell, Paul D; Colquhoun, Jed
2015-01-01
Measuring farm sustainability performance is a crucial component for improving agricultural sustainability. While extensive assessments and indicators exist that reflect the different facets of agricultural sustainability, because of the relatively large number of measures and interactions among them, a composite indicator that integrates and aggregates over all variables is particularly useful. This paper describes and empirically evaluates a method for constructing a composite sustainability indicator that individually scores and ranks farm sustainability performance. The method first uses non-negative polychoric principal component analysis to reduce the number of variables, to remove correlation among variables and to transform categorical variables to continuous variables. Next the method applies common-weight data envelope analysis to these principal components to individually score each farm. The method solves weights endogenously and allows identifying important practices in sustainability evaluation. An empirical application to Wisconsin cranberry farms finds heterogeneity in sustainability practice adoption, implying that some farms could adopt relevant practices to improve the overall sustainability performance of the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.
HPLC-Orbitrap analysis for identification of organic molecules in complex material
NASA Astrophysics Data System (ADS)
Gautier, T.; Schmitz-Afonso, I.; Carrasco, N.; Touboul, D.; Szopa, C.; Buch, A.; Pernot, P.
2015-10-01
We performed High Performance Liquid Chromatography (HPLC) coupled to Orbitrap High Resolution Mass Spectrometry (OHR MS) analysis of Titan's tholins. This analysis allowed us to determine the exact composition and structure of some of the major components of tholins.
Chen, Pei; Jin, Hong-Yu; Sun, Lei; Ma, Shuang-Cheng
2016-09-01
Multi-source analysis of traditional Chinese medicine is key to ensuring its safety and efficacy. Compared with traditional experimental differentiation, chemometric analysis is a simpler strategy to identify traditional Chinese medicines. Multi-component analysis plays an increasingly vital role in the quality control of traditional Chinese medicines. A novel strategy, based on chemometric analysis and quantitative analysis of multiple components, was proposed to easily and effectively control the quality of traditional Chinese medicines such as Chonglou. Ultra high performance liquid chromatography was more convenient and efficient. Five species of Chonglou were distinguished by chemometric analysis and nine saponins, including Chonglou saponins I, II, V, VI, VII, D, and H, as well as dioscin and gracillin, were determined in 18 min. The method is feasible and credible, and enables to improve quality control of traditional Chinese medicines and natural products. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
1992-01-01
The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.
SSME Post Test Diagnostic System: Systems Section
NASA Technical Reports Server (NTRS)
Bickmore, Timothy
1995-01-01
An assessment of engine and component health is routinely made after each test firing or flight firing of a Space Shuttle Main Engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project - the SSME Post Test Diagnostic System (PTDS) - is to develop a computer program which automates the analysis of test data from the SSME in order to detect and diagnose anomalies. This report primarily covers work on the Systems Section of the PTDS, which automates the analyses performed by the systems/performance group at the Propulsion Branch of NASA Marshall Space Flight Center (MSFC). This group is responsible for assessing the overall health and performance of the engine, and detecting and diagnosing anomalies which involve multiple components (other groups are responsible for analyzing the behavior of specific components). The PTDS utilizes several advanced software technologies to perform its analyses. Raw test data is analyzed using signal processing routines which detect features in the data, such as spikes, shifts, peaks, and drifts. Component analyses are performed by expert systems, which use 'rules-of-thumb' obtained from interviews with the MSFC data analysts to detect and diagnose anomalies. The systems analysis is performed using case-based reasoning. Results of all analyses are stored in a relational database and displayed via an X-window-based graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.
Power Take-off System for Marine Renewable Devices, CRADA Number CRD-14-566
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muljadi, Eduard
Ocean Renewable Power Company (ORPC) proposes a project to develop and test innovative second-generation power take-off (PTO) components for the U.S. Department of Energy's 2013 FOA: Marine and Hydrokinetic System Performance Advancement, Topic Area 2 (Project). Innovative PTO components will include new and improved designs for bearings, couplings and a subsea electrical generator. Specific project objectives include the following: (1) Develop components for an advanced PTO suitable for MHK devices; (2) Bench test these components; (3) Assess the component and system performance benefits; (4) Perform a system integration study to integrate these components into an ORPC hydrokinetic turbine. National Renewablemore » Energy Laboratory (NREL) will participate on the ORPC lead team to review design of the generator and will provide guidance on the design. Based on inputs from the project team, NREL will also provide an economic analysis of the impacts of the proposed system performance advancements.« less
Stress Analysis of B-52B and B-52H Air-Launching Systems Failure-Critical Structural Components
NASA Technical Reports Server (NTRS)
Ko, William L.
2005-01-01
The operational life analysis of any airborne failure-critical structural component requires the stress-load equation, which relates the applied load to the maximum tangential tensile stress at the critical stress point. The failure-critical structural components identified are the B-52B Pegasus pylon adapter shackles, B-52B Pegasus pylon hooks, B-52H airplane pylon hooks, B-52H airplane front fittings, B-52H airplane rear pylon fitting, and the B-52H airplane pylon lower sway brace. Finite-element stress analysis was performed on the said structural components, and the critical stress point was located and the stress-load equation was established for each failure-critical structural component. The ultimate load, yield load, and proof load needed for operational life analysis were established for each failure-critical structural component.
Cell module and fuel conditioner development
NASA Technical Reports Server (NTRS)
Feret, J. M.
1981-01-01
A phosphoric acid fuel cell (PAFC) stack design having a 10 kW power rating for operation at higher than atmospheric pressure based on the existing Mark II design configuration is described. Functional analysis, trade studies and thermodynamic cycle analysis for requirements definition and system operating parameter selection purposes were performed. Fuel cell materials and components, and performance testing and evaluation of the repeating electrode components were characterized. The state of the art manufacturing technology for all fuel cell components and the fabrication of short stacks of various sites were established. A 10 kW PAFC stack design for higher pressure operation utilizing the top down systems engineering aproach was developed.
2012-03-01
EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution
Zeng, Rui; Fu, Juan; Wu, La-Bin; Huang, Lin-Fang
2013-07-01
To analyze components of Citrus reticulata and salt-processed C. reticulata by ultra-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UPLC-Q-TOF/MS), and compared the changes in components before and after being processed with salt. Principal component analysis (PCA) and partial least squares discriminant analysis (OPLS-DA) were adopted to analyze the difference in fingerprint between crude and processed C. reticulata, showing increased content of eriocitrin, limonin, nomilin and obacunone increase in salt-processed C. reticulata. Potential chemical markers were identified as limonin, obacunone and nomilin, which could be used for distinguishing index components of crude and processed C. reticulata.
Predictive Validity of National Basketball Association Draft Combine on Future Performance.
Teramoto, Masaru; Cross, Chad L; Rieger, Randall H; Maak, Travis G; Willick, Stuart E
2018-02-01
Teramoto, M, Cross, CL, Rieger, RH, Maak, TG, and Willick, SE. Predictive validity of national basketball association draft combine on future performance. J Strength Cond Res 32(2): 396-408, 2018-The National Basketball Association (NBA) Draft Combine is an annual event where prospective players are evaluated in terms of their athletic abilities and basketball skills. Data collected at the Combine should help NBA teams select right the players for the upcoming NBA draft; however, its value for predicting future performance of players has not been examined. This study investigated predictive validity of the NBA Draft Combine on future performance of basketball players. We performed a principal component analysis (PCA) on the 2010-2015 Combine data to reduce correlated variables (N = 234), a correlation analysis on the Combine data and future on-court performance to examine relationships (maximum pairwise N = 217), and a robust principal component regression (PCR) analysis to predict first-year and 3-year on-court performance from the Combine measures (N = 148 and 127, respectively). Three components were identified within the Combine data through PCA (= Combine subscales): length-size, power-quickness, and upper-body strength. As per the correlation analysis, the individual Combine items for anthropometrics, including height without shoes, standing reach, weight, wingspan, and hand length, as well as the Combine subscale of length-size, had positive, medium-to-large-sized correlations (r = 0.313-0.545) with defensive performance quantified by Defensive Box Plus/Minus. The robust PCR analysis showed that the Combine subscale of length-size was a predictor most significantly associated with future on-court performance (p ≤ 0.05), including Win Shares, Box Plus/Minus, and Value Over Replacement Player, followed by upper-body strength. In conclusion, the NBA Draft Combine has value for predicting future performance of players.
Metal-backed versus all-polyethylene tibial components in primary total knee arthroplasty
2011-01-01
Background and purpose The choice of either all-polyethylene (AP) tibial components or metal-backed (MB) tibial components in total knee arthroplasty (TKA) remains controversial. We therefore performed a meta-analysis and systematic review of randomized controlled trials that have evaluated MB and AP tibial components in primary TKA. Methods The search strategy included a computerized literature search (Medline, EMBASE, Scopus, and the Cochrane Central Register of Controlled Trials) and a manual search of major orthopedic journals. A meta-analysis and systematic review of randomized or quasi-randomized trials that compared the performance of tibial components in primary TKA was performed using a fixed or random effects model. We assessed the methodological quality of studies using Detsky quality scale. Results 9 randomized controlled trials (RCTs) published between 2000 and 2009 met the inclusion quality standards for the systematic review. The mean standardized Detsky score was 14 (SD 3). We found that the frequency of radiolucent lines in the MB group was significantly higher than that in the AP group. There were no statistically significant differences between the MB and AP tibial components regarding component positioning, knee score, knee range of motion, quality of life, and postoperative complications. Interpretation Based on evidence obtained from this study, the AP tibial component was comparable with or better than the MB tibial component in TKA. However, high-quality RCTs are required to validate the results. PMID:21895503
Maneshi, Mona; Vahdat, Shahabeddin; Gotman, Jean; Grova, Christophe
2016-01-01
Independent component analysis (ICA) has been widely used to study functional magnetic resonance imaging (fMRI) connectivity. However, the application of ICA in multi-group designs is not straightforward. We have recently developed a new method named “shared and specific independent component analysis” (SSICA) to perform between-group comparisons in the ICA framework. SSICA is sensitive to extract those components which represent a significant difference in functional connectivity between groups or conditions, i.e., components that could be considered “specific” for a group or condition. Here, we investigated the performance of SSICA on realistic simulations, and task fMRI data and compared the results with one of the state-of-the-art group ICA approaches to infer between-group differences. We examined SSICA robustness with respect to the number of allowable extracted specific components and between-group orthogonality assumptions. Furthermore, we proposed a modified formulation of the back-reconstruction method to generate group-level t-statistics maps based on SSICA results. We also evaluated the consistency and specificity of the extracted specific components by SSICA. The results on realistic simulated and real fMRI data showed that SSICA outperforms the regular group ICA approach in terms of reconstruction and classification performance. We demonstrated that SSICA is a powerful data-driven approach to detect patterns of differences in functional connectivity across groups/conditions, particularly in model-free designs such as resting-state fMRI. Our findings in task fMRI show that SSICA confirms results of the general linear model (GLM) analysis and when combined with clustering analysis, it complements GLM findings by providing additional information regarding the reliability and specificity of networks. PMID:27729843
Evaluation of Rankine cycle air conditioning system hardware by computer simulation
NASA Technical Reports Server (NTRS)
Healey, H. M.; Clark, D.
1978-01-01
A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.
NASA Technical Reports Server (NTRS)
1993-01-01
The Marshall Space Flight Center is responsible for the development and management of advanced launch vehicle propulsion systems, including the Space Shuttle Main Engine (SSME), which is presently operational, and the Space Transportation Main Engine (STME) under development. The SSME's provide high performance within stringent constraints on size, weight, and reliability. Based on operational experience, continuous design improvement is in progress to enhance system durability and reliability. Specialized data analysis and interpretation is required in support of SSME and advanced propulsion system diagnostic evaluations. Comprehensive evaluation of the dynamic measurements obtained from test and flight operations is necessary to provide timely assessment of the vibrational characteristics indicating the operational status of turbomachinery and other critical engine components. Efficient performance of this effort is critical due to the significant impact of dynamic evaluation results on ground test and launch schedules, and requires direct familiarity with SSME and derivative systems, test data acquisition, and diagnostic software. Detailed analysis and evaluation of dynamic measurements obtained during SSME and advanced system ground test and flight operations was performed including analytical/statistical assessment of component dynamic behavior, and the development and implementation of analytical/statistical models to efficiently define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational condition. In addition, the SSME and J-2 data will be applied to develop vibroacoustic environments for advanced propulsion system components, as required. This study will provide timely assessment of engine component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. This contract will be performed through accomplishment of negotiated task orders.
Fracture mechanics concepts in reliability analysis of monolithic ceramics
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.; Gyekenyesi, John P.
1987-01-01
Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.
Distributed optical fiber vibration sensor based on spectrum analysis of Polarization-OTDR system.
Zhang, Ziyi; Bao, Xiaoyi
2008-07-07
A fully distributed optical fiber vibration sensor is demonstrated based on spectrum analysis of Polarization-OTDR system. Without performing any data averaging, vibration disturbances up to 5 kHz is successfully demonstrated in a 1km fiber link with 10m spatial resolution. The FFT is performed at each spatial resolution; the relation of the disturbance at each frequency component versus location allows detection of multiple events simultaneously with different and the same frequency components.
Dynamic competitive probabilistic principal components analysis.
López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel
2009-04-01
We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.
Maximum flow-based resilience analysis: From component to system
Jin, Chong; Li, Ruiying; Kang, Rui
2017-01-01
Resilience, the ability to withstand disruptions and recover quickly, must be considered during system design because any disruption of the system may cause considerable loss, including economic and societal. This work develops analytic maximum flow-based resilience models for series and parallel systems using Zobel’s resilience measure. The two analytic models can be used to evaluate quantitatively and compare the resilience of the systems with the corresponding performance structures. For systems with identical components, the resilience of the parallel system increases with increasing number of components, while the resilience remains constant in the series system. A Monte Carlo-based simulation method is also provided to verify the correctness of our analytic resilience models and to analyze the resilience of networked systems based on that of components. A road network example is used to illustrate the analysis process, and the resilience comparison among networks with different topologies but the same components indicates that a system with redundant performance is usually more resilient than one without redundant performance. However, not all redundant capacities of components can improve the system resilience, the effectiveness of the capacity redundancy depends on where the redundant capacity is located. PMID:28545135
Butler, Rebecca A.
2014-01-01
Stroke aphasia is a multidimensional disorder in which patient profiles reflect variation along multiple behavioural continua. We present a novel approach to separating the principal aspects of chronic aphasic performance and isolating their neural bases. Principal components analysis was used to extract core factors underlying performance of 31 participants with chronic stroke aphasia on a large, detailed battery of behavioural assessments. The rotated principle components analysis revealed three key factors, which we labelled as phonology, semantic and executive/cognition on the basis of the common elements in the tests that loaded most strongly on each component. The phonology factor explained the most variance, followed by the semantic factor and then the executive-cognition factor. The use of principle components analysis rendered participants’ scores on these three factors orthogonal and therefore ideal for use as simultaneous continuous predictors in a voxel-based correlational methodology analysis of high resolution structural scans. Phonological processing ability was uniquely related to left posterior perisylvian regions including Heschl’s gyrus, posterior middle and superior temporal gyri and superior temporal sulcus, as well as the white matter underlying the posterior superior temporal gyrus. The semantic factor was uniquely related to left anterior middle temporal gyrus and the underlying temporal stem. The executive-cognition factor was not correlated selectively with the structural integrity of any particular region, as might be expected in light of the widely-distributed and multi-functional nature of the regions that support executive functions. The identified phonological and semantic areas align well with those highlighted by other methodologies such as functional neuroimaging and neurostimulation. The use of principle components analysis allowed us to characterize the neural bases of participants’ behavioural performance more robustly and selectively than the use of raw assessment scores or diagnostic classifications because principle components analysis extracts statistically unique, orthogonal behavioural components of interest. As such, in addition to improving our understanding of lesion–symptom mapping in stroke aphasia, the same approach could be used to clarify brain–behaviour relationships in other neurological disorders. PMID:25348632
Dong, Shuya; He, Jiao; Hou, Huiping; Shuai, Yaping; Wang, Qi; Yang, Wenling; Sun, Zheng; Li, Qing; Bi, Kaishun; Liu, Ran
2017-12-01
A novel, improved, and comprehensive method for quality evaluation and discrimination of Herba Leonuri has been developed and validated based on normal- and reversed-phase chromatographic methods. To identify Herba Leonuri, normal- and reversed-phase high-performance thin-layer chromatography fingerprints were obtained by comparing the colors and R f values of the bands, and reversed-phase high-performance liquid chromatography fingerprints were obtained by using an Agilent Poroshell 120 SB-C18 within 28 min. By similarity analysis and hierarchical clustering analysis, we show that there are similar chromatographic patterns in Herba Leonuri samples, but significant differences in counterfeits and variants. To quantify the bio-active components of Herba Leonuri, reversed-phase high-performance liquid chromatography was performed to analyze syringate, leonurine, quercetin-3-O-robiniaglycoside, hyperoside, rutin, isoquercitrin, wogonin, and genkwanin simultaneously by single standard to determine multi-components method with rutin as internal standard. Meanwhile, normal-phase high-performance liquid chromatography was performed by using an Agilent ZORBAX HILIC Plus within 6 min to determine trigonelline and stachydrine using trigonelline as internal standard. Innovatively, among these compounds, bio-active components of quercetin-3-O-robiniaglycoside and trigonelline were first determined in Herba Leonuri. In general, the method integrating multi-chromatographic analyses offered an efficient way for the standardization and identification of Herba Leonuri. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
Kopriva, Ivica; Persin, Antun; Puizina-Ivić, Neira; Mirić, Lina
2010-07-02
This study was designed to demonstrate robust performance of the novel dependent component analysis (DCA)-based approach to demarcation of the basal cell carcinoma (BCC) through unsupervised decomposition of the red-green-blue (RGB) fluorescent image of the BCC. Robustness to intensity fluctuation is due to the scale invariance property of DCA algorithms, which exploit spectral and spatial diversities between the BCC and the surrounding tissue. Used filtering-based DCA approach represents an extension of the independent component analysis (ICA) and is necessary in order to account for statistical dependence that is induced by spectral similarity between the BCC and surrounding tissue. This generates weak edges what represents a challenge for other segmentation methods as well. By comparative performance analysis with state-of-the-art image segmentation methods such as active contours (level set), K-means clustering, non-negative matrix factorization, ICA and ratio imaging we experimentally demonstrate good performance of DCA-based BCC demarcation in two demanding scenarios where intensity of the fluorescent image has been varied almost two orders of magnitude. Copyright 2010 Elsevier B.V. All rights reserved.
Biomass relations for components of five Minnesota shrubs.
Richard R. Buech; David J. Rugg
1995-01-01
Presents equations for estimating biomass of six components on five species of shrubs common to northeastern Minnesota. Regression analysis is used to compare the performance of three estimators of biomass.
Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L
2018-02-13
While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. A new assessment framework was developed to collect and analyze qualitative evidence based on CHW perspectives on seven program components associated with effectiveness (selection; training; community embeddedness; peer support; supportive supervision; relationship with other healthcare workers; retention and incentive structures). Focus groups were conducted with four high/medium-performing CHW teams and four low-performing CHW teams selected through random, stratified sampling. Content analysis involved organizing focus group transcripts according to the seven program effectiveness components, and assigning scores to each component per focus group. Four components, 'supportive supervision', 'good relationships with other healthcare workers', 'peer support', and 'retention and incentive structures' received the lowest overall scores. Variances in scores between 'high'/'medium'- and 'low'-performing CHW teams were largest for 'supportive supervision' and 'good relationships with other healthcare workers.' Our analysis suggests that in the Bushenyi intervention context, CHW team performance is highly correlated with the quality of supervision and relationships with other healthcare workers. CHWs identified key performance-related issues of absentee supervisors, referral system challenges, and lack of engagement/respect by health workers. Other less-correlated program components warrant further study and may have been impacted by relatively consistent program implementation within our limited study area. Applying process-oriented measurement tools are needed to better understand CHW performance-related factors and build a supportive environment for CHW program effectiveness and sustainability. Findings from a qualitative, multi-component tool developed and applied in this study suggest that factors related to (1) supportive supervision and (2) relationships with other healthcare workers may be strongly associated with variances in performance outcomes within a program. Careful consideration of supervisory structure and health worker orientation during program implementation are among strategies proposed to increase CHW performance.
Li, Yong-Wei; Qi, Jin; Wen-Zhang; Zhou, Shui-Ping; Yan-Wu; Yu, Bo-Yang
2014-07-01
Liriope muscari (Decne.) L. H. Bailey is a well-known traditional Chinese medicine used for treating cough and insomnia. There are few reports on the quality evaluation of this herb partly because the major steroid saponins are not readily identified by UV detectors and are not easily isolated due to the existence of many similar isomers. In this study, a qualitative and quantitative method was developed to analyze the major components in L. muscari (Decne.) L. H. Bailey roots. Sixteen components were deduced and identified primarily by the information obtained from ultra high performance liquid chromatography with ion-trap time-of-flight mass spectrometry. The method demonstrated the desired specificity, linearity, stability, precision, and accuracy for simultaneous determination of 15 constituents (13 steroidal glycosides, 25(R)-ruscogenin, and pentylbenzoate) in 26 samples from different origins. The fingerprint was established, and the evaluation was achieved using similarity analysis and principal component analysis of 15 fingerprint peaks from 26 samples by ultra high performance liquid chromatography. The results from similarity analysis were consistent with those of principal component analysis. All results suggest that the established method could be applied effectively to the determination of multi-ingredients and fingerprint analysis of steroid saponins for quality assessment and control of L. muscari (Decne.) L. H. Bailey. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Principal Component Analysis for pulse-shape discrimination of scintillation radiation detectors
NASA Astrophysics Data System (ADS)
Alharbi, T.
2016-01-01
In this paper, we report on the application of Principal Component analysis (PCA) for pulse-shape discrimination (PSD) of scintillation radiation detectors. The details of the method are described and the performance of the method is experimentally examined by discriminating between neutrons and gamma-rays with a liquid scintillation detector in a mixed radiation field. The performance of the method is also compared against that of the conventional charge-comparison method, demonstrating the superior performance of the method particularly at low light output range. PCA analysis has the important advantage of automatic extraction of the pulse-shape characteristics which makes the PSD method directly applicable to various scintillation detectors without the need for the adjustment of a PSD parameter.
Analysis of Free Modeling Predictions by RBO Aleph in CASP11
Mabrouk, Mahmoud; Werner, Tim; Schneider, Michael; Putz, Ines; Brock, Oliver
2015-01-01
The CASP experiment is a biannual benchmark for assessing protein structure prediction methods. In CASP11, RBO Aleph ranked as one of the top-performing automated servers in the free modeling category. This category consists of targets for which structural templates are not easily retrievable. We analyze the performance of RBO Aleph and show that its success in CASP was a result of its ab initio structure prediction protocol. A detailed analysis of this protocol demonstrates that two components unique to our method greatly contributed to prediction quality: residue–residue contact prediction by EPC-map and contact–guided conformational space search by model-based search (MBS). Interestingly, our analysis also points to a possible fundamental problem in evaluating the performance of protein structure prediction methods: Improvements in components of the method do not necessarily lead to improvements of the entire method. This points to the fact that these components interact in ways that are poorly understood. This problem, if indeed true, represents a significant obstacle to community-wide progress. PMID:26492194
NDARC-NASA Design and Analysis of Rotorcraft Theoretical Basis and Architecture
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2010-01-01
The theoretical basis and architecture of the conceptual design tool NDARC (NASA Design and Analysis of Rotorcraft) are described. The principal tasks of NDARC are to design (or size) a rotorcraft to satisfy specified design conditions and missions, and then analyze the performance of the aircraft for a set of off-design missions and point operating conditions. The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated. The aircraft attributes are obtained from the sum of the component attributes. NDARC provides a capability to model general rotorcraft configurations, and estimate the performance and attributes of advanced rotor concepts. The software has been implemented with low-fidelity models, typical of the conceptual design environment. Incorporation of higher-fidelity models will be possible, as the architecture of the code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis and optimization.
Relaxation mode analysis of a peptide system: comparison with principal component analysis.
Mitsutake, Ayori; Iijima, Hiromitsu; Takano, Hiroshi
2011-10-28
This article reports the first attempt to apply the relaxation mode analysis method to a simulation of a biomolecular system. In biomolecular systems, the principal component analysis is a well-known method for analyzing the static properties of fluctuations of structures obtained by a simulation and classifying the structures into some groups. On the other hand, the relaxation mode analysis has been used to analyze the dynamic properties of homopolymer systems. In this article, a long Monte Carlo simulation of Met-enkephalin in gas phase has been performed. The results are analyzed by the principal component analysis and relaxation mode analysis methods. We compare the results of both methods and show the effectiveness of the relaxation mode analysis.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Energy Efficient Engine Low Pressure Subsystem Flow Analysis
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Lynn, Sean R.; Heidegger, Nathan J.; Delaney, Robert A.
1998-01-01
The objective of this project is to provide the capability to analyze the aerodynamic performance of the complete low pressure subsystem (LPS) of the Energy Efficient Engine (EEE). The analyses were performed using three-dimensional Navier-Stokes numerical models employing advanced clustered processor computing platforms. The analysis evaluates the impact of steady aerodynamic interaction effects between the components of the LPS at design and off-design operating conditions. Mechanical coupling is provided by adjusting the rotational speed of common shaft-mounted components until a power balance is achieved. The Navier-Stokes modeling of the complete low pressure subsystem provides critical knowledge of component aero/mechanical interactions that previously were unknown to the designer until after hardware testing.
Energy Efficient Engine Low Pressure Subsystem Aerodynamic Analysis
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Delaney, Robert A.; Lynn, Sean R.; Veres, Joseph P.
1998-01-01
The objective of this study was to demonstrate the capability to analyze the aerodynamic performance of the complete low pressure subsystem (LPS) of the Energy Efficient Engine (EEE). Detailed analyses were performed using three- dimensional Navier-Stokes numerical models employing advanced clustered processor computing platforms. The analysis evaluates the impact of steady aerodynamic interaction effects between the components of the LPS at design and off- design operating conditions. Mechanical coupling is provided by adjusting the rotational speed of common shaft-mounted components until a power balance is achieved. The Navier-Stokes modeling of the complete low pressure subsystem provides critical knowledge of component acro/mechanical interactions that previously were unknown to the designer until after hardware testing.
Open-cycle systems performance analysis programming guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olson, D.A.
1981-12-01
The Open-Cycle OTEC Systems Performance Analysis Program is an algorithm programmed on SERI's CDC Cyber 170/720 computer to predict the performance of a Claude-cycle, open-cycle OTEC plant. The algorithm models the Claude-cycle system as consisting of an evaporator, a turbine, a condenser, deaerators, a condenser gas exhaust, a cold water pipe and cold and warm seawater pumps. Each component is a separate subroutine in the main program. A description is given of how to write Fortran subroutines to fit into the main program for the components of the OTEC plant. An explanation is provided of how to use the algorithm.more » The main program and existing component subroutines are described. Appropriate common blocks and input and output variables are listed. Preprogrammed thermodynamic property functions for steam, fresh water, and seawater are described.« less
UMAMI: A Recipe for Generating Meaningful Metrics through Holistic I/O Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockwood, Glenn K.; Yoo, Wucherl; Byna, Suren
I/O efficiency is essential to productivity in scientific computing, especially as many scientific domains become more data-intensive. Many characterization tools have been used to elucidate specific aspects of parallel I/O performance, but analyzing components of complex I/O subsystems in isolation fails to provide insight into critical questions: how do the I/O components interact, what are reasonable expectations for application performance, and what are the underlying causes of I/O performance problems? To address these questions while capitalizing on existing component-level characterization tools, we propose an approach that combines on-demand, modular synthesis of I/O characterization data into a unified monitoring and metricsmore » interface (UMAMI) to provide a normalized, holistic view of I/O behavior. We evaluate the feasibility of this approach by applying it to a month-long benchmarking study on two distinct largescale computing platforms. We present three case studies that highlight the importance of analyzing application I/O performance in context with both contemporaneous and historical component metrics, and we provide new insights into the factors affecting I/O performance. By demonstrating the generality of our approach, we lay the groundwork for a production-grade framework for holistic I/O analysis.« less
NASA Astrophysics Data System (ADS)
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Analysis on Sealing Reliability of Bolted Joint Ball Head Component of Satellite Propulsion System
NASA Astrophysics Data System (ADS)
Guo, Tao; Fan, Yougao; Gao, Feng; Gu, Shixin; Wang, Wei
2018-01-01
Propulsion system is one of the important subsystems of satellite, and its performance directly affects the service life, attitude control and reliability of the satellite. The Paper analyzes the sealing principle of bolted joint ball head component of satellite propulsion system and discuss from the compatibility of hydrazine anhydrous and bolted joint ball head component, influence of ground environment on the sealing performance of bolted joint ball heads, and material failure caused by environment, showing that the sealing reliability of bolted joint ball head component is good and the influence of above three aspects on sealing of bolted joint ball head component can be ignored.
Physiological and anthropometric determinants of rhythmic gymnastics performance.
Douda, Helen T; Toubekis, Argyris G; Avloniti, Alexandra A; Tokmakidis, Savvas P
2008-03-01
To identify the physiological and anthropometric predictors of rhythmic gymnastics performance, which was defined from the total ranking score of each athlete in a national competition. Thirty-four rhythmic gymnasts were divided into 2 groups, elite (n = 15) and nonelite (n = 19), and they underwent a battery of anthropometric, physical fitness, and physiological measurements. The principal-components analysis extracted 6 components: anthropometric, flexibility, explosive strength, aerobic capacity, body dimensions, and anaerobic metabolism. These were used in a simultaneous multiple-regression procedure to determine which best explain the variance in rhythmic gymnastics performance. Based on the principal-component analysis, the anthropometric component explained 45% of the total variance, flexibility 12.1%, explosive strength 9.2%, aerobic capacity 7.4%, body dimensions 6.8%, and anaerobic metabolism 4.6%. Components of anthropometric (r = .50) and aerobic capacity (r = .49) were significantly correlated with performance (P < .01). When the multiple-regression model-y = 10.708 + (0.0005121 x VO2max) + (0.157 x arm span) + (0.814 x midthigh circumference) - (0.293 x body mass)-was applied to elite gymnasts, 92.5% of the variation was explained by VO2max (58.9%), arm span (12%), midthigh circumference (13.1%), and body mass (8.5%). Selected anthropometric characteristics, aerobic power, flexibility, and explosive strength are important determinants of successful performance. These findings might have practical implications for both training and talent identification in rhythmic gymnastics.
NASA Astrophysics Data System (ADS)
Biswal, Milan; Mishra, Srikanta
2018-05-01
The limited information on origin and nature of stimulus frequency otoacoustic emissions (SFOAEs) necessitates a thorough reexamination into SFOAE analysis procedures. This will lead to a better understanding of the generation of SFOAEs. The SFOAE response waveform in the time domain can be interpreted as a summation of amplitude modulated and frequency modulated component waveforms. The efficiency of a technique to segregate these components is critical to describe the nature of SFOAEs. Recent advancements in robust time-frequency analysis algorithms have staked claims on the more accurate extraction of these components, from composite signals buried in noise. However, their potential has not been fully explored for SFOAEs analysis. Indifference to distinct information, due to nature of these analysis techniques, may impact the scientific conclusions. This paper attempts to bridge this gap in literature by evaluating the performance of three linear time-frequency analysis algorithms: short-time Fourier transform (STFT), continuous Wavelet transform (CWT), S-transform (ST) and two nonlinear algorithms: Hilbert-Huang Transform (HHT), synchrosqueezed Wavelet transform (SWT). We revisit the extraction of constituent components and estimation of their magnitude and delay, by carefully evaluating the impact of variation in analysis parameters. The performance of HHT and SWT from the perspective of time-frequency filtering and delay estimation were found to be relatively less efficient for analyzing SFOAEs. The intrinsic mode functions of HHT does not completely characterize the reflection components and hence IMF based filtering alone, is not recommended for segregating principal emission from multiple reflection components. We found STFT, WT, and ST to be suitable for canceling multiple internal reflection components with marginal altering in SFOAE.
Hardesty, Samantha L; Hagopian, Louis P; McIvor, Melissa M; Wagner, Leaora L; Sigurdsson, Sigurdur O; Bowman, Lynn G
2014-09-01
The present study isolated the effects of frequently used staff training intervention components to increase communication between direct care staff and clinicians working on an inpatient behavioral unit. Written "protocol review" quizzes developed by clinicians were designed to assess knowledge about a patient's behavioral protocols. Direct care staff completed these at the beginning of each day and evening shift. Clinicians were required to score and discuss these protocol reviews with direct care staff for at least 75% of shifts over a 2-week period. During baseline, only 21% of clinicians met this requirement. Completing and scoring of protocol reviews did not improve following additional in-service training (M = 15%) or following an intervention aimed at decreasing response effort combined with prompting (M = 28%). After implementing an intervention involving specified performance criterion and performance feedback, 86% of clinicians reached the established goal. Results of a component analysis suggested that the presentation of both the specified performance criterion and supporting contingencies was necessary to maintain acceptable levels of performance. © The Author(s) 2014.
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
We developed a novel quantitative analysis method based on ultra high performance liquid chromatography coupled with diode array detection for the simultaneous determination of the 14 main active components in Yinchenhao decoction. All components were separated on an Agilent SB-C18 column by using a gradient solvent system of acetonitrile/0.1% phosphoric acid solution at a flow rate of 0.4 mL/min for 35 min. Subsequently, linearity, precision, repeatability, and accuracy tests were implemented to validate the method. Furthermore, the method has been applied for compositional difference analysis of 14 components in eight normal-extraction Yinchenhao decoction samples, accompanied by hierarchical clustering analysis and similarity analysis. The result that all samples were divided into three groups based on different contents of components demonstrated that extraction methods of decocting, refluxing, ultrasonication and extraction solvents of water or ethanol affected component differentiation, and should be related to its clinical applications. The results also indicated that the sample prepared by patients in the family by using water extraction employing a casserole was almost same to that prepared using a stainless-steel kettle, which is mostly used in pharmaceutical factories. This research would help patients to select the best and most convenient method for preparing Yinchenhao decoction. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Structural aspects of face recognition and the other-race effect.
O'Toole, A J; Deffenbacher, K A; Valentin, D; Abdi, H
1994-03-01
The other-race effect was examined in a series of experiments and simulations that looked at the relationships among observer ratings of typicality, familiarity, attractiveness, memorability, and the performance variables of d' and criterion. Experiment 1 replicated the other-race effect with our Caucasian and Japanese stimuli for both Caucasian and Asian observers. In Experiment 2, we collected ratings from Caucasian observers on the faces used in the recognition task. A Varimax-rotated principal components analysis on the rating and performance data for the Caucasian faces replicated Vokey and Read's (1992) finding that typicality is composed of two orthogonal components, dissociable via their independent relationships to: (1) attractiveness and familiarity ratings and (2) memorability ratings. For Japanese faces, however, we found that typicality was related only to memorability. Where performance measures were concerned, two additional principal components dominated by criterion and by d' emerged for Caucasian faces. For the Japanese faces, however, the performance measures of d' and criterion merged into a single component that represented a second component of typicality, one orthogonal to the memorability-dominated component. A measure of face representation quality extracted from an autoassociative neural network trained with a majority of Caucasian faces and a minority of Japanese faces was incorporated into the principal components analysis. For both Caucasian and Japanese faces, the neural network measure related both to memorability ratings and to human accuracy measures. Combined, the human data and simulation results indicate that the memorability component of typicality may be related to small, local, distinctive features, whereas the attractiveness/familiarity component may be more related to the global, shape-based properties of the face.
Genome-scale analysis of the high-efficient protein secretion system of Aspergillus oryzae
2014-01-01
Background The koji mold, Aspergillus oryzae is widely used for the production of industrial enzymes due to its particularly high protein secretion capacity and ability to perform post-translational modifications. However, systemic analysis of its secretion system is lacking, generally due to the poorly annotated proteome. Results Here we defined a functional protein secretory component list of A. oryzae using a previously reported secretory model of S. cerevisiae as scaffold. Additional secretory components were obtained by blast search with the functional components reported in other closely related fungal species such as Aspergillus nidulans and Aspergillus niger. To evaluate the defined component list, we performed transcriptome analysis on three α-amylase over-producing strains with varying levels of secretion capacities. Specifically, secretory components involved in the ER-associated processes (including components involved in the regulation of transport between ER and Golgi) were significantly up-regulated, with many of them never been identified for A. oryzae before. Furthermore, we defined a complete list of the putative A. oryzae secretome and monitored how it was affected by overproducing amylase. Conclusion In combination with the transcriptome data, the most complete secretory component list and the putative secretome, we improved the systemic understanding of the secretory machinery of A. oryzae in response to high levels of protein secretion. The roles of many newly predicted secretory components were experimentally validated and the enriched component list provides a better platform for driving more mechanistic studies of the protein secretory pathway in this industrially important fungus. PMID:24961398
Genome-scale analysis of the high-efficient protein secretion system of Aspergillus oryzae.
Liu, Lifang; Feizi, Amir; Österlund, Tobias; Hjort, Carsten; Nielsen, Jens
2014-06-24
The koji mold, Aspergillus oryzae is widely used for the production of industrial enzymes due to its particularly high protein secretion capacity and ability to perform post-translational modifications. However, systemic analysis of its secretion system is lacking, generally due to the poorly annotated proteome. Here we defined a functional protein secretory component list of A. oryzae using a previously reported secretory model of S. cerevisiae as scaffold. Additional secretory components were obtained by blast search with the functional components reported in other closely related fungal species such as Aspergillus nidulans and Aspergillus niger. To evaluate the defined component list, we performed transcriptome analysis on three α-amylase over-producing strains with varying levels of secretion capacities. Specifically, secretory components involved in the ER-associated processes (including components involved in the regulation of transport between ER and Golgi) were significantly up-regulated, with many of them never been identified for A. oryzae before. Furthermore, we defined a complete list of the putative A. oryzae secretome and monitored how it was affected by overproducing amylase. In combination with the transcriptome data, the most complete secretory component list and the putative secretome, we improved the systemic understanding of the secretory machinery of A. oryzae in response to high levels of protein secretion. The roles of many newly predicted secretory components were experimentally validated and the enriched component list provides a better platform for driving more mechanistic studies of the protein secretory pathway in this industrially important fungus.
Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2012-01-01
This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.
Early Improper Motion Detection in Golf Swings Using Wearable Motion Sensors: The First Approach
Stančin, Sara; Tomažič, Sašo
2013-01-01
This paper presents an analysis of a golf swing to detect improper motion in the early phase of the swing. Led by the desire to achieve a consistent shot outcome, a particular golfer would (in multiple trials) prefer to perform completely identical golf swings. In reality, some deviations from the desired motion are always present due to the comprehensive nature of the swing motion. Swing motion deviations that are not detrimental to performance are acceptable. This analysis is conducted using a golfer's leading arm kinematic data, which are obtained from a golfer wearing a motion sensor that is comprised of gyroscopes and accelerometers. Applying the principal component analysis (PCA) to the reference observations of properly performed swings, the PCA components of acceptable swing motion deviations are established. Using these components, the motion deviations in the observations of other swings are examined. Any unacceptable deviations that are detected indicate an improper swing motion. Arbitrarily long observations of an individual player's swing sequences can be included in the analysis. The results obtained for the considered example show an improper swing motion in early phase of the swing, i.e., the first part of the backswing. An early detection method for improper swing motions that is conducted on an individual basis provides assistance for performance improvement. PMID:23752563
Early improper motion detection in golf swings using wearable motion sensors: the first approach.
Stančin, Sara; Tomažič, Sašo
2013-06-10
This paper presents an analysis of a golf swing to detect improper motion in the early phase of the swing. Led by the desire to achieve a consistent shot outcome, a particular golfer would (in multiple trials) prefer to perform completely identical golf swings. In reality, some deviations from the desired motion are always present due to the comprehensive nature of the swing motion. Swing motion deviations that are not detrimental to performance are acceptable. This analysis is conducted using a golfer's leading arm kinematic data, which are obtained from a golfer wearing a motion sensor that is comprised of gyroscopes and accelerometers. Applying the principal component analysis (PCA) to the reference observations of properly performed swings, the PCA components of acceptable swing motion deviations are established. Using these components, the motion deviations in the observations of other swings are examined. Any unacceptable deviations that are detected indicate an improper swing motion. Arbitrarily long observations of an individual player's swing sequences can be included in the analysis. The results obtained for the considered example show an improper swing motion in early phase of the swing, i.e., the first part of the backswing. An early detection method for improper swing motions that is conducted on an individual basis provides assistance for performance improvement.
An RFI Detection Algorithm for Microwave Radiometers Using Sparse Component Analysis
NASA Technical Reports Server (NTRS)
Mohammed-Tano, Priscilla N.; Korde-Patel, Asmita; Gholian, Armen; Piepmeier, Jeffrey R.; Schoenwald, Adam; Bradley, Damon
2017-01-01
Radio Frequency Interference (RFI) is a threat to passive microwave measurements and if undetected, can corrupt science retrievals. The sparse component analysis (SCA) for blind source separation has been investigated to detect RFI in microwave radiometer data. Various techniques using SCA have been simulated to determine detection performance with continuous wave (CW) RFI.
NASA Astrophysics Data System (ADS)
Kim, Junghoe; Lee, Jong-Hwan
2014-03-01
A functional connectivity (FC) analysis from resting-state functional MRI (rsfMRI) is gaining its popularity toward the clinical application such as diagnosis of neuropsychiatric disease. To delineate the brain networks from rsfMRI data, non-neuronal components including head motions and physiological artifacts mainly observed in cerebrospinal fluid (CSF), white matter (WM) along with a global brain signal have been regarded as nuisance variables in calculating the FC level. However, it is still unclear how the non-neuronal components can affect the performance toward diagnosis of neuropsychiatric disease. In this study, a systematic comparison of classification performance of schizophrenia patients was provided employing the partial correlation coefficients (CCs) as feature elements. Pair-wise partial CCs were calculated between brain regions, in which six combinatorial sets of nuisance variables were considered. The partial CCs were used as candidate feature elements followed by feature selection based on the statistical significance test between two groups in the training set. Once a linear support vector machine was trained using the selected features from the training set, the classification performance was evaluated using the features from the test set (i.e. leaveone- out cross validation scheme). From the results, the error rate using all non-neuronal components as nuisance variables (12.4%) was significantly lower than those using remaining combination of non-neuronal components as nuisance variables (13.8 ~ 20.0%). In conclusion, the non-neuronal components substantially degraded the automated diagnosis performance, which supports our hypothesis that the non-neuronal components are crucial in controlling the automated diagnosis performance of the neuropsychiatric disease using an fMRI modality.
NASA Technical Reports Server (NTRS)
Goldstein, Arthur W
1947-01-01
The performance of the turbine component of an NACA research jet engine was investigated with cold air. The interaction and the matching of the turbine with the NACA eight-stage compressor were computed with the combination considered as a jet engine. The over-all performance of the engine was then determined. The internal aerodynamics were studied to the extent of investigating the performance of the first stator ring and its influence on the turbine performance. For this ring, the stream-filament method for computing velocity distribution permitted efficient sections to be designed, but the design condition of free-vortex flow with uniform axial velocities was not obtained.
Barbadoro, P; Ensini, A; Leardini, A; d'Amato, M; Feliciangeli, A; Timoncini, A; Amadei, F; Belvedere, C; Giannini, S
2014-12-01
Unicompartmental knee arthroplasty (UKA) has shown a higher rate of revision compared with total knee arthroplasty. The success of UKA depends on prosthesis component alignment, fixation and soft tissue integrity. The tibial cut is the crucial surgical step. The hypothesis of the present study is that tibial component malalignment is correlated with its risk of loosening in UKA. This study was performed in twenty-three patients undergoing primary cemented unicompartmental knee arthroplasties. Translations and rotations of the tibial component and the maximum total point motion (MTPM) were measured using radiostereometric analysis at 3, 6, 12 and 24 months. Standard radiological evaluations were also performed immediately before and after surgery. Varus/valgus and posterior slope of the tibial component and tibial-femoral axes were correlated with radiostereometric micro-motion. A survival analysis was also performed at an average of 5.9 years by contacting patients by phone. Varus alignment of the tibial component was significantly correlated with MTPM, anterior tibial sinking, varus rotation and anterior and medial translations from radiostereometry. The posterior slope of the tibial component was correlated with external rotation. The survival rate at an average of 5.9 years was 89%. The two patients who underwent revision presented a tibial component varus angle of 10° for both. There is correlation between varus orientation of the tibial component and MTPM from radiostereometry in unicompartmental knee arthroplasties. Particularly, a misalignment in varus larger than 5° could lead to risk of loosening the tibial component. Prognostic studies-retrospective study, Level II.
Design and performance evaluation of the imaging payload for a remote sensing satellite
NASA Astrophysics Data System (ADS)
Abolghasemi, Mojtaba; Abbasi-Moghadam, Dariush
2012-11-01
In this paper an analysis method and corresponding analytical tools for design of the experimental imaging payload (IMPL) of a remote sensing satellite (SINA-1) are presented. We begin with top-level customer system performance requirements and constraints and derive the critical system and component parameters, then analyze imaging payload performance until a preliminary design that meets customer requirements. We consider system parameters and components composing the image chain for imaging payload system which includes aperture, focal length, field of view, image plane dimensions, pixel dimensions, detection quantum efficiency, and optical filter requirements. The performance analysis is accomplished by calculating the imaging payload's SNR (signal-to-noise ratio), and imaging resolution. The noise components include photon noise due to signal scene and atmospheric background, cold shield, out-of-band optical filter leakage and electronic noise. System resolution is simulated through cascaded modulation transfer functions (MTFs) and includes effects due to optics, image sampling, and system motion. Calculations results for the SINA-1 satellite are also presented.
Gas engine heat pump cycle analysis. Volume 1: Model description and generic analysis
NASA Astrophysics Data System (ADS)
Fischer, R. D.
1986-10-01
The task has prepared performance and cost information to assist in evaluating the selection of high voltage alternating current components, values for component design variables, and system configurations and operating strategy. A steady-state computer model for performance simulation of engine-driven and electrically driven heat pumps was prepared and effectively used for parametric and seasonal performance analyses. Parametric analysis showed the effect of variables associated with design of recuperators, brine coils, domestic hot water heat exchanger, compressor size, engine efficiency, insulation on exhaust and brine piping. Seasonal performance data were prepared for residential and commercial units in six cities with system configurations closely related to existing or contemplated hardware of the five GRI engine contractors. Similar data were prepared for an advanced variable-speed electric unit for comparison purposes. The effect of domestic hot water production on operating costs was determined. Four fan-operating strategies and two brine loop configurations were explored.
Nam, Se Jin; Yoo, Jaeheung; Lee, Hye Sun; Kim, Eun-Kyung; Moon, Hee Jung; Yoon, Jung Hyun; Kwak, Jin Young
2016-04-01
To evaluate the diagnostic value of histogram analysis using grayscale sonograms for differentiation of malignant and benign thyroid nodules. From July 2013 through October 2013, 579 nodules in 563 patients who had undergone ultrasound-guided fine-needle aspiration were included. For the grayscale histogram analysis, pixel echogenicity values in regions of interest were measured as 0 to 255 (0, black; 255, white) with in-house software. Five parameters (mean, skewness, kurtosis, standard deviation, and entropy) were obtained for each thyroid nodule. With principal component analysis, an index was derived. Diagnostic performance rates for the 5 histogram parameters and the principal component analysis index were calculated. A total of 563 patients were included in the study (mean age ± SD, 50.3 ± 12.3 years;range, 15-79 years). Of the 579 nodules, 431 were benign, and 148 were malignant. Among the 5 parameters and the principal component analysis index, the standard deviation (75.546 ± 14.153 versus 62.761 ± 16.01; P < .001), kurtosis (3.898 ± 2.652 versus 6.251 ± 9.102; P < .001), entropy (0.16 ± 0.135 versus 0.239 ± 0.185; P < .001), and principal component analysis index (-0.386±0.774 versus 0.134 ± 0.889; P < .001) were significantly different between the malignant and benign nodules. With the calculated cutoff values, the areas under the curve were 0.681 (95% confidence interval, 0.643-0.721) for standard deviation, 0.661 (0.620-0.703) for principal component analysis index, 0.651 (0.607-0.691) for kurtosis, 0.638 (0.596-0.681) for entropy, and 0.606 (0.563-0.647) for skewness. The subjective analysis of grayscale sonograms by radiologists alone showed an area under the curve of 0.861 (0.833-0.888). Grayscale histogram analysis was feasible for differentiating malignant and benign thyroid nodules but did not show better diagnostic performance than subjective analysis performed by radiologists. Further technical advances will be needed to objectify interpretations of thyroid grayscale sonograms. © 2016 by the American Institute of Ultrasound in Medicine.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run.
Armeanu, Daniel; Andrei, Jean Vasile; Lache, Leonard; Panait, Mirela
2017-01-01
The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets.
A multifactor approach to forecasting Romanian gross domestic product (GDP) in the short run
Armeanu, Daniel; Lache, Leonard; Panait, Mirela
2017-01-01
The purpose of this paper is to investigate the application of a generalized dynamic factor model (GDFM) based on dynamic principal components analysis to forecasting short-term economic growth in Romania. We have used a generalized principal components approach to estimate a dynamic model based on a dataset comprising 86 economic and non-economic variables that are linked to economic output. The model exploits the dynamic correlations between these variables and uses three common components that account for roughly 72% of the information contained in the original space. We show that it is possible to generate reliable forecasts of quarterly real gross domestic product (GDP) using just the common components while also assessing the contribution of the individual variables to the dynamics of real GDP. In order to assess the relative performance of the GDFM to standard models based on principal components analysis, we have also estimated two Stock-Watson (SW) models that were used to perform the same out-of-sample forecasts as the GDFM. The results indicate significantly better performance of the GDFM compared with the competing SW models, which empirically confirms our expectations that the GDFM produces more accurate forecasts when dealing with large datasets. PMID:28742100
Combination of PCA and LORETA for sources analysis of ERP data: an emotional processing study
NASA Astrophysics Data System (ADS)
Hu, Jin; Tian, Jie; Yang, Lei; Pan, Xiaohong; Liu, Jiangang
2006-03-01
The purpose of this paper is to study spatiotemporal patterns of neuronal activity in emotional processing by analysis of ERP data. 108 pictures (categorized as positive, negative and neutral) were presented to 24 healthy, right-handed subjects while 128-channel EEG data were recorded. An analysis of two steps was applied to the ERP data. First, principal component analysis was performed to obtain significant ERP components. Then LORETA was applied to each component to localize their brain sources. The first six principal components were extracted, each of which showed different spatiotemporal patterns of neuronal activity. The results agree with other emotional study by fMRI or PET. The combination of PCA and LORETA can be used to analyze spatiotemporal patterns of ERP data in emotional processing.
Fast principal component analysis for stacking seismic data
NASA Astrophysics Data System (ADS)
Wu, Juan; Bai, Min
2018-04-01
Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.
Independent Component Analysis of Textures
NASA Technical Reports Server (NTRS)
Manduchi, Roberto; Portilla, Javier
2000-01-01
A common method for texture representation is to use the marginal probability densities over the outputs of a set of multi-orientation, multi-scale filters as a description of the texture. We propose a technique, based on Independent Components Analysis, for choosing the set of filters that yield the most informative marginals, meaning that the product over the marginals most closely approximates the joint probability density function of the filter outputs. The algorithm is implemented using a steerable filter space. Experiments involving both texture classification and synthesis show that compared to Principal Components Analysis, ICA provides superior performance for modeling of natural and synthetic textures.
Do ewes born with a male co-twin have greater longevity with lambing over time?
USDA-ARS?s Scientific Manuscript database
Based on a recent analysis of historical records, ewes born co-twin to a ram had greater lifetime reproductive performance than ewes born co-twin to a ewe. We are interested in determining what component(s) of lifetime reproductive performance may be associated with a ewe’s co-twin sex. As an initi...
NASA Astrophysics Data System (ADS)
Mantini, D.; Hild, K. E., II; Alleva, G.; Comani, S.
2006-02-01
Independent component analysis (ICA) algorithms have been successfully used for signal extraction tasks in the field of biomedical signal processing. We studied the performances of six algorithms (FastICA, CubICA, JADE, Infomax, TDSEP and MRMI-SIG) for fetal magnetocardiography (fMCG). Synthetic datasets were used to check the quality of the separated components against the original traces. Real fMCG recordings were simulated with linear combinations of typical fMCG source signals: maternal and fetal cardiac activity, ambient noise, maternal respiration, sensor spikes and thermal noise. Clusters of different dimensions (19, 36 and 55 sensors) were prepared to represent different MCG systems. Two types of signal-to-interference ratios (SIR) were measured. The first involves averaging over all estimated components and the second is based solely on the fetal trace. The computation time to reach a minimum of 20 dB SIR was measured for all six algorithms. No significant dependency on gestational age or cluster dimension was observed. Infomax performed poorly when a sub-Gaussian source was included; TDSEP and MRMI-SIG were sensitive to additive noise, whereas FastICA, CubICA and JADE showed the best performances. Of all six methods considered, FastICA had the best overall performance in terms of both separation quality and computation times.
Inversion of gravity gradient tensor data: does it provide better resolution?
NASA Astrophysics Data System (ADS)
Paoletti, V.; Fedi, M.; Italiano, F.; Florio, G.; Ialongo, S.
2016-04-01
The gravity gradient tensor (GGT) has been increasingly used in practical applications, but the advantages and the disadvantages of the analysis of GGT components versus the analysis of the vertical component of the gravity field are still debated. We analyse the performance of joint inversion of GGT components versus separate inversion of the gravity field alone, or of one tensor component. We perform our analysis by inspection of the Picard Plot, a Singular Value Decomposition tool, and analyse both synthetic data and gradiometer measurements carried out at the Vredefort structure, South Africa. We show that the main factors controlling the reliability of the inversion are algebraic ambiguity (the difference between the number of unknowns and the number of available data points) and signal-to-noise ratio. Provided that algebraic ambiguity is kept low and the noise level is small enough so that a sufficient number of SVD components can be included in the regularized solution, we find that: (i) the choice of tensor components involved in the inversion is not crucial to the overall reliability of the reconstructions; (ii) GGT inversion can yield the same resolution as inversion with a denser distribution of gravity data points, but with the advantage of using fewer measurement stations.
Analysis and design of a mechanical system to use with the Ronchi and Fizeau tests
NASA Astrophysics Data System (ADS)
Galán-Martínez, Arturo D.; Santiago-Alvarado, Agustín.; González-García, Jorge; Cruz-Martínez, Víctor M.; Cordero-Dávila, Alberto; Granados-Agustin, Fermin S.; Robledo-Sánchez, Calos
2013-11-01
Nowadays, there is a demand for more efficient opto-mechanical mounts which allow for the implementation of robust optical arrays in a quick and simple fashion. That is to say, mounts are needed which facilitate alignment of the optical components in order to perform the desired movements of each component. Optical testing systems available in the market today are costly, heavy and sometimes require multiple kits depending on the dimensions of the optical components. In this paper, we present the design and analysis of a mechanical system with some interchangeable basic mounts which allow for the application of both Ronchi and Fizeau tests for the evaluation of concave reflective surfaces with a diameter of 2 to 10 cm. The mechanical system design is done using the methodology of product design process, while the analysis is performed using the commercial software SolidWorks.
NASA Astrophysics Data System (ADS)
Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas
2014-05-01
Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.
Harmonic component detection: Optimized Spectral Kurtosis for operational modal analysis
NASA Astrophysics Data System (ADS)
Dion, J.-L.; Tawfiq, I.; Chevallier, G.
2012-01-01
This work is a contribution in the field of Operational Modal Analysis to identify the modal parameters of mechanical structures using only measured responses. The study deals with structural responses coupled with harmonic components amplitude and frequency modulated in a short range, a common combination for mechanical systems with engines and other rotating machines in operation. These harmonic components generate misleading data interpreted erroneously by the classical methods used in OMA. The present work attempts to differentiate maxima in spectra stemming from harmonic components and structural modes. The detection method proposed is based on the so-called Optimized Spectral Kurtosis and compared with others definitions of Spectral Kurtosis described in the literature. After a parametric study of the method, a critical study is performed on numerical simulations and then on an experimental structure in operation in order to assess the method's performance.
A feasibility study on age-related factors of wrist pulse using principal component analysis.
Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim
2016-08-01
Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkinson, V.K.; Young, J.M.
1995-07-01
The US Army`s Project Manager, Advanced Field Artillery System/Future Armored Resupply Vehicle (PM-AFAS/FARV) is sponsoring the development of technologies that can be applied to the resupply vehicle for the Advanced Field Artillery System. The Engineering Technology Division of the Oak Ridge National Laboratory has proposed adding diagnostics/prognostics systems to four components of the Ammunition Transfer Arm of this vehicle, and a cost-benefit analysis was performed on the diagnostics/prognostics to show the potential savings that may be gained by incorporating these systems onto the vehicle. Possible savings could be in the form of reduced downtime, less unexpected or unnecessary maintenance, fewermore » regular maintenance checks. and/or tower collateral damage or loss. The diagnostics/prognostics systems are used to (1) help determine component problems, (2) determine the condition of the components, and (3) estimate the remaining life of the monitored components. The four components on the arm that are targeted for diagnostics/prognostics are (1) the electromechanical brakes, (2) the linear actuators, (3) the wheel/roller bearings, and (4) the conveyor drive system. These would be monitored using electrical signature analysis, vibration analysis, or a combination of both. Annual failure rates for the four components were obtained along with specifications for vehicle costs, crews, number of missions, etc. Accident scenarios based on component failures were postulated, and event trees for these scenarios were constructed to estimate the annual loss of the resupply vehicle, crew, arm. or mission aborts. A levelized cost-benefit analysis was then performed to examine the costs of such failures, both with and without some level of failure reduction due to the diagnostics/prognostics systems. Any savings resulting from using diagnostics/prognostics were calculated.« less
Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grin, A.; Lstiburek, J.
2012-09-01
This report describes the work conducted by the Building Science Corporation (BSC) Building America Research Team's 'Energy Efficient Housing Research Partnerships' project. Based on past experience in the Building America program, they have found that combinations of materials and approaches---in other words, systems--usually provide optimum performance. No single manufacturer typically provides all of the components for an assembly, nor has the specific understanding of all the individual components necessary for optimum performance.
Wave Rotor Research and Technology Development
NASA Technical Reports Server (NTRS)
Welch, Gerard E.
1998-01-01
Wave rotor technology offers the potential to increase the performance of gas turbine engines significantly, within the constraints imposed by current material temperature limits. The wave rotor research at the NASA Lewis Research Center is a three-element effort: 1) Development of design and analysis tools to accurately predict the performance of wave rotor components; 2) Experiments to characterize component performance; 3) System integration studies to evaluate the effect of wave rotor topping on the gas turbine engine system.
Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M
2014-01-01
The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.
Analysis of free modeling predictions by RBO aleph in CASP11.
Mabrouk, Mahmoud; Werner, Tim; Schneider, Michael; Putz, Ines; Brock, Oliver
2016-09-01
The CASP experiment is a biannual benchmark for assessing protein structure prediction methods. In CASP11, RBO Aleph ranked as one of the top-performing automated servers in the free modeling category. This category consists of targets for which structural templates are not easily retrievable. We analyze the performance of RBO Aleph and show that its success in CASP was a result of its ab initio structure prediction protocol. A detailed analysis of this protocol demonstrates that two components unique to our method greatly contributed to prediction quality: residue-residue contact prediction by EPC-map and contact-guided conformational space search by model-based search (MBS). Interestingly, our analysis also points to a possible fundamental problem in evaluating the performance of protein structure prediction methods: Improvements in components of the method do not necessarily lead to improvements of the entire method. This points to the fact that these components interact in ways that are poorly understood. This problem, if indeed true, represents a significant obstacle to community-wide progress. Proteins 2016; 84(Suppl 1):87-104. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1988-01-01
This annual status report presents the results of work performed during the fourth year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes permitting more accurate and efficient 3-D analysis of selected hot section components, i.e., combustor liners, turbine blades and turbine vanes. The computer codes embody a progression of math models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. Volume 1 of this report discusses the special finite element models developed during the fourth year of the contract.
Principal components analysis in clinical studies.
Zhang, Zhongheng; Castelló, Adela
2017-09-01
In multivariate analysis, independent variables are usually correlated to each other which can introduce multicollinearity in the regression models. One approach to solve this problem is to apply principal components analysis (PCA) over these variables. This method uses orthogonal transformation to represent sets of potentially correlated variables with principal components (PC) that are linearly uncorrelated. PCs are ordered so that the first PC has the largest possible variance and only some components are selected to represent the correlated variables. As a result, the dimension of the variable space is reduced. This tutorial illustrates how to perform PCA in R environment, the example is a simulated dataset in which two PCs are responsible for the majority of the variance in the data. Furthermore, the visualization of PCA is highlighted.
ERIC Educational Resources Information Center
Phillips, Sharon A.
2013-01-01
Selecting appropriate performance improvement interventions is a critical component of a comprehensive model of performance improvement. Intervention selection is an interconnected process involving analysis of an organization's environment, definition of the performance problem, and identification of a performance gap and identification of causal…
Periodic component analysis as a spatial filter for SSVEP-based brain-computer interface.
Kiran Kumar, G R; Reddy, M Ramasubba
2018-06-08
Traditional Spatial filters used for steady-state visual evoked potential (SSVEP) extraction such as minimum energy combination (MEC) require the estimation of the background electroencephalogram (EEG) noise components. Even though this leads to improved performance in low signal to noise ratio (SNR) conditions, it makes such algorithms slow compared to the standard detection methods like canonical correlation analysis (CCA) due to the additional computational cost. In this paper, Periodic component analysis (πCA) is presented as an alternative spatial filtering approach to extract the SSVEP component effectively without involving extensive modelling of the noise. The πCA can separate out components corresponding to a given frequency of interest from the background electroencephalogram (EEG) by capturing the temporal information and does not generalize SSVEP based on rigid templates. Data from ten test subjects were used to evaluate the proposed method and the results demonstrate that the periodic component analysis acts as a reliable spatial filter for SSVEP extraction. Statistical tests were performed to validate the results. The experimental results show that πCA provides significant improvement in accuracy compared to standard CCA and MEC in low SNR conditions. The results demonstrate that πCA provides better detection accuracy compared to CCA and on par with that of MEC at a lower computational cost. Hence πCA is a reliable and efficient alternative detection algorithm for SSVEP based brain-computer interface (BCI). Copyright © 2018. Published by Elsevier B.V.
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
Olivares, Pedro R; García-Rubio, Javier
2016-01-01
To analyze the associations between different components of fitness and fatness with academic performance, adjusting the analysis by sex, age, socio-economic status, region and school type in a Chilean sample. Data of fitness, fatness and academic performance was obtained from the Chilean System for the Assessment of Educational Quality test for eighth grade in 2011 and includes a sample of 18,746 subjects (49% females). Partial correlations adjusted by confounders were done to explore association between fitness and fatness components, and between the academic scores. Three unadjusted and adjusted linear regression models were done in order to analyze the associations of variables. Fatness has a negative association with academic performance when Body Mass Index (BMI) and Waist to Height Ratio (WHR) are assessed independently. When BMI and WHR are assessed jointly and adjusted by cofounders, WHR is more associated with academic performance than BMI, and only the association of WHR is positive. For fitness components, strength was the variable most associated with the academic performance. Cardiorespiratory capacity was not associated with academic performance if fatness and other fitness components are included in the model. Fitness and fatness are associated with academic performance. WHR and strength are more related with academic performance than BMI and cardiorespiratory capacity.
2016-01-01
Objectives To analyze the associations between different components of fitness and fatness with academic performance, adjusting the analysis by sex, age, socio-economic status, region and school type in a Chilean sample. Methods Data of fitness, fatness and academic performance was obtained from the Chilean System for the Assessment of Educational Quality test for eighth grade in 2011 and includes a sample of 18,746 subjects (49% females). Partial correlations adjusted by confounders were done to explore association between fitness and fatness components, and between the academic scores. Three unadjusted and adjusted linear regression models were done in order to analyze the associations of variables. Results Fatness has a negative association with academic performance when Body Mass Index (BMI) and Waist to Height Ratio (WHR) are assessed independently. When BMI and WHR are assessed jointly and adjusted by cofounders, WHR is more associated with academic performance than BMI, and only the association of WHR is positive. For fitness components, strength was the variable most associated with the academic performance. Cardiorespiratory capacity was not associated with academic performance if fatness and other fitness components are included in the model. Conclusions Fitness and fatness are associated with academic performance. WHR and strength are more related with academic performance than BMI and cardiorespiratory capacity. PMID:27761345
Repp, B H
1999-03-01
Patterns of expressive dynamics were measured in bars 1-5 of 115 commercially recorded performances of Chopin's Etude in E major, op. 10, No. 3. The grand average pattern (or dynamic profile) was representative of many performances and highly similar to the average dynamic profile of a group of advanced student performances, which suggests a widely shared central norm of expressive dynamics. The individual dynamic profiles were subjected to principal components analysis, which yielded Varimax-rotated components, each representing a different, nonstandard dynamic profile associated with a small subset of performances. Most performances had dynamic patterns resembling a mixture of several components, and no clustering of of performances into distinct groups was apparent. Some weak relationships of dynamic profiles with sociocultural variables were found, most notably a tendency of female pianists to exhibit a greater dynamic range in the melody. Within the melody, there were no significant relationships between expressive timing [Repp, J. Acoust. Soc. Am. 104, 1085-1100 (1998)] and expressive dynamics. These two important dimensions seemed to be controlled independently at this local level and thus offer the artist many degrees of freedom in giving a melody expressive shape.
NASA Technical Reports Server (NTRS)
2000-01-01
This test report presents the test data of the EOS AMSU-A Flight Model No.1 (FM-1) receiver subsystem. The tests are performed per the Acceptance Test Procedure for the AMSU-A Reseiver Subsystem, AE-26002/6A. The functional performance tests are conducted either at the component or subsystem level. While the component-level tests are performed over the entire operating temperature range predicted by thermal analysis, the subsystem-level test are conducted at ambient temperature only.
Estimating optical imaging system performance for space applications
NASA Technical Reports Server (NTRS)
Sinclair, K. F.
1972-01-01
The critical system elements of an optical imaging system are identified and a method for an initial assessment of system performance is presented. A generalized imaging system is defined. A system analysis is considered, followed by a component analysis. An example of the method is given using a film imaging system.
An Integrated Framework for Parameter-based Optimization of Scientific Workflows.
Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel
2009-01-01
Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.
NASA Astrophysics Data System (ADS)
Dafu, Shen; Leihong, Zhang; Dong, Liang; Bei, Li; Yi, Kang
2017-07-01
The purpose of this study is to improve the reconstruction precision and better copy the color of spectral image surfaces. A new spectral reflectance reconstruction algorithm based on an iterative threshold combined with weighted principal component space is presented in this paper, and the principal component with weighted visual features is the sparse basis. Different numbers of color cards are selected as the training samples, a multispectral image is the testing sample, and the color differences in the reconstructions are compared. The channel response value is obtained by a Mega Vision high-accuracy, multi-channel imaging system. The results show that spectral reconstruction based on weighted principal component space is superior in performance to that based on traditional principal component space. Therefore, the color difference obtained using the compressive-sensing algorithm with weighted principal component analysis is less than that obtained using the algorithm with traditional principal component analysis, and better reconstructed color consistency with human eye vision is achieved.
Fan, Feiyi; Yan, Yuepeng; Tang, Yongzhong; Zhang, Hao
2017-12-01
Monitoring pulse oxygen saturation (SpO 2 ) and heart rate (HR) using photoplethysmography (PPG) signal contaminated by a motion artifact (MA) remains a difficult problem, especially when the oximeter is not equipped with a 3-axis accelerometer for adaptive noise cancellation. In this paper, we report a pioneering investigation on the impact of altering the frame length of Molgedey and Schuster independent component analysis (ICAMS) on performance, design a multi-classifier fusion strategy for selecting the PPG correlated signal component, and propose a novel approach to extract SpO 2 and HR readings from PPG signal contaminated by strong MA interference. The algorithm comprises multiple stages, including dual frame length ICAMS, a multi-classifier-based PPG correlated component selector, line spectral analysis, tree-based HR monitoring, and post-processing. Our approach is evaluated by multi-subject tests. The root mean square error (RMSE) is calculated for each trial. Three statistical metrics are selected as performance evaluation criteria: mean RMSE, median RMSE and the standard deviation (SD) of RMSE. The experimental results demonstrate that a shorter ICAMS analysis window probably results in better performance in SpO 2 estimation. Notably, the designed multi-classifier signal component selector achieved satisfactory performance. The subject tests indicate that our algorithm outperforms other baseline methods regarding accuracy under most criteria. The proposed work can contribute to improving the performance of current pulse oximetry and personal wearable monitoring devices. Copyright © 2017 Elsevier Ltd. All rights reserved.
Three-Point Flexural Properties of Bonded Reinforcement Elements for Pleasure Craft Decks
NASA Astrophysics Data System (ADS)
Di Bella, G.; Galtieri, G.; Borsellino, C.
2018-02-01
The aim of this work was both to study the performances of pleasure craft reinforced components, bonded using a structural adhesive, and to compare them with those obtained using over-lamination as joining system, typically employed in the shipbuilding. With such aim, two different lots of components were prepared: in the first lot, the reinforcement structures were laminated directly on the investigated composite components and, in the second one; they were made separately in a mould and, then, bonded to the composite components. This last method allowed to evaluate the introduction of a product/process innovation in a field typically unwilling to innovation, still tied to craft, and non-standardized procedures. The results of bending tests, performed in order to evaluate the mechanical behaviour of the reinforced components, evidenced the goodness of this innovative design choice. Finally, a finite element analysis was performed. [Figure not available: see fulltext.
Study of advanced techniques for determining the long term performance of components
NASA Technical Reports Server (NTRS)
1973-01-01
The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.
Jamadar, Sharna D; Egan, Gary F; Calhoun, Vince D; Johnson, Beth; Fielding, Joanne
2016-07-01
Intrinsic brain activity provides the functional framework for the brain's full repertoire of behavioral responses; that is, a common mechanism underlies intrinsic and extrinsic neural activity, with extrinsic activity building upon the underlying baseline intrinsic activity. The generation of a motor movement in response to sensory stimulation is one of the most fundamental functions of the central nervous system. Since saccadic eye movements are among our most stereotyped motor responses, we hypothesized that individual variability in the ability to inhibit a prepotent saccade and make a voluntary antisaccade would be related to individual variability in intrinsic connectivity. Twenty-three individuals completed the antisaccade task and resting-state functional magnetic resonance imaging (fMRI). A multivariate analysis of covariance identified relationships between fMRI oscillations (0.01-0.2 Hz) of resting-state networks determined using high-dimensional independent component analysis and antisaccade performance (latency, error rate). Significant multivariate relationships between antisaccade latency and directional error rate were obtained in independent components across the entire brain. Some of the relationships were obtained in components that overlapped substantially with the task; however, many were obtained in components that showed little overlap with the task. The current results demonstrate that even in the absence of a task, spectral power in regions showing little overlap with task activity predicts an individual's performance on a saccade task.
Deineko, Viktor
2006-01-01
Human multisynthetase complex auxiliary component, protein p43 is an endothelial monocyte-activating polypeptide II precursor. In this study, comprehensive sequence analysis of N-terminus has been performed to identify structural domains, motifs, sites of post-translation modification and other functionally important parameters. The spatial structure model of full-chain protein p43 is obtained.
Creation of a virtual cutaneous tissue bank
NASA Astrophysics Data System (ADS)
LaFramboise, William A.; Shah, Sujal; Hoy, R. W.; Letbetter, D.; Petrosko, P.; Vennare, R.; Johnson, Peter C.
2000-04-01
Cellular and non-cellular constituents of skin contain fundamental morphometric features and structural patterns that correlate with tissue function. High resolution digital image acquisitions performed using an automated system and proprietary software to assemble adjacent images and create a contiguous, lossless, digital representation of individual microscope slide specimens. Serial extraction, evaluation and statistical analysis of cutaneous feature is performed utilizing an automated analysis system, to derive normal cutaneous parameters comprising essential structural skin components. Automated digital cutaneous analysis allows for fast extraction of microanatomic dat with accuracy approximating manual measurement. The process provides rapid assessment of feature both within individual specimens and across sample populations. The images, component data, and statistical analysis comprise a bioinformatics database to serve as an architectural blueprint for skin tissue engineering and as a diagnostic standard of comparison for pathologic specimens.
Test bed experiments for various telerobotic system characteristics and configurations
NASA Technical Reports Server (NTRS)
Duffie, Neil A.; Wiker, Steven F.; Zik, John J.
1990-01-01
Dexterous manipulation and grasping in telerobotic systems depends on the integration of high-performance sensors, displays, actuators and controls into systems in which careful consideration has been given to human perception and tolerance. Research underway at the Wisconsin Center for Space Automation and Robotics (WCSAR) has the objective of enhancing the performance of these systems and their components, and quantifying the effects of the many electrical, mechanical, control, and human factors that affect their performance. This will lead to a fundamental understanding of performance issues which will in turn allow designers to evaluate sensor, actuator, display, and control technologies with respect to generic measures of dexterous performance. As part of this effort, an experimental test bed was developed which has telerobotic components with exceptionally high fidelity in master/slave operation. A Telerobotic Performance Analysis System has also been developed which allows performance to be determined for various system configurations and electro-mechanical characteristics. Both this performance analysis system and test bed experiments are described.
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.
The Importance of Engine External's Health
NASA Technical Reports Server (NTRS)
Stoner, Barry L.
2006-01-01
Engine external components include all the fluid carrying, electron carrying, and support devices that are needed to operate the propulsion system. These components are varied and include: pumps, valves, actuators, solenoids, sensors, switches, heat exchangers, electrical generators, electrical harnesses, tubes, ducts, clamps and brackets. The failure of any component to perform its intended function will result in a maintenance action, a dispatch delay, or an engine in flight shutdown. The life of each component, in addition to its basic functional design, is closely tied to its thermal and dynamic environment .Therefore, to reach a mature design life, the component's thermal and dynamic environment must be understood and controlled, which can only be accomplished by attention to design analysis and testing. The purpose of this paper is to review analysis and test techniques toward achieving good component health.
Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
2011-01-01
Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.
Time-dependent inertia analysis of vehicle mechanisms
NASA Astrophysics Data System (ADS)
Salmon, James Lee
Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.
Integration Test of the High Voltage Hall Accelerator System Components
NASA Technical Reports Server (NTRS)
Kamhawi, Hani; Haag, Thomas; Huang, Wensheng; Pinero, Luis; Peterson, Todd; Dankanich, John
2013-01-01
NASA Glenn Research Center is developing a 4 kilowatt-class Hall propulsion system for implementation in NASA science missions. NASA science mission performance analysis was completed using the latest high voltage Hall accelerator (HiVHAc) and Aerojet-Rocketdyne's state-of-the-art BPT-4000 Hall thruster performance curves. Mission analysis results indicated that the HiVHAc thruster out performs the BPT-4000 thruster for all but one of the missions studied. Tests of the HiVHAc system major components were performed. Performance evaluation of the HiVHAc thruster at NASA Glenn's vacuum facility 5 indicated that thruster performance was lower than performance levels attained during tests in vacuum facility 12 due to the lower background pressures attained during vacuum facility 5 tests when compared to vacuum facility 12. Voltage-Current characterization of the HiVHAc thruster in vacuum facility 5 showed that the HiVHAc thruster can operate stably for a wide range of anode flow rates for discharge voltages between 250 and 600 volts. A Colorado Power Electronics enhanced brassboard power processing unit was tested in vacuum for 1,500 hours and the unit demonstrated discharge module efficiency of 96.3% at 3.9 kilowatts and 650 volts. Stand-alone open and closed loop tests of a VACCO TRL 6 xenon flow control module were also performed. An integrated test of the HiVHAc thruster, brassboard power processing unit, and xenon flow control module was performed and confirmed that integrated operation of the HiVHAc system major components. Future plans include continuing the maturation of the HiVHAc system major components and the performance of a single-string integration test.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2017-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws and geometric features were inspected using a 2-megavolt linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Integrated smart panel and support structure response
NASA Astrophysics Data System (ADS)
DeGiorgi, Virginia G.
1998-06-01
The performance of smart structures is a complex interaction between active and passive components. Active components, even when non-activated, can have an impact on structural performance and, conversely, structural characteristics of passive components can have a measurable impact on active component performance. The present work is an evaluation of the structural characteristics of an active panel designed for acoustic quieting. The support structure is included in the panel design as evaluated. Finite element methods are used to determine the active panel-support structure response. Two conditions are considered; a hollow unfilled support structure and the same structure filled with a polymer compound. Finite element models were defined so that stiffness values corresponding to the center of individual pistons could be determined. Superelement techniques were used to define mass and stiffness values representative of the combined active and support structure at the center of each piston. Results of interest obtained from the analysis include mode shapes, natural frequencies, and equivalent spring stuffiness for use in structural response models to represent the support structure. The effects on plate motion on piston performance cannot be obtained from this analysis, however mass and stiffness matrices for use in an integrated system model to determine piston head velocities can be obtained from this work.
Personal Computer Transport Analysis Program
NASA Technical Reports Server (NTRS)
DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter
2012-01-01
The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.
Component-specific modeling. [jet engine hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
Accomplishments are described for a 3 year program to develop methodology for component-specific modeling of aircraft hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models, (2) geometry model generators, (3) remeshing, (4) specialty three-dimensional inelastic structural analysis, (5) computationally efficient solvers, (6) adaptive solution strategies, (7) engine performance parameters/component response variables decomposition and synthesis, (8) integrated software architecture and development, and (9) validation cases for software developed.
NASA Astrophysics Data System (ADS)
Lucia, M.; Kaita, R.; Majeski, R.; Bedoya, F.; Allain, J. P.; Abrams, T.; Bell, R. E.; Boyle, D. P.; Jaworski, M. A.; Schmitt, J. C.
2015-08-01
The Materials Analysis and Particle Probe (MAPP) diagnostic has been implemented on the Lithium Tokamak Experiment (LTX) at PPPL, providing the first in situ X-ray photoelectron spectroscopy (XPS) surface characterization of tokamak plasma facing components (PFCs). MAPP samples were exposed to argon glow discharge conditioning (GDC), lithium evaporations, and hydrogen tokamak discharges inside LTX. Samples were analyzed with XPS, and alterations to surface conditions were correlated against observed LTX plasma performance changes. Argon GDC caused the accumulation of nm-scale metal oxide layers on the PFC surface, which appeared to bury surface carbon and oxygen contamination and thus improve plasma performance. Lithium evaporation led to the rapid formation of a lithium oxide (Li2O) surface; plasma performance was strongly improved for sufficiently thick evaporative coatings. Results indicate that a 5 h argon GDC or a 50 nm evaporative lithium coating will both significantly improve LTX plasma performance.
LeVan, P; Urrestarazu, E; Gotman, J
2006-04-01
To devise an automated system to remove artifacts from ictal scalp EEG, using independent component analysis (ICA). A Bayesian classifier was used to determine the probability that 2s epochs of seizure segments decomposed by ICA represented EEG activity, as opposed to artifact. The classifier was trained using numerous statistical, spectral, and spatial features. The system's performance was then assessed using separate validation data. The classifier identified epochs representing EEG activity in the validation dataset with a sensitivity of 82.4% and a specificity of 83.3%. An ICA component was considered to represent EEG activity if the sum of the probabilities that its epochs represented EEG exceeded a threshold predetermined using the training data. Otherwise, the component represented artifact. Using this threshold on the validation set, the identification of EEG components was performed with a sensitivity of 87.6% and a specificity of 70.2%. Most misclassified components were a mixture of EEG and artifactual activity. The automated system successfully rejected a good proportion of artifactual components extracted by ICA, while preserving almost all EEG components. The misclassification rate was comparable to the variability observed in human classification. Current ICA methods of artifact removal require a tedious visual classification of the components. The proposed system automates this process and removes simultaneously multiple types of artifacts.
Zhou, Xiahui; Chen, Xiaocheng; Wu, Xin; Cao, Gang; Zhang, Junjie
2016-04-01
In this study, high-performance liquid chromatography coupled with amaZon SL high-performance ion trap mass spectrometry was used to analyze the target components in white chrysanthemum flowers of Hangzhou. Twenty-one components were detected and identified in both white chrysanthemum flowers of Hangzhou samples by using target compound analysis. Furthermore, seven new compounds in white chrysanthemum flowers of Hangzhou were found and identified by analyzing the fragment ion behavior in the mass spectra. The established method can be expedient for the global quality investigation of complex components in herbal medicines and food. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong
2017-04-01
An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Adams, D. F.; Hartmann, U. G.; Lazarow, L. L.; Maloy, J. O.; Mohler, G. W.
1976-01-01
The design of the vector magnetometer selected for analysis is capable of exceeding the required accuracy of 5 gamma per vector field component. The principal elements that assure this performance level are very low power dissipation triaxial feedback coils surrounding ring core flux-gates and temperature control of the critical components of two-loop feedback electronics. An analysis of the calibration problem points to the need for improved test facilities.
Development and test of advanced composite components. Center Directors discretionary fund program
NASA Technical Reports Server (NTRS)
Faile, G.; Hollis, R.; Ledbetter, F.; Maldonado, J.; Sledd, J.; Stuckey, J.; Waggoner, G.; Engler, E.
1985-01-01
This report describes the design, analysis, fabrication, and test of a complex bathtub fitting. Graphite fibers in an epoxy matrix were utilized in manufacturing of 11 components representing four different design and layup concepts. Design allowables were developed for use in the final stress analysis. Strain gage measurements were taken throughout the static load test and correlation of test and analysis data were performed, yielding good understanding of the material behavior and instrumentation requirements for future applications.
NASA Astrophysics Data System (ADS)
Filiatrault, Andre; Sullivan, Timothy
2014-08-01
With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.
CRAX/Cassandra Reliability Analysis Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D.
1999-02-10
Over the past few years Sandia National Laboratories has been moving toward an increased dependence on model- or physics-based analyses as a means to assess the impact of long-term storage on the nuclear weapons stockpile. These deterministic models have also been used to evaluate replacements for aging systems, often involving commercial off-the-shelf components (COTS). In addition, the models have been used to assess the performance of replacement components manufactured via unique, small-lot production runs. In either case, the limited amount of available test data dictates that the only logical course of action to characterize the reliability of these components ismore » to specifically consider the uncertainties in material properties, operating environment etc. within the physics-based (deterministic) model. This not only provides the ability to statistically characterize the expected performance of the component or system, but also provides direction regarding the benefits of additional testing on specific components within the system. An effort was therefore initiated to evaluate the capabilities of existing probabilistic methods and, if required, to develop new analysis methods to support the inclusion of uncertainty in the classical design tools used by analysts and design engineers at Sandia. The primary result of this effort is the CMX (Cassandra Exoskeleton) reliability analysis software.« less
A hybrid spatial-spectral denoising method for infrared hyperspectral images using 2DPCA
NASA Astrophysics Data System (ADS)
Huang, Jun; Ma, Yong; Mei, Xiaoguang; Fan, Fan
2016-11-01
The traditional noise reduction methods for 3-D infrared hyperspectral images typically operate independently in either the spatial or spectral domain, and such methods overlook the relationship between the two domains. To address this issue, we propose a hybrid spatial-spectral method in this paper to link both domains. First, principal component analysis and bivariate wavelet shrinkage are performed in the 2-D spatial domain. Second, 2-D principal component analysis transformation is conducted in the 1-D spectral domain to separate the basic components from detail ones. The energy distribution of noise is unaffected by orthogonal transformation; therefore, the signal-to-noise ratio of each component is used as a criterion to determine whether a component should be protected from over-denoising or denoised with certain 1-D denoising methods. This study implements the 1-D wavelet shrinking threshold method based on Stein's unbiased risk estimator, and the quantitative results on publicly available datasets demonstrate that our method can improve denoising performance more effectively than other state-of-the-art methods can.
NASA Technical Reports Server (NTRS)
Monaghan, Mark W.; Gillespie, Amanda M.
2013-01-01
During the shuttle era NASA utilized a failure reporting system called the Problem Reporting and Corrective Action (PRACA) it purpose was to identify and track system non-conformance. The PRACA system over the years evolved from a relatively nominal way to identify system problems to a very complex tracking and report generating data base. The PRACA system became the primary method to categorize any and all anomalies from corrosion to catastrophic failure. The systems documented in the PRACA system range from flight hardware to ground or facility support equipment. While the PRACA system is complex, it does possess all the failure modes, times of occurrence, length of system delay, parts repaired or replaced, and corrective action performed. The difficulty is mining the data then to utilize that data in order to estimate component, Line Replaceable Unit (LRU), and system reliability analysis metrics. In this paper, we identify a methodology to categorize qualitative data from the ground system PRACA data base for common ground or facility support equipment. Then utilizing a heuristic developed for review of the PRACA data determine what reports identify a credible failure. These data are the used to determine inter-arrival times to perform an estimation of a metric for repairable component-or LRU reliability. This analysis is used to determine failure modes of the equipment, determine the probability of the component failure mode, and support various quantitative differing techniques for performing repairable system analysis. The result is that an effective and concise estimate of components used in manned space flight operations. The advantage is the components or LRU's are evaluated in the same environment and condition that occurs during the launch process.
Dimensionality reduction for the quantitative evaluation of a smartphone-based Timed Up and Go test.
Palmerini, Luca; Mellone, Sabato; Rocchi, Laura; Chiari, Lorenzo
2011-01-01
The Timed Up and Go is a clinical test to assess mobility in the elderly and in Parkinson's disease. Lately instrumented versions of the test are being considered, where inertial sensors assess motion. To improve the pervasiveness, ease of use, and cost, we consider a smartphone's accelerometer as the measurement system. Several parameters (usually highly correlated) can be computed from the signals recorded during the test. To avoid redundancy and obtain the features that are most sensitive to the locomotor performance, a dimensionality reduction was performed through principal component analysis (PCA). Forty-nine healthy subjects of different ages were tested. PCA was performed to extract new features (principal components) which are not redundant combinations of the original parameters and account for most of the data variability. They can be useful for exploratory analysis and outlier detection. Then, a reduced set of the original parameters was selected through correlation analysis with the principal components. This set could be recommended for studies based on healthy adults. The proposed procedure could be used as a first-level feature selection in classification studies (i.e. healthy-Parkinson's disease, fallers-non fallers) and could allow, in the future, a complete system for movement analysis to be incorporated in a smartphone.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-05-04
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.
Independent component analysis for onset detection in piano trills
NASA Astrophysics Data System (ADS)
Brown, Judith C.; Todd, Jeremy G.; Smaragdis, Paris
2002-05-01
The detection of onsets in piano music is difficult due to the presence of many notes simultaneously and their long decay times from pedaling. This is even more difficult for trills where the rapid note changes make it difficult to observe a decrease in amplitude for individual notes in either the temporal wave form or the time dependent Fourier components. Occasionally one note of the trill has a much lower amplitude than the other making an unambiguous determination of its onset virtually impossible. We have analyzed a number of trills from CD's of performances by Horowitz, Ashkenazy, and Goode, choosing the same trill and different performances where possible. The Fourier transform was calculated as a function of time, and the magnitude coefficients served as input for a calculation using the method of independent component analysis. In most cases this gave a more definitive determination of the onset times, as can be demonstrated graphically. For comparison identical calculations have been carried out on recordings of midi generated performances on a Yamaha Disclavier piano.
Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge
2016-01-01
Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
[Balanced scorecard for performance measurement of a nursing organization in a Korean hospital].
Hong, Yoonmi; Hwang, Kyung Ja; Kim, Mi Ja; Park, Chang Gi
2008-02-01
The purpose of this study was to develop a balanced scorecard (BSC) for performance measurement of a Korean hospital nursing organization and to evaluate the validity and reliability of performance measurement indicators. Two hundred fifty-nine nurses in a Korean hospital participated in a survey questionnaire that included 29-item performance evaluation indicators developed by investigators of this study based on the Kaplan and Norton's BSC (1992). Cronbach's alpha was used to test the reliability of the BSC. Exploratory and confirmatory factor analysis with a structure equation model (SEM) was applied to assess the construct validity of the BSC. Cronbach's alpha of 29 items was .948. Factor analysis of the BSC showed 5 principal components (eigen value >1.0) which explained 62.7% of the total variance, and it included a new one, community service. The SEM analysis results showed that 5 components were significant for the hospital BSC tool. High degree of reliability and validity of this BSC suggests that it may be used for performance measurements of a Korean hospital nursing organization. Future studies may consider including a balanced number of nurse managers and staff nurses in the study. Further data analysis on the relationships among factors is recommended.
RSA prediction of high failure rate for the uncoated Interax TKA confirmed by meta-analysis.
Pijls, Bart G; Nieuwenhuijse, Marc J; Schoones, Jan W; Middeldorp, Saskia; Valstar, Edward R; Nelissen, Rob G H H
2012-04-01
In a previous radiostereometric (RSA) trial the uncoated, uncemented, Interax tibial components showed excessive migration within 2 years compared to HA-coated and cemented tibial components. It was predicted that this type of fixation would have a high failure rate. The purpose of this systematic review and meta-analysis was to investigate whether this RSA prediction was correct. We performed a systematic review and meta-analysis to determine the revision rate for aseptic loosening of the uncoated and cemented Interax tibial components. 3 studies were included, involving 349 Interax total knee arthroplasties (TKAs) for the comparison of uncoated and cemented fixation. There were 30 revisions: 27 uncoated and 3 cemented components. There was a 3-times higher revision rate for the uncoated Interax components than that for cemented Interax components (OR = 3; 95% CI: 1.4-7.2). This meta-analysis confirms the prediction of a previous RSA trial. The uncoated Interax components showed the highest migration and turned out to have the highest revision rate for aseptic loosening. RSA appears to enable efficient detection of an inferior design as early as 2 years postoperatively in a small group of patients.
Hooper, R.P.; Peters, N.E.
1989-01-01
A principal-components analysis was performed on the major solutes in wet deposition collected from 194 stations in the United States and its territories. Approximately 90% of the components derived could be interpreted as falling into one of three categories - acid, salt, or an agricultural/soil association. The total mass, or the mass of any one solute, was apportioned among these components by multiple linear regression techniques. The use of multisolute components for determining trends or spatial distribution represents a substantial improvement over single-solute analysis in that these components are more directly related to the sources of the deposition. The geographic patterns displayed by the components in this analysis indicate a far more important role for acid deposition in the Southeast and intermountain regions of the United States than would be indicated by maps of sulfate or nitrate deposition alone. In the Northeast and Midwest, the acid component is not declining at most stations, as would be expected from trends in sulfate deposition, but is holding constant or increasing. This is due, in part, to a decline in the agriculture/soil factor throughout this region, which would help to neutralize the acidity.
Reliability Centred Maintenance (RCM) Analysis of Laser Machine in Filling Lithos at PT X
NASA Astrophysics Data System (ADS)
Suryono, M. A. E.; Rosyidi, C. N.
2018-03-01
PT. X used automated machines which work for sixteen hours per day. Therefore, the machines should be maintained to keep the availability of the machines. The aim of this research is to determine maintenance tasks according to the cause of component’s failure using Reliability Centred Maintenance (RCM) and determine the amount of optimal inspection frequency which must be performed to the machine at filling lithos process. In this research, RCM is used as an analysis tool to determine the critical component and find optimal inspection frequencies to maximize machine’s reliability. From the analysis, we found that the critical machine in filling lithos process is laser machine in Line 2. Then we proceed to determine the cause of machine’s failure. Lastube component has the highest Risk Priority Number (RPN) among other components such as power supply, lens, chiller, laser siren, encoder, conveyor, and mirror galvo. Most of the components have operational consequences and the others have hidden failure consequences and safety consequences. Time-directed life-renewal task, failure finding task, and servicing task can be used to overcome these consequences. The results of data analysis show that the inspection must be performed once a month for laser machine in the form of preventive maintenance to lowering the downtime.
Classification of breast tissue in mammograms using efficient coding.
Costa, Daniel D; Campos, Lúcio F; Barros, Allan K
2011-06-24
Female breast cancer is the major cause of death by cancer in western countries. Efforts in Computer Vision have been made in order to improve the diagnostic accuracy by radiologists. Some methods of lesion diagnosis in mammogram images were developed based in the technique of principal component analysis which has been used in efficient coding of signals and 2D Gabor wavelets used for computer vision applications and modeling biological vision. In this work, we present a methodology that uses efficient coding along with linear discriminant analysis to distinguish between mass and non-mass from 5090 region of interest from mammograms. The results show that the best rates of success reached with Gabor wavelets and principal component analysis were 85.28% and 87.28%, respectively. In comparison, the model of efficient coding presented here reached up to 90.07%. Altogether, the results presented demonstrate that independent component analysis performed successfully the efficient coding in order to discriminate mass from non-mass tissues. In addition, we have observed that LDA with ICA bases showed high predictive performance for some datasets and thus provide significant support for a more detailed clinical investigation.
Sun, Hui; Wang, Huiyu; Zhang, Aihua; Yan, Guangli; Han, Ying; Li, Yuan; Wu, Xiuhong; Meng, Xiangcai; Wang, Xijun
2016-01-01
As herbal medicines have an important position in health care systems worldwide, their current assessment, and quality control are a major bottleneck. Cortex Phellodendri chinensis (CPC) and Cortex Phellodendri amurensis (CPA) are widely used in China, however, how to identify species of CPA and CPC has become urgent. In this study, multivariate analysis approach was performed to the investigation of chemical discrimination of CPA and CPC. Principal component analysis showed that two herbs could be separated clearly. The chemical markers such as berberine, palmatine, phellodendrine, magnoflorine, obacunone, and obaculactone were identified through the orthogonal partial least squared discriminant analysis, and were identified tentatively by the accurate mass of quadruple-time-of-flight mass spectrometry. A total of 29 components can be used as the chemical markers for discrimination of CPA and CPC. Of them, phellodenrine is significantly higher in CPC than that of CPA, whereas obacunone and obaculactone are significantly higher in CPA than that of CPC. The present study proves that multivariate analysis approach based chemical analysis greatly contributes to the investigation of CPA and CPC, and showed that the identified chemical markers as a whole should be used to discriminate the two herbal medicines, and simultaneously the results also provided chemical information for their quality assessment. Multivariate analysis approach was performed to the investigate the herbal medicineThe chemical markers were identified through multivariate analysis approachA total of 29 components can be used as the chemical markers. UPLC-Q/TOF-MS-based multivariate analysis method for the herbal medicine samples Abbreviations used: CPC: Cortex Phellodendri chinensis, CPA: Cortex Phellodendri amurensis, PCA: Principal component analysis, OPLS-DA: Orthogonal partial least squares discriminant analysis, BPI: Base peaks ion intensity.
Energy Efficient Engine: Combustor component performance program
NASA Technical Reports Server (NTRS)
Dubiel, D. J.
1986-01-01
The results of the Combustor Component Performance analysis as developed under the Energy Efficient Engine (EEE) program are presented. This study was conducted to demonstrate the aerothermal and environmental goals established for the EEE program and to identify areas where refinements might be made to meet future combustor requirements. In this study, a full annular combustor test rig was used to establish emission levels and combustor performance for comparison with those indicated by the supporting technology program. In addition, a combustor sector test rig was employed to examine differences in emissions and liner temperatures obtained during the full annular performance and supporting technology tests.
Independent EEG Sources Are Dipolar
Delorme, Arnaud; Palmer, Jason; Onton, Julie; Oostenveld, Robert; Makeig, Scott
2012-01-01
Independent component analysis (ICA) and blind source separation (BSS) methods are increasingly used to separate individual brain and non-brain source signals mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings. We compared results of decomposing thirteen 71-channel human scalp EEG datasets by 22 ICA and BSS algorithms, assessing the pairwise mutual information (PMI) in scalp channel pairs, the remaining PMI in component pairs, the overall mutual information reduction (MIR) effected by each decomposition, and decomposition ‘dipolarity’ defined as the number of component scalp maps matching the projection of a single equivalent dipole with less than a given residual variance. The least well-performing algorithm was principal component analysis (PCA); best performing were AMICA and other likelihood/mutual information based ICA methods. Though these and other commonly-used decomposition methods returned many similar components, across 18 ICA/BSS algorithms mean dipolarity varied linearly with both MIR and with PMI remaining between the resulting component time courses, a result compatible with an interpretation of many maximally independent EEG components as being volume-conducted projections of partially-synchronous local cortical field activity within single compact cortical domains. To encourage further method comparisons, the data and software used to prepare the results have been made available (http://sccn.ucsd.edu/wiki/BSSComparison). PMID:22355308
Optical Performance Of The Gemini Carbon Dioxide Laser Fusion System
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.; Hayden, J. J.; Liberman, I.
1980-11-01
The performance of the Gemini two beam carbon dioxide laser fusion system was recently upgraded by installation of optical components with improved quality in the final amplifier. A theoretical analysis was conducted in conlunction with measurements of the new performance. The analysis and experimental procedures, and results obtained are reported and compared. Good agreement was found which was within the uncertainties of the analysis and the inaccuracies of the experiments. The focal spot Strehl ratio was between 0.24 and 0.3 for both beams.
The Factor Structure of Some Piagetian Tasks
ERIC Educational Resources Information Center
Lawson, Anton E.; Nordland, Floyd H.
1976-01-01
Investigated was the hypothesis that conservation tasks are unifactor by administering eight different conservation tasks to 96 seventh-grade science students and performing a principal component analysis on the data. Results indicated that conservation tasks may measure up to three different components of cognitive thought. (SL)
Eaton, Jennifer L; Mohr, David C; Hodgson, Michael J; McPhaul, Kathleen M
2018-02-01
To describe development and validation of the work-related well-being (WRWB) index. Principal components analysis was performed using Federal Employee Viewpoint Survey (FEVS) data (N = 392,752) to extract variables representing worker well-being constructs. Confirmatory factor analysis was performed to verify factor structure. To validate the WRWB index, we used multiple regression analysis to examine relationships with burnout associated outcomes. Principal Components Analysis identified three positive psychology constructs: "Work Positivity", "Co-worker Relationships", and "Work Mastery". An 11 item index explaining 63.5% of variance was achieved. The structural equation model provided a very good fit to the data. Higher WRWB scores were positively associated with all three employee experience measures examined in regression models. The new WRWB index shows promise as a valid and widely accessible instrument to assess worker well-being.
Gao, Boyan; Lu, Yingjian; Sheng, Yi; Chen, Pei; Yu, Liangli (Lucy)
2013-01-01
High performance liquid chromatography (HPLC) and flow injection electrospray ionization with ion trap mass spectrometry (FIMS) fingerprints combined with the principal component analysis (PCA) were examined for their potential in differentiating commercial organic and conventional sage samples. The individual components in the sage samples were also characterized with an ultra-performance liquid chromatography with a quadrupole-time of flight mass spectrometer (UPLC Q-TOF MS). The results suggested that both HPLC and FIMS fingerprints combined with PCA could differentiate organic and conventional sage samples effectively. FIMS may serve as a quick test capable of distinguishing organic and conventional sages in 1 min, and could potentially be developed for high-throughput applications; whereas HPLC fingerprints could provide more chemical composition information with a longer analytical time. PMID:23464755
NASA Astrophysics Data System (ADS)
Bisyri Husin Musawi Maliki, Ahmad; Razali Abdullah, Mohamad; Juahir, Hafizan; Muhamad, Wan Siti Amalina Wan; Afiqah Mohamad Nasir, Nur; Muazu Musa, Rabiu; Musliha Mat-Rasid, Siti; Adnan, Aleesha; Azura Kosni, Norlaila; Abdullah, Farhana; Ain Shahirah Abdullah, Nurul
2018-04-01
The main purpose of this study was to develop Anthropometric, Growth and Maturity Index (AGaMI) in soccer and explore its differences to soccer player physical attributes, fitness, motivation and skills. A total 223 adolescent soccer athletes aged 12 to 18 years old were selected as respondent. AGaMI was develop based on anthropometric components (bicep, tricep, subscapular, suprailiac, calf circumference and muac) with growth and maturity component using tanner scale. Meanwhile, relative performance namely physical, fitness, motivation and skills attributes of soccer were measured as dependent variables. The Principal Component Analysis (PCA) and Analysis of Variance (ANOVA) are used to achieve the objective in this study. AGaMI had categorized players into three different groups namely; high (5 players), moderate (88 players) and low (91 players). PCA revealed a moderate to very strong dominant range of 0.69 to 0.90 of factor loading on AGaMI. Further analysis assigned AGaMI groups as treated as independent variables (IV) and physical, fitness, motivation and skills attributes were treated as dependent variables (DV). Finally, ANOVA showed that flexibility, leg power, age, weight, height, sitting height, short and long pass are the most significant parameters statistically differentiate by the groups of AGaMI (p<0.05). As a summary, body fat mass, growth and maturity are an essential component differentiating the output of the soccer players relative performance. In future, information of the AGaMI model are useful to the coach and players for identifying the suitable biological and physiological demand reflects more comprehensive means of youth soccer relative performance. This study further highlights the importance of assessing AGaMI when identifying soccer relative performance.
Cavala, Marijana; Katić, Ratko
2010-12-01
The aim of the study was to define biomotor characteristics that determine playing performance and position in female handball. A battery of 13 variables consisting of somatotype components (3 variables), basic motor abilities (5 variables) and specific motor abilities (5 variables) were applied in a sample of 52 elite female handball players. Differences in biomotor characteristics according to playing performance and position of female handball players were determined by use of the analysis of variance (ANOVA) and discriminative analysis. Study results showed the high-quality female handball players to predominantly differ from the less successful ones in the specific factor of throw strength and basic dash factor, followed by the specific abilities of movement without and with ball, basic coordination/agility and specific ability of ball manipulation, and a more pronounced mesomorphic component. Results also revealed the wing players to be superior in the speed of movement frequency (psychomotor speed), run (explosive strength) and speed of movement with ball as compared with players at other playing positions. Also, endomorphic component was less pronounced in players at the wing and back player positions as compared with goalkeeper and pivot positions, where endomorphic component was considerably more pronounced.
System diagnostics using qualitative analysis and component functional classification
Reifman, J.; Wei, T.Y.C.
1993-11-23
A method for detecting and identifying faulty component candidates during off-normal operations of nuclear power plants involves the qualitative analysis of macroscopic imbalances in the conservation equations of mass, energy and momentum in thermal-hydraulic control volumes associated with one or more plant components and the functional classification of components. The qualitative analysis of mass and energy is performed through the associated equations of state, while imbalances in momentum are obtained by tracking mass flow rates which are incorporated into a first knowledge base. The plant components are functionally classified, according to their type, as sources or sinks of mass, energy and momentum, depending upon which of the three balance equations is most strongly affected by a faulty component which is incorporated into a second knowledge base. Information describing the connections among the components of the system forms a third knowledge base. The method is particularly adapted for use in a diagnostic expert system to detect and identify faulty component candidates in the presence of component failures and is not limited to use in a nuclear power plant, but may be used with virtually any type of thermal-hydraulic operating system. 5 figures.
System diagnostics using qualitative analysis and component functional classification
Reifman, Jaques; Wei, Thomas Y. C.
1993-01-01
A method for detecting and identifying faulty component candidates during off-normal operations of nuclear power plants involves the qualitative analysis of macroscopic imbalances in the conservation equations of mass, energy and momentum in thermal-hydraulic control volumes associated with one or more plant components and the functional classification of components. The qualitative analysis of mass and energy is performed through the associated equations of state, while imbalances in momentum are obtained by tracking mass flow rates which are incorporated into a first knowledge base. The plant components are functionally classified, according to their type, as sources or sinks of mass, energy and momentum, depending upon which of the three balance equations is most strongly affected by a faulty component which is incorporated into a second knowledge base. Information describing the connections among the components of the system forms a third knowledge base. The method is particularly adapted for use in a diagnostic expert system to detect and identify faulty component candidates in the presence of component failures and is not limited to use in a nuclear power plant, but may be used with virtually any type of thermal-hydraulic operating system.
Performance of J-33-A-21 and J-33-A-23 Compressors with and without Water Injection
NASA Technical Reports Server (NTRS)
Beede, William L.
1948-01-01
In an investigation of the J-33-A-21 and the J-33-A-23 compressors with and without water injection, it was discovered that the compressors reacted differently to water injection although they were physically similar. An analysis of the effect of water injection on compressor performance and the consequent effect on matching of the compressor and turbine components in the turbojet engine was made. The analysis of component matching is based on a turbine flow function defined as the product of the equivalent weight flow and the reciprocal of the compressor pressure ratio.
Mueller, Evelyn A; Bengel, Juergen; Wirtz, Markus A
2013-12-01
This study aimed to develop a self-description assessment instrument to measure work performance in patients with musculoskeletal diseases. In terms of the International Classification of Functioning, Disability and Health (ICF), work performance is defined as the degree of meeting the work demands (activities) at the actual workplace (environment). To account for the fact that work performance depends on the work demands of the job, we strived to develop item banks that allow a flexible use of item subgroups depending on the specific work demands of the patients' jobs. Item development included the collection of work tasks from literature and content validation through expert surveys and patient interviews. The resulting 122 items were answered by 621 patients with musculoskeletal diseases. Exploratory factor analysis to ascertain dimensionality and Rasch analysis (partial credit model) for each of the resulting dimensions were performed. Exploratory factor analysis resulted in four dimensions, and subsequent Rasch analysis led to the following item banks: 'impaired productivity' (15 items), 'impaired cognitive performance' (18), 'impaired coping with stress' (13) and 'impaired physical performance' (low physical workload 20 items, high physical workload 10 items). The item banks exhibited person separation indices (reliability) between 0.89 and 0.96. The assessment of work performance adds the activities component to the more commonly employed participation component of the ICF-model. The four item banks can be adapted to specific jobs where necessary without losing comparability of person measures, as the item banks are based on Rasch analysis.
Incipient fault detection study for advanced spacecraft systems
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.
1986-01-01
A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.
Structural response of SSME turbine blade airfoils
NASA Technical Reports Server (NTRS)
Arya, V. K.; Abdul-Aziz, A.; Thompson, R. L.
1988-01-01
Reusable space propulsion hot gas-path components are required to operate under severe thermal and mechanical loading conditions. These operating conditions produce elevated temperature and thermal transients which results in significant thermally induced inelastic strains, particularly, in the turbopump turbine blades. An inelastic analysis for this component may therefore be necessary. Anisotropic alloys such as MAR M-247 or PWA-1480 are being considered to meet the safety and durability requirements of this component. An anisotropic inelastic structural analysis for an SSME fuel turbopump turbine blade was performed. The thermal loads used resulted from a transient heat transfer analysis of a turbine blade. A comparison of preliminary results from the elastic and inelastic analyses is presented.
Using Interactive Graphics to Teach Multivariate Data Analysis to Psychology Students
ERIC Educational Resources Information Center
Valero-Mora, Pedro M.; Ledesma, Ruben D.
2011-01-01
This paper discusses the use of interactive graphics to teach multivariate data analysis to Psychology students. Three techniques are explored through separate activities: parallel coordinates/boxplots; principal components/exploratory factor analysis; and cluster analysis. With interactive graphics, students may perform important parts of the…
Research on distributed heterogeneous data PCA algorithm based on cloud platform
NASA Astrophysics Data System (ADS)
Zhang, Jin; Huang, Gang
2018-05-01
Principal component analysis (PCA) of heterogeneous data sets can solve the problem that centralized data scalability is limited. In order to reduce the generation of intermediate data and error components of distributed heterogeneous data sets, a principal component analysis algorithm based on heterogeneous data sets under cloud platform is proposed. The algorithm performs eigenvalue processing by using Householder tridiagonalization and QR factorization to calculate the error component of the heterogeneous database associated with the public key to obtain the intermediate data set and the lost information. Experiments on distributed DBM heterogeneous datasets show that the model method has the feasibility and reliability in terms of execution time and accuracy.
Grizzly Usage and Theory Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, B. W.; Backman, M.; Chakraborty, P.
2016-03-01
Grizzly is a multiphysics simulation code for characterizing the behavior of nuclear power plant (NPP) structures, systems and components (SSCs) subjected to a variety of age-related aging mechanisms. Grizzly simulates both the progression of aging processes, as well as the capacity of aged components to safely perform. This initial beta release of Grizzly includes capabilities for engineering-scale thermo-mechanical analysis of reactor pressure vessels (RPVs). Grizzly will ultimately include capabilities for a wide range of components and materials. Grizzly is in a state of constant development, and future releases will broaden the capabilities of this code for RPV analysis, as wellmore » as expand it to address degradation in other critical NPP components.« less
Computational analysis of a multistage axial compressor
NASA Astrophysics Data System (ADS)
Mamidoju, Chaithanya
Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.
NASA Technical Reports Server (NTRS)
1991-01-01
Analytical Design Service Corporation, Ann Arbor, MI, used NASTRAN (a NASA Structural Analysis program that analyzes a design and predicts how parts will perform) in tests of transmissions, engine cooling systems, internal engine parts, and body components. They also use it to design future automobiles. Analytical software can save millions by allowing computer simulated analysis of performance even before prototypes are built.
Heat-Energy Analysis for Solar Receivers
NASA Technical Reports Server (NTRS)
Lansing, F. L.
1982-01-01
Heat-energy analysis program (HEAP) solves general heat-transfer problems, with some specific features that are "custom made" for analyzing solar receivers. Can be utilized not only to predict receiver performance under varying solar flux, ambient temperature and local heat-transfer rates but also to detect locations of hotspots and metallurgical difficulties and to predict performance sensitivity of neighboring component parameters.
Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang
2017-05-01
Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Kesharaju, Manasa; Nagarajah, Romesh
2015-09-01
The motivation for this research stems from a need for providing a non-destructive testing method capable of detecting and locating any defects and microstructural variations within armour ceramic components before issuing them to the soldiers who rely on them for their survival. The development of an automated ultrasonic inspection based classification system would make possible the checking of each ceramic component and immediately alert the operator about the presence of defects. Generally, in many classification problems a choice of features or dimensionality reduction is significant and simultaneously very difficult, as a substantial computational effort is required to evaluate possible feature subsets. In this research, a combination of artificial neural networks and genetic algorithms are used to optimize the feature subset used in classification of various defects in reaction-sintered silicon carbide ceramic components. Initially wavelet based feature extraction is implemented from the region of interest. An Artificial Neural Network classifier is employed to evaluate the performance of these features. Genetic Algorithm based feature selection is performed. Principal Component Analysis is a popular technique used for feature selection and is compared with the genetic algorithm based technique in terms of classification accuracy and selection of optimal number of features. The experimental results confirm that features identified by Principal Component Analysis lead to improved performance in terms of classification percentage with 96% than Genetic algorithm with 94%. Copyright © 2015 Elsevier B.V. All rights reserved.
Empirical mode decomposition for analyzing acoustical signals
NASA Technical Reports Server (NTRS)
Huang, Norden E. (Inventor)
2005-01-01
The present invention discloses a computer implemented signal analysis method through the Hilbert-Huang Transformation (HHT) for analyzing acoustical signals, which are assumed to be nonlinear and nonstationary. The Empirical Decomposition Method (EMD) and the Hilbert Spectral Analysis (HSA) are used to obtain the HHT. Essentially, the acoustical signal will be decomposed into the Intrinsic Mode Function Components (IMFs). Once the invention decomposes the acoustic signal into its constituting components, all operations such as analyzing, identifying, and removing unwanted signals can be performed on these components. Upon transforming the IMFs into Hilbert spectrum, the acoustical signal may be compared with other acoustical signals.
High performance semantic factoring of giga-scale semantic graph databases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
al-Saffar, Sinan; Adolf, Bob; Haglin, David
2010-10-01
As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to bring high performance computational resources to bear on their analysis, interpretation, and visualization, especially with respect to their innate semantic structure. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multithreaded architecture of the Cray XMT platform, conventional clusters, and large data stores. In this paper we describe that architecture, and present the results of our deployingmore » that for the analysis of the Billion Triple dataset with respect to its semantic factors, including basic properties, connected components, namespace interaction, and typed paths.« less
Samu, Dávid; Campbell, Karen L.; Tsvetanov, Kamen A.; Shafto, Meredith A.; Brayne, Carol; Bullmore, Edward T.; Calder, Andrew C.; Cusack, Rhodri; Dalgleish, Tim; Duncan, John; Henson, Richard N.; Matthews, Fiona E.; Marslen-Wilson, William D.; Rowe, James B.; Cheung, Teresa; Davis, Simon; Geerligs, Linda; Kievit, Rogier; McCarrey, Anna; Mustafa, Abdur; Price, Darren; Taylor, Jason R.; Treder, Matthias; van Belle, Janna; Williams, Nitin; Bates, Lauren; Emery, Tina; Erzinçlioglu, Sharon; Gadie, Andrew; Gerbase, Sofia; Georgieva, Stanimira; Hanley, Claire; Parkin, Beth; Troy, David; Auer, Tibor; Correia, Marta; Gao, Lu; Green, Emma; Henriques, Rafael; Allen, Jodie; Amery, Gillian; Amunts, Liana; Barcroft, Anne; Castle, Amanda; Dias, Cheryl; Dowrick, Jonathan; Fair, Melissa; Fisher, Hayley; Goulding, Anna; Grewal, Adarsh; Hale, Geoff; Hilton, Andrew; Johnson, Frances; Johnston, Patricia; Kavanagh-Williamson, Thea; Kwasniewska, Magdalena; McMinn, Alison; Norman, Kim; Penrose, Jessica; Roby, Fiona; Rowland, Diane; Sargeant, John; Squire, Maggie; Stevens, Beth; Stoddart, Aldabra; Stone, Cheryl; Thompson, Tracy; Yazlik, Ozlem; Barnes, Dan; Dixon, Marie; Hillman, Jaya; Mitchell, Joanne; Villis, Laura; Tyler, Lorraine K.
2017-01-01
Healthy ageing has disparate effects on different cognitive domains. The neural basis of these differences, however, is largely unknown. We investigated this question by using Independent Components Analysis to obtain functional brain components from 98 healthy participants aged 23–87 years from the population-based Cam-CAN cohort. Participants performed two cognitive tasks that show age-related decrease (fluid intelligence and object naming) and a syntactic comprehension task that shows age-related preservation. We report that activation of task-positive neural components predicts inter-individual differences in performance in each task across the adult lifespan. Furthermore, only the two tasks that show performance declines with age show age-related decreases in task-positive activation of neural components and decreasing default mode (DM) suppression. Our results suggest that distributed, multi-component brain responsivity supports cognition across the adult lifespan, and the maintenance of this, along with maintained DM deactivation, characterizes successful ageing and may explain differential ageing trajectories across cognitive domains. PMID:28480894
Samu, Dávid; Campbell, Karen L; Tsvetanov, Kamen A; Shafto, Meredith A; Tyler, Lorraine K
2017-05-08
Healthy ageing has disparate effects on different cognitive domains. The neural basis of these differences, however, is largely unknown. We investigated this question by using Independent Components Analysis to obtain functional brain components from 98 healthy participants aged 23-87 years from the population-based Cam-CAN cohort. Participants performed two cognitive tasks that show age-related decrease (fluid intelligence and object naming) and a syntactic comprehension task that shows age-related preservation. We report that activation of task-positive neural components predicts inter-individual differences in performance in each task across the adult lifespan. Furthermore, only the two tasks that show performance declines with age show age-related decreases in task-positive activation of neural components and decreasing default mode (DM) suppression. Our results suggest that distributed, multi-component brain responsivity supports cognition across the adult lifespan, and the maintenance of this, along with maintained DM deactivation, characterizes successful ageing and may explain differential ageing trajectories across cognitive domains.
NASA Astrophysics Data System (ADS)
Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi
2018-02-01
In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.
In situ X-ray diffraction analysis of (CF x) n batteries: signal extraction by multivariate analysis
Rodriguez, Mark A.; Keenan, Michael R.; Nagasubramanian, Ganesan
2007-11-10
In this study, (CF x) n cathode reaction during discharge has been investigated using in situ X-ray diffraction (XRD). Mathematical treatment of the in situ XRD data set was performed using multivariate curve resolution with alternating least squares (MCR–ALS), a technique of multivariate analysis. MCR–ALS analysis successfully separated the relatively weak XRD signal intensity due to the chemical reaction from the other inert cell component signals. The resulting dynamic reaction component revealed the loss of (CF x) n cathode signal together with the simultaneous appearance of LiF by-product intensity. Careful examination of the XRD data set revealed an additional dynamicmore » component which may be associated with the formation of an intermediate compound during the discharge process.« less
Implanted component faults and their effects on gas turbine engine performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacLeod, J.D.; Taylor, V.; Laflamme, J.C.G.
Under the sponsorship of the Canadian Department of National Defence, the Engine Laboratory of the National Research Council of Canada (NRCC) has established a program for the evaluation of component deterioration on gas turbine engine performance. The effect is aimed at investigating the effects of typical in-service faults on the performance characteristics of each individual engine component. The objective of the program is the development of a generalized fault library, which will be used with fault identification techniques in the field, to reduce unscheduled maintenance. To evaluate the effects of implanted faults on the performance of a single spool engine,more » such as an Allison T56 turboprop engine, a series of faulted parts were installed. For this paper the following faults were analyzed: (a) first-stage turbine nozzle erosion damage; (b) first-stage turbine rotor blade untwist; (c) compressor seal wear; (d) first and second-stage compressor blade tip clearance increase. This paper describes the project objectives, the experimental installation, and the results of the fault implantation on engine performance. Discussed are performance variations on both engine and component characteristics. As the performance changes were significant, a rigorous measurement uncertainty analysis is included.« less
Analysis of radiofrequency discharges in plasma
Kumar, Devendra; McGlynn, Sean P.
1992-01-01
Separation of laser optogalvanic signals in plasma into two components: (1) an ionization rate change component, and (2) a photoacoustic mediated component. This separation of components may be performed even when the two components overlap in time, by measuring time-resolved laser optogalvanic signals in an rf discharge plasma as the rf frequency is varied near the electrical resonance peak of the plasma and associated driving/detecting circuits. A novel spectrometer may be constructed to make these measurements. Such a spectrometer would be useful in better understanding and controlling such processes as plasma etching and plasma deposition.
Performance-based maintenance of gas turbines for reliable control of degraded power systems
NASA Astrophysics Data System (ADS)
Mo, Huadong; Sansavini, Giovanni; Xie, Min
2018-03-01
Maintenance actions are necessary for ensuring proper operations of control systems under component degradation. However, current condition-based maintenance (CBM) models based on component health indices are not suitable for degraded control systems. Indeed, failures of control systems are only determined by the controller outputs, and the feedback mechanism compensates the control performance loss caused by the component deterioration. Thus, control systems may still operate normally even if the component health indices exceed failure thresholds. This work investigates the CBM model of control systems and employs the reduced control performance as a direct degradation measure for deciding maintenance activities. The reduced control performance depends on the underlying component degradation modelled as a Wiener process and the feedback mechanism. To this aim, the controller features are quantified by developing a dynamic and stochastic control block diagram-based simulation model, consisting of the degraded components and the control mechanism. At each inspection, the system receives a maintenance action if the control performance deterioration exceeds its preventive-maintenance or failure thresholds. Inspired by realistic cases, the component degradation model considers random start time and unit-to-unit variability. The cost analysis of maintenance model is conducted via Monte Carlo simulation. Optimal maintenance strategies are investigated to minimize the expected maintenance costs, which is a direct consequence of the control performance. The proposed framework is able to design preventive maintenance actions on a gas power plant, to ensuring required load frequency control performance against a sudden load increase. The optimization results identify the trade-off between system downtime and maintenance costs as a function of preventive maintenance thresholds and inspection frequency. Finally, the control performance-based maintenance model can reduce maintenance costs as compared to CBM and pre-scheduled maintenance.
Rui, Wen; Chen, Hongyuan; Tan, Yuzhi; Zhong, Yanmei; Feng, Yifan
2010-05-01
A rapid method for the analysis of the main components of the total glycosides of Ranunculus japonicus (TGOR) was developed using ultra-performance liquid chromatography with quadrupole-time-of-flight mass spectrometry (UPLC/Q-TOF-MS). The separation analysis was performed on a Waters Acquity UPLC system and the accurate mass of molecules and their fragment ions were determined by Q-TOF MS. Twenty compounds, including lactone glycosides, flavonoid glycosides and flavonoid aglycones, were identified and tentatively deduced on the basis of their elemental compositions, MS/MS data and relevant literature. The results demonstrated that lactone glycosides and flavonoids were the main constituents of TGOR. Furthermore, an effective and rapid pattern was established allowing for the comprehensive and systematic characterization of the complex samples.
NASA Astrophysics Data System (ADS)
Jian, X. H.; Dong, F. L.; Xu, J.; Li, Z. J.; Jiao, Y.; Cui, Y. Y.
2018-05-01
The feasibility of differentiating tissue components by performing frequency domain analysis of photoacoustic images acquired at different wavelengths was studied in this paper. Firstly, according to the basic theory of photoacoustic imaging, a brief theoretical model for frequency domain analysis of multiwavelength photoacoustic signal was deduced. The experiment results proved that the performance of different targets in frequency domain is quite different. Especially, the acoustic spectrum characteristic peaks of different targets are unique, which are 2.93 MHz, 5.37 MHz, 6.83 MHz, and 8.78 MHz for PDMS phantom, while 13.20 MHz, 16.60 MHz, 26.86 MHz, and 29.30 MHz for pork fat. The results indicated that the acoustic spectrum of photoacoustic imaging signals is possible to be utilized for tissue composition characterization.
Caffeine ingestion enhances Wingate performance: a meta-analysis.
Grgic, Jozo
2018-03-01
The positive effects of caffeine ingestion on aerobic performance are well-established; however, recent findings are suggesting that caffeine ingestion might also enhance components of anaerobic performance. A commonly used test of anaerobic performance and power output is the 30-second Wingate test. Several studies explored the effects of caffeine ingestion on Wingate performance, with equivocal findings. To elucidate this topic, this paper aims to determine the effects of caffeine ingestion on Wingate performance using meta-analytic statistical techniques. Following a search through PubMed/MEDLINE, Scopus, and SportDiscus ® , 16 studies were found meeting the inclusion criteria (pooled number of participants = 246). Random-effects meta-analysis of standardized mean differences (SMD) for peak power output and mean power output was performed. Study quality was assessed using the modified version of the PEDro checklist. Results of the meta-analysis indicated a significant difference (p = .005) between the placebo and caffeine trials on mean power output with SMD values of small magnitude (0.18; 95% confidence interval: 0.05, 0.31; +3%). The meta-analysis performed for peak power output indicated a significant difference (p = .006) between the placebo and caffeine trials (SMD = 0.27; 95% confidence interval: 0.08, 0.47 [moderate magnitude]; +4%). The results from the PEDro checklist indicated that, in general, studies are of good and excellent methodological quality. This meta-analysis adds on to the current body of evidence showing that caffeine ingestion can also enhance components of anaerobic performance. The results presented herein may be helpful for developing more efficient evidence-based recommendations regarding caffeine supplementation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
Code of Federal Regulations, 2013 CFR
2013-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
Code of Federal Regulations, 2012 CFR
2012-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... emergency power to instruments, utility service systems, and operating systems important to safety if there... include: (a) A general description of the structures, systems, components, equipment, and process... of the performance of the structures, systems, and components to identify those that are important to...
ERIC Educational Resources Information Center
Peterson, Christina Hamme
2012-01-01
Counseling work is increasingly conducted in team format. The methods counseling teams use to manage the emotional component of their group life, or their group emotional intelligence, have been proposed as significantly contributing to group member trust, cooperation, and ultimate performance. Item development, exploratory factor analysis, and…
In-line task 57: Component evaluation. [of circuit breakers, panel switches, etc. for space shuttle
NASA Technical Reports Server (NTRS)
Boykin, J. C.
1974-01-01
Design analysis tests were performed on selected power switching components to determine the possible applicability of off-the-shelf hardware to space shuttles. Various characteristics were also evaluated in those devices to determine the most desirable properties for the space shuttle.
NASA Technical Reports Server (NTRS)
Boyd, R. K.; Brumfield, J. O.; Campbell, W. J.
1984-01-01
Three feature extraction methods, canonical analysis (CA), principal component analysis (PCA), and band selection, have been applied to Thematic Mapper Simulator (TMS) data in order to evaluate the relative performance of the methods. The results obtained show that CA is capable of providing a transformation of TMS data which leads to better classification results than provided by all seven bands, by PCA, or by band selection. A second conclusion drawn from the study is that TMS bands 2, 3, 4, and 7 (thermal) are most important for landcover classification.
MSFC Skylab structures and mechanical systems mission evaluation
NASA Technical Reports Server (NTRS)
1974-01-01
A performance analysis for structural and mechanical major hardware systems and components is presented. Development background testing, modifications, and requirement adjustments are included. Functional narratives are provided for comparison purposes as are predicted design performance criterion. Each item is evaluated on an individual basis: that is, (1) history (requirements, design, manufacture, and test); (2) in-orbit performance (description and analysis); and (3) conclusions and recommendations regarding future space hardware application. Overall, the structural and mechanical performance of the Skylab hardware was outstanding.
Solar cell array design handbook - The principles and technology of photovoltaic energy conversion
NASA Technical Reports Server (NTRS)
Rauschenbach, H. S.
1980-01-01
Photovoltaic solar cell array design and technology for ground-based and space applications are discussed from the user's point of view. Solar array systems are described, with attention given to array concepts, historical development, applications and performance, and the analysis of array characteristics, circuits, components, performance and reliability is examined. Aspects of solar cell array design considered include the design process, photovoltaic system and detailed array design, and the design of array thermal, radiation shielding and electromagnetic components. Attention is then given to the characteristics and design of the separate components of solar arrays, including the solar cells, optical elements and mechanical elements, and the fabrication, testing, environmental conditions and effects and material properties of arrays and their components are discussed.
Orbital transfer rocket engine technology 7.5K-LB thrust rocket engine preliminary design
NASA Technical Reports Server (NTRS)
Harmon, T. J.; Roschak, E.
1993-01-01
A preliminary design of an advanced LOX/LH2 expander cycle rocket engine producing 7,500 lbf thrust for Orbital Transfer vehicle missions was completed. Engine system, component and turbomachinery analysis at both on design and off design conditions were completed. The preliminary design analysis results showed engine requirements and performance goals were met. Computer models are described and model outputs are presented. Engine system assembly layouts, component layouts and valve and control system analysis are presented. Major design technologies were identified and remaining issues and concerns were listed.
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul
2002-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
Extension of a System Level Tool for Component Level Analysis
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)
2001-01-01
This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, C.H.; Ready, A.B.; Rea, J.
1995-06-01
Versions of the computer program PROATES (PROcess Analysis for Thermal Energy Systems) have been used since 1979 to analyse plant performance improvement proposals relating to existing plant and also to evaluate new plant designs. Several plant modifications have been made to improve performance based on the model predictions and the predicted performance has been realised in practice. The program was born out of a need to model the overall steady state performance of complex plant to enable proposals to change plant component items or operating strategy to be evaluated. To do this with confidence it is necessary to model themore » multiple thermodynamic interactions between the plant components. The modelling system is modular in concept allowing the configuration of individual plant components to represent any particular power plant design. A library exists of physics based modules which have been extensively validated and which provide representations of a wide range of boiler, turbine and CW system components. Changes to model data and construction is achieved via a user friendly graphical model editing/analysis front-end with results being presented via the computer screen or hard copy. The paper describes briefly the modelling system but concentrates mainly on the application of the modelling system to assess design re-optimisation, firing with different fuels and the re-powering of an existing plant.« less
NASA Astrophysics Data System (ADS)
Ji, Yi; Sun, Shanlin; Xie, Hong-Bo
2017-06-01
Discrete wavelet transform (WT) followed by principal component analysis (PCA) has been a powerful approach for the analysis of biomedical signals. Wavelet coefficients at various scales and channels were usually transformed into a one-dimensional array, causing issues such as the curse of dimensionality dilemma and small sample size problem. In addition, lack of time-shift invariance of WT coefficients can be modeled as noise and degrades the classifier performance. In this study, we present a stationary wavelet-based two-directional two-dimensional principal component analysis (SW2D2PCA) method for the efficient and effective extraction of essential feature information from signals. Time-invariant multi-scale matrices are constructed in the first step. The two-directional two-dimensional principal component analysis then operates on the multi-scale matrices to reduce the dimension, rather than vectors in conventional PCA. Results are presented from an experiment to classify eight hand motions using 4-channel electromyographic (EMG) signals recorded in healthy subjects and amputees, which illustrates the efficiency and effectiveness of the proposed method for biomedical signal analysis.
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
Accomplishments are described for the second year effort of a 3-year program to develop methodology for component specific modeling of aircraft engine hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models; (2) geometry model generators; (3) remeshing; (4) specialty 3-D inelastic stuctural analysis; (5) computationally efficient solvers, (6) adaptive solution strategies; (7) engine performance parameters/component response variables decomposition and synthesis; (8) integrated software architecture and development, and (9) validation cases for software developed.
NASA Technical Reports Server (NTRS)
Williams, R. E.; Kruger, R.
1980-01-01
Estimation procedures are described for measuring component failure rates, for comparing the failure rates of two different groups of components, and for formulating confidence intervals for testing hypotheses (based on failure rates) that the two groups perform similarly or differently. Appendix A contains an example of an analysis in which these methods are applied to investigate the characteristics of two groups of spacecraft components. The estimation procedures are adaptable to system level testing and to monitoring failure characteristics in orbit.
Sharifi, Mohammad Sharif; Hazell, Stuart Loyd
2012-01-01
The chemical entities of Mastic, Kurdica, Mutica and Cabolica gums from genus Pistacia have been isolated and characterised by GC-Mass Spectrometry, High Performance Liquid Chromatography and Column Chromatography. These chemical entities were screened for anti-microbial activities against nine strains of Helicobacter pylori and some other Gram-negative and Gram-positive bacteria. The most bioactive components were structurally analysed. These components mimic steroid compounds, in particular, the known antibiotic Fusidic acid. Some of these chemical entities have produced promising data that could lead to the development of a novel class of antimicrobial agents that may have application in the treatment of infectious disease. Kill kinetics have been also performed, and the produced data were evaluated by Generalized Multiplicative Analysis Of Variance (GEMANOVA) for the bactericidal and bacteriostatic activities which can be clinically significant. The isolated components were all bactericidal. PMID:22980113
NASA Astrophysics Data System (ADS)
Pu, Huangsheng; Zhang, Guanglei; He, Wei; Liu, Fei; Guang, Huizhi; Zhang, Yue; Bai, Jing; Luo, Jianwen
2014-09-01
It is a challenging problem to resolve and identify drug (or non-specific fluorophore) distribution throughout the whole body of small animals in vivo. In this article, an algorithm of unmixing multispectral fluorescence tomography (MFT) images based on independent component analysis (ICA) is proposed to solve this problem. ICA is used to unmix the data matrix assembled by the reconstruction results from MFT. Then the independent components (ICs) that represent spatial structures and the corresponding spectrum courses (SCs) which are associated with spectral variations can be obtained. By combining the ICs with SCs, the recovered MFT images can be generated and fluorophore concentration can be calculated. Simulation studies, phantom experiments and animal experiments with different concentration contrasts and spectrum combinations are performed to test the performance of the proposed algorithm. Results demonstrate that the proposed algorithm can not only provide the spatial information of fluorophores, but also recover the actual reconstruction of MFT images.
Comparative analysis on flexibility requirements of typical Cryogenic Transfer lines
NASA Astrophysics Data System (ADS)
Jadon, Mohit; Kumar, Uday; Choukekar, Ketan; Shah, Nitin; Sarkar, Biswanath
2017-04-01
The cryogenic systems and their applications; primarily in large Fusion devices, utilize multiple cryogen transfer lines of various sizes and complexities to transfer cryogenic fluids from plant to the various user/ applications. These transfer lines are composed of various critical sections i.e. tee section, elbows, flexible components etc. The mechanical sustainability (under failure circumstances) of these transfer lines are primary requirement for safe operation of the system and applications. The transfer lines need to be designed for multiple design constraints conditions like line layout, support locations and space restrictions. The transfer lines are subjected to single load and multiple load combinations, such as operational loads, seismic loads, leak in insulation vacuum loads etc. [1]. The analytical calculations and flexibility analysis using professional software are performed for the typical transfer lines without any flexible component, the results were analysed for functional and mechanical load conditions. The failure modes were identified along the critical sections. The same transfer line was then refurbished with the flexible components and analysed for failure modes. The flexible components provide additional flexibility to the transfer line system and make it safe. The results obtained from the analytical calculations were compared with those obtained from the flexibility analysis software calculations. The optimization of the flexible component’s size and selection was performed and components were selected to meet the design requirements as per code.
Design and optimization of a self-deploying PV tent array
NASA Astrophysics Data System (ADS)
Colozza, Anthony J.
A study was performed to design a self-deploying tent shaped PV (photovoltaic) array and optimize the design for maximum specific power. Each structural component of the design was analyzed to determine the size necessary to withstand the various forces it would be subjected to. Through this analysis the component weights were determined. An optimization was performed to determine the array dimensions and blanket geometry which produce the maximum specific power for a given PV blanket. This optimization was performed for both Lunar and Martian environmental conditions. The performance specifications for the array at both locations and with various PV blankets were determined.
Smoothing of the bivariate LOD score for non-normal quantitative traits.
Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John
2005-12-30
Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.
Preserving subject variability in group fMRI analysis: performance evaluation of GICA vs. IVA
Michael, Andrew M.; Anderson, Mathew; Miller, Robyn L.; Adalı, Tülay; Calhoun, Vince D.
2014-01-01
Independent component analysis (ICA) is a widely applied technique to derive functionally connected brain networks from fMRI data. Group ICA (GICA) and Independent Vector Analysis (IVA) are extensions of ICA that enable users to perform group fMRI analyses; however a full comparison of the performance limits of GICA and IVA has not been investigated. Recent interest in resting state fMRI data with potentially higher degree of subject variability makes the evaluation of the above techniques important. In this paper we compare component estimation accuracies of GICA and an improved version of IVA using simulated fMRI datasets. We systematically change the degree of inter-subject spatial variability of components and evaluate estimation accuracy over all spatial maps (SMs) and time courses (TCs) of the decomposition. Our results indicate the following: (1) at low levels of SM variability or when just one SM is varied, both GICA and IVA perform well, (2) at higher levels of SM variability or when more than one SMs are varied, IVA continues to perform well but GICA yields SM estimates that are composites of other SMs with errors in TCs, (3) both GICA and IVA remove spatial correlations of overlapping SMs and introduce artificial correlations in their TCs, (4) if number of SMs is over estimated, IVA continues to perform well but GICA introduces artifacts in the varying and extra SMs with artificial correlations in the TCs of extra components, and (5) in the absence or presence of SMs unique to one subject, GICA produces errors in TCs and IVA estimates are accurate. In summary, our simulation experiments (both simplistic and realistic) and our holistic analyses approach indicate that IVA produces results that are closer to ground truth and thereby better preserves subject variability. The improved version of IVA is now packaged into the GIFT toolbox (http://mialab.mrn.org/software/gift). PMID:25018704
NASA Astrophysics Data System (ADS)
Durigon, Angelica; Lier, Quirijn de Jong van; Metselaar, Klaas
2016-10-01
To date, measuring plant transpiration at canopy scale is laborious and its estimation by numerical modelling can be used to assess high time frequency data. When using the model by Jacobs (1994) to simulate transpiration of water stressed plants it needs to be reparametrized. We compare the importance of model variables affecting simulated transpiration of water stressed plants. A systematic literature review was performed to recover existing parameterizations to be tested in the model. Data from a field experiment with common bean under full and deficit irrigation were used to correlate estimations to forcing variables applying principal component analysis. New parameterizations resulted in a moderate reduction of prediction errors and in an increase in model performance. Ags model was sensitive to changes in the mesophyll conductance and leaf angle distribution parameterizations, allowing model improvement. Simulated transpiration could be separated in temporal components. Daily, afternoon depression and long-term components for the fully irrigated treatment were more related to atmospheric forcing variables (specific humidity deficit between stomata and air, relative air humidity and canopy temperature). Daily and afternoon depression components for the deficit-irrigated treatment were related to both atmospheric and soil dryness, and long-term component was related to soil dryness.
ERIC Educational Resources Information Center
Fernandez-Sainz, A.; García-Merino, J. D.; Urionabarrenetxea, S.
2016-01-01
This paper seeks to discover whether the performance of university students has improved in the wake of the changes in higher education introduced by the Bologna Declaration of 1999 and the construction of the European Higher Education Area. A principal component analysis is used to construct a multi-dimensional performance variable called the…
Habeeb, Christine M; Eklund, Robert C; Coffee, Pete
2017-06-01
This study explored person-related sources of variance in athletes' efficacy beliefs and performances when performing in pairs with distinguishable roles differing in partner dependence. College cheerleaders (n = 102) performed their role in repeated performance trials of two low- and two high-difficulty paired-stunt tasks with three different partners. Data were obtained on self-, other-, and collective efficacy beliefs and subjective performances, and objective performance assessments were obtained from digital recordings. Using the social relations model framework, total variance in each belief/assessment was partitioned, for each role, into numerical components of person-related variance relative to the self, the other, and the collective. Variance component by performance role by task-difficulty repeated-measures analysis of variances revealed that the largest person-related variance component differed by athlete role and increased in size in high-difficulty tasks. Results suggest that the extent the athlete's performance depends on a partner relates to the extent the partner is a source of self-, other-, and collective efficacy beliefs.
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)
2002-01-01
In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory
Parts and Components Reliability Assessment: A Cost Effective Approach
NASA Technical Reports Server (NTRS)
Lee, Lydia
2009-01-01
System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.
Conceptual design and analysis of a dynamic scale model of the Space Station Freedom
NASA Technical Reports Server (NTRS)
Davis, D. A.; Gronet, M. J.; Tan, M. K.; Thorne, J.
1994-01-01
This report documents the conceptual design study performed to evaluate design options for a subscale dynamic test model which could be used to investigate the expected on-orbit structural dynamic characteristics of the Space Station Freedom early build configurations. The baseline option was a 'near-replica' model of the SSF SC-7 pre-integrated truss configuration. The approach used to develop conceptual design options involved three sets of studies: evaluation of the full-scale design and analysis databases, conducting scale factor trade studies, and performing design sensitivity studies. The scale factor trade study was conducted to develop a fundamental understanding of the key scaling parameters that drive design, performance and cost of a SSF dynamic scale model. Four scale model options were estimated: 1/4, 1/5, 1/7, and 1/10 scale. Prototype hardware was fabricated to assess producibility issues. Based on the results of the study, a 1/4-scale size is recommended based on the increased model fidelity associated with a larger scale factor. A design sensitivity study was performed to identify critical hardware component properties that drive dynamic performance. A total of 118 component properties were identified which require high-fidelity replication. Lower fidelity dynamic similarity scaling can be used for non-critical components.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus. PMID:18466597
Probabilistic finite elements for fracture and fatigue analysis
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.
1989-01-01
The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.
Iris recognition based on robust principal component analysis
NASA Astrophysics Data System (ADS)
Karn, Pradeep; He, Xiao Hai; Yang, Shuai; Wu, Xiao Hong
2014-11-01
Iris images acquired under different conditions often suffer from blur, occlusion due to eyelids and eyelashes, specular reflection, and other artifacts. Existing iris recognition systems do not perform well on these types of images. To overcome these problems, we propose an iris recognition method based on robust principal component analysis. The proposed method decomposes all training images into a low-rank matrix and a sparse error matrix, where the low-rank matrix is used for feature extraction. The sparsity concentration index approach is then applied to validate the recognition result. Experimental results using CASIA V4 and IIT Delhi V1iris image databases showed that the proposed method achieved competitive performances in both recognition accuracy and computational efficiency.
Warsito, Warsito; Palungan, Maimunah Hindun; Utomo, Edy Priyo
2017-01-01
Essential oil is consisting of complex component. It is divided into major and minor component. Therefore, this study aims to examine the distribution of major and minor components on Kaffir lime oil by using fractional distillation. Fractional distillation and distributional analysis of components within fractions have been performed on kaffir lime oil ( Citrus hystrix DC .). Fractional distillation was performed by using PiloDist 104-VTU, column length of 2 m (number of plate 120), the system pressure was set on 5 and 10 mBar, while the reflux ratio varied on 10/10, 20/10 and 60/10, and the chemical composition analysis was done by using GC-MS. Chemical composition of the distillated lime oil consisted of mix-twigs and leaves that composed of 20 compounds, with five main components β-citronellal (46.40%), L-linalool (13.11%), β-citronellol (11.03%), citronelyl acetate (6.76%) and sabinen (5.91%). The optimum conditions for fractional distillation were obtained at 5 mBar pressure with reflux ratio of 10/10. Components of β -citronellal and L-linalool were distributed in the fraction-1 to fraction 9, hydrocarbon monoterpenes components were distributed only on the fraction-1 to fraction 4, while the oxygenated monoterpenes components dominated the fraction-5 to fraction-9. The highest level of β-citronellal was 84.86% (fraction-7), L-linalool 20.13% (fraction-5), sabinen 19.83% (fraction-1), and the component level of 4-terpeneol, β-citronellol and sitronelyl acetate respectively 7.16%; 12.27%; 5.22% (fraction-9).
ECOPASS - a multivariate model used as an index of growth performance of poplar clones
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceulemans, R.; Impens, I.
The model (ECOlogical PASSport) reported was constructed by principal component analysis from a combination of biochemical, anatomical/morphological and ecophysiological gas exchange parameters measured on 5 fast growing poplar clones. Productivity data were 10 selected trees in 3 plantations in Belgium and given as m.a.i.(b.a.). The model is shown to be able to reflect not only genetic origin and the relative effects of the different parameters of the clones, but also their production potential. Multiple regression analysis of the 4 principal components showed a high cumulative correlation (96%) between the 3 components related to ecophysiological, biochemical and morphological parameters, and productivity;more » the ecophysiological component alone correlated 85% with productivity.« less
Physician performance assessment using a composite quality index.
Liu, Kaibo; Jain, Shabnam; Shi, Jianjun
2013-07-10
Assessing physician performance is important for the purposes of measuring and improving quality of service and reducing healthcare delivery costs. In recent years, physician performance scorecards have been used to provide feedback on individual measures; however, one key challenge is how to develop a composite quality index that combines multiple measures for overall physician performance evaluation. A controversy arises over establishing appropriate weights to combine indicators in multiple dimensions, and cannot be easily resolved. In this study, we proposed a generic unsupervised learning approach to develop a single composite index for physician performance assessment by using non-negative principal component analysis. We developed a new algorithm named iterative quadratic programming to solve the numerical issue in the non-negative principal component analysis approach. We conducted real case studies to demonstrate the performance of the proposed method. We provided interpretations from both statistical and clinical perspectives to evaluate the developed composite ranking score in practice. In addition, we implemented the root cause assessment techniques to explain physician performance for improvement purposes. Copyright © 2012 John Wiley & Sons, Ltd.
Non-Double-Couple Component Analysis of Induced Microearthquakes in the Val D'Agri Basin (Italy)
NASA Astrophysics Data System (ADS)
Roselli, P.; Improta, L.; Saccorotti, G.
2017-12-01
In recent years it has become accepted that earthquake source can attain significant Non-Double-Couple (NDC) components. Among the driving factors of deviation from normal double-couple (DC) mechanisms there is the opening/closing of fracture networks and the activation of pre-existing faults by pore fluid pressure perturbations. This observation makes the thorough analysis of source mechanism of key importance for the understanding of withdrawal/injection induced seismicity from geothermal and hydrocarbon reservoirs, as well as of water reservoir induced seismicity. In addition to the DC component, seismic moment tensor can be decomposed into isotropic (ISO) and compensated linear vector dipole (CLVD) components. In this study we performed a careful analysis of the seismic moment tensor of induced microseismicity recorded in the Val d'Agri (Southern Apennines, Italy) focusing our attention on the NDC component. The Val d'Agri is a Quaternary extensional basin that hosts the largest onshore European oil field and a water reservoir (Pertusillo Lake impoundment) characterized by severe seasonal level oscillations. Our input data-set includes swarm-type induced micro-seismicity recorded between 2005-2006 by a high-performance network and accurately localized by a reservoir-scale local earthquake tomography. We analyze two different seismicity clusters: (i) a swarm of 69 earthquakes with 0.3 ≤ ML ≤ 1.8 induced by a wastewater disposal well of the oilfield during the initial daily injection tests (10 days); (ii) 526 earthquakes with -0.2 ≤ ML ≤ 2.7 induced by seasonal volume changes of the artificial lake. We perform the seismic moment tensor inversion by using HybridMT code. After a very accurate signal-to-noise selection and hand-made picking of P-pulses, we obtain %DC, %ISO, %CLVD for each event. DC and NDC components are analyzed and compared with the spatio-temporal distribution of seismicity, the local stress field, the injection parameters and the water level in the impoundment. We find significant NDC components and abrupt temporal variations in the %DC and %ISO components that appear linked to the extremely variable parameters of the injection tests into the disposal well.
Evaluation of Parallel Analysis Methods for Determining the Number of Factors
ERIC Educational Resources Information Center
Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.
2010-01-01
Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…
[Does carbonate originate from carbonate-calcium crystal component of the human urinary calculus?].
Yuzawa, Masayuki; Nakano, Kazuhiko; Kumamaru, Takatoshi; Nukui, Akinori; Ikeda, Hitoshi; Suzuki, Kazumi; Kobayashi, Minoru; Sugaya, Yasuhiro; Morita, Tatsuo
2008-09-01
It gives important information in selecting the appropriate treatment for urolithiasis to confirm the component of urinary calculus. Presently component analysis of the urinary calculus is generally performed by infrared spectroscopy which is employed by companies providing laboratory testing services in Japan. The infrared spectroscopy determines the molecular components from the absorption spectra in consequence of atomic vibrations. It has the drawback that an accurate crystal structure cannot be analyzed compared with the X-ray diffraction method which analyzes the crystal constituent based on the diffraction of X-rays on crystal lattice. The components of the urinary calculus including carbonate are carbonate apatite and calcium carbonate such as calcite. Although the latter is reported to be very rare component in human urinary calculus, the results by infrared spectroscopy often show that calcium carbonate is included in calculus. The infrared spectroscopy can confirm the existence of carbonate but cannot determine whether carbonate is originated from carbonate apatite or calcium carbonate. Thus, it is not clear whether calcium carbonate is included in human urinary calculus component in Japan. In this study, we examined human urinary calculus including carbonate by use of X-ray structural analysis in order to elucidate the origin of carbonate in human urinary calculus. We examined 17 human calculi which were reported to contain calcium carbonate by infrared spectroscopy performed in the clinical laboratory. Fifteen calculi were obtained from urinary tract, and two were from gall bladder. The stones were analyzed by X-ray powder method after crushed finely. The reports from the clinical laboratory showed that all urinary culculi consisted of calcium carbonate and calcium phosphate, while the gallstones consisted of calcium carbonate. But the components of all urinary calculi were revealed to be carbonate apatite by X-ray diffraction. The components of gallstones were shown to be calcium carbonate (one calcite and the other aragonite) not only by infrared spectroscopy but by X-ray diffraction. It was shown that component analysis of the calculus could be more accurately performed by adding X-ray diffraction method to infrared spectroscopy. It was shown that calcium carbonate existed in a gallstone. As for the carbonate in human urinary calculi, present study showed that it was not calcium carbonate origin but carbonate apatite origin.
NASA Astrophysics Data System (ADS)
Shokravi, H.; Bakhary, NH
2017-11-01
Subspace System Identification (SSI) is considered as one of the most reliable tools for identification of system parameters. Performance of a SSI scheme is considerably affected by the structure of the associated identification algorithm. Weight matrix is a variable in SSI that is used to reduce the dimensionality of the state-space equation. Generally one of the weight matrices of Principle Component (PC), Unweighted Principle Component (UPC) and Canonical Variate Analysis (CVA) are used in the structure of a SSI algorithm. An increasing number of studies in the field of structural health monitoring are using SSI for damage identification. However, studies that evaluate the performance of the weight matrices particularly in association with accuracy, noise resistance, and time complexity properties are very limited. In this study, the accuracy, noise-robustness, and time-efficiency of the weight matrices are compared using different qualitative and quantitative metrics. Three evaluation metrics of pole analysis, fit values and elapsed time are used in the assessment process. A numerical model of a mass-spring-dashpot and operational data is used in this research paper. It is observed that the principal components obtained using PC algorithms are more robust against noise uncertainty and give more stable results for the pole distribution. Furthermore, higher estimation accuracy is achieved using UPC algorithm. CVA had the worst performance for pole analysis and time efficiency analysis. The superior performance of the UPC algorithm in the elapsed time is attributed to using unit weight matrices. The obtained results demonstrated that the process of reducing dimensionality in CVA and PC has not enhanced the time efficiency but yield an improved modal identification in PC.
Liver DCE-MRI Registration in Manifold Space Based on Robust Principal Component Analysis.
Feng, Qianjin; Zhou, Yujia; Li, Xueli; Mei, Yingjie; Lu, Zhentai; Zhang, Yu; Feng, Yanqiu; Liu, Yaqin; Yang, Wei; Chen, Wufan
2016-09-29
A technical challenge in the registration of dynamic contrast-enhanced magnetic resonance (DCE-MR) imaging in the liver is intensity variations caused by contrast agents. Such variations lead to the failure of the traditional intensity-based registration method. To address this problem, a manifold-based registration framework for liver DCE-MR time series is proposed. We assume that liver DCE-MR time series are located on a low-dimensional manifold and determine intrinsic similarities between frames. Based on the obtained manifold, the large deformation of two dissimilar images can be decomposed into a series of small deformations between adjacent images on the manifold through gradual deformation of each frame to the template image along the geodesic path. Furthermore, manifold construction is important in automating the selection of the template image, which is an approximation of the geodesic mean. Robust principal component analysis is performed to separate motion components from intensity changes induced by contrast agents; the components caused by motion are used to guide registration in eliminating the effect of contrast enhancement. Visual inspection and quantitative assessment are further performed on clinical dataset registration. Experiments show that the proposed method effectively reduces movements while preserving the topology of contrast-enhancing structures and provides improved registration performance.
Three dimensional tracking with misalignment between display and control axes
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Tyler, Mitchell; Kim, Won S.; Stark, Lawrence
1992-01-01
Human operators confronted with misaligned display and control frames of reference performed three dimensional, pursuit tracking in virtual environment and virtual space simulations. Analysis of the components of the tracking errors in the perspective displays presenting virtual space showed that components of the error due to visual motor misalignment may be linearly separated from those associated with the mismatch between display and control coordinate systems. Tracking performance improved with several hours practice despite previous reports that such improvement did not take place.
1985-01-01
components must also perform accurately if control is to be accurate, tests were made to determine if these components were likely to introduce more...efficient. However, it also greatly increases the com- plexity of the control systems, since room temperature measurements must be made for each zone, with...involving a psychrometer (a dry-bulb and a wet-bulb mercury thermometer) provides only a rough indication. Calibration is time- consuming and only partly
Early warning reporting categories analysis of recall and complaints data.
DOT National Transportation Integrated Search
2001-12-31
This analysis was performed to assist the National Highway Traffic Safety Administration (NHTSA) in identifying components and systems to be included in early warning reporting (EWR) categories that would be based upon historical safety-related recal...
Evolution of optical fibre cabling components at CERN: Performance and technology trends analysis
NASA Astrophysics Data System (ADS)
Shoaie, Mohammad Amin; Meroli, Stefano; Machado, Simao; Ricci, Daniel
2018-05-01
CERN optical fibre infrastructure has been growing constantly over the past decade due to ever increasing connectivity demands. The provisioning plan and fibre installation of this vast laboratory is performed by Fibre Optics and Cabling Section at Engineering Department. In this paper we analyze the procurement data for essential fibre cabling components during a five-year interval to extract the existing trends and anticipate future directions. The analysis predicts high contribution of LC connector and an increasing usage of multi-fibre connectors. It is foreseen that single-mode fibres become the main fibre type for mid and long-range installations while air blowing would be the major installation technique. Performance assessment of various connectors shows that the expanded beam ferrule is favored for emerging on-board optical interconnections thanks to its scalable density and stable return-loss.
Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz
2010-08-06
Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.
High Performance Descriptive Semantic Analysis of Semantic Graph Databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan
As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprisingmore » computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.« less
A CAD Approach to Integrating NDE With Finite Element
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Downey, James; Ghosn, Louis J.; Baaklini, George Y.
2004-01-01
Nondestructive evaluation (NDE) is one of several technologies applied at NASA Glenn Research Center to determine atypical deformities, cracks, and other anomalies experienced by structural components. NDE consists of applying high-quality imaging techniques (such as x-ray imaging and computed tomography (CT)) to discover hidden manufactured flaws in a structure. Efforts are in progress to integrate NDE with the finite element (FE) computational method to perform detailed structural analysis of a given component. This report presents the core outlines for an in-house technical procedure that incorporates this combined NDE-FE interrelation. An example is presented to demonstrate the applicability of this analytical procedure. FE analysis of a test specimen is performed, and the resulting von Mises stresses and the stress concentrations near the anomalies are observed, which indicates the fidelity of the procedure. Additional information elaborating on the steps needed to perform such an analysis is clearly presented in the form of mini step-by-step guidelines.
Students' Perceptions of Teaching and Learning Practices: A Principal Component Approach
ERIC Educational Resources Information Center
Mukorera, Sophia; Nyatanga, Phocenah
2017-01-01
Students' attendance and engagement with teaching and learning practices is perceived as a critical element for academic performance. Even with stipulated attendance policies, students still choose not to engage. The study employed a principal component analysis to analyze first- and second-year students' perceptions of the importance of the 12…
Modeling of power electronic systems with EMTP
NASA Technical Reports Server (NTRS)
Tam, Kwa-Sur; Dravid, Narayan V.
1989-01-01
In view of the potential impact of power electronics on power systems, there is need for a computer modeling/analysis tool to perform simulation studies on power systems with power electronic components as well as to educate engineering students about such systems. The modeling of the major power electronic components of the NASA Space Station Freedom Electric Power System is described along with ElectroMagnetic Transients Program (EMTP) and it is demonstrated that EMTP can serve as a very useful tool for teaching, design, analysis, and research in the area of power systems with power electronic components. EMTP modeling of power electronic circuits is described and simulation results are presented.
NASA Technical Reports Server (NTRS)
Gale, R. L.; Nease, A. W.; Nelson, D. J.
1978-01-01
Computer program mathematically describes complete hydraulic systems to study their dynamic performance. Program employs subroutines that simulate components of hydraulic system, which are then controlled by main program. Program is useful to engineers working with detailed performance results of aircraft, spacecraft, or similar hydraulic systems.
Adiabatic diesel engine component development: Reference engine for on-highway applications
NASA Technical Reports Server (NTRS)
Hakim, Nabil S.
1986-01-01
The main objectives were to select an advanced low heat rejection diesel reference engine (ADRE) and to carry out systems analysis and design. The ADRE concept selection consisted of: (1) rated point performance optimization; (2) study of various exhaust energy recovery scenarios; (3) components, systems and engine configuration studies; and (4) life cycle cost estimates of the ADRE economic worth. The resulting ADRE design proposed a reciprocator with many advanced features for the 1995 technology demonstration time frame. These included ceramic air gap insulated hot section structural components, high temperature tribology treatments, nonmechanical (camless) valve actuation systems, and elimination of the cylinder head gasket. ADRE system analysis and design resulted in more definition of the engine systems. These systems include: (1) electro-hydraulic valve actuation, (2) electronic common rail injection system; (3) engine electronic control; (4) power transfer for accessory drives and exhaust energy recovery systems; and (5) truck installation. Tribology and performance assessments were also carried out. Finite element and probability of survival analyses were undertaken for the ceramic low heat rejection component.
NASA Astrophysics Data System (ADS)
Chen, Shuming; Wang, Dengfeng; Liu, Bo
This paper investigates optimization design of the thickness of the sound package performed on a passenger automobile. The major characteristics indexes for performance selected to evaluate the processes are the SPL of the exterior noise and the weight of the sound package, and the corresponding parameters of the sound package are the thickness of the glass wool with aluminum foil for the first layer, the thickness of the glass fiber for the second layer, and the thickness of the PE foam for the third layer. In this paper, the process is fundamentally with multiple performances, thus, the grey relational analysis that utilizes grey relational grade as performance index is especially employed to determine the optimal combination of the thickness of the different layers for the designed sound package. Additionally, in order to evaluate the weighting values corresponding to various performance characteristics, the principal component analysis is used to show their relative importance properly and objectively. The results of the confirmation experiments uncover that grey relational analysis coupled with principal analysis methods can successfully be applied to find the optimal combination of the thickness for each layer of the sound package material. Therefore, the presented method can be an effective tool to improve the vehicle exterior noise and lower the weight of the sound package. In addition, it will also be helpful for other applications in the automotive industry, such as the First Automobile Works in China, Changan Automobile in China, etc.
A system-level view of optimizing high-channel-count wireless biosignal telemetry.
Chandler, Rodney J; Gibson, Sarah; Karkare, Vaibhav; Farshchi, Shahin; Marković, Dejan; Judy, Jack W
2009-01-01
In this paper we perform a system-level analysis of a wireless biosignal telemetry system. We perform an analysis of each major system component (e.g., analog front end, analog-to-digital converter, digital signal processor, and wireless link), in which we consider physical, algorithmic, and design limitations. Since there are a wide range applications for wireless biosignal telemetry systems, each with their own unique set of requirements for key parameters (e.g., channel count, power dissipation, noise level, number of bits, etc.), our analysis is equally broad. The net result is a set of plots, in which the power dissipation for each component and as the system as a whole, are plotted as a function of the number of channels for different architectural strategies. These results are also compared to existing implementations of complete wireless biosignal telemetry systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less
Gosetti, Fabio; Chiuminatto, Ugo; Mazzucco, Eleonora; Mastroianni, Rita; Marengo, Emilio
2015-01-15
The study investigates the sunlight photodegradation process of carminic acid, a natural red colourant used in beverages. For this purpose, both carminic acid aqueous standard solutions and sixteen different commercial beverages, ten containing carminic acid and six containing E120 dye, were subjected to photoirradiation. The results show different patterns of degradation, not only between the standard solutions and the beverages, but also from beverage to beverage. Due to the different beverage recipes, unpredictable reactions take place between the dye and the other ingredients. To identify the dye degradation products in a very complex scenario, a methodology was used, based on the combined use of principal component analysis with discriminant analysis and ultra-high-performance liquid chromatography coupled with tandem high resolution mass spectrometry. The methodology is unaffected by beverage composition and allows the degradation products of carminic acid dye to be identified for each beverage. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huang, Jian; Yuen, Pong C.; Chen, Wen-Sheng; Lai, J. H.
2005-05-01
Many face recognition algorithms/systems have been developed in the last decade and excellent performances have also been reported when there is a sufficient number of representative training samples. In many real-life applications such as passport identification, only one well-controlled frontal sample image is available for training. Under this situation, the performance of existing algorithms will degrade dramatically or may not even be implemented. We propose a component-based linear discriminant analysis (LDA) method to solve the one training sample problem. The basic idea of the proposed method is to construct local facial feature component bunches by moving each local feature region in four directions. In this way, we not only generate more samples with lower dimension than the original image, but also consider the face detection localization error while training. After that, we propose a subspace LDA method, which is tailor-made for a small number of training samples, for the local feature projection to maximize the discrimination power. Theoretical analysis and experiment results show that our proposed subspace LDA is efficient and overcomes the limitations in existing LDA methods. Finally, we combine the contributions of each local component bunch with a weighted combination scheme to draw the recognition decision. A FERET database is used for evaluating the proposed method and results are encouraging.
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.
2009-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D
2008-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.
Analysis of radiofrequency discharges in plasma
Kumar, D.; McGlynn, S.P.
1992-08-04
Separation of laser optogalvanic signals in plasma into two components: (1) an ionization rate change component, and (2) a photoacoustic mediated component. This separation of components may be performed even when the two components overlap in time, by measuring time-resolved laser optogalvanic signals in an rf discharge plasma as the rf frequency is varied near the electrical resonance peak of the plasma and associated driving/detecting circuits. A novel spectrometer may be constructed to make these measurements. Such a spectrometer would be useful in better understanding and controlling such processes as plasma etching and plasma deposition. 15 figs.
Optical design and tolerancing of an ophthalmological system
NASA Astrophysics Data System (ADS)
Sieber, Ingo; Martin, Thomas; Yi, Allen; Li, Likai; Rübenach, Olaf
2014-09-01
Tolerance analysis by means of simulation is an essential step in system integration. Tolerance analysis allows for predicting the performance of a system setup of real manufactured parts and for an estimation of the yield with respect to evaluation figures, such as performance requirements, systems specification or cost demands. Currently, optical freeform optics is gaining importance in optical systems design. The performance of freeform optics often strongly depends on the manufacturing accuracy of the surfaces. For this reason, a tolerance analysis with respect to the fabrication accuracy is of crucial importance. The characterization of form tolerances caused by the manufacturing process is based on the definition of straightness, flatness, roundness, and cylindricity. In case of freeform components, however, it is often impossible to define a form deviation by means of this standard classification. Hence, prediction of the impact of manufacturing tolerances on the optical performance is not possible by means of a conventional tolerance analysis. To carry out a tolerance analysis of the optical subsystem, including freeform optics, metrology data of the fabricated surfaces have to be integrated into the optical model. The focus of this article is on design for manufacturability of freeform optics with integrated alignment structures and on tolerance analysis of the optical subsystem based on the measured surface data of manufactured optical freeform components with respect to assembly and manufacturing tolerances. This approach will be reported here using an ophthalmological system as an example.
Analytical Modeling and Performance Prediction of Remanufactured Gearbox Components
NASA Astrophysics Data System (ADS)
Pulikollu, Raja V.; Bolander, Nathan; Vijayakar, Sandeep; Spies, Matthew D.
Gearbox components operate in extreme environments, often leading to premature removal or overhaul. Though worn or damaged, these components still have the ability to function given the appropriate remanufacturing processes are deployed. Doing so reduces a significant amount of resources (time, materials, energy, manpower) otherwise required to produce a replacement part. Unfortunately, current design and analysis approaches require extensive testing and evaluation to validate the effectiveness and safety of a component that has been used in the field then processed outside of original OEM specification. To test all possible combination of component coupled with various levels of potential damage repaired through various options of processing would be an expensive and time consuming feat, thus prohibiting a broad deployment of remanufacturing processes across industry. However, such evaluation and validation can occur through Integrated Computational Materials Engineering (ICME) modeling and simulation. Sentient developed a microstructure-based component life prediction (CLP) tool to quantify and assist gearbox components remanufacturing process. This was achieved by modeling the design-manufacturing-microstructure-property relationship. The CLP tool assists in remanufacturing of high value, high demand rotorcraft, automotive and wind turbine gears and bearings. This paper summarizes the CLP models development, and validation efforts by comparing the simulation results with rotorcraft spiral bevel gear physical test data. CLP analyzes gear components and systems for safety, longevity, reliability and cost by predicting (1) New gearbox component performance, and optimal time-to-remanufacture (2) Qualification of used gearbox components for remanufacturing process (3) Predicting the remanufactured component performance.
Appliance of Independent Component Analysis to System Intrusion Analysis
NASA Astrophysics Data System (ADS)
Ishii, Yoshikazu; Takagi, Tarou; Nakai, Kouji
In order to analyze the output of the intrusion detection system and the firewall, we evaluated the applicability of ICA(independent component analysis). We developed a simulator for evaluation of intrusion analysis method. The simulator consists of the network model of an information system, the service model and the vulnerability model of each server, and the action model performed on client and intruder. We applied the ICA for analyzing the audit trail of simulated information system. We report the evaluation result of the ICA on intrusion analysis. In the simulated case, ICA separated two attacks correctly, and related an attack and the abnormalities of the normal application produced under the influence of the attach.
NASA Astrophysics Data System (ADS)
Linke, J.
2006-04-01
The plasma exposed components in existing and future fusion devices are strongly affected by the plasma material interaction processes. These mechanisms have a strong influence on the plasma performance; in addition they have major impact on the lifetime of the plasma facing armour and the joining interface between the plasma facing material (PFM) and the heat sink. Besides physical and chemical sputtering processes, high heat quasi-stationary fluxes during normal and intense thermal transients are of serious concern for the engineers who develop reliable wall components. In addition, the material and component degradation due to intense fluxes of energetic neutrons is another critical issue in D-T-burning fusion devices which requires extensive R&D. This paper presents an overview on the materials development and joining, the testing of PFMs and components, and the analysis of the neutron irradiation induced degradation.
Cool, Steve; Victor, Jan; De Baets, Thierry
2006-12-01
Fifty unicompartmental knee arthroplasties (UKAs) were performed through a minimally invasive approach and were reviewed with an average follow-up of 3.7 years. This technique leads to reduced access to surgical landmarks. The purpose of this study was to evaluate whether correct component positioning is possible through this less invasive approach. Component positioning, femorotibial alignment and early outcomes were evaluated. We observed perfect tibial component position, but femoral component position was less consistent, especially in the sagittal plane. Femorotibial alignment in the coronal plane was within 2.5 degrees of the desired axis for 80% of the cases. Femoral component position in the sagittal plane was within a 10 degrees range of the ideal for 70% of the cases. The mean IKS Knee Function Score and Knee Score were 89/100 and 91/100 respectively. We observed two polyethylene dislocations, and one revision was performed for progressive patellofemoral arthrosis. According to our data, minimally invasive UKA does not conflict with component positioning although a learning curve needs to be respected, with femoral component positioning as the major obstacle.
NASA Technical Reports Server (NTRS)
Gates, R. M.; Williams, J. E.
1974-01-01
Results are given of analytical studies performed in support of the design, implementation, checkout and use of NASA's dynamic docking test system (DDTS). Included are analyses of simulator components, a list of detailed operational test procedures, a summary of simulator performance, and an analysis and comparison of docking dynamics and loads obtained by test and analysis.
Solar array electrical performance assessment for Space Station Freedom
NASA Technical Reports Server (NTRS)
Smith, Bryan K.; Brisco, Holly
1993-01-01
Electrical power for Space Station Freedom will be generated by large Photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis, and test data to date. A description of the LMSC performance model, future test plans, and predicted performance ranges are also given.
Solar array electrical performance assessment for Space Station Freedom
NASA Technical Reports Server (NTRS)
Smith, Bryan K.; Brisco, Holly
1993-01-01
Electrical power for Space Station Freedom will be generated by large photovoltaic arrays with a beginning of life power requirement of 30.8 kW per array. The solar arrays will operate in a Low Earth Orbit (LEO) over a design life of fifteen years. This paper provides an analysis of the predicted solar array electrical performance over the design life and presents a summary of supporting analysis and test data for the assigned model parameters and performance loss factors. Each model parameter and loss factor is assessed based upon program requirements, component analysis and test data to date. A description of the LMSC performance model future test plans and predicted performance ranges are also given.
Advanced Self-Calibrating, Self-Repairing Data Acquisition System
NASA Technical Reports Server (NTRS)
Medelius, Pedro J. (Inventor); Eckhoff, Anthony J. (Inventor); Angel, Lucena R. (Inventor); Perotti, Jose M. (Inventor)
2002-01-01
An improved self-calibrating and self-repairing Data Acquisition System (DAS) for use in inaccessible areas, such as onboard spacecraft, and capable of autonomously performing required system health checks, failure detection. When required, self-repair is implemented utilizing a "spare parts/tool box" system. The available number of spare components primarily depends upon each component's predicted reliability which may be determined using Mean Time Between Failures (MTBF) analysis. Failing or degrading components are electronically removed and disabled to reduce power consumption, before being electronically replaced with spare components.
Anomalous neural circuit function in schizophrenia during a virtual Morris water task.
Folley, Bradley S; Astur, Robert; Jagannathan, Kanchana; Calhoun, Vince D; Pearlson, Godfrey D
2010-02-15
Previous studies have reported learning and navigation impairments in schizophrenia patients during virtual reality allocentric learning tasks. The neural bases of these deficits have not been explored using functional MRI despite well-explored anatomic characterization of these paradigms in non-human animals. Our objective was to characterize the differential distributed neural circuits involved in virtual Morris water task performance using independent component analysis (ICA) in schizophrenia patients and controls. Additionally, we present behavioral data in order to derive relationships between brain function and performance, and we have included a general linear model-based analysis in order to exemplify the incremental and differential results afforded by ICA. Thirty-four individuals with schizophrenia and twenty-eight healthy controls underwent fMRI scanning during a block design virtual Morris water task using hidden and visible platform conditions. Independent components analysis was used to deconstruct neural contributions to hidden and visible platform conditions for patients and controls. We also examined performance variables, voxel-based morphometry and hippocampal subparcellation, and regional BOLD signal variation. Independent component analysis identified five neural circuits. Mesial temporal lobe regions, including the hippocampus, were consistently task-related across conditions and groups. Frontal, striatal, and parietal circuits were recruited preferentially during the visible condition for patients, while frontal and temporal lobe regions were more saliently recruited by controls during the hidden platform condition. Gray matter concentrations and BOLD signal in hippocampal subregions were associated with task performance in controls but not patients. Patients exhibited impaired performance on the hidden and visible conditions of the task, related to negative symptom severity. While controls showed coupling between neural circuits, regional neuroanatomy, and behavior, patients activated different task-related neural circuits, not associated with appropriate regional neuroanatomy. GLM analysis elucidated several comparable regions, with the exception of the hippocampus. Inefficient allocentric learning and memory in patients may be related to an inability to recruit appropriate task-dependent neural circuits. Copyright 2009 Elsevier Inc. All rights reserved.
Systems level test and simulation for photonic processing systems
NASA Astrophysics Data System (ADS)
Erteza, I. A.; Stalker, K. T.
1995-08-01
Photonic technology is growing in importance throughout DOD. Programs have been underway in each of the Services to demonstrate the ability of photonics to enhance current electronic performance in several prototype systems, such as the Navy's SLQ-32 radar warning receiver, the Army's multi-role survivable radar and the phased array radar controller for the Airborne Warning and Control System (AWACS) upgrade. Little, though, is known about radiation effects; the component studies do not furnish the information needed to predict overall system performance in a radiation environment. To date, no comprehensive test and analysis program has been conducted to evaluate sensitivity of overall system performance to the radiation environment. The goal of this program is to relate component level effects to system level performance through modeling and testing of a selected optical processing system, and to help direct component testing to items which can directly and adversely affect overall system performance. This report gives a broad overview of the project, highlighting key results.
Grilo, C M
2004-01-01
To examine the factor structure of DSM-IV criteria for obsessive compulsive personality disorder (OCPD) in patients with binge eating disorder (BED). Two hundred and eleven consecutive out-patients with axis I diagnoses of BED were reliably assessed with semi-structured diagnostic interviews. The eight criteria for the OCPD diagnosis were examined with reliability and correlational analyses. Exploratory factor analysis was performed to identify potential components. Cronbach's coefficient alpha for the OCPD criteria was 0.77. Principal components factor analysis with varimax rotation revealed a three-factor solution (rigidity, perfectionism, and miserliness), which accounted for 65% of variance. The DSM-IV criteria for OCPD showed good internal consistency. Exploratory factor analysis, however, revealed three components that may reflect distinct interpersonal, intrapersonal (cognitive), and behavioral features.
Liu, Cui-Ting; Zhang, Min; Yan, Ping; Liu, Hai-Chan; Liu, Xing-Yun; Zhan, Ruo-Ting
2016-01-01
Zhengtian pills (ZTPs) are traditional Chinese medicine (TCM) which have been commonly used to treat headaches. Volatile components of ZTPs extracted by ethyl acetate with an ultrasonic method were analyzed by gas chromatography mass spectrometry (GC-MS). Twenty-two components were identified, accounting for 78.884% of the total components of volatile oil. The three main volatile components including protocatechuic acid, ferulic acid, and ligustilide were simultaneously determined using ultra-high performance liquid chromatography coupled with diode array detection (UHPLC-DAD). Baseline separation was achieved on an XB-C18 column with linear gradient elution of methanol-0.2% acetic acid aqueous solution. The UHPLC-DAD method provided good linearity (R (2) ≥ 0.9992), precision (RSD < 3%), accuracy (100.68-102.69%), and robustness. The UHPLC-DAD/GC-MS method was successfully utilized to analyze volatile components, protocatechuic acid, ferulic acid, and ligustilide, in 13 batches of ZTPs, which is suitable for discrimination and quality assessment of ZTPs.
Luo, Jinxue; Zhang, Jinsong; Tan, Xiaohui; McDougald, Diane; Zhuang, Guoqiang; Fane, Anthony G; Kjelleberg, Staffan; Cohen, Yehuda; Rice, Scott A
2014-10-01
Biofouling, the combined effect of microorganism and biopolymer accumulation, significantly reduces the process efficiency of membrane bioreactors (MBRs). Here, four biofilm components, alpha-polysaccharides, beta-polysaccharides, proteins and microorganisms, were quantified in MBRs. The biomass of each component was positively correlated with the transmembrane pressure increase in MBRs. Proteins were the most abundant biopolymer in biofilms and showed the fastest rate of increase. The spatial distribution and co-localization analysis of the biofouling components indicated at least 60% of the extracellular polysaccharide (EPS) components were associated with the microbial cells when the transmembrane pressure (TMP) entered the jump phase, suggesting that the EPS components were either secreted by the biofilm cells or that the deposition of these components facilitated biofilm formation. It is suggested that biofilm formation and the accumulation of EPS are intrinsically coupled, resulting in biofouling and loss of system performance. Therefore, strategies that control biofilm formation on membranes may result in a significant improvement of MBR performance.
NASA Astrophysics Data System (ADS)
Mahmoudishadi, S.; Malian, A.; Hosseinali, F.
2017-09-01
The image processing techniques in transform domain are employed as analysis tools for enhancing the detection of mineral deposits. The process of decomposing the image into important components increases the probability of mineral extraction. In this study, the performance of Principal Component Analysis (PCA) and Independent Component Analysis (ICA) has been evaluated for the visible and near-infrared (VNIR) and Shortwave infrared (SWIR) subsystems of ASTER data. Ardestan is located in part of Central Iranian Volcanic Belt that hosts many well-known porphyry copper deposits. This research investigated the propylitic and argillic alteration zones and outer mineralogy zone in part of Ardestan region. The two mentioned approaches were applied to discriminate alteration zones from igneous bedrock using the major absorption of indicator minerals from alteration and mineralogy zones in spectral rang of ASTER bands. Specialized PC components (PC2, PC3 and PC6) were used to identify pyrite and argillic and propylitic zones that distinguish from igneous bedrock in RGB color composite image. Due to the eigenvalues, the components 2, 3 and 6 account for 4.26% ,0.9% and 0.09% of the total variance of the data for Ardestan scene, respectively. For the purpose of discriminating the alteration and mineralogy zones of porphyry copper deposit from bedrocks, those mentioned percentages of data in ICA independent components of IC2, IC3 and IC6 are more accurately separated than noisy bands of PCA. The results of ICA method conform to location of lithological units of Ardestan region, as well.
Articular Cartilage of the Human Knee Joint: In Vivo Multicomponent T2 Analysis at 3.0 T
Choi, Kwang Won; Samsonov, Alexey; Spencer, Richard G.; Wilson, John J.; Block, Walter F.; Kijowski, Richard
2015-01-01
Purpose To compare multicomponent T2 parameters of the articular cartilage of the knee joint measured by using multicomponent driven equilibrium single-shot observation of T1 and T2 (mcDESPOT) in asymptomatic volunteers and patients with osteoarthritis. Materials and Methods This prospective study was performed with institutional review board approval and with written informed consent from all subjects. The mcDESPOT sequence was performed in the knee joint of 13 asymptomatic volunteers and 14 patients with osteoarthritis of the knee. Single-component T2 (T2Single), T2 of the fast-relaxing water component (T2F) and of the slow-relaxing water component (T2S), and the fraction of the fast-relaxing water component (FF) of cartilage were measured. Wilcoxon rank-sum tests and multivariate linear regression models were used to compare mcDESPOT parameters between volunteers and patients with osteoarthritis. Receiver operating characteristic analysis was used to assess diagnostic performance with mcDESPOT parameters for distinguishing morphologically normal cartilage from morphologically degenerative cartilage identified at magnetic resonance imaging in eight cartilage subsections of the knee joint. Results Higher cartilage T2Single (P < .001), lower cartilage FF (P < .001), and similar cartilage T2F (P = .079) and T2S (P = .124) values were seen in patients with osteoarthritis compared with those in asymptomatic volunteers. Differences in T2Single and FF remained significant (P < .05) after consideration of age differences between groups of subjects. Diagnostic performance was higher with FF than with T2Single for distinguishing between normal and degenerative cartilage (P < .05), with greater areas under the curve at receiver operating characteristic analysis. Conclusion Patients with osteoarthritis of the knee had significantly higher cartilage T2Single and significantly lower cartilage FF than did asymptomatic volunteers, and receiver operating characteristic analysis results suggested that FF may allow greater diagnostic performance than that with T2Single for distinguishing between normal and degenerative cartilage. © RSNA, 2015 Online supplemental material is available for this article. PMID:26024307
Caprihan, A; Pearlson, G D; Calhoun, V D
2008-08-15
Principal component analysis (PCA) is often used to reduce the dimension of data before applying more sophisticated data analysis methods such as non-linear classification algorithms or independent component analysis. This practice is based on selecting components corresponding to the largest eigenvalues. If the ultimate goal is separation of data in two groups, then these set of components need not have the most discriminatory power. We measured the distance between two such populations using Mahalanobis distance and chose the eigenvectors to maximize it, a modified PCA method, which we call the discriminant PCA (DPCA). DPCA was applied to diffusion tensor-based fractional anisotropy images to distinguish age-matched schizophrenia subjects from healthy controls. The performance of the proposed method was evaluated by the one-leave-out method. We show that for this fractional anisotropy data set, the classification error with 60 components was close to the minimum error and that the Mahalanobis distance was twice as large with DPCA, than with PCA. Finally, by masking the discriminant function with the white matter tracts of the Johns Hopkins University atlas, we identified left superior longitudinal fasciculus as the tract which gave the least classification error. In addition, with six optimally chosen tracts the classification error was zero.
Multidisciplinary propulsion simulation using the numerical propulsion system simulator (NPSS)
NASA Technical Reports Server (NTRS)
Claus, Russel W.
1994-01-01
Implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributions to the high cost is the need to perform many large scale system tests. The traditional design analysis procedure decomposes the engine into isolated components and focuses attention on each single physical discipline (e.g., fluid for structural dynamics). Consequently, the interactions that naturally occur between components and disciplines can be masked by the limited interactions that occur between individuals or teams doing the design and must be uncovered during expensive engine testing. This overview will discuss a cooperative effort of NASA, industry, and universities to integrate disciplines, components, and high performance computing into a Numerical propulsion System Simulator (NPSS).
Ha, Steven T.K.; Wilkins, Charles L.; Abidi, Sharon L.
1989-01-01
A mixture of closely related streptomyces fermentation products, antimycin A, Is separated, and the components are identified by using reversed-phase high-performance liquid chromatography with directly linked 400-MHz proton nuclear magnetic resonance detection. Analyses of mixtures of three amino acids, alanine, glycine, and valine, are used to determine optimal measurement conditions. Sensitivity increases of as much as a factor of 3 are achieved, at the expense of some loss in chromatographic resolution, by use of an 80-μL NMR cell, Instead of a smaller 14-μL cell. Analysis of the antimycin A mixture, using the optimal analytical high performance liquid chromatography/nuclear magnetic resonance conditions, reveals it to consist of at least 10 closely related components.
CARES/Life Software for Designing More Reliable Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.
1997-01-01
Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.
Sui, Jing; Adali, Tülay; Pearlson, Godfrey D.; Clark, Vincent P.; Calhoun, Vince D.
2009-01-01
Independent component analysis (ICA) is a promising method that is increasingly used to analyze brain imaging data such as functional magnetic resonance imaging (fMRI), structural MRI, and electroencephalography and has also proved useful for group comparison, e.g., differentiating healthy controls from patients. An advantage of ICA is its ability to identify components that are mixed in an unknown manner. However, ICA is not necessarily robust and optimal in identifying between-group effects, especially in highly noisy situations. Here, we propose a modified ICA framework for multi-group data analysis that incorporates prior information regarding group membership as a constraint into the mixing coefficients. Our approach, called coefficient-constrained ICA (CC-ICA), prioritizes identification of components that show a significant group difference. The performance of CC-ICA via synthetic and hybrid data simulations is evaluated under different hypothesis testing assumptions and signal to noise ratios (SNRs). Group analysis is also conducted on real multitask fMRI data. Results show that CC-ICA improves the estimation accuracy of the independent components greatly, especially those that have different patterns for different groups (e.g., patients vs. controls); In addition, it enhances the data extraction sensitivity to group differences by ranking components with P value or J-divergence more consistently with the ground truth. The proposed algorithm performs quite well for both group-difference detection and multitask fMRI data fusion, which may prove especially important for the identification of relevant disease biomarkers. PMID:19172631
Warsito, Warsito; Palungan, Maimunah Hindun; Utomo, Edy Priyo
2017-01-01
Introduction Essential oil is consisting of complex component. It is divided into major and minor component. Therefore, this study aims to examine the distribution of major and minor components on Kaffir lime oil by using fractional distillation. Fractional distillation and distributional analysis of components within fractions have been performed on kaffir lime oil (Citrus hystrix DC.). Methods Fractional distillation was performed by using PiloDist 104-VTU, column length of 2 m (number of plate 120), the system pressure was set on 5 and 10 mBar, while the reflux ratio varied on 10/10, 20/10 and 60/10, and the chemical composition analysis was done by using GC-MS. Chemical composition of the distillated lime oil consisted of mix-twigs and leaves that composed of 20 compounds, with five main components β-citronellal (46.40%), L-linalool (13.11%), β-citronellol (11.03%), citronelyl acetate (6.76%) and sabinen (5.91%). Results The optimum conditions for fractional distillation were obtained at 5 mBar pressure with reflux ratio of 10/10. Components of β -citronellal and L-linalool were distributed in the fraction-1 to fraction 9, hydrocarbon monoterpenes components were distributed only on the fraction-1 to fraction 4, while the oxygenated monoterpenes components dominated the fraction-5 to fraction-9. Conclusion The highest level of β-citronellal was 84.86% (fraction-7), L-linalool 20.13% (fraction-5), sabinen 19.83% (fraction-1), and the component level of 4-terpeneol, β-citronellol and sitronelyl acetate respectively 7.16%; 12.27%; 5.22% (fraction-9). PMID:29187951
Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals
NASA Astrophysics Data System (ADS)
Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam
A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.
Spectral compression algorithms for the analysis of very large multivariate images
Keenan, Michael R.
2007-10-16
A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.
Template-directed instrumentation in total knee arthroplasty: cost savings analysis.
Hsu, Andrew R; Gross, Christopher E; Bhatia, Sanjeev; Levine, Brett R
2012-11-01
The use of digital radiography and templating software in total knee arthroplasty (TKA) continues to become more prevalent as the number of procedures performed increases every year. Template-directed instrumentation (TDI) is a novel approach to surgical planning that combines digital templating with limited intraoperative instruments. The purpose of this study was to evaluate the financial implications and radiographic outcomes of using TDI to direct instrumentation during primary TKA. Over a 1-year period, 82 consecutive TKAs using TDI were retrospectively reviewed. Patient demographics and preoperative templated sizes of predicted components were recorded, and OrthoView digital planning software (OrthoView LLC, Jacksonville, Florida was used to determine the 2 most likely tibial and femoral component sizes for each case. This sizing information was used to direct component vendors to prepare 3 lightweight instrument trays based on these sizes. The sizes of implanted components and the number of total trays required were documented. A cost savings analysis was performed to compare TDI and non-TDI surgical expenses for TKA. In 80 (97%) of 82 cases, the prepared sizes determined by TDI using 3 instrument trays were sufficient. Preoperative templating correctly predicted the size of the tibial and femoral component sizes in 90% and 83% of cases, respectively. The average number of trays used with TDI was 3.0 (range, 3-5 trays) compared with 7.5 (range, 6-9 trays) used in 82 preceding non-TDI TKAs. Based on standard fees to sterilize and package implant trays (approximately $26 based on a survey of 10 orthopedic hospitals performing TKA), approximately $9612 was saved by using TDI over the 1-year study period. Overall, digital templating and TDI were a simple and cost-effective approach when performing primary TKA. Copyright 2012, SLACK Incorporated.
Precision Attitude Determination System (PADS) design and analysis. Two-axis gimbal star tracker
NASA Technical Reports Server (NTRS)
1973-01-01
Development of the Precision Attitude Determination System (PADS) focused chiefly on the two-axis gimballed star tracker and electronics design improved from that of Precision Pointing Control System (PPCS), and application of the improved tracker for PADS at geosynchronous altitude. System design, system analysis, software design, and hardware design activities are reported. The system design encompasses the PADS configuration, system performance characteristics, component design summaries, and interface considerations. The PADS design and performance analysis includes error analysis, performance analysis via attitude determination simulation, and star tracker servo design analysis. The design of the star tracker and electronics are discussed. Sensor electronics schematics are included. A detailed characterization of the application software algorithms and computer requirements is provided.
ERIC Educational Resources Information Center
Soltesz, Fruzsina; Szucs, Denes
2009-01-01
Developmental dyscalculia (DD) still lacks a generally accepted definition. A major problem is that the cognitive component processes contributing to arithmetic performance are still poorly defined. By a reanalysis of our previous event-related brain potential (ERP) data (Soltesz et al., 2007) here our objective was to identify and compare…
Energy efficient engine component development and integration program
NASA Technical Reports Server (NTRS)
1980-01-01
The design of an energy efficient commercial turbofan engine is examined with emphasis on lower fuel consumption and operating costs. Propulsion system performance, emission standards, and noise reduction are also investigated. A detailed design analysis of the engine/aircraft configuration, engine components, and core engine is presented along with an evaluation of the technology and testing involved.
10 CFR 50.48 - Fire protection.
Code of Federal Regulations, 2011 CFR
2011-01-01
... suppression systems; and (iii) The means to limit fire damage to structures, systems, or components important...) Standard 805, “Performance-Based Standard for Fire Protection for Light Water Reactor Electric Generating... pressurized-water reactors (PWRs) is not permitted. (iv) Uncertainty analysis. An uncertainty analysis...
10 CFR 50.48 - Fire protection.
Code of Federal Regulations, 2010 CFR
2010-01-01
... suppression systems; and (iii) The means to limit fire damage to structures, systems, or components important...) Standard 805, “Performance-Based Standard for Fire Protection for Light Water Reactor Electric Generating... pressurized-water reactors (PWRs) is not permitted. (iv) Uncertainty analysis. An uncertainty analysis...
Morin, R.H.
1997-01-01
Returns from drilling in unconsolidated cobble and sand aquifers commonly do not identify lithologic changes that may be meaningful for Hydrogeologic investigations. Vertical resolution of saturated, Quaternary, coarse braided-slream deposits is significantly improved by interpreting natural gamma (G), epithermal neutron (N), and electromagnetically induced resistivity (IR) logs obtained from wells at the Capital Station site in Boise, Idaho. Interpretation of these geophysical logs is simplified because these sediments are derived largely from high-gamma-producing source rocks (granitics of the Boise River drainage), contain few clays, and have undergone little diagenesis. Analysis of G, N, and IR data from these deposits with principal components analysis provides an objective means to determine if units can be recognized within the braided-stream deposits. In particular, performing principal components analysis on G, N, and IR data from eight wells at Capital Station (1) allows the variable system dimensionality to be reduced from three to two by selecting the two eigenvectors with the greatest variance as axes for principal component scatterplots, (2) generates principal components with interpretable physical meanings, (3) distinguishes sand from cobble-dominated units, and (4) provides a means to distinguish between cobble-dominated units.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C. C.
The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less
Pang, Hanqing; Wang, Jun; Tang, Yuping; Xu, Huiqin; Wu, Liang; Jin, Yi; Zhu, Zhenhua; Guo, Sheng; Shi, Xuqin; Huang, Shengliang; Sun, Dazheng; Duan, Jin-Ao
2016-11-01
Xin-Sheng-Hua granule, a representative formula for postpartum hemorrhage, has been used clinically to treat postpartum diseases. Its main bioactive components comprise aromatic acids, phthalides, alkaloids, flavonoids, and gingerols among others. To investigate the changes in main bioactive constituents in its seven single herbs before and after compatibility, a rapid, simple, and sensitive method was developed for comparative analysis of 27 main bioactive components by using ultrahigh-performance liquid chromatography with triple quadrupole electrospray tandem mass spectrometry for the first time. The sufficient separation of 27 target constituents was achieved on a Thermo Scientific Hypersil GOLD column (100 mm × 3 mm, 1.9 μm) within 20 min under the optimized chromatographic conditions. Compared with the theoretical content, the observed content of each analyte showed remarkable differences in Xin-Sheng-Hua granule except thymine, p-coumaric acid, senkyunolide I, senkyunolide H, and ligustilide; the total contents of 27 components increased significantly, and the content variation degrees for the different components were gingerols > flavonoids > aromatic acids > alkaloids > phthalides. The results could provide a good reference for the quality control of Xin-Sheng-Hua granule and might be helpful to interpret the drug interactions based on variation of bioactive components in formulae. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Applying conscientiousness index: a tool to explore medical students' professionalism in Indonesia.
Jaya, Wolter Prakarsa; Rukmini, Elisabeth
2016-07-14
This study was aimed to describe lecturers' perspective concerning the suitable Conscientiousness Index (CI) components and implementations, as well as to compare the CI scores in year 1-4 student batches. Components were formulated from objective measurements based on interviews with 12 faculty members. The components include: attendance, adherence to rules, evaluative feedback submissions, performance in assignments and clinical skills, assignment submissions, volunteerism, accomplishments, and general misconducts. The scores were collected from year 1-4 pre-clinical medical students (N=144) during the first semester of 2014-2015. Final interviews were conducted with 9 faculty members. Quantitative analysis was performed using Kruskal-Wallis and Mann-Whitney test. Qualitative analysis was performed using content analysis. Using Kruskal-Wallis test, significant difference was found in the CI scores among all years (p=0.000). Post-hoc analysis using Mann-Whitney test showed significant difference in all years except year 1 and 4 (p=0.388). Of the 9 lecturers interviewed during the second interviews, 7 endorsed the importance of CI, while 2 doubted its applicability. Due to the unique characteristics of each block, our system had not been able to conduct a balanced CI evaluation, as compared to the original research. We concluded that the implementation of CI would be highly dependent on the faculty members, with their commitment as the main pre-requisite. We hope to involve academic advisors as CI evaluators and improve our student-centered learning for future assessments. Further study is needed to investigate the longitudinal implementation of CI.
Wang, Jinxu; Tong, Xin; Li, Peibo; Liu, Menghua; Peng, Wei; Cao, Hui; Su, Weiwei
2014-08-08
Shenqi Fuzheng Injection (SFI) is an injectable traditional Chinese herbal formula comprised of two Chinese herbs, Radix codonopsis and Radix astragali, which were commonly used to improve immune functions against chronic diseases in an integrative and holistic way in China and other East Asian countries for thousands of years. This present study was designed to explore the bioactive components on immuno-enhancement effects in SFI using the relevance analysis between chemical fingerprints and biological effects in vivo. According to a four-factor, nine-level uniform design, SFI samples were prepared with different proportions of the four portions separated from SFI via high speed counter current chromatography (HSCCC). SFI samples were assessed with high performance liquid chromatography (HPLC) for 23 identified components. For the immunosuppressed murine experiments, biological effects in vivo were evaluated on spleen index (E1), peripheral white blood cell counts (E2), bone marrow cell counts (E3), splenic lymphocyte proliferation (E4), splenic natural killer cell activity (E5), peritoneal macrophage phagocytosis (E6) and the amount of interleukin-2 (E7). Based on the hypothesis that biological effects in vivo varied with differences in components, multivariate relevance analysis, including gray relational analysis (GRA), multi-linear regression analysis (MLRA) and principal component analysis (PCA), were performed to evaluate the contribution of each identified component. The results indicated that the bioactive components of SFI on immuno-enhancement activities were calycosin-7-O-β-d-glucopyranoside (P9), isomucronulatol-7,2'-di-O-glucoside (P11), biochanin-7-glucoside (P12), 9,10-dimethoxypterocarpan-3-O-xylosylglucoside (P15) and astragaloside IV (P20), which might have positive effects on spleen index (E1), splenic lymphocyte proliferation (E4), splenic natural killer cell activity (E5), peritoneal macrophage phagocytosis (E6) and the amount of interleukin-2 (E7), while 5-hydroxymethyl-furaldehyde (P5) and lobetyolin (P13) might have negative effects on E1, E4, E5, E6 and E7. Finally, the bioactive HPLC fingerprint of SFI based on its bioactive components on immuno-enhancement effects was established for quality control of SFI. In summary, this study provided a perspective to explore the bioactive components in a traditional Chinese herbal formula with a series of HPLC and animal experiments, which would be helpful to improve quality control and inspire further clinical studies of traditional Chinese medicines. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sasmita, Yoga; Darmawan, Gumgum
2017-08-01
This research aims to evaluate the performance of forecasting by Fourier Series Analysis (FSA) and Singular Spectrum Analysis (SSA) which are more explorative and not requiring parametric assumption. Those methods are applied to predicting the volume of motorcycle sales in Indonesia from January 2005 to December 2016 (monthly). Both models are suitable for seasonal and trend component data. Technically, FSA defines time domain as the result of trend and seasonal component in different frequencies which is difficult to identify in the time domain analysis. With the hidden period is 2,918 ≈ 3 and significant model order is 3, FSA model is used to predict testing data. Meanwhile, SSA has two main processes, decomposition and reconstruction. SSA decomposes the time series data into different components. The reconstruction process starts with grouping the decomposition result based on similarity period of each component in trajectory matrix. With the optimum of window length (L = 53) and grouping effect (r = 4), SSA predicting testing data. Forecasting accuracy evaluation is done based on Mean Absolute Percentage Error (MAPE), Mean Absolute Error (MAE) and Root Mean Square Error (RMSE). The result shows that in the next 12 month, SSA has MAPE = 13.54 percent, MAE = 61,168.43 and RMSE = 75,244.92 and FSA has MAPE = 28.19 percent, MAE = 119,718.43 and RMSE = 142,511.17. Therefore, to predict volume of motorcycle sales in the next period should use SSA method which has better performance based on its accuracy.
NASA Technical Reports Server (NTRS)
Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.
2018-01-01
Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid contradictions such as having to choose one sub-component to generate performance diagnostics to another, possibly not fully consistent, component.
Oshima, Ryusei; Kotani, Akira; Kuroda, Minpei; Yamamoto, Kazuhiro; Mimaki, Yoshihiro; Hakamata, Hideki
2018-03-01
High-performance liquid chromatography with ultraviolet detection (HPLC-UV) using 20 mM phosphate mobile phase and an octadecylsilyl column (Triart C18, 150 × 3.0 mm i.d., 3 μm) has been developed for the analysis of hydrophilic compounds in the water extract of Schisandrae Fructus samples. The present HPLC-UV method permits the accurate and precise determination of malic, citric, and protocatechuic acids in the Japanese Pharmacopoeia (JP) Schisandrae Fructus, Schisandrae Chinensis Fructus and Schisandrae Sphenantherae Fructus. The JP Schisandrae Fructus studied contains 27.98 mg/g malic, 107.08 mg/g citric, and 0.42 mg/g protocatechuic acids, with a relative standard deviation (RSD) of repeatability of <0.9% (n = 6). The content of malic acids in Schisandrae Chinensis Fructus is approximately ten times that in Schisandrae Sphenantherae Fructus. To examine whether the HPLC-UV method is applicable to the fingerprint-based discrimination of Schisandrae Fructus samples obtained from Chinese markets, principal component analysis (PCA) was performed using the determined contents of organic acids and the ratio of six characteristic unknown peaks derived from hydrophilic components to internal standard peak areas. On the score plots, Schisandrae Chinensis Fructus and Schisandrae Sphenantherae Fructus samples are clearly discriminated. Therefore, the HPLC-UV method for the analysis of hydrophilic components coupled with PCA has been shown to be practical and useful in the quality control of Schisandrae Fructus.
Development of a Relay Performance Web Tool for the Mars Network
NASA Technical Reports Server (NTRS)
Allard, Daniel A.; Edwards, Charles D.
2009-01-01
Modern Mars surface missions rely upon orbiting spacecraft to relay communications to and from Earth systems. An important component of this multi-mission relay process is the collection of relay performance statistics supporting strategic trend analysis and tactical anomaly identification and tracking.
[HPLC fingerprint of flavonoids in Sophora flavescens and determination of five components].
Ma, Hong-Yan; Zhou, Wan-Shan; Chu, Fu-Jiang; Wang, Dong; Liang, Sheng-Wang; Li, Shao
2013-08-01
A simple and reliable method of high-performance liquid chromatography with photodiode array detection (HPLC-DAD) was developed to evaluate the quality of a traditional Chinese medicine Sophora flavescens through establishing chromatographic fingerprint and simultaneous determination of five flavonoids, including trifolirhizin, maackiain, kushenol I, kurarinone and sophoraflavanone G. The optimal conditions of separation and detection were achieved on an ULTIMATE XB-C18 column (4.6 mm x 250 mm, 5 microm) with a gradient of acetonitrile and water, detected at 295 nm. In the chromatographic fingerprint, 13 peaks were selected as the characteristic peaks to assess the similarities of different samples collected from different origins in China according to similarity evaluation for chromatographic fingerprint of traditional chinese medicine (2004AB) and principal component analysis (PCA) were used in data analysis. There were significant differences in the fingerprint chromatograms between S. flavescens and S. tonkinensis. Principal component analysis showed that kurarinone and sophoraflavanone G were the most important component. In quantitative analysis, the five components showed good regression (R > 0.999) with linear ranges, and their recoveries were in the range of 96.3% - 102.3%. This study indicated that the combination of quantitative and chromatographic fingerprint analysis can be readily utilized as a quality control method for S. flavescens and its related traditional Chinese medicinal preparations.
NASA Astrophysics Data System (ADS)
Babanova, Sofia; Artyushkova, Kateryna; Ulyanova, Yevgenia; Singhal, Sameer; Atanassov, Plamen
2014-01-01
Two statistical methods, design of experiments (DOE) and principal component analysis (PCA) are employed to investigate and improve performance of air-breathing gas-diffusional enzymatic electrodes. DOE is utilized as a tool for systematic organization and evaluation of various factors affecting the performance of the composite system. Based on the results from the DOE, an improved cathode is constructed. The current density generated utilizing the improved cathode (755 ± 39 μA cm-2 at 0.3 V vs. Ag/AgCl) is 2-5 times higher than the highest current density previously achieved. Three major factors contributing to the cathode performance are identified: the amount of enzyme, the volume of phosphate buffer used to immobilize the enzyme, and the thickness of the gas-diffusion layer (GDL). PCA is applied as an independent confirmation tool to support conclusions made by DOE and to visualize the contribution of factors in individual cathode configurations.
COMPADRE: an R and web resource for pathway activity analysis by component decompositions.
Ramos-Rodriguez, Roberto-Rafael; Cuevas-Diaz-Duran, Raquel; Falciani, Francesco; Tamez-Peña, Jose-Gerardo; Trevino, Victor
2012-10-15
The analysis of biological networks has become essential to study functional genomic data. Compadre is a tool to estimate pathway/gene sets activity indexes using sub-matrix decompositions for biological networks analyses. The Compadre pipeline also includes one of the direct uses of activity indexes to detect altered gene sets. For this, the gene expression sub-matrix of a gene set is decomposed into components, which are used to test differences between groups of samples. This procedure is performed with and without differentially expressed genes to decrease false calls. During this process, Compadre also performs an over-representation test. Compadre already implements four decomposition methods [principal component analysis (PCA), Isomaps, independent component analysis (ICA) and non-negative matrix factorization (NMF)], six statistical tests (t- and f-test, SAM, Kruskal-Wallis, Welch and Brown-Forsythe), several gene sets (KEGG, BioCarta, Reactome, GO and MsigDB) and can be easily expanded. Our simulation results shown in Supplementary Information suggest that Compadre detects more pathways than over-representation tools like David, Babelomics and Webgestalt and less false positives than PLAGE. The output is composed of results from decomposition and over-representation analyses providing a more complete biological picture. Examples provided in Supplementary Information show the utility, versatility and simplicity of Compadre for analyses of biological networks. Compadre is freely available at http://bioinformatica.mty.itesm.mx:8080/compadre. The R package is also available at https://sourceforge.net/p/compadre.
NOTE: Entropy-based automated classification of independent components separated from fMCG
NASA Astrophysics Data System (ADS)
Comani, S.; Srinivasan, V.; Alleva, G.; Romani, G. L.
2007-03-01
Fetal magnetocardiography (fMCG) is a noninvasive technique suitable for the prenatal diagnosis of the fetal heart function. Reliable fetal cardiac signals can be reconstructed from multi-channel fMCG recordings by means of independent component analysis (ICA). However, the identification of the separated components is usually accomplished by visual inspection. This paper discusses a novel automated system based on entropy estimators, namely approximate entropy (ApEn) and sample entropy (SampEn), for the classification of independent components (ICs). The system was validated on 40 fMCG datasets of normal fetuses with the gestational age ranging from 22 to 37 weeks. Both ApEn and SampEn were able to measure the stability and predictability of the physiological signals separated with ICA, and the entropy values of the three categories were significantly different at p <0.01. The system performances were compared with those of a method based on the analysis of the time and frequency content of the components. The outcomes of this study showed a superior performance of the entropy-based system, in particular for early gestation, with an overall ICs detection rate of 98.75% and 97.92% for ApEn and SampEn respectively, as against a value of 94.50% obtained with the time-frequency-based system.
Lau, Johnny King L; Humphreys, Glyn W; Douis, Hassan; Balani, Alex; Bickerton, Wai-Ling; Rotshtein, Pia
2015-01-01
We report a lesion-symptom mapping analysis of visual speech production deficits in a large group (280) of stroke patients at the sub-acute stage (<120 days post-stroke). Performance on object naming was evaluated alongside three other tests of visual speech production, namely sentence production to a picture, sentence reading and nonword reading. A principal component analysis was performed on all these tests' scores and revealed a 'shared' component that loaded across all the visual speech production tasks and a 'unique' component that isolated object naming from the other three tasks. Regions for the shared component were observed in the left fronto-temporal cortices, fusiform gyrus and bilateral visual cortices. Lesions in these regions linked to both poor object naming and impairment in general visual-speech production. On the other hand, the unique naming component was potentially associated with the bilateral anterior temporal poles, hippocampus and cerebellar areas. This is in line with the models proposing that object naming relies on a left-lateralised language dominant system that interacts with a bilateral anterior temporal network. Neuropsychological deficits in object naming can reflect both the increased demands specific to the task and the more general difficulties in language processing.
NASA Astrophysics Data System (ADS)
Chen, Lei; Liu, Xiang; Lian, Youyun; Cai, Laizhong
2015-09-01
The hypervapotron (HV), as an enhanced heat transfer technique, will be used for ITER divertor components in the dome region as well as the enhanced heat flux first wall panels. W-Cu brazing technology has been developed at SWIP (Southwestern Institute of Physics), and one W/CuCrZr/316LN component of 450 mm×52 mm×166 mm with HV cooling channels will be fabricated for high heat flux (HHF) tests. Before that a relevant analysis was carried out to optimize the structure of divertor component elements. ANSYS-CFX was used in CFD analysis and ABAQUS was adopted for thermal-mechanical calculations. Commercial code FE-SAFE was adopted to compute the fatigue life of the component. The tile size, thickness of tungsten tiles and the slit width among tungsten tiles were optimized and its HHF performances under International Thermonuclear Experimental Reactor (ITER) loading conditions were simulated. One brand new tokamak HL-2M with advanced divertor configuration is under construction in SWIP, where ITER-like flat-tile divertor components are adopted. This optimized design is expected to supply valuable data for HL-2M tokamak. supported by the National Magnetic Confinement Fusion Science Program of China (Nos. 2011GB110001 and 2011GB110004)
High variable mixture ratio oxygen/hydrogen engine
NASA Technical Reports Server (NTRS)
Erickson, C. M.; Tu, W. H.; Weiss, A. H.
1988-01-01
The ability of an O2/H2 engine to operate over a range of high-propellant mixture ratios was previously shown to be advantageous in single stage to orbit (SSTO) vehicles. The results are presented for the analysis of high-performance engine power cycles operating over propellant mixture ratio ranges of 12 to 6 and 9 to 6. A requirement to throttle up to 60 percent of nominal thrust was superimposed as a typical throttle range to limit vehicle acceleration as propellant is expended. The object of the analysis was to determine areas of concern relative to component and engine operability or potential hazards resulting from the operating requirements and ranges of conditions that derive from the overall engine requirements. The SSTO mission necessitates a high-performance, lightweight engine. Therefore, staged combustion power cycles employing either dual fuel-rich preburners or dual mixed (fuel-rich and oxygen-rich) preburners were examined. Engine mass flow and power balances were made and major component operating ranges were defined. Component size and arrangement were determined through engine layouts for one of the configurations evaluated. Each component is being examined to determine if there are areas of concern with respect to component efficiency, operability, reliability, or hazard. The effects of reducing the maximum chamber pressure were investigated for one of the cycles.
Probabilistic Structural Analysis Theory Development
NASA Technical Reports Server (NTRS)
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Das, Atanu; Mukhopadhyay, Chaitali
2007-10-28
We have performed molecular dynamics (MD) simulation of the thermal denaturation of one protein and one peptide-ubiquitin and melittin. To identify the correlation in dynamics among various secondary structural fragments and also the individual contribution of different residues towards thermal unfolding, principal component analysis method was applied in order to give a new insight to protein dynamics by analyzing the contribution of coefficients of principal components. The cross-correlation matrix obtained from MD simulation trajectory provided important information regarding the anisotropy of backbone dynamics that leads to unfolding. Unfolding of ubiquitin was found to be a three-state process, while that of melittin, though smaller and mostly helical, is more complicated.
NASA Astrophysics Data System (ADS)
Das, Atanu; Mukhopadhyay, Chaitali
2007-10-01
We have performed molecular dynamics (MD) simulation of the thermal denaturation of one protein and one peptide—ubiquitin and melittin. To identify the correlation in dynamics among various secondary structural fragments and also the individual contribution of different residues towards thermal unfolding, principal component analysis method was applied in order to give a new insight to protein dynamics by analyzing the contribution of coefficients of principal components. The cross-correlation matrix obtained from MD simulation trajectory provided important information regarding the anisotropy of backbone dynamics that leads to unfolding. Unfolding of ubiquitin was found to be a three-state process, while that of melittin, though smaller and mostly helical, is more complicated.
Classification of fMRI resting-state maps using machine learning techniques: A comparative study
NASA Astrophysics Data System (ADS)
Gallos, Ioannis; Siettos, Constantinos
2017-11-01
We compare the efficiency of Principal Component Analysis (PCA) and nonlinear learning manifold algorithms (ISOMAP and Diffusion maps) for classifying brain maps between groups of schizophrenia patients and healthy from fMRI scans during a resting-state experiment. After a standard pre-processing pipeline, we applied spatial Independent component analysis (ICA) to reduce (a) noise and (b) spatial-temporal dimensionality of fMRI maps. On the cross-correlation matrix of the ICA components, we applied PCA, ISOMAP and Diffusion Maps to find an embedded low-dimensional space. Finally, support-vector-machines (SVM) and k-NN algorithms were used to evaluate the performance of the algorithms in classifying between the two groups.
Hologram interferometry in automotive component vibration testing
NASA Astrophysics Data System (ADS)
Brown, Gordon M.; Forbes, Jamie W.; Marchi, Mitchell M.; Wales, Raymond R.
1993-02-01
An ever increasing variety of automotive component vibration testing is being pursued at Ford Motor Company, U.S.A. The driving force for use of hologram interferometry in these tests is the continuing need to design component structures to meet more stringent functional performance criteria. Parameters such as noise and vibration, sound quality, and reliability must be optimized for the lightest weight component possible. Continually increasing customer expectations and regulatory pressures on fuel economy and safety mandate that vehicles be built from highly optimized components. This paper includes applications of holographic interferometry for powertrain support structure tuning, body panel noise reduction, wiper system noise and vibration path analysis, and other vehicle component studies.
SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.
Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan
2017-09-01
With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.
Beekman, Alice; Shan, Daxian; Ali, Alana; Dai, Weiguo; Ward-Smith, Stephen; Goldenberg, Merrill
2005-04-01
This study evaluated the effect of the imaginary component of the refractive index on laser diffraction particle size data for pharmaceutical samples. Excipient particles 1-5 microm in diameter (irregular morphology) were measured by laser diffraction. Optical parameters were obtained and verified based on comparison of calculated vs. actual particle volume fraction. Inappropriate imaginary components of the refractive index can lead to inaccurate results, including false peaks in the size distribution. For laser diffraction measurements, obtaining appropriate or "effective" imaginary components of the refractive index was not always straightforward. When the recommended criteria such as the concentration match and the fit of the scattering data gave similar results for very different calculated size distributions, a supplemental technique, microscopy with image analysis, was used to decide between the alternatives. Use of effective optical parameters produced a good match between laser diffraction data and microscopy/image analysis data. The imaginary component of the refractive index can have a major impact on particle size results calculated from laser diffraction data. When performed properly, laser diffraction and microscopy with image analysis can yield comparable results.
NASA Technical Reports Server (NTRS)
Egbert, James Allen
2016-01-01
In support of ground system development for the Space Launch System (SLS), engineers are tasked with building immense engineering models of extreme complexity. The various systems require rigorous analysis of pneumatics, hydraulic, cryogenic, and hypergolic systems. There are certain standards that each of these systems must meet, in the form of pressure vessel system (PVS) certification reports. These reports can be hundreds of pages long, and require many hours to compile. Traditionally, each component is analyzed individually, often utilizing hand calculations in the design process. The objective of this opportunity is to perform these analyses in an integrated fashion with the parametric CADCAE environment. This allows for systems to be analyzed on an assembly level in a semi-automated fashion, which greatly improves accuracy and efficiency. To accomplish this, component specific parameters were stored in the Windchill database to individual Creo Parametric models based on spec control drawings. These parameters were then accessed by using the Prime Analysis within Creo Parametric. MathCAD Prime spreadsheets were created that automatically extracted these parameters, performed calculations, and generated reports. The reports described component compatibility based on local conditions such as pressure, temperature, density, and flow rates. The reports also determined component pairing compatibility, such as properly sizing relief valves with regulators. The reports stored the input conditions that were used to determine compatibility to increase traceability of component selection. The desired workflow for using this tool would begin with a Creo Schematics diagram of a PVS system. This schematic would store local conditions and locations of components. The schematic would then populate an assembly within Creo Parametric, using Windchill database parts. These parts would have their attributes already assigned, and the MathCAD spreadsheets could begin running through database parts to determine which components would be suited for specific locations within the assembly. This eliminates a significant amount of time from the design process, and makes initial analysis assessments more accurate. Each component that would be checked for a location within the assembly would generate a report, showing whether the component was compatible. These reports could be used to generate the PVS report without the need to perform the same analysis multiple times. This process also has the potential to be expanded upon to further automate PVS reports. The integration of software codes or macros could be used to automatically check through hundreds of parts for each location on the schematic. If the software could recognize which type of component would be necessary for each location, it is possible that simply starting the macro could completely choose all the components needed for the schematic, and in turn the system. This would save many hours of work initially selecting components, which could end up saving money. Overall, this process helps to automate initial component selections for PVS systems to fit local design specifications. These selections will automatically generate reports showing how the design criteria are met by the specific component that was chosen. These reports will contribute to easier compilation of the PVS certification reports, which currently take a great amount of time and effort to produce.
Marcos, Ma Shiela Angeli; David, Laura; Peñaflor, Eileen; Ticzon, Victor; Soriano, Maricor
2008-10-01
We introduce an automated benthic counting system in application for rapid reef assessment that utilizes computer vision on subsurface underwater reef video. Video acquisition was executed by lowering a submersible bullet-type camera from a motor boat while moving across the reef area. A GPS and echo sounder were linked to the video recorder to record bathymetry and location points. Analysis of living and non-living components was implemented through image color and texture feature extraction from the reef video frames and classification via Linear Discriminant Analysis. Compared to common rapid reef assessment protocols, our system can perform fine scale data acquisition and processing in one day. Reef video was acquired in Ngedarrak Reef, Koror, Republic of Palau. Overall success performance ranges from 60% to 77% for depths of 1 to 3 m. The development of an automated rapid reef classification system is most promising for reef studies that need fast and frequent data acquisition of percent cover of living and nonliving components.
Modal Identification in an Automotive Multi-Component System Using HS 3D-DIC
López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.
2018-01-01
The modal characterization of automotive lighting systems becomes difficult using sensors due to the light weight of the elements which compose the component as well as the intricate access to allocate them. In experimental modal analysis, high speed 3D digital image correlation (HS 3D-DIC) is attracting the attention since it provides full-field contactless measurements of 3D displacements as main advantage over other techniques. Different methodologies have been published that perform modal identification, i.e., natural frequencies, damping ratios, and mode shapes using the full-field information. In this work, experimental modal analysis has been performed in a multi-component automotive lighting system using HS 3D-DIC. Base motion excitation was applied to simulate operating conditions. A recently validated methodology has been employed for modal identification using transmissibility functions, i.e., the transfer functions from base motion tests. Results make it possible to identify local and global behavior of the different elements of injected polymeric and metallic materials. PMID:29401725
Multicomponent analysis of a digital Trail Making Test.
Fellows, Robert P; Dahmen, Jessamyn; Cook, Diane; Schmitter-Edgecombe, Maureen
2017-01-01
The purpose of the current study was to use a newly developed digital tablet-based variant of the TMT to isolate component cognitive processes underlying TMT performance. Similar to the paper-based trail making test, this digital variant consists of two conditions, Part A and Part B. However, this digital version automatically collects additional data to create component subtest scores to isolate cognitive abilities. Specifically, in addition to the total time to completion and number of errors, the digital Trail Making Test (dTMT) records several unique components including the number of pauses, pause duration, lifts, lift duration, time inside each circle, and time between circles. Participants were community-dwelling older adults who completed a neuropsychological evaluation including measures of processing speed, inhibitory control, visual working memory/sequencing, and set-switching. The abilities underlying TMT performance were assessed through regression analyses of component scores from the dTMT with traditional neuropsychological measures. Results revealed significant correlations between paper and digital variants of Part A (r s = .541, p < .001) and paper and digital versions of Part B (r s = .799, p < .001). Regression analyses with traditional neuropsychological measures revealed that Part A components were best predicted by speeded processing, while inhibitory control and visual/spatial sequencing were predictors of specific components of Part B. Exploratory analyses revealed that specific dTMT-B components were associated with a performance-based medication management task. Taken together, these results elucidate specific cognitive abilities underlying TMT performance, as well as the utility of isolating digital components.
Design and performance analysis of gas sorption compressors
NASA Technical Reports Server (NTRS)
Chan, C. K.
1984-01-01
Compressor kinetics based on gas adsorption and desorption processes by charcoal and for gas absorption and desorption processes by LaNi5 were analyzed using a two-phase model and a three-component model, respectively. The assumption of the modeling involved thermal and mechanical equilibria between phases or among the components. The analyses predicted performance well for compressors which have heaters located outside the adsorbent or the absorbent bed. For the rapidly-cycled compressor, where the heater was centrally located, only the transient pressure compared well with the experimental data.
This paper presents an analysis of the CMAQ v4.5 model performance for particulate matter and its chemical components for the simulated year 2001. This is part two is two part series of papers that examines the model performance of CMAQ v4.5.
Lajnef, Tarek; Chaibi, Sahbi; Eichenlaub, Jean-Baptiste; Ruby, Perrine M.; Aguera, Pierre-Emmanuel; Samet, Mounir; Kachouri, Abdennaceur; Jerbi, Karim
2015-01-01
A novel framework for joint detection of sleep spindles and K-complex events, two hallmarks of sleep stage S2, is proposed. Sleep electroencephalography (EEG) signals are split into oscillatory (spindles) and transient (K-complex) components. This decomposition is conveniently achieved by applying morphological component analysis (MCA) to a sparse representation of EEG segments obtained by the recently introduced discrete tunable Q-factor wavelet transform (TQWT). Tuning the Q-factor provides a convenient and elegant tool to naturally decompose the signal into an oscillatory and a transient component. The actual detection step relies on thresholding (i) the transient component to reveal K-complexes and (ii) the time-frequency representation of the oscillatory component to identify sleep spindles. Optimal thresholds are derived from ROC-like curves (sensitivity vs. FDR) on training sets and the performance of the method is assessed on test data sets. We assessed the performance of our method using full-night sleep EEG data we collected from 14 participants. In comparison to visual scoring (Expert 1), the proposed method detected spindles with a sensitivity of 83.18% and false discovery rate (FDR) of 39%, while K-complexes were detected with a sensitivity of 81.57% and an FDR of 29.54%. Similar performances were obtained when using a second expert as benchmark. In addition, when the TQWT and MCA steps were excluded from the pipeline the detection sensitivities dropped down to 70% for spindles and to 76.97% for K-complexes, while the FDR rose up to 43.62 and 49.09%, respectively. Finally, we also evaluated the performance of the proposed method on a set of publicly available sleep EEG recordings. Overall, the results we obtained suggest that the TQWT-MCA method may be a valuable alternative to existing spindle and K-complex detection methods. Paths for improvements and further validations with large-scale standard open-access benchmarking data sets are discussed. PMID:26283943
Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li
2015-09-01
The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Latent effects decision analysis
Cooper, J Arlin [Albuquerque, NM; Werner, Paul W [Albuquerque, NM
2004-08-24
Latent effects on a system are broken down into components ranging from those far removed in time from the system under study (latent) to those which closely effect changes in the system. Each component is provided with weighted inputs either by a user or from outputs of other components. A non-linear mathematical process known as `soft aggregation` is performed on the inputs to each component to provide information relating to the component. This information is combined in decreasing order of latency to the system to provide a quantifiable measure of an attribute of a system (e.g., safety) or to test hypotheses (e.g., for forensic deduction or decisions about various system design options).
Towards Rocket Engine Components with Increased Strength and Robust Operating Characteristics
NASA Technical Reports Server (NTRS)
Marcu, Bogdan; Hadid, Ali; Lin, Pei; Balcazar, Daniel; Rai, Man Mohan; Dorney, Daniel J.
2005-01-01
High-energy rotating machines, powering liquid propellant rocket engines, are subject to various sources of high and low cycle fatigue generated by unsteady flow phenomena. Given the tremendous need for reliability in a sustainable space exploration program, a fundamental change in the design methodology for engine components is required for both launch and space based systems. A design optimization system based on neural-networks has been applied and demonstrated in the redesign of the Space Shuttle Main Engine (SSME) Low Pressure Oxidizer Turbo Pump (LPOTP) turbine nozzle. One objective of the redesign effort was to increase airfoil thickness and thus increase its strength while at the same time detuning the vane natural frequency modes from the vortex shedding frequency. The second objective was to reduce the vortex shedding amplitude. The third objective was to maintain this low shedding amplitude even in the presence of large manufacturing tolerances. All of these objectives were achieved without generating any detrimental effects on the downstream flow through the turbine, and without introducing any penalty in performance. The airfoil redesign and preliminary assessment was performed in the Exploration Technology Directorate at NASA ARC. Boeing/Rocketdyne and NASA MSFC independently performed final CFD assessments of the design. Four different CFD codes were used in this process. They include WIL DCA T/CORSAIR (NASA), FLUENT (commercial), TIDAL (Boeing Rocketdyne) and, a new family (AardvarWPhantom) of CFD analysis codes developed at NASA MSFC employing LOX fluid properties and a Generalized Equation Set formulation. Extensive aerodynamic performance analysis and stress analysis carried out at Boeing Rocketdyne and NASA MSFC indicate that the redesign objectives have been fully met. The paper presents the results of the assessment analysis and discusses the future potential of robust optimal design for rocket engine components.
NASA Astrophysics Data System (ADS)
Kong, Changduk; Lim, Semyeong
2011-12-01
Recently, the health monitoring system of major gas path components of gas turbine uses mostly the model based method like the Gas Path Analysis (GPA). This method is to find quantity changes of component performance characteristic parameters such as isentropic efficiency and mass flow parameter by comparing between measured engine performance parameters such as temperatures, pressures, rotational speeds, fuel consumption, etc. and clean engine performance parameters without any engine faults which are calculated by the base engine performance model. Currently, the expert engine diagnostic systems using the artificial intelligent methods such as Neural Networks (NNs), Fuzzy Logic and Genetic Algorithms (GAs) have been studied to improve the model based method. Among them the NNs are mostly used to the engine fault diagnostic system due to its good learning performance, but it has a drawback due to low accuracy and long learning time to build learning data base if there are large amount of learning data. In addition, it has a very complex structure for finding effectively single type faults or multiple type faults of gas path components. This work builds inversely a base performance model of a turboprop engine to be used for a high altitude operation UAV using measured performance data, and proposes a fault diagnostic system using the base engine performance model and the artificial intelligent methods such as Fuzzy logic and Neural Network. The proposed diagnostic system isolates firstly the faulted components using Fuzzy Logic, then quantifies faults of the identified components using the NN leaned by fault learning data base, which are obtained from the developed base performance model. In leaning the NN, the Feed Forward Back Propagation (FFBP) method is used. Finally, it is verified through several test examples that the component faults implanted arbitrarily in the engine are well isolated and quantified by the proposed diagnostic system.
Public perceptions of key performance indicators of healthcare in Alberta, Canada.
Northcott, Herbert C; Harvey, Michael D
2012-06-01
To examine the relationship between public perceptions of key performance indicators assessing various aspects of the health-care system. Cross-sequential survey research. Annual telephone surveys of random samples of adult Albertans selected by random digit dialing and stratified according to age, sex and region (n = 4000 for each survey year). The survey questionnaires included single-item measures of key performance indicators to assess public perceptions of availability, accessibility, quality, outcome and satisfaction with healthcare. Cronbach's α and factor analysis were used to assess the relationship between key performance indicators focusing on the health-care system overall and on a recent interaction with the health-care system. The province of Alberta, Canada during the years 1996-2004. Four thousand adults randomly selected each survey year. Survey questions measuring public perceptions of healthcare availability, accessibility, quality, outcome and satisfaction with healthcare. Factor analysis identified two principal components with key performance indicators focusing on the health system overall loading most strongly on the first component and key performance indicators focusing on the most recent health-care encounter loading most strongly on the second component. Assessments of the quality of care most recently received, accessibility of that care and perceived outcome of care tended to be higher than the more general assessments of overall health system quality and accessibility. Assessments of specific health-care encounters and more general assessments of the overall health-care system, while related, nevertheless comprise separate dimensions for health-care evaluation.
A Feature Fusion Based Forecasting Model for Financial Time Series
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
2014-01-01
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455
Sabir, Aryani; Rafi, Mohamad; Darusman, Latifah K
2017-04-15
HPLC fingerprint analysis combined with chemometrics was developed to discriminate between the red and the white rice bran grown in Indonesia. The major component in rice bran is γ-oryzanol which consisted of 4 main compounds, namely cycloartenol ferulate, cyclobranol ferulate, campesterol ferulate and β-sitosterol ferulate. Separation of these four compounds along with other compounds was performed using C18 and methanol-acetonitrile with gradient elution system. By using these intensity variations, principal component and discriminant analysis were performed to discriminate the two samples. Discriminant analysis was successfully discriminated the red from the white rice bran with predictive ability of the model showed a satisfactory classification for the test samples. The results of this study indicated that the developed method was suitable as quality control method for rice bran in terms of identification and discrimination of the red and the white rice bran. Copyright © 2016 Elsevier Ltd. All rights reserved.
Anastasiadou, Maria N; Christodoulakis, Manolis; Papathanasiou, Eleftherios S; Papacostas, Savvas S; Mitsis, Georgios D
2017-09-01
This paper proposes supervised and unsupervised algorithms for automatic muscle artifact detection and removal from long-term EEG recordings, which combine canonical correlation analysis (CCA) and wavelets with random forests (RF). The proposed algorithms first perform CCA and continuous wavelet transform of the canonical components to generate a number of features which include component autocorrelation values and wavelet coefficient magnitude values. A subset of the most important features is subsequently selected using RF and labelled observations (supervised case) or synthetic data constructed from the original observations (unsupervised case). The proposed algorithms are evaluated using realistic simulation data as well as 30min epochs of non-invasive EEG recordings obtained from ten patients with epilepsy. We assessed the performance of the proposed algorithms using classification performance and goodness-of-fit values for noisy and noise-free signal windows. In the simulation study, where the ground truth was known, the proposed algorithms yielded almost perfect performance. In the case of experimental data, where expert marking was performed, the results suggest that both the supervised and unsupervised algorithm versions were able to remove artifacts without affecting noise-free channels considerably, outperforming standard CCA, independent component analysis (ICA) and Lagged Auto-Mutual Information Clustering (LAMIC). The proposed algorithms achieved excellent performance for both simulation and experimental data. Importantly, for the first time to our knowledge, we were able to perform entirely unsupervised artifact removal, i.e. without using already marked noisy data segments, achieving performance that is comparable to the supervised case. Overall, the results suggest that the proposed algorithms yield significant future potential for improving EEG signal quality in research or clinical settings without the need for marking by expert neurophysiologists, EMG signal recording and user visual inspection. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
40 CFR 60.2035 - How are these new source performance standards structured?
Code of Federal Regulations, 2010 CFR
2010-07-01
... standards structured? 60.2035 Section 60.2035 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... source performance standards contain the eleven major components listed in paragraphs (a) through (k) of this section. (a) Preconstruction siting analysis. (b) Waste management plan. (c) Operator training and...
Discriminant analysis of resting-state functional connectivity patterns on the Grassmann manifold
NASA Astrophysics Data System (ADS)
Fan, Yong; Liu, Yong; Jiang, Tianzi; Liu, Zhening; Hao, Yihui; Liu, Haihong
2010-03-01
The functional networks, extracted from fMRI images using independent component analysis, have been demonstrated informative for distinguishing brain states of cognitive functions and neurological diseases. In this paper, we propose a novel algorithm for discriminant analysis of functional networks encoded by spatial independent components. The functional networks of each individual are used as bases for a linear subspace, referred to as a functional connectivity pattern, which facilitates a comprehensive characterization of temporal signals of fMRI data. The functional connectivity patterns of different individuals are analyzed on the Grassmann manifold by adopting a principal angle based subspace distance. In conjunction with a support vector machine classifier, a forward component selection technique is proposed to select independent components for constructing the most discriminative functional connectivity pattern. The discriminant analysis method has been applied to an fMRI based schizophrenia study with 31 schizophrenia patients and 31 healthy individuals. The experimental results demonstrate that the proposed method not only achieves a promising classification performance for distinguishing schizophrenia patients from healthy controls, but also identifies discriminative functional networks that are informative for schizophrenia diagnosis.
Early Migration Predicts Aseptic Loosening of Cementless Femoral Stems: A Long-term Study.
Streit, Marcus R; Haeussler, Daniel; Bruckner, Thomas; Proctor, Tanja; Innmann, Moritz M; Merle, Christian; Gotterbarm, Tobias; Weiss, Stefan
2016-07-01
Excessive early migration of cemented stems and cups after THA has been associated with poor long-term survival and allows predictable evaluation of implant performance. However, there are few data regarding the relationship between early migration and aseptic loosening of cementless femoral components, and whether early migration might predict late failure has not been evaluated, to our knowledge. Einzel-Bild-Röntgen-Analyse-femoral component analysis (EBRA-FCA) is a validated technique to accurately measure axial femoral stem migration without the need for tantalum markers, can be performed retrospectively, and may be a suitable tool to identify poor performing implants before their widespread use. We asked: (1) Is axial migration within the first 24 months as assessed by EBRA-FCA greater among cementless stems that develop aseptic loosening than those that remain well fixed through the second decade; (2) what is the diagnostic performance of implant migration at 24 months postoperatively to predict later aseptic loosening of these components; and (3) how does long-term stem survivorship compare between groups with high and low early migration? We evaluated early axial stem migration in 158 cementless THAs using EBRA-FCA. The EBRA-FCA measurements were performed during the first week postoperatively (baseline measurement) and at regular followups of 3, 6, and 12 months postoperatively and annually thereafter. The mean duration of followup was 21 years (range, 18-24 years). The stems studied represented 45% (158 of 354) of the cementless THAs performed during that time, and cementless THAs represented 34% (354 of 1038) of the THA practice during that period. No patient enrolled in this study was lost to followup. Multivariate survivorship analysis using Cox's regression model was performed with an endpoint of aseptic loosening of the femoral component. Loosening was defined according to the criteria described by Engh et al. and assessed by two independent observers. Patients with a diagnosis of prosthetic joint infection were excluded. Receiver operating characteristic (ROC) curve analysis was used to evaluate diagnostic performance of axial stem migration 1, 2, 3, and 4 years postoperatively as a predictor of aseptic loosening. Survivorship of hips with high (≥ 2.7 mm) and low (< 2.7 mm) migration was compared using a competing-events analysis. Femoral components that had aseptic loosening develop showed greater mean distal migration at 24 months postoperatively than did components that remained well fixed throughout the surveillance period (4.2 mm ± 3.1 mm vs 0.8 mm ± 0.9 mm; mean difference, 3.4 mm, 95% CI, 2.5-4.4; p ≤ 0.001). Distal migration at 24 months postoperatively was a strong risk factor for aseptic loosening (hazard ratio, 1.98; 95% CI, 1.51-2.57; p < 0.001). The associated overall diagnostic performance of 2-year distal migration for predicting aseptic loosening was good (area under the ROC curve, 0.86; 95% CI, 0.72-1.00; p < 0.001). Sensitivity of early migration measurement was high for the prediction of aseptic loosening during the first decade after surgery but decreased markedly thereafter. Stems with large amounts of early migration (≥ 2.7 mm) had lower 18-year survivorship than did stems with little early migration (29% [95% CI, 0%-62%] versus 95% [95% CI, 90%-100%] p < 0.001). Early migration, as measured by EBRA-FCA at 2 years postoperatively, has good diagnostic capabilities for detection of uncemented femoral components at risk for aseptic loosening during the first and early second decades after surgery. However, there was no relationship between early migration patterns and aseptic loosening during the late second and third decades. EBRA-FCA can be used as a research tool to evaluate new cementless stems or in clinical practice to evaluate migration patterns in patients with painful femoral components. Level III, diagnostic study.
Component Analysis of Remanent Magnetization Curves: A Revisit with a New Model Distribution
NASA Astrophysics Data System (ADS)
Zhao, X.; Suganuma, Y.; Fujii, M.
2017-12-01
Geological samples often consist of several magnetic components that have distinct origins. As the magnetic components are often indicative of their underlying geological and environmental processes, it is therefore desirable to identify individual components to extract associated information. This component analysis can be achieved using the so-called unmixing method, which fits a mixture model of certain end-member model distribution to the measured remanent magnetization curve. In earlier studies, the lognormal, skew generalized Gaussian and skewed Gaussian distributions have been used as the end-member model distribution in previous studies, which are performed on the gradient curve of remanent magnetization curves. However, gradient curves are sensitive to measurement noise as the differentiation of the measured curve amplifies noise, which could deteriorate the component analysis. Though either smoothing or filtering can be applied to reduce the noise before differentiation, their effect on biasing component analysis is vaguely addressed. In this study, we investigated a new model function that can be directly applied to the remanent magnetization curves and therefore avoid the differentiation. The new model function can provide more flexible shape than the lognormal distribution, which is a merit for modeling the coercivity distribution of complex magnetic component. We applied the unmixing method both to model and measured data, and compared the results with those obtained using other model distributions to better understand their interchangeability, applicability and limitation. The analyses on model data suggest that unmixing methods are inherently sensitive to noise, especially when the number of component is over two. It is, therefore, recommended to verify the reliability of component analysis by running multiple analyses with synthetic noise. Marine sediments and seafloor rocks are analyzed with the new model distribution. Given the same component number, the new model distribution can provide closer fits than the lognormal distribution evidenced by reduced residuals. Moreover, the new unmixing protocol is automated so that the users are freed from the labor of providing initial guesses for the parameters, which is also helpful to improve the subjectivity of component analysis.
Zhe, Gao; Ying-Chun, Wang; Yan-Xu, Chang
2016-01-01
Using high-performance liquid chromatography coupled with diode array detection and electrospray ionization tandem mass spectrometry (HPLC-DAD-MSn) method, qualitative and quantitative analysis of flavonoids of stems, leaves, fruits and seeds, and anthocyanidin of fresh fruits in Nitraria tangutorum were performed. A total of 14 flavonoid components were identified from the seeds of N. tangutorum including three quercetin derivatives, three kaempferol derivatives, and eight isorhamnetin derivatives. A total of 12, 10, and 7 flavonoid components were identified from leaves, stems, and fruits of N. tangutorum, respectively; all were present in seeds also. The total content of flavonoids in leaves was the highest, up to 42.43 mg/g·dry weight. A total of 12 anthocyanidin components were identified from the fresh fruits of N. tangutorum, belonging to five anthocyanidin. The total content of anthocyanidin in fresh fruits was up to 45.83 mg/100 g· fresh weight, of which the acylated anthocyanidin accounted for 65.7%. The HPLC-DAD-MS(n) method can be operated easily, rapidly, and accurately, and is feasible for qualitative and quantitative analysis of flavone glycosides in N. tangutorum.
Staffing for Cyberspace Operations: Summary of Analysis
2016-08-01
appropriate total force mix, defined as the choice between military, civilian, and contractor performance of DoD activities, is a key component in this...designated for civilian performance if the requirement is inherently governmental or subject to least-cost government civilian or contractor performance if...activities open to the least costly performance type (government civilian or contractor ). To understand the CMF mission requirements, we studied existing
Nonflammable, Nonaqueous, Low Atmospheric Impact, High Performance Cleaning Solvents
NASA Technical Reports Server (NTRS)
Dhooge, P. M.; Glass, S. M.; Nimitz, J. S.
2001-01-01
For many years, chlorofluorocarbon (CFC) and chlorocarbon solvents have played an important part in aerospace operations. These solvents found extensive use as cleaning and analysis (EPA) solvents in precision and critical cleaning. However, CFCs and chlorocarbon solvents have deleterious effects on the ozone layer, are relatively strong greenhouse gases, and some are suspect or known carcinogens. Because of their ozone-depletion potential (ODP), the Montreal Protocol and its amendments, as well as other environmental regulations, have resulted in the phaseout of CFC-113 and 1,1,1-trichloroethane (TCA). Although alternatives have been recommended, they do not perform as well as the original solvents. In addition, some analyses, such as the infrared analysis of extracted hydrocarbons, cannot be performed with the substitute solvents that contain C-H bonds. CFC-113 solvent has been used for many critical aerospace applications. CFC-113, also known as Freon (registered) TF, has been used extensively in NASA's cleaning facilities for precision and critical cleaning, in particular the final rinsing in Class 100 areas, with gas chromatography analysis of rinse residue. While some cleaning can be accomplished by other processes, there are certain critical applications where CFC-113 or a similar solvent is highly cost-effective and ensures safety. Oxygen system components are one example where a solvent compatible with oxygen and capable of removing fluorocarbon grease is needed. Electronic components and precision mechanical components can also be damaged by aggressive cleaning solvents.
Sun, Qian; Chang, Lu; Ren, Yanping; Cao, Liang; Sun, Yingguang; Du, Yingfeng; Shi, Xiaowei; Wang, Qiao; Zhang, Lantong
2012-11-01
A novel method based on high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry was developed for simultaneous determination of the 11 major active components including ten flavonoids and one phenolic acid in Cirsium setosum. Separation was performed on a reversed-phase C(18) column with gradient elution of methanol and 0.1‰ acetic acid (v/v). The identification and quantification of the analytes were achieved on a hybrid quadrupole linear ion trap mass spectrometer. Multiple-reaction monitoring scanning was employed for quantification with switching electrospray ion source polarity between positive and negative modes in a single run. Full validation of the assay was carried out including linearity, precision, accuracy, stability, limits of detection and quantification. The results demonstrated that the method developed was reliable, rapid, and specific. The 25 batches of C. setosum samples from different sources were first determined using the developed method and the total contents of 11 analytes ranged from 1717.460 to 23028.258 μg/g. Among them, the content of linarin was highest, and its mean value was 7340.967 μg/g. Principal component analysis and hierarchical clustering analysis were performed to differentiate and classify the samples, which is helpful for comprehensive evaluation of the quality of C. setosum. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Comprehensive Analysis of Two Downburst-Related Aircraft Accidents
NASA Technical Reports Server (NTRS)
Shen, J.; Parks, E. K.; Bach, R. E.
1996-01-01
Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.
Nguyen, Phuong H
2007-05-15
Principal component analysis is a powerful method for projecting multidimensional conformational space of peptides or proteins onto lower dimensional subspaces in which the main conformations are present, making it easier to reveal the structures of molecules from e.g. molecular dynamics simulation trajectories. However, the identification of all conformational states is still difficult if the subspaces consist of more than two dimensions. This is mainly due to the fact that the principal components are not independent with each other, and states in the subspaces cannot be visualized. In this work, we propose a simple and fast scheme that allows one to obtain all conformational states in the subspaces. The basic idea is that instead of directly identifying the states in the subspace spanned by principal components, we first transform this subspace into another subspace formed by components that are independent of one other. These independent components are obtained from the principal components by employing the independent component analysis method. Because of independence between components, all states in this new subspace are defined as all possible combinations of the states obtained from each single independent component. This makes the conformational analysis much simpler. We test the performance of the method by analyzing the conformations of the glycine tripeptide and the alanine hexapeptide. The analyses show that our method is simple and quickly reveal all conformational states in the subspaces. The folding pathways between the identified states of the alanine hexapeptide are analyzed and discussed in some detail. 2007 Wiley-Liss, Inc.
Hakimzadeh, Neda; Parastar, Hadi; Fattahi, Mohammad
2014-01-24
In this study, multivariate curve resolution (MCR) and multivariate classification methods are proposed to develop a new chemometric strategy for comprehensive analysis of high-performance liquid chromatography-diode array absorbance detection (HPLC-DAD) fingerprints of sixty Salvia reuterana samples from five different geographical regions. Different chromatographic problems occurred during HPLC-DAD analysis of S. reuterana samples, such as baseline/background contribution and noise, low signal-to-noise ratio (S/N), asymmetric peaks, elution time shifts, and peak overlap are handled using the proposed strategy. In this way, chromatographic fingerprints of sixty samples are properly segmented to ten common chromatographic regions using local rank analysis and then, the corresponding segments are column-wise augmented for subsequent MCR analysis. Extended multivariate curve resolution-alternating least squares (MCR-ALS) is used to obtain pure component profiles in each segment. In general, thirty-one chemical components were resolved using MCR-ALS in sixty S. reuterana samples and the lack of fit (LOF) values of MCR-ALS models were below 10.0% in all cases. Pure spectral profiles are considered for identification of chemical components by comparing their resolved spectra with the standard ones and twenty-four components out of thirty-one components were identified. Additionally, pure elution profiles are used to obtain relative concentrations of chemical components in different samples for multivariate classification analysis by principal component analysis (PCA) and k-nearest neighbors (kNN). Inspection of the PCA score plot (explaining 76.1% of variance accounted for three PCs) showed that S. reuterana samples belong to four clusters. The degree of class separation (DCS) which quantifies the distance separating clusters in relation to the scatter within each cluster is calculated for four clusters and it was in the range of 1.6-5.8. These results are then confirmed by kNN. In addition, according to the PCA loading plot and kNN dendrogram of thirty-one variables, five chemical constituents of luteolin-7-o-glucoside, salvianolic acid D, rosmarinic acid, lithospermic acid and trijuganone A are identified as the most important variables (i.e., chemical markers) for clusters discrimination. Finally, the effect of different chemical markers on samples differentiation is investigated using counter-propagation artificial neural network (CP-ANN) method. It is concluded that the proposed strategy can be successfully applied for comprehensive analysis of chromatographic fingerprints of complex natural samples. Copyright © 2013 Elsevier B.V. All rights reserved.
Spatially resolved spectroscopy analysis of the XMM-Newton large program on SN1006
NASA Astrophysics Data System (ADS)
Li, Jiang-Tao; Decourchelle, Anne; Miceli, Marco; Vink, Jacco; Bocchino, Fabrizio
2016-04-01
We perform analysis of the XMM-Newton large program on SN1006 based on our newly developed methods of spatially resolved spectroscopy analysis. We extract spectra from low and high resolution meshes. The former (3596 meshes) is used to roughly decompose the thermal and non-thermal components and characterize the spatial distributions of different parameters, such as temperature, abundances of different elements, ionization age, and electron density of the thermal component, as well as photon index and cutoff frequency of the non-thermal component. On the other hand, the low resolution meshes (583 meshes) focus on the interior region dominated by the thermal emission and have enough counts to well characterize the Si lines. We fit the spectra from the low resolution meshes with different models, in order to decompose the multiple plasma components at different thermal and ionization states and compare their spatial distributions. In this poster, we will present the initial results of this project.
Computer-Aided Design of Low-Noise Microwave Circuits
NASA Astrophysics Data System (ADS)
Wedge, Scott William
1991-02-01
Devoid of most natural and manmade noise, microwave frequencies have detection sensitivities limited by internally generated receiver noise. Low-noise amplifiers are therefore critical components in radio astronomical antennas, communications links, radar systems, and even home satellite dishes. A general technique to accurately predict the noise performance of microwave circuits has been lacking. Current noise analysis methods have been limited to specific circuit topologies or neglect correlation, a strong effect in microwave devices. Presented here are generalized methods, developed for computer-aided design implementation, for the analysis of linear noisy microwave circuits comprised of arbitrarily interconnected components. Included are descriptions of efficient algorithms for the simultaneous analysis of noisy and deterministic circuit parameters based on a wave variable approach. The methods are therefore particularly suited to microwave and millimeter-wave circuits. Noise contributions from lossy passive components and active components with electronic noise are considered. Also presented is a new technique for the measurement of device noise characteristics that offers several advantages over current measurement methods.
NASA Astrophysics Data System (ADS)
Seo, Jihye; An, Yuri; Lee, Jungsul; Choi, Chulhee
2015-03-01
Indocyanine green (ICG), a near-infrared fluorophore, has been used in visualization of vascular structure and non-invasive diagnosis of vascular disease. Although many imaging techniques have been developed, there are still limitations in diagnosis of vascular diseases. We have recently developed a minimally invasive diagnostics system based on ICG fluorescence imaging for sensitive detection of vascular insufficiency. In this study, we used principal component analysis (PCA) to examine ICG spatiotemporal profile and to obtain pathophysiological information from ICG dynamics. Here we demonstrated that principal components of ICG dynamics in both feet showed significant differences between normal control and diabetic patients with vascula complications. We extracted the PCA time courses of the first three components and found distinct pattern in diabetic patient. We propose that PCA of ICG dynamics reveal better classification performance compared to fluorescence intensity analysis. We anticipate that specific feature of spatiotemporal ICG dynamics can be useful in diagnosis of various vascular diseases.
NASA Astrophysics Data System (ADS)
Wei, Wenjuan; Liu, Jiangang; Dai, Ruwei; Feng, Lu; Li, Ling; Tian, Jie
2014-03-01
Previous behavioral research has proved that individuals process own- and other-race faces differently. One well-known effect is the other-race effect (ORE), which indicates that individuals categorize other-race faces more accurately and faster than own-race faces. The existed functional magnetic resonance imaging (fMRI) studies of the other-race effect mainly focused on the racial prejudice and the socio-affective differences towards own- and other-race face. In the present fMRI study, we adopted a race-categorization task to determine the activation level differences between categorizing own- and other-race faces. Thirty one Chinese participants who live in China with Chinese as the majority and who had no direct contact with Caucasian individual were recruited in the present study. We used the group independent component analysis (ICA), which is a method of blind source signal separation that has proven to be promising for analysis of fMRI data. We separated the entail data into 56 components which is estimated based on one subject using the Minimal Description Length (MDL) criteria. The components sorted based on the multiple linear regression temporal sorting criteria, and the fit regression parameters were used in performing statistical test to evaluate the task-relatedness of the components. The one way anova was performed to test the significance of the component time course in different conditions. Our result showed that the areas, which coordinates is similar to the right FFA coordinates that previous studies reported, were greater activated for own-race faces than other-race faces, while the precuneus showed greater activation for other-race faces than own-race faces.
NASA Astrophysics Data System (ADS)
Hart, D. M.; Merchant, B. J.; Abbott, R. E.
2012-12-01
The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.
Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection
NASA Astrophysics Data System (ADS)
Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa
2018-01-01
A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.
Vergara, Victor M; Ulloa, Alvaro; Calhoun, Vince D; Boutte, David; Chen, Jiayu; Liu, Jingyu
2014-09-01
Multi-modal data analysis techniques, such as the Parallel Independent Component Analysis (pICA), are essential in neuroscience, medical imaging and genetic studies. The pICA algorithm allows the simultaneous decomposition of up to two data modalities achieving better performance than separate ICA decompositions and enabling the discovery of links between modalities. However, advances in data acquisition techniques facilitate the collection of more than two data modalities from each subject. Examples of commonly measured modalities include genetic information, structural magnetic resonance imaging (MRI) and functional MRI. In order to take full advantage of the available data, this work extends the pICA approach to incorporate three modalities in one comprehensive analysis. Simulations demonstrate the three-way pICA performance in identifying pairwise links between modalities and estimating independent components which more closely resemble the true sources than components found by pICA or separate ICA analyses. In addition, the three-way pICA algorithm is applied to real experimental data obtained from a study that investigate genetic effects on alcohol dependence. Considered data modalities include functional MRI (contrast images during alcohol exposure paradigm), gray matter concentration images from structural MRI and genetic single nucleotide polymorphism (SNP). The three-way pICA approach identified links between a SNP component (pointing to brain function and mental disorder associated genes, including BDNF, GRIN2B and NRG1), a functional component related to increased activation in the precuneus area, and a gray matter component comprising part of the default mode network and the caudate. Although such findings need further verification, the simulation and in-vivo results validate the three-way pICA algorithm presented here as a useful tool in biomedical data fusion applications. Copyright © 2014 Elsevier Inc. All rights reserved.
Liu, Wei; Wang, Dongmei; Liu, Jianjun; Li, Dengwu; Yin, Dongxue
2016-01-01
The present study was performed to assess the quality of Potentilla fruticosa L. sampled from distinct regions of China using high performance liquid chromatography (HPLC) fingerprinting coupled with a suite of chemometric methods. For this quantitative analysis, the main active phytochemical compositions and the antioxidant activity in P. fruticosa were also investigated. Considering the high percentages and antioxidant activities of phytochemicals, P. fruticosa samples from Kangding, Sichuan were selected as the most valuable raw materials. Similarity analysis (SA) of HPLC fingerprints, hierarchical cluster analysis (HCA), principle component analysis (PCA), and discriminant analysis (DA) were further employed to provide accurate classification and quality estimates of P. fruticosa. Two principal components (PCs) were collected by PCA. PC1 separated samples from Kangding, Sichuan, capturing 57.64% of the variance, whereas PC2 contributed to further separation, capturing 18.97% of the variance. Two kinds of discriminant functions with a 100% discrimination ratio were constructed. The results strongly supported the conclusion that the eight samples from different regions were clustered into three major groups, corresponding with their morphological classification, for which HPLC analysis confirmed the considerable variation in phytochemical compositions and that P. fruticosa samples from Kangding, Sichuan were of high quality. The results of SA, HCA, PCA, and DA were in agreement and performed well for the quality assessment of P. fruticosa. Consequently, HPLC fingerprinting coupled with chemometric techniques provides a highly flexible and reliable method for the quality evaluation of traditional Chinese medicines.
Liu, Wei; Wang, Dongmei; Liu, Jianjun; Li, Dengwu; Yin, Dongxue
2016-01-01
The present study was performed to assess the quality of Potentilla fruticosa L. sampled from distinct regions of China using high performance liquid chromatography (HPLC) fingerprinting coupled with a suite of chemometric methods. For this quantitative analysis, the main active phytochemical compositions and the antioxidant activity in P. fruticosa were also investigated. Considering the high percentages and antioxidant activities of phytochemicals, P. fruticosa samples from Kangding, Sichuan were selected as the most valuable raw materials. Similarity analysis (SA) of HPLC fingerprints, hierarchical cluster analysis (HCA), principle component analysis (PCA), and discriminant analysis (DA) were further employed to provide accurate classification and quality estimates of P. fruticosa. Two principal components (PCs) were collected by PCA. PC1 separated samples from Kangding, Sichuan, capturing 57.64% of the variance, whereas PC2 contributed to further separation, capturing 18.97% of the variance. Two kinds of discriminant functions with a 100% discrimination ratio were constructed. The results strongly supported the conclusion that the eight samples from different regions were clustered into three major groups, corresponding with their morphological classification, for which HPLC analysis confirmed the considerable variation in phytochemical compositions and that P. fruticosa samples from Kangding, Sichuan were of high quality. The results of SA, HCA, PCA, and DA were in agreement and performed well for the quality assessment of P. fruticosa. Consequently, HPLC fingerprinting coupled with chemometric techniques provides a highly flexible and reliable method for the quality evaluation of traditional Chinese medicines. PMID:26890416
Partial Analysis of Insta-Foam
NASA Technical Reports Server (NTRS)
Chou, L. W.
1983-01-01
Insta-Foam, used as a thermal insulator for the non-critical area of the external tank during the prelaunch phase to minimize icing, is a two-component system. Component A has polyisocyanates, blowing agents, and stabilizers; Component B has the polyols, catalysts, blowing agents, stabilizers and fire retardant. The blowing agents are Freon 11 and Freon 12, the stabilizers are silicone surfactants, the catalysts are tertiary amines, and the fire retardant is tri-(beta-chloro-isopropyl) phosphate (PCF). High performance liquid chromatography (HPLC) was quantitatively identified polyols and PFC.
COMCAN: a computer program for common cause analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burdick, G.R.; Marshall, N.H.; Wilson, J.R.
1976-05-01
The computer program, COMCAN, searches the fault tree minimal cut sets for shared susceptibility to various secondary events (common causes) and common links between components. In the case of common causes, a location check may also be performed by COMCAN to determine whether barriers to the common cause exist between components. The program can locate common manufacturers of components having events in the same minimal cut set. A relative ranking scheme for secondary event susceptibility is included in the program.
NASA Astrophysics Data System (ADS)
Azadeh, A.; Foroozan, H.; Ashjari, B.; Motevali Haghighi, S.; Yazdanparast, R.; Saberi, M.; Torki Nejad, M.
2017-10-01
ISs and ITs play a critical role in large complex gas corporations. Many factors such as human, organisational and environmental factors affect IS in an organisation. Therefore, investigating ISs success is considered to be a complex problem. Also, because of the competitive business environment and the high amount of information flow in organisations, new issues like resilient ISs and successful customer relationship management (CRM) have emerged. A resilient IS will provide sustainable delivery of information to internal and external customers. This paper presents an integrated approach to enhance and optimise the performance of each component of a large IS based on CRM and resilience engineering (RE) in a gas company. The enhancement of the performance can help ISs to perform business tasks efficiently. The data are collected from standard questionnaires. It is then analysed by data envelopment analysis by selecting the optimal mathematical programming approach. The selected model is validated and verified by principle component analysis method. Finally, CRM and RE factors are identified as influential factors through sensitivity analysis for this particular case study. To the best of our knowledge, this is the first study for performance assessment and optimisation of large IS by combined RE and CRM.
Catanuto, Giuseppe; Taher, Wafa; Rocco, Nicola; Catalano, Francesca; Allegra, Dario; Milotta, Filippo Luigi Maria; Stanco, Filippo; Gallo, Giovanni; Nava, Maurizio Bruno
2018-03-20
Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. We will quantitatively describe breast shape with two parameters derived from a statistical methodology denominated principal component analysis (PCA). We created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. We plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and postoperative surgical case and test-retest was performed by two operators. The first two principal components derived from PCA are able to characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators are able to obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and postoperative stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.
Shallwani, Shirin M; Simmonds, Maureen J; Kasymjanova, Goulnar; Spahija, Jadranka
2016-09-01
Our objectives were: (a) to identify predictors of change in health-related quality of life (HRQOL) in patients with advanced non-small cell lung cancer (NSCLC) undergoing chemotherapy; and (b) to characterize symptom status, nutritional status, physical performance and HRQOL in this population and to estimate the extent to which these variables change following two cycles of chemotherapy. A secondary analysis of a longitudinal observational study of 47 patients (24 men and 23 women) with newly diagnosed advanced NSCLC receiving two cycles of first-line chemotherapy was performed. Primary outcomes were changes in HRQOL (physical and mental component summaries (PCS and MCS) of the 36-item Short-Form Health Survey (SF-36)). Predictors in the models included pre-chemotherapy patient-reported symptoms (Schwartz Cancer Fatigue Scale (SCFS) and Lung Cancer Subscale), nutritional screening (Patient-Generated Subjective Global Assessment) and physical performance measures (6-min Walk Test (6MWT), one-minute chair rise test and grip strength). Mean SF-36 PCS score, 6MWT distance and grip strength declined following two cycles of chemotherapy (p<0.05). Multiple linear regression modelling revealed pre-chemotherapy SCFS score and 6MWT distance as the strongest predictors of change in the mental component of HRQOL accounting for 13% and 9% of the variance, respectively. No significant predictors were found for change in the physical component of HRQOL. Pre-chemotherapy 6MWT distance and fatigue severity predicted change in the mental component of HRQOL in patients with advanced NSCLC undergoing chemotherapy, while physical performance declined during treatment. Clinical management of these factors may be useful for HRQOL optimization in this population. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Stability analysis of an autocatalytic protein model
NASA Astrophysics Data System (ADS)
Lee, Julian
2016-05-01
A self-regulatory genetic circuit, where a protein acts as a positive regulator of its own production, is known to be the simplest biological network with a positive feedback loop. Although at least three components—DNA, RNA, and the protein—are required to form such a circuit, stability analysis of the fixed points of this self-regulatory circuit has been performed only after reducing the system to a two-component system, either by assuming a fast equilibration of the DNA component or by removing the RNA component. Here, stability of the fixed points of the three-component positive feedback loop is analyzed by obtaining eigenvalues of the full three-dimensional Hessian matrix. In addition to rigorously identifying the stable fixed points and saddle points, detailed information about the system can be obtained, such as the existence of complex eigenvalues near a fixed point.
Wang, Shu-Ping; Liu, Lei; Wang, Ling-Ling; Jiang, Peng; Zhang, Ji-Quan; Zhang, Wei-Dong; Liu, Run-Hui
2010-06-15
Based on the serum pharmacochemistry technique and high-performance liquid chromatography/diode-array detection (HPLC/DAD) coupled with electrospray tandem mass spectrometry (HPLC/ESI-MS/MS), a method for screening and analysis of the multiple absorbed bioactive components and metabolites of Jitai tablets (JTT) in orally dosed rat plasma was developed. Plasma was treated by methanol precipitation prior to liquid chromatography, and the separation was carried out on a Symmetry C(18) column, with a linear gradient (0.1% formic acid/water/acetonitrile). Mass spectra were acquired in negative and positive ion modes, respectively. As a result, 26 bioactive components originated from JTT and 5 metabolites were tentatively identified in orally dosed rat plasma by comparing their retention times and MS spectra with those of authentic standards and literature data. It is concluded that an effective and reliable analytical method was set up for screening the bioactive components of Chinese herbal medicine, which provided a meaningful basis for further pharmacology and active mechanism research of JTT. Copyright (c) 2010 John Wiley & Sons, Ltd.
Racetrack resonator as a loss measurement platform for photonic components.
Jones, Adam M; DeRose, Christopher T; Lentine, Anthony L; Starbuck, Andrew; Pomerene, Andrew T S; Norwood, Robert A
2015-11-02
This work represents the first complete analysis of the use of a racetrack resonator to measure the insertion loss of efficient, compact photonic components. Beginning with an in-depth analysis of potential error sources and a discussion of the calibration procedure, the technique is used to estimate the insertion loss of waveguide width tapers of varying geometry with a resulting 95% confidence interval of 0.007 dB. The work concludes with a performance comparison of the analyzed tapers with results presented for four taper profiles and three taper lengths.
Racetrack resonator as a loss measurement platform for photonic components
Jones, Adam M.; Univ. of Arizona, Tucson, AZ; DeRose, Christopher T.; ...
2015-10-27
This work represents the first complete analysis of the use of a racetrack resonator to measure the insertion loss of efficient, compact photonic components. Beginning with an in-depth analysis of potential error sources and a discussion of the calibration procedure, the technique is used to estimate the insertion loss of waveguide width tapers of varying geometry with a resulting 95% confidence interval of 0.007 dB. Furthermore, the work concludes with a performance comparison of the analyzed tapers with results presented for four taper profiles and three taper lengths.
NASA Astrophysics Data System (ADS)
Haram, M.; Wang, T.; Gu, F.; Ball, A. D.
2012-05-01
Motor current signal analysis has been an effective way for many years of monitoring electrical machines themselves. However, little work has been carried out in using this technique for monitoring their downstream equipment because of difficulties in extracting small fault components in the measured current signals. This paper investigates the characteristics of electrical current signals for monitoring the faults from a downstream gearbox using a modulation signal bispectrum (MSB), including phase effects in extracting small modulating components in a noisy measurement. An analytical study is firstly performed to understand amplitude, frequency and phase characteristics of current signals due to faults. It then explores the performance of MSB analysis in detecting weak modulating components in current signals. Experimental study based on a 10kw two stage gearbox, driven by a three phase induction motor, shows that MSB peaks at different rotational frequencies can be based to quantify the severity of gear tooth breakage and the degrees of shaft misalignment. In addition, the type and location of a fault can be recognized based on the frequency at which the change of MSB peak is the highest among different frequencies.
Almeida, Mariana R; Correa, Deleon N; Zacca, Jorge J; Logrado, Lucio Paulo Lima; Poppi, Ronei J
2015-02-20
The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50 μg cm(-2). Copyright © 2014 Elsevier B.V. All rights reserved.
Color enhancement of landsat agricultural imagery: JPL LACIE image processing support task
NASA Technical Reports Server (NTRS)
Madura, D. P.; Soha, J. M.; Green, W. B.; Wherry, D. B.; Lewis, S. D.
1978-01-01
Color enhancement techniques were applied to LACIE LANDSAT segments to determine if such enhancement can assist analysis in crop identification. The procedure involved increasing the color range by removing correlation between components. First, a principal component transformation was performed, followed by contrast enhancement to equalize component variances, followed by an inverse transformation to restore familiar color relationships. Filtering was applied to lower order components to reduce color speckle in the enhanced products. Use of single acquisition and multiple acquisition statistics to control the enhancement were compared, and the effects of normalization investigated. Evaluation is left to LACIE personnel.
System principles, mathematical models and methods to ensure high reliability of safety systems
NASA Astrophysics Data System (ADS)
Zaslavskyi, V.
2017-04-01
Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.
ERIC Educational Resources Information Center
Yavuz, Soner; Morgil, Inci
2006-01-01
In the applications of instrumental analysis lessons, advanced instruments with the needed experiments are needed. During the lessons it is a fact that the more experiments are performed, the more learning will be. For this reason, experiments that do not last long and should be performed with more simple instruments and that increase students"…
Modulated Hebb-Oja learning rule--a method for principal subspace analysis.
Jankovic, Marko V; Ogawa, Hidemitsu
2006-03-01
This paper presents analysis of the recently proposed modulated Hebb-Oja (MHO) method that performs linear mapping to a lower-dimensional subspace. Principal component subspace is the method that will be analyzed. Comparing to some other well-known methods for yielding principal component subspace (e.g., Oja's Subspace Learning Algorithm), the proposed method has one feature that could be seen as desirable from the biological point of view--synaptic efficacy learning rule does not need the explicit information about the value of the other efficacies to make individual efficacy modification. Also, the simplicity of the "neural circuits" that perform global computations and a fact that their number does not depend on the number of input and output neurons, could be seen as good features of the proposed method.
DOT National Transportation Integrated Search
1975-05-31
Prediction of wheel displacements and wheel-rail forces is a prerequisite to the evaluation of the curving performance of rail vehicles. This information provides part of the basis for the rational design of wheels and suspension components, for esta...
The Effects of Phonetic Similarity and List Length on Children's Sound Categorization Performance.
ERIC Educational Resources Information Center
Snowling, Margaret J.; And Others
1994-01-01
Examined the phonological analysis and verbal working memory components of the sound categorization task and their relationships to reading skill differences. Children were tested on sound categorization by having them identify odd words in sequences. Sound categorization performance was sensitive to individual differences in speech perception…
Prototype solar heating and combined heating and cooling systems
NASA Technical Reports Server (NTRS)
1977-01-01
System analysis activities were directed toward refining the heating system parameters. Trade studies were performed to support hardware selections for all systems and for the heating only operational test sites in particular. The heating system qualification tests were supported by predicting qualification test component performance prior to conducting the test.
USDA-ARS?s Scientific Manuscript database
High performance liquid chromatography (UPLC) and flow injection electrospray ionization with ion trap mass spectrometry (FIMS) fingerprints combined with the principal component analysis (PCA) were examined for their potential in differentiating commercial organic and conventional sage samples. The...
Scaled centrifugal compressor, collector and running gear program
NASA Technical Reports Server (NTRS)
Kenehan, J. G.
1983-01-01
The Scaled Centrifugal Compressor, Collector and Running gear Program was conducted in support of an overall NASA strategy to improve small-compressor performance, durability, and reliability while reducing initial and life-cycle costs. Accordingly, Garrett designed and provided a test rig, gearbox coupling, and facility collector for a new NASA facility, and provided a scaled model of an existing, high-performance impeller for evaluation scaling effects on aerodynamic performance and for obtaining other performance data. Test-rig shafting was designed to operate smoothly throughout a speed range up to 60,000 rpm. Pressurized components were designed to operate at pressures up to 300 psia and at temperatures to 1000 F. Nonrotating components were designed to provide a margin-of-safety of 0.05 or greater; rotating components, for a margin-of-safety based on allowable yield and ultimate strengths. Design activities were supported by complete design analysis, and the finished hardware was subjected to check-runs to confirm proper operation. The test rig will support a wide range of compressor tests and evaluations.
Convection equation modeling: A non-iterative direct matrix solution algorithm for use with SINDA
NASA Technical Reports Server (NTRS)
Schrage, Dean S.
1993-01-01
The determination of the boundary conditions for a component-level analysis, applying discrete finite element and finite difference modeling techniques often requires an analysis of complex coupled phenomenon that cannot be described algebraically. For example, an analysis of the temperature field of a coldplate surface with an integral fluid loop requires a solution to the parabolic heat equation and also requires the boundary conditions that describe the local fluid temperature. However, the local fluid temperature is described by a convection equation that can only be solved with the knowledge of the locally-coupled coldplate temperatures. Generally speaking, it is not computationally efficient, and sometimes, not even possible to perform a direct, coupled phenomenon analysis of the component-level and boundary condition models within a single analysis code. An alternative is to perform a disjoint analysis, but transmit the necessary information between models during the simulation to provide an indirect coupling. For this approach to be effective, the component-level model retains full detail while the boundary condition model is simplified to provide a fast, first-order prediction of the phenomenon in question. Specifically for the present study, the coldplate structure is analyzed with a discrete, numerical model (SINDA) while the fluid loop convection equation is analyzed with a discrete, analytical model (direct matrix solution). This indirect coupling allows a satisfactory prediction of the boundary condition, while not subjugating the overall computational efficiency of the component-level analysis. In the present study a discussion of the complete analysis of the derivation and direct matrix solution algorithm of the convection equation is presented. Discretization is analyzed and discussed to extend of solution accuracy, stability and computation speed. Case studies considering a pulsed and harmonic inlet disturbance to the fluid loop are analyzed to assist in the discussion of numerical dissipation and accuracy. In addition, the issues of code melding or integration with standard class solvers such as SINDA are discussed to advise the user of the potential problems to be encountered.
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
Open-cycle OTEC system performance analysis. [Claude cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewandowski, A.A.; Olson, D.A.; Johnson, D.H.
1980-10-01
An algorithm developed to calculate the performance of Claude-Cycle ocean thermal energy conversion (OTEC) systems is described. The algorithm treats each component of the system separately and then interfaces them to form a complete system, allowing a component to be changed without changing the rest of the algorithm. Two components that are subject to change are the evaporator and condenser. For this study we developed mathematical models of a channel-flow evaporator and both a horizontal jet and spray director contact condenser. The algorithm was then programmed to run on SERI's CDC 7600 computer and used to calculate the effect onmore » performance of deaerating the warm and cold water streams before entering the evaporator and condenser, respectively. This study indicates that there is no advantage to removing air from these streams compared with removing the air from the condenser.« less
NASA Astrophysics Data System (ADS)
Silva, N.; Esper, A.
2012-01-01
The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.
Investigation of performance deterioration of the CF6/JT9D, high-bypass ratio turbofan engines
NASA Technical Reports Server (NTRS)
Ziemianski, J. A.; Mehalic, C. M.
1980-01-01
The aircraft energy efficiency program within NASA is developing technology required to improve the fuel efficiency of commercial subsonic transport aricraft. One segment of this program includes engine diagnostics which is directed toward determining the sources and causes of performance deterioration in the Pratt and Whitney Aircraft JT9D and General Electric CF6 high-bypass ratio turbofan engines and developing technology for minimizing the performance losses. Results of engine performance deterioration investigations based on historical data, special engine tests, and specific tests to define the influence of flight loads and component clearances on performance are presented. The results of analysis of several damage mechanisms that contribute to performance deterioration such as blade tip rubs, airfoil surface roughness and erosion, and thermal distortion are also included. The significance of these damage mechanisms on component and overall engine performance is discussed.
The CF6 engine performance improvement
NASA Technical Reports Server (NTRS)
Fasching, W. A.
1982-01-01
As part of the NASA-sponsored Engine Component Improvement (ECI) Program, a feasibility analysis of performance improvement and retention concepts for the CF6-6 and CF6-50 engines was conducted and seven concepts were identified for development and ground testing: new fan, new front mount, high pressure turbine aerodynamic performance improvement, high pressure turbine roundness, high pressure turbine active clearance control, low pressure turbine active clearance control, and short core exhaust nozzle. The development work and ground testing are summarized, and the major test results and an enomic analysis for each concept are presented.
NASA Technical Reports Server (NTRS)
1987-01-01
The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.
Precoded spatial multiplexing MIMO system with spatial component interleaver.
Gao, Xiang; Wu, Zhanji
In this paper, the performance of precoded bit-interleaved coded modulation (BICM) spatial multiplexing multiple-input multiple-output (MIMO) system with spatial component interleaver is investigated. For the ideal precoded spatial multiplexing MIMO system with spatial component interleaver based on singular value decomposition (SVD) of the MIMO channel, the average pairwise error probability (PEP) of coded bits is derived. Based on the PEP analysis, the optimum spatial Q-component interleaver design criterion is provided to achieve the minimum error probability. For the limited feedback precoded proposed scheme with linear zero forcing (ZF) receiver, in order to minimize a bound on the average probability of a symbol vector error, a novel effective signal-to-noise ratio (SNR)-based precoding matrix selection criterion and a simplified criterion are proposed. Based on the average mutual information (AMI)-maximization criterion, the optimal constellation rotation angles are investigated. Simulation results indicate that the optimized spatial multiplexing MIMO system with spatial component interleaver can achieve significant performance advantages compared to the conventional spatial multiplexing MIMO system.
SSME HPOTP post-test diagnostic system enhancement project
NASA Technical Reports Server (NTRS)
Bickmore, Timothy W.
1995-01-01
An assessment of engine and component health is routinely made after each test or flight firing of a space shuttle main engine (SSME). Currently, this health assessment is done by teams of engineers who manually review sensor data, performance data, and engine and component operating histories. Based on review of information from these various sources, an evaluation is made as to the health of each component of the SSME and the preparedness of the engine for another test or flight. The objective of this project is to further develop a computer program which automates the analysis of test data from the SSME high-pressure oxidizer turbopump (HPOTP) in order to detect and diagnose anomalies. This program fits into a larger system, the SSME Post-Test Diagnostic System (PTDS), which will eventually be extended to assess the health and status of most SSME components on the basis of test data analysis. The HPOTP module is an expert system, which uses 'rules-of-thumb' obtained from interviews with experts from NASA Marshall Space Flight Center (MSFC) to detect and diagnose anomalies. Analyses of the raw test data are first performed using pattern recognition techniques which result in features such as spikes, shifts, peaks, and drifts being detected and written to a database. The HPOTP module then looks for combination of these features which are indicative of known anomalies, using the rules gathered from the turbomachinery experts. Results of this analysis are then displayed via a graphical user interface which provides ranked lists of anomalies and observations by engine component, along with supporting data plots for each.
Systems Analysis Initiated for All-Electric Aircraft Propulsion
NASA Technical Reports Server (NTRS)
Kohout, Lisa L.
2003-01-01
A multidisciplinary effort is underway at the NASA Glenn Research Center to develop concepts for revolutionary, nontraditional fuel cell power and propulsion systems for aircraft applications. There is a growing interest in the use of fuel cells as a power source for electric propulsion as well as an auxiliary power unit to substantially reduce or eliminate environmentally harmful emissions. A systems analysis effort was initiated to assess potential concepts in an effort to identify those configurations with the highest payoff potential. Among the technologies under consideration are advanced proton exchange membrane (PEM) and solid oxide fuel cells, alternative fuels and fuel processing, and fuel storage. Prior to this effort, the majority of fuel cell analysis done at Glenn was done for space applications. Because of this, a new suite of models was developed. These models include the hydrogen-air PEM fuel cell; internal reforming solid oxide fuel cell; balance-of-plant components (compressor, humidifier, separator, and heat exchangers); compressed gas, cryogenic, and liquid fuel storage tanks; and gas turbine/generator models for hybrid system applications. Initial mass, volume, and performance estimates of a variety of PEM systems operating on hydrogen and reformate have been completed for a baseline general aviation aircraft. Solid oxide/turbine hybrid systems are being analyzed. In conjunction with the analysis efforts, a joint effort has been initiated with Glenn s Computer Services Division to integrate fuel cell stack and component models with the visualization environment that supports the GRUVE lab, Glenn s virtual reality facility. The objective of this work is to provide an environment to assist engineers in the integration of fuel cell propulsion systems into aircraft and provide a better understanding of the interaction between system components and the resulting effect on the overall design and performance of the aircraft. Initially, three-dimensional computer-aided design (CAD) models of representative PEM fuel cell stack and components were developed and integrated into the virtual reality environment along with an Excel-based model used to calculate fuel cell electrical performance on the basis of cell dimensions (see the figure). CAD models of a representative general aviation aircraft were also developed and added to the environment. With the use of special headgear, users will be able to virtually manipulate the fuel cell s physical characteristics and its placement within the aircraft while receiving information on the resultant fuel cell output power and performance. As the systems analysis effort progresses, we will add more component models to the GRUVE environment to help us more fully understand the effect of various system configurations on the aircraft.
Ibrahim, George M; Morgan, Benjamin R; Macdonald, R Loch
2014-03-01
Predictors of outcome after aneurysmal subarachnoid hemorrhage have been determined previously through hypothesis-driven methods that often exclude putative covariates and require a priori knowledge of potential confounders. Here, we apply a data-driven approach, principal component analysis, to identify baseline patient phenotypes that may predict neurological outcomes. Principal component analysis was performed on 120 subjects enrolled in a prospective randomized trial of clazosentan for the prevention of angiographic vasospasm. Correlation matrices were created using a combination of Pearson, polyserial, and polychoric regressions among 46 variables. Scores of significant components (with eigenvalues>1) were included in multivariate logistic regression models with incidence of severe angiographic vasospasm, delayed ischemic neurological deficit, and long-term outcome as outcomes of interest. Sixteen significant principal components accounting for 74.6% of the variance were identified. A single component dominated by the patients' initial hemodynamic status, World Federation of Neurosurgical Societies score, neurological injury, and initial neutrophil/leukocyte counts was significantly associated with poor outcome. Two additional components were associated with angiographic vasospasm, of which one was also associated with delayed ischemic neurological deficit. The first was dominated by the aneurysm-securing procedure, subarachnoid clot clearance, and intracerebral hemorrhage, whereas the second had high contributions from markers of anemia and albumin levels. Principal component analysis, a data-driven approach, identified patient phenotypes that are associated with worse neurological outcomes. Such data reduction methods may provide a better approximation of unique patient phenotypes and may inform clinical care as well as patient recruitment into clinical trials. http://www.clinicaltrials.gov. Unique identifier: NCT00111085.
Hyper-X Hot Structures Comparison of Thermal Analysis and Flight Data
NASA Technical Reports Server (NTRS)
Amundsen, Ruth M.; Leonard, Charles P.; Bruce, Walter E., III
2004-01-01
The Hyper-X (X-43A) program is a flight experiment to demonstrate scramjet performance and operability under controlled powered free-flight conditions at Mach 7 and 10. The Mach 7 flight was successfully completed on March 27, 2004. Thermocouple instrumentation in the hot structures (nose, horizontal tail, and vertical tail) recorded the flight thermal response of these components. Preflight thermal analysis was performed for design and risk assessment purposes. This paper will present a comparison of the preflight thermal analysis and the recorded flight data.
2015-01-01
Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications. PMID:26267377
NASA Technical Reports Server (NTRS)
Johnsen, R. L.
1979-01-01
The performance sensitivity of a two-shaft automotive gas turbine engine to changes in component performance and cycle operating parameters was examined. Sensitivities were determined for changes in turbomachinery efficiency, compressor inlet temperature, power turbine discharge temperature, regenerator effectiveness, regenerator pressure drop, and several gas flow and heat leaks. Compressor efficiency was found to have the greatest effect on system performance.
Authoritarianism Revisited: Evidence for an Aggression Factor.
ERIC Educational Resources Information Center
Raden, David
1981-01-01
Performed a principal components factor analysis on scores of 245 undergraduates to a short version of the F Scale and measures of prejudice, attitude toward welfare, toleration of political deviance, punitiveness toward criminals, and support of the Vietnam War. Analysis produced two factors: authoritarian aggression and attitude toward welfare.…
GOATS Image Projection Component
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2011-01-01
When doing mission analysis and design of an imaging system in orbit around the Earth, answering the fundamental question of imaging performance requires an understanding of the image products that will be produced by the imaging system. GOATS software represents a series of MATLAB functions to provide for geometric image projections. Unique features of the software include function modularity, a standard MATLAB interface, easy-to-understand first-principles-based analysis, and the ability to perform geometric image projections of framing type imaging systems. The software modules are created for maximum analysis utility, and can all be used independently for many varied analysis tasks, or used in conjunction with other orbit analysis tools.
Chromatographic and electrophoretic approaches in ink analysis.
Zlotnick, J A; Smith, F P
1999-10-15
Inks are manufactured from a wide variety of substances that exhibit very different chemical behaviors. Inks designed for use in different writing instruments or printing methods have quite dissimilar components. Since the 1950s chromatographic and electrophoretic methods have played important roles in the analysis of inks, where compositional information may have bearing on the investigation of counterfeiting, fraud, forgery, and other crimes. Techniques such as paper chromatography and electrophoresis, thin-layer chromatography, high-performance liquid chromatography, gas chromatography, gel electrophoresis, and the relatively new technique of capillary electrophoresis have all been explored as possible avenues for the separation of components of inks. This paper reviews the components of different types of inks and applications of the above separation methods are reviewed.
Yamamoto, Shinya; Bamba, Takeshi; Sano, Atsushi; Kodama, Yukako; Imamura, Miho; Obata, Akio; Fukusaki, Eiichiro
2012-08-01
Soy sauces, produced from different ingredients and brewing processes, have variations in components and quality. Therefore, it is extremely important to comprehend the relationship between components and the sensory attributes of soy sauces. The current study sought to perform metabolite profiling in order to devise a method of assessing the attributes of soy sauces. Quantitative descriptive analysis (QDA) data for 24 soy sauce samples were obtained from well selected sensory panelists. Metabolite profiles primarily concerning low-molecular-weight hydrophilic components were based on gas chromatography with time-of-flightmass spectrometry (GC/TOFMS). QDA data for soy sauces were accurately predicted by projection to latent structure (PLS), with metabolite profiles serving as explanatory variables and QDA data set serving as a response variable. Moreover, analysis of correlation between matrices of metabolite profiles and QDA data indicated contributing compounds that were highly correlated with QDA data. Especially, it was indicated that sugars are important components of the tastes of soy sauces. This new approach which combines metabolite profiling with QDA is applicable to analysis of sensory attributes of food as a result of the complex interaction between its components. This approach is effective to search important compounds that contribute to the attributes. Copyright © 2012 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; auBuchon, M.
2000-01-01
Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of propulsion systems for aircraft and space vehicles called the Numerical Propulsion System Simulation (NPSS). The NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer, along with the concept of numerical zooming between 0- Dimensional to 1-, 2-, and 3-dimensional component engine codes. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Current "state-of-the-art" engine simulations are 0-dimensional in that there is there is no axial, radial or circumferential resolution within a given component (e.g. a compressor or turbine has no internal station designations). In these 0-dimensional cycle simulations the individual component performance characteristics typically come from a table look-up (map) with adjustments for off-design effects such as variable geometry, Reynolds effects, and clearances. Zooming one or more of the engine components to a higher order, physics-based analysis means a higher order code is executed and the results from this analysis are used to adjust the 0-dimensional component performance characteristics within the system simulation. By drawing on the results from more predictive, physics based higher order analysis codes, "cycle" simulations are refined to closely model and predict the complex physical processes inherent to engines. As part of the overall development of the NPSS, NASA and industry began the process of defining and implementing an object class structure that enables Numerical Zooming between the NPSS Version I (0-dimension) and higher order 1-, 2- and 3-dimensional analysis codes. The NPSS Version I preserves the historical cycle engineering practices but also extends these classical practices into the area of numerical zooming for use within a companies' design system. What follows here is a description of successfully zooming I-dimensional (row-by-row) high pressure compressor results back to a NPSS engine 0-dimension simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the fidelity of the engine system simulation and enable the engine system to be "pre-validated" prior to commitment to engine hardware.
Guo, Yujie; Chen, Xi; Qi, Jin; Yu, Boyang
2016-07-01
A reliable method, combining qualitative analysis by high-performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry and quantitative assessment by high-performance liquid chromatography with photodiode array detection, has been developed to simultaneously analyze flavonoids and alkaloids in lotus leaf extracts. In the qualitative analysis, a total of 30 compounds, including 12 flavonoids, 16 alkaloids, and two proanthocyanidins, were identified. The fragmentation behaviors of four types of flavone glycoside and three types of alkaloid are summarized. The mass spectra of four representative components, quercetin 3-O-glucuronide, norcoclaurine, nuciferine, and neferine, are shown to illustrate their fragmentation pathways. Five pairs of isomers were detected and three of them were distinguished by comparing the elution order with reference substances and the mass spectrometry data with reported data. In the quantitative analysis, 30 lotus leaf samples from different regions were analyzed to investigate the proportion of eight representative compounds. Quercetin 3-O-glucuronide was found to be the predominant constituent of lotus leaf extracts. For further discrimination among the samples, hierarchical cluster analysis, and principal component analysis, based on the areas of the eight quantitative peaks, were carried out. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel
Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less
Development of the Functional Flow Block Diagram for the J-2X Rocket Engine System
NASA Technical Reports Server (NTRS)
White, Thomas; Stoller, Sandra L.; Greene, WIlliam D.; Christenson, Rick L.; Bowen, Barry C.
2007-01-01
The J-2X program calls for the upgrade of the Apollo-era Rocketdyne J-2 engine to higher power levels, using new materials and manufacturing techniques, and with more restrictive safety and reliability requirements than prior human-rated engines in NASA history. Such requirements demand a comprehensive systems engineering effort to ensure success. Pratt & Whitney Rocketdyne system engineers performed a functional analysis of the engine to establish the functional architecture. J-2X functions were captured in six major operational blocks. Each block was divided into sub-blocks or states. In each sub-block, functions necessary to perform each state were determined. A functional engine schematic consistent with the fidelity of the system model was defined for this analysis. The blocks, sub-blocks, and functions were sequentially numbered to differentiate the states in which the function were performed and to indicate the sequence of events. The Engine System was functionally partitioned, to provide separate and unique functional operators. Establishing unique functional operators as work output of the System Architecture process is novel in Liquid Propulsion Engine design. Each functional operator was described such that its unique functionality was identified. The decomposed functions were then allocated to the functional operators both of which were the inputs to the subsystem or component performance specifications. PWR also used a novel approach to identify and map the engine functional requirements to customer-specified functions. The final result was a comprehensive Functional Flow Block Diagram (FFBD) for the J-2X Engine System, decomposed to the component level and mapped to all functional requirements. This FFBD greatly facilitates component specification development, providing a well-defined trade space for functional trades at the subsystem and component level. It also provides a framework for function-based failure modes and effects analysis (FMEA), and a rigorous baseline for the functional architecture.
An application of principal component analysis to the clavicle and clavicle fixation devices.
Daruwalla, Zubin J; Courtis, Patrick; Fitzpatrick, Clare; Fitzpatrick, David; Mullett, Hannan
2010-03-26
Principal component analysis (PCA) enables the building of statistical shape models of bones and joints. This has been used in conjunction with computer assisted surgery in the past. However, PCA of the clavicle has not been performed. Using PCA, we present a novel method that examines the major modes of size and three-dimensional shape variation in male and female clavicles and suggests a method of grouping the clavicle into size and shape categories. Twenty-one high-resolution computerized tomography scans of the clavicle were reconstructed and analyzed using a specifically developed statistical software package. After performing statistical shape analysis, PCA was applied to study the factors that account for anatomical variation. The first principal component representing size accounted for 70.5 percent of anatomical variation. The addition of a further three principal components accounted for almost 87 percent. Using statistical shape analysis, clavicles in males have a greater lateral depth and are longer, wider and thicker than in females. However, the sternal angle in females is larger than in males. PCA confirmed these differences between genders but also noted that men exhibit greater variance and classified clavicles into five morphological groups. This unique approach is the first that standardizes a clavicular orientation. It provides information that is useful to both, the biomedical engineer and clinician. Other applications include implant design with regard to modifying current or designing future clavicle fixation devices. Our findings support the need for further development of clavicle fixation devices and the questioning of whether gender-specific devices are necessary.
NASA Technical Reports Server (NTRS)
Packard, Michael H.
2002-01-01
Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.
NASA Technical Reports Server (NTRS)
Peoples, J. A.
1975-01-01
Report includes many charts that present graphically the effects of design parameters on performance. Equations and data are given which can assist designer in selecting among such factors as working medium, horsepower, and engine components.
Faster tissue interface analysis from Raman microscopy images using compressed factorisation
NASA Astrophysics Data System (ADS)
Palmer, Andrew D.; Bannerman, Alistair; Grover, Liam; Styles, Iain B.
2013-06-01
The structure of an artificial ligament was examined using Raman microscopy in combination with novel data analysis. Basis approximation and compressed principal component analysis are shown to provide efficient compression of confocal Raman microscopy images, alongside powerful methods for unsupervised analysis. This scheme allows the acceleration of data mining, such as principal component analysis, as they can be performed on the compressed data representation, providing a decrease in the factorisation time of a single image from five minutes to under a second. Using this workflow the interface region between a chemically engineered ligament construct and a bone-mimic anchor was examined. Natural ligament contains a striated interface between the bone and tissue that provides improved mechanical load tolerance, a similar interface was found in the ligament construct.
Compositional analysis and structural elucidation of glycosaminoglycans in chicken eggs
Liu, Zhangguo; Zhang, Fuming; Li, Lingyun; Li, Guoyun; He, Wenqing; Linhardt, Robert J.
2014-01-01
Glycosaminoglycans (GAGs) have numerous applications in the fields of pharmaceuticals, cosmetics, nutraceuticals, and foods. GAGs are also critically important in the developmental biology of all multicellular animals. GAGs were isolated from chicken egg components including yolk, thick egg white, thin egg white, membrane, calcified shell matrix supernatant, and shell matrix deposit. Disaccharide compositional analysis was performed using ultra high-performance liquid chromatography-mass spectrometry. The results of these analyses showed that all four families of GAGs were detected in all egg components. Keratan sulfate was found in egg whites (thick and thin) and shell matrix (calcified shell matrix supernatant and deposit) with high level. Chondroitin sulfates were much more plentiful in both shell matrix components and membrane. Hyaluronan was plentiful in both shell matrix components and membrane, but were only present in a trace of quantities in the yolk. Heparan sulfate was plentiful in the shell matrix deposit but was present in a trace of quantities in the egg content components (yolk, thick and thin egg whites). Most of the chondroitin and heparan sulfate disaccharides were present in the GAGs found in chicken eggs with the exception of chondroitin and heparan sulfate 2,6-disulfated disaccharides. Both CS and HS in the shell matrix deposit contained the most diverse chondroitin and heparan sulfate disaccharide compositions. Eggs might provide a potential new source of GAGs. PMID:25218438
EEG artifact elimination by extraction of ICA-component features using image processing algorithms.
Radüntz, T; Scouten, J; Hochmuth, O; Meffert, B
2015-03-30
Artifact rejection is a central issue when dealing with electroencephalogram recordings. Although independent component analysis (ICA) separates data in linearly independent components (IC), the classification of these components as artifact or EEG signal still requires visual inspection by experts. In this paper, we achieve automated artifact elimination using linear discriminant analysis (LDA) for classification of feature vectors extracted from ICA components via image processing algorithms. We compare the performance of this automated classifier to visual classification by experts and identify range filtering as a feature extraction method with great potential for automated IC artifact recognition (accuracy rate 88%). We obtain almost the same level of recognition performance for geometric features and local binary pattern (LBP) features. Compared to the existing automated solutions the proposed method has two main advantages: First, it does not depend on direct recording of artifact signals, which then, e.g. have to be subtracted from the contaminated EEG. Second, it is not limited to a specific number or type of artifact. In summary, the present method is an automatic, reliable, real-time capable and practical tool that reduces the time intensive manual selection of ICs for artifact removal. The results are very promising despite the relatively small channel resolution of 25 electrodes. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
TC-2 post Helios experiment data review. [postflight systems analysis of spacecraft performance
NASA Technical Reports Server (NTRS)
1975-01-01
Data are presented from a systems postflight analysis of the Centaur Launch Vehicle and Helios. Also given is a comparison of data from preflight analyses. Topics examined are: (1) propellant behavior; (2) helium usage; (3) propellant tank pressurization; (4) propellant tank thermodynamics; (5) component heating; thermal control; and thermal protection system; (6) main engine system; (7) H2O2 consumption; (8) boost pump post-meco performance; and (9) an overview of other systems.
A New Measurement of the Cosmic-Ray Proton and Helium Spectra
NASA Astrophysics Data System (ADS)
Mocchiutti, E.; Ambriola, M.; Bartalucci, S.; Bellotti, R.; Bergström, D.; Boezio, M.; Bonicini, V.; Bravar, U.; Cafagna, F.; Carlson, P.; Casolino, M.; Ciacio, F.; Circella, M.; De Marzo, C. N.; De Pascale, M. P.; Finetti, N.; Francke, T.; Hansen, P.; Hof, M.; Kremer, J.; Menn, W.; Mitchell, J. W.; Mocchiutti, E.; Morselli, A.; Ormes, J. F.; Papini, P.; Piccardi, S.; Picozza, P.; Ricci, M.; Schiavon, P.; Simon, M.; Sparvoli, R.; Spillantini, P.; Stephens, S. A.; Stochaj, S. J.; Streitmatter, R. E.; Suffert, M.; Vacchi, A.; Vannuccini, E.; Zampa, N.; WIZARD/CAPRICE Collaboration
2001-08-01
A new measurement of the primary cosmic ray spectra was performed during the balloon-borne CAPRICE experiment in 1998. This apparatus consists of a magnet spectrometer, with a superconducting magnet and a driftchamber tracking device, a time of flight scintillator system, a silicon-tungsten imaging calorimeter and a gas ring imaging Cherenkov detector. This combination of state-of-the-art detectors provides excellent particle discrimination capabilities, such that detailed investigations of the antiproton, electron/positron, muon and primary components of cosmic rays have been performed. The analysis of the primary proton component is illustrated in this paper.
Warthin tumor--morphological study of the stromal compartment.
Dăguci, Luminiţa; Simionescu, Cristiana; Stepan, A; Munteanu, Cristina; Dăguci, C; Bătăiosu, Marilena
2011-01-01
Warthin tumor is the second most common benign tumors of the parotid gland, after pleomorphic adenoma. Our study was performed on 21 cases with Warthin tumor diagnosed between 2005-2010, which were analyzed clinically, histologically and immunohistochemically, using anti-CD20 and anti-CD45RO antibodies. The analysis of age distribution within the investigated cases indicated that Warthin tumor incidence is increasing in the seventh decade of life, most patients being male (M/F 5/2). Histopathological, the analysis report of stroma÷parenchyma in 14 cases revealed a balanced distribution of the two components, in four cases, the epithelial component was predominant and in three cases, the stromal component was predominant. Immunohistochemical study for the two specific lymphocyte proliferation markers indicated positivity for both epithelial component and stroma. Cell layout of CD45RO and CD20cy at the level of lymphoid stroma had a similar pattern with normal or reactive lymph nodes.
Principal Components Analysis of a JWST NIRSpec Detector Subsystem
NASA Technical Reports Server (NTRS)
Arendt, Richard G.; Fixsen, D. J.; Greenhouse, Matthew A.; Lander, Matthew; Lindler, Don; Loose, Markus; Moseley, S. H.; Mott, D. Brent; Rauscher, Bernard J.; Wen, Yiting;
2013-01-01
We present principal component analysis (PCA) of a flight-representative James Webb Space Telescope NearInfrared Spectrograph (NIRSpec) Detector Subsystem. Although our results are specific to NIRSpec and its T - 40 K SIDECAR ASICs and 5 m cutoff H2RG detector arrays, the underlying technical approach is more general. We describe how we measured the systems response to small environmental perturbations by modulating a set of bias voltages and temperature. We used this information to compute the systems principal noise components. Together with information from the astronomical scene, we show how the zeroth principal component can be used to calibrate out the effects of small thermal and electrical instabilities to produce cosmetically cleaner images with significantly less correlated noise. Alternatively, if one were designing a new instrument, one could use a similar PCA approach to inform a set of environmental requirements (temperature stability, electrical stability, etc.) that enabled the planned instrument to meet performance requirements
Overview of the Systems Special Investigation Group investigation
NASA Technical Reports Server (NTRS)
Mason, James B.; Dursch, Harry; Edelman, Joel
1993-01-01
The Long Duration Exposure Facility (LDEF) carried a remarkable variety of electrical, mechanical, thermal, and optical systems, subsystems, and components. Nineteen of the fifty-seven experiments flown on LDEF contained functional systems that were active on-orbit. Almost all of the other experiments possessed at least a few specific components of interest to the Systems Special Investigation Group (Systems SIG), such as adhesives, seals, fasteners, optical components, and thermal blankets. Almost all top level functional testing of the active LDEF and experiment systems has been completed. Failure analysis of both LDEF hardware and individual experiments that failed to perform as designed has also been completed. Testing of system components and experimenter hardware of interest to the Systems SIG is ongoing. All available testing and analysis results were collected and integrated by the Systems SIG. An overview of our findings is provided. An LDEF Optical Experiment Database containing information for all 29 optical related experiments is also discussed.
Schmithorst, Vincent J; Brown, Rhonda Douglas
2004-07-01
The suitability of a previously hypothesized triple-code model of numerical processing, involving analog magnitude, auditory verbal, and visual Arabic codes of representation, was investigated for the complex mathematical task of the mental addition and subtraction of fractions. Functional magnetic resonance imaging (fMRI) data from 15 normal adult subjects were processed using exploratory group Independent Component Analysis (ICA). Separate task-related components were found with activation in bilateral inferior parietal, left perisylvian, and ventral occipitotemporal areas. These results support the hypothesized triple-code model corresponding to the activated regions found in the individual components and indicate that the triple-code model may be a suitable framework for analyzing the neuropsychological bases of the performance of complex mathematical tasks. Copyright 2004 Elsevier Inc.
Park, Gi-Pyo
2014-08-01
This study examined the latent constructs of the Foreign Language Classroom Anxiety Scale (FLCAS) using two different groups of Korean English as a foreign language (EFL) university students. Maximum likelihood exploratory factor analysis with direct oblimin rotation was performed among the first group of 217 participants and produced two meaningful latent components in the FLCAS. The two components of the FLCAS were closely examined among the second group of 244 participants to find the extent to which the two components of the FLCAS fit the data. The model fit indexes showed that the two-factor model in general adequately fit the data. Findings of this study were discussed with the focus on the two components of the FLCAS, followed by future study areas to be undertaken to shed further light on the role of foreign language anxiety in L2 acquisition.
Cline, Michael; Taylor, John E; Flores, Jesus; Bracken, Samuel; McCall, Suzanne; Ceremuga, Thomas E
2008-02-01
The purpose of our study was to investigate the anxiolytic effects of linalool and its potential interaction with the GABAA receptor in Sprague-Dawley rats. Lavender has been used traditionally as an herbal remedy in the treatment of many medical conditions, including anxiety. Linalool is a major component of the essential oil of lavender. Forty-four rats were divided into 4 groups: control, linalool, midazolam (positive control), and flumazenil and linalool. The behavioral and the neurohormonal/physiological components of anxiety were evaluated. The behavioral component was examined by using the elevated plus maze (open arm time/total time) and the neurohormonal/physiological component by measuring serum catecholamine and corticosterone levels. Data analysis was performed using a 2-tailed Multivariate Analysis of Variance and Sheffe post-hoc test. Our data suggest that linalool does not produce anxiolysis by modulation of the GABAA receptor; however, linalool may modulate motor movements and locomotion.
Vibrational Analysis of Engine Components Using Neural-Net Processing and Electronic Holography
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Fite, E. Brian; Mehmed, Oral; Thorp, Scott A.
1997-01-01
The use of computational-model trained artificial neural networks to acquire damage specific information from electronic holograms is discussed. A neural network is trained to transform two time-average holograms into a pattern related to the bending-induced-strain distribution of the vibrating component. The bending distribution is very sensitive to component damage unlike the characteristic fringe pattern or the displacement amplitude distribution. The neural network processor is fast for real-time visualization of damage. The two-hologram limit makes the processor more robust to speckle pattern decorrelation. Undamaged and cracked cantilever plates serve as effective objects for testing the combination of electronic holography and neural-net processing. The requirements are discussed for using finite-element-model trained neural networks for field inspections of engine components. The paper specifically discusses neural-network fringe pattern analysis in the presence of the laser speckle effect and the performances of two limiting cases of the neural-net architecture.
Vibrational Analysis of Engine Components Using Neural-Net Processing and Electronic Holography
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Fite, E. Brian; Mehmed, Oral; Thorp, Scott A.
1998-01-01
The use of computational-model trained artificial neural networks to acquire damage specific information from electronic holograms is discussed. A neural network is trained to transform two time-average holograms into a pattern related to the bending-induced-strain distribution of the vibrating component. The bending distribution is very sensitive to component damage unlike the characteristic fringe pattern or the displacement amplitude distribution. The neural network processor is fast for real-time visualization of damage. The two-hologram limit makes the processor more robust to speckle pattern decorrelation. Undamaged and cracked cantilever plates serve as effective objects for testing the combination of electronic holography and neural-net processing. The requirements are discussed for using finite-element-model trained neural networks for field inspections of engine components. The paper specifically discusses neural-network fringe pattern analysis in the presence of the laser speckle effect and the performances of two limiting cases of the neural-net architecture.
Heart sound segmentation of pediatric auscultations using wavelet analysis.
Castro, Ana; Vinhoza, Tiago T V; Mattos, Sandra S; Coimbra, Miguel T
2013-01-01
Auscultation is widely applied in clinical activity, nonetheless sound interpretation is dependent on clinician training and experience. Heart sound features such as spatial loudness, relative amplitude, murmurs, and localization of each component may be indicative of pathology. In this study we propose a segmentation algorithm to extract heart sound components (S1 and S2) based on it's time and frequency characteristics. This algorithm takes advantage of the knowledge of the heart cycle times (systolic and diastolic periods) and of the spectral characteristics of each component, through wavelet analysis. Data collected in a clinical environment, and annotated by a clinician was used to assess algorithm's performance. Heart sound components were correctly identified in 99.5% of the annotated events. S1 and S2 detection rates were 90.9% and 93.3% respectively. The median difference between annotated and detected events was of 33.9 ms.
Verdejo-García, Antonio; Pérez-García, Miguel
2007-03-01
Structure of executive function was examined and we contrasted performance of substance dependent individuals (polysubstance users) and control participants on neuropsychological measures assessing the different executive components obtained. Additionally, we contrasted performance of polysubstance users with preference for cocaine vs heroin and controls to explore possible differential effects of the main substance abused on executive impairment. Two groups of participants were recruited: abstinent polysubstance users and controls. Polysubstance users were further subdivided based on their drug of choice (cocaine vs heroin). We administered to all participants a comprehensive protocol of executive measures, including tests of fluency, working memory, reasoning, inhibitory control, flexibility, and decision making. Consistent with previous models, the principal component analysis showed that executive functions are organized into four separate components, three of them previously described: updating, inhibition, and shifting; and a fourth component of decision making. Abstinent polysubstance users had clinically significant impairments on measures assessing these four executive components (with effect sizes ranging from 0.5 to 2.2). Cocaine polysubstance users had more severe impairments than heroin users and controls on measures of inhibition (Stroop) and shifting (go/no go and category test). Greater severity of drug use predicted poorer performance on updating measures. Executive functions can be fractionated into four relatively independent components. Chronic drug use is associated with widespread impairment of these four executive components, with cocaine use inducing more severe deficits on inhibition and shifting. These findings show both common and differential effects of two widely used drugs on different executive components.
Micro-Logistics Analysis for Human Space Exploration
NASA Technical Reports Server (NTRS)
Cirillo, William; Stromgren, Chel; Galan, Ricardo
2008-01-01
Traditionally, logistics analysis for space missions has focused on the delivery of elements and goods to a destination. This type of logistics analysis can be referred to as "macro-logistics". While the delivery of goods is a critical component of mission analysis, it captures only a portion of the constraints that logistics planning may impose on a mission scenario. The other component of logistics analysis concerns the local handling of goods at the destination, including storage, usage, and disposal. This type of logistics analysis, referred to as "micro-logistics", may also be a primary driver in the viability of a human lunar exploration scenario. With the rigorous constraints that will be placed upon a human lunar outpost, it is necessary to accurately evaluate micro-logistics operations in order to develop exploration scenarios that will result in an acceptable level of system performance.
Design sensitivity analysis using EAL. Part 1: Conventional design parameters
NASA Technical Reports Server (NTRS)
Dopker, B.; Choi, Kyung K.; Lee, J.
1986-01-01
A numerical implementation of design sensitivity analysis of builtup structures is presented, using the versatility and convenience of an existing finite element structural analysis code and its database management system. The finite element code used in the implemenatation presented is the Engineering Analysis Language (EAL), which is based on a hybrid method of analysis. It was shown that design sensitivity computations can be carried out using the database management system of EAL, without writing a separate program and a separate database. Conventional (sizing) design parameters such as cross-sectional area of beams or thickness of plates and plane elastic solid components are considered. Compliance, displacement, and stress functionals are considered as performance criteria. The method presented is being extended to implement shape design sensitivity analysis using a domain method and a design component method.
Study on nondestructive discrimination of genuine and counterfeit wild ginsengs using NIRS
NASA Astrophysics Data System (ADS)
Lu, Q.; Fan, Y.; Peng, Z.; Ding, H.; Gao, H.
2012-07-01
A new approach for the nondestructive discrimination between genuine wild ginsengs and the counterfeit ones by near infrared spectroscopy (NIRS) was developed. Both discriminant analysis and back propagation artificial neural network (BP-ANN) were applied to the model establishment for discrimination. Optimal modeling wavelengths were determined based on the anomalous spectral information of counterfeit samples. Through principal component analysis (PCA) of various wild ginseng samples, genuine and counterfeit, the cumulative percentages of variance of the principal components were obtained, serving as a reference for principal component (PC) factor determination. Discriminant analysis achieved an identification ratio of 88.46%. With sample' truth values as its outputs, a three-layer BP-ANN model was built, which yielded a higher discrimination accuracy of 100%. The overall results sufficiently demonstrate that NIRS combined with BP-ANN classification algorithm performs better on ginseng discrimination than discriminant analysis, and can be used as a rapid and nondestructive method for the detection of counterfeit wild ginsengs in food and pharmaceutical industry.
DOT National Transportation Integrated Search
1975-05-01
The development of fatigue performance standards for freight car truck components and wheels requires a knowledge of the fluctuation service load environment, and a basis for stating the conservatism of the design with respect to the environment. On ...
Rein, Thomas R; Harvati, Katerina; Harrison, Terry
2015-01-01
Uncovering links between skeletal morphology and locomotor behavior is an essential component of paleobiology because it allows researchers to infer the locomotor repertoire of extinct species based on preserved fossils. In this study, we explored ulnar shape in anthropoid primates using 3D geometric morphometrics to discover novel aspects of shape variation that correspond to observed differences in the relative amount of forelimb suspensory locomotion performed by species. The ultimate goal of this research was to construct an accurate predictive model that can be applied to infer the significance of these behaviors. We studied ulnar shape variation in extant species using principal component analysis. Species mainly clustered into phylogenetic groups along the first two principal components. Upon closer examination, the results showed that the position of species within each major clade corresponded closely with the proportion of forelimb suspensory locomotion that they have been observed to perform in nature. We used principal component regression to construct a predictive model for the proportion of these behaviors that would be expected to occur in the locomotor repertoire of anthropoid primates. We then applied this regression analysis to Pliopithecus vindobonensis, a stem catarrhine from the Miocene of central Europe, and found strong evidence that this species was adapted to perform a proportion of forelimb suspensory locomotion similar to that observed in the extant woolly monkey, Lagothrix lagothricha. Copyright © 2014 Elsevier Ltd. All rights reserved.
Liu, Shuqiang; Tan, Zhibin; Li, Pingting; Gao, Xiaoling; Zeng, Yuaner; Wang, Shuling
2016-03-20
HepG2 cells biospecific extraction method and high performance liquid chromatography-electrospray ionization-mass spectrometry (HPLC-ESI-MS) analysis was proposed for screening of potential antiatherosclerotic active components in Bupeuri radix, a well-known Traditional Chinese Medicine (TCM). The hypothesis suggested that when cells are incubated together with the extracts of TCM, the potential bioactive components in the TCM should selectively combine with the receptor or channel of HepG2 cells, then the eluate which contained biospecific component binding to HepG2 cells was identified using HPLC-ESI-MS analysis. The potential bioactive components of Bupeuri radix were investigated using the proposed approach. Five compounds in the saikosaponins of Bupeuri radix were detected as these components selectively combined with HepG2 cells, among these compounds, two potentially bioactive compounds namely saikosaponin b1 and saikosaponin b2 (SSb2) were identified by comparing with the chromatography of the standard sample and analysis of the structural clearance characterization of MS. Then SSb2 was used to assess the uptake of DiI-high density lipoprotein (HDL) in HepG2 cells for antiatherosclerotic activity. The results have showed that SSb2, with indicated concentrations (5, 15, 25, and 40 μM) could remarkably uptake dioctadecylindocarbocyanine labeled- (DiI) -HDL in HepG2 cells (Vs control group, *P<0.01). In conclusion, the application of HepG2 biospecific extraction coupled with HPLC-ESI-MS analysis is a rapid, convenient, and reliable method for screening potential bioactive components in TCM and SSb2 may be a valuable novel drug agent for the treatment of atherosclerosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Atmospheric cloud physics thermal systems analysis
NASA Technical Reports Server (NTRS)
1977-01-01
Engineering analyses performed on the Atmospheric Cloud Physics (ACPL) Science Simulator expansion chamber and associated thermal control/conditioning system are reported. Analyses were made to develop a verified thermal model and to perform parametric thermal investigations to evaluate systems performance characteristics. Thermal network representations of solid components and the complete fluid conditioning system were solved simultaneously using the Systems Improved Numerical Differencing Analyzer (SINDA) computer program.
Geometric subspace methods and time-delay embedding for EEG artifact removal and classification.
Anderson, Charles W; Knight, James N; O'Connor, Tim; Kirby, Michael J; Sokolov, Artem
2006-06-01
Generalized singular-value decomposition is used to separate multichannel electroencephalogram (EEG) into components found by optimizing a signal-to-noise quotient. These components are used to filter out artifacts. Short-time principal components analysis of time-delay embedded EEG is used to represent windowed EEG data to classify EEG according to which mental task is being performed. Examples are presented of the filtering of various artifacts and results are shown of classification of EEG from five mental tasks using committees of decision trees.
Real-Time, High-Frequency QRS Electrocardiograph
NASA Technical Reports Server (NTRS)
Schlegel, Todd T.; DePalma, Jude L.; Moradi, Saeed
2003-01-01
An electronic system that performs real-time analysis of the low-amplitude, high-frequency, ordinarily invisible components of the QRS portion of an electrocardiographic signal in real time has been developed. Whereas the signals readily visible on a conventional electrocardiogram (ECG) have amplitudes of the order of a millivolt and are characterized by frequencies <100 Hz, the ordinarily invisible components have amplitudes in the microvolt range and are characterized by frequencies from about 150 to about 250 Hz. Deviations of these high-frequency components from a normal pattern can be indicative of myocardial ischemia or myocardial infarction
Vairavan, Srinivasan; Eswaran, Hari; Preissl, Hubert; Wilson, James D; Haddad, Naim; Lowery, Curtis L; Govindan, Rathinaswamy B
2010-01-01
The fetal magnetoencephalogram (fMEG) is measured in the presence of large interference from maternal and fetal magnetocardiograms (mMCG and fMCG). These cardiac interferences can be attenuated by orthogonal projection (OP) technique of the corresponding spatial vectors. However, the OP technique redistributes the fMEG signal among the channels and also leaves some cardiac residuals (partially attenuated mMCG and fMCG) due to loss of stationarity in the signal. In this paper, we propose a novel way to extract and localize the neonatal and fetal spontaneous brain activity by using independent component analysis (ICA) technique. In this approach, we perform ICA on a small subset of sensors for 1-min duration. The independent components obtained are further investigated for the presence of discontinuous patterns as identified by the Hilbert phase analysis and are used as decision criteria for localizing the spontaneous brain activity. In order to locate the region of highest spontaneous brain activity content, this analysis is performed on the sensor subsets, which are traversed across the entire sensor space. The region of the spontaneous brain activity as identified by the proposed approach correlated well with the neonatal and fetal head location. In addition, the burst duration and the inter-burst interval computed for the identified discontinuous brain patterns are in agreement with the reported values.
Telephone system operations evaluation : before AOS implementation
DOT National Transportation Integrated Search
1999-01-01
This study provides a detailed baseline analysis of telephone system performance before AOS : implementation. By the time of the preparation of this report, the phone system component of : AOS had not been implemented.
NASA Technical Reports Server (NTRS)
Miller, W. S.
1974-01-01
A structural analysis performed on the 1/4-watt cryogenic refrigerator. The analysis covered the complete assembly except for the cooling jacket and mounting brackets. Maximum stresses, margin of safety, and natural frequencies were calculated for structurally loaded refrigerator components shown in assembly drawings. The stress analysis indicates that the design is satisfactory for the specified vibration environment, and the proof, burst, and normal operating loads.
NASA Astrophysics Data System (ADS)
Stisen, S.; Demirel, C.; Koch, J.
2017-12-01
Evaluation of performance is an integral part of model development and calibration as well as it is of paramount importance when communicating modelling results to stakeholders and the scientific community. There exists a comprehensive and well tested toolbox of metrics to assess temporal model performance in the hydrological modelling community. On the contrary, the experience to evaluate spatial performance is not corresponding to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study aims at making a contribution towards advancing spatial pattern oriented model evaluation for distributed hydrological models. This is achieved by introducing a novel spatial performance metric which provides robust pattern performance during model calibration. The promoted SPAtial EFficiency (spaef) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multi-component approach is necessary in order to adequately compare spatial patterns. spaef, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are tested in a spatial pattern oriented model calibration of a catchment model in Denmark. The calibration is constrained by a remote sensing based spatial pattern of evapotranspiration and discharge timeseries at two stations. Our results stress that stand-alone metrics tend to fail to provide holistic pattern information to the optimizer which underlines the importance of multi-component metrics. The three spaef components are independent which allows them to complement each other in a meaningful way. This study promotes the use of bias insensitive metrics which allow comparing variables which are related but may differ in unit in order to optimally exploit spatial observations made available by remote sensing platforms. We see great potential of spaef across environmental disciplines dealing with spatially distributed modelling.
Understanding International GNC Hardware Trends
NASA Technical Reports Server (NTRS)
Greenbaum, Adam; Brady, Tye; Dennehy, Cornelius; Airey, Stephen P.; Roelke, Evan; Judd, Samuel Brady
2015-01-01
An industry-wide survey of guidance, navigation and control (GNC) sensors, namely star trackers, gyros, and sun sensors was undertaken in 2014, in which size, mass, power, and various performance metrics were recorded for each category. A multidimensional analysis was performed, looking at the spectrum of available sensors, with the intent of identifying gaps in the available capability range. Mission types that are not currently well served by the available components were discussed, as well as some missions that would be enabled by filling gaps in the component space. This paper continues that study, with a focus on reaction wheels and magnetometers, as well as with updates to the listings of star trackers, gyros, and sun sensors. Also discussed are a framework for making the database available to the community at large, and the continued maintenance of this database and the analysis of its contents.
Coupled parametric design of flow control and duct shape
NASA Technical Reports Server (NTRS)
Florea, Razvan (Inventor); Bertuccioli, Luca (Inventor)
2009-01-01
A method for designing gas turbine engine components using a coupled parametric analysis of part geometry and flow control is disclosed. Included are the steps of parametrically defining the geometry of the duct wall shape, parametrically defining one or more flow control actuators in the duct wall, measuring a plurality of performance parameters or metrics (e.g., flow characteristics) of the duct and comparing the results of the measurement with desired or target parameters, and selecting the optimal duct geometry and flow control for at least a portion of the duct, the selection process including evaluating the plurality of performance metrics in a pareto analysis. The use of this method in the design of inter-turbine transition ducts, serpentine ducts, inlets, diffusers, and similar components provides a design which reduces pressure losses and flow profile distortions.
Ad hoc Laser networks component technology for modular spacecraft
NASA Astrophysics Data System (ADS)
Huang, Xiujun; Shi, Dele; Ma, Zongfeng; Shen, Jingshi
2016-03-01
Distributed reconfigurable satellite is a new kind of spacecraft system, which is based on a flexible platform of modularization and standardization. Based on the module data flow analysis of the spacecraft, this paper proposes a network component of ad hoc Laser networks architecture. Low speed control network with high speed load network of Microwave-Laser communication mode, no mesh network mode, to improve the flexibility of the network. Ad hoc Laser networks component technology was developed, and carried out the related performance testing and experiment. The results showed that ad hoc Laser networks components can meet the demand of future networking between the module of spacecraft.
Ad hoc laser networks component technology for modular spacecraft
NASA Astrophysics Data System (ADS)
Huang, Xiujun; Shi, Dele; Shen, Jingshi
2017-10-01
Distributed reconfigurable satellite is a new kind of spacecraft system, which is based on a flexible platform of modularization and standardization. Based on the module data flow analysis of the spacecraft, this paper proposes a network component of ad hoc Laser networks architecture. Low speed control network with high speed load network of Microwave-Laser communication mode, no mesh network mode, to improve the flexibility of the network. Ad hoc Laser networks component technology was developed, and carried out the related performance testing and experiment. The results showed that ad hoc Laser networks components can meet the demand of future networking between the module of spacecraft.
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
Study of component technologies for fuel cell on-site integrated energy system. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Lee, W. D.; Mathias, S.
1980-01-01
This data base catalogue was compiled in order to facilitate the analysis of various on site integrated energy system with fuel cell power plants. The catalogue is divided into two sections. The first characterizes individual components in terms of their performance profiles as a function of design parameters. The second characterizes total heating and cooling systems in terms of energy output as a function of input and control variables. The integrated fuel cell systems diagrams and the computer analysis of systems are included as well as the cash flows series for baseline systems.
Fetal ECG extraction using independent component analysis by Jade approach
NASA Astrophysics Data System (ADS)
Giraldo-Guzmán, Jader; Contreras-Ortiz, Sonia H.; Lasprilla, Gloria Isabel Bautista; Kotas, Marian
2017-11-01
Fetal ECG monitoring is a useful method to assess the fetus health and detect abnormal conditions. In this paper we propose an approach to extract fetal ECG from abdomen and chest signals using independent component analysis based on the joint approximate diagonalization of eigenmatrices approach. The JADE approach avoids redundancy, what reduces matrix dimension and computational costs. Signals were filtered with a high pass filter to eliminate low frequency noise. Several levels of decomposition were tested until the fetal ECG was recognized in one of the separated sources output. The proposed method shows fast and good performance.
NASA Astrophysics Data System (ADS)
Jiang, Weiping; Ma, Jun; Li, Zhao; Zhou, Xiaohui; Zhou, Boye
2018-05-01
The analysis of the correlations between the noise in different components of GPS stations has positive significance to those trying to obtain more accurate uncertainty of velocity with respect to station motion. Previous research into noise in GPS position time series focused mainly on single component evaluation, which affects the acquisition of precise station positions, the velocity field, and its uncertainty. In this study, before and after removing the common-mode error (CME), we performed one-dimensional linear regression analysis of the noise amplitude vectors in different components of 126 GPS stations with a combination of white noise, flicker noise, and random walking noise in Southern California. The results show that, on the one hand, there are above-moderate degrees of correlation between the white noise amplitude vectors in all components of the stations before and after removal of the CME, while the correlations between flicker noise amplitude vectors in horizontal and vertical components are enhanced from un-correlated to moderately correlated by removing the CME. On the other hand, the significance tests show that, all of the obtained linear regression equations, which represent a unique function of the noise amplitude in any two components, are of practical value after removing the CME. According to the noise amplitude estimates in two components and the linear regression equations, more accurate noise amplitudes can be acquired in the two components.
Puniya, Bhanwar Lal; Allen, Laura; Hochfelder, Colleen; Majumder, Mahbubul; Helikar, Tomáš
2016-01-01
Dysregulation in signal transduction pathways can lead to a variety of complex disorders, including cancer. Computational approaches such as network analysis are important tools to understand system dynamics as well as to identify critical components that could be further explored as therapeutic targets. Here, we performed perturbation analysis of a large-scale signal transduction model in extracellular environments that stimulate cell death, growth, motility, and quiescence. Each of the model’s components was perturbed under both loss-of-function and gain-of-function mutations. Using 1,300 simulations under both types of perturbations across various extracellular conditions, we identified the most and least influential components based on the magnitude of their influence on the rest of the system. Based on the premise that the most influential components might serve as better drug targets, we characterized them for biological functions, housekeeping genes, essential genes, and druggable proteins. The most influential components under all environmental conditions were enriched with several biological processes. The inositol pathway was found as most influential under inactivating perturbations, whereas the kinase and small lung cancer pathways were identified as the most influential under activating perturbations. The most influential components were enriched with essential genes and druggable proteins. Moreover, known cancer drug targets were also classified in influential components based on the affected components in the network. Additionally, the systemic perturbation analysis of the model revealed a network motif of most influential components which affect each other. Furthermore, our analysis predicted novel combinations of cancer drug targets with various effects on other most influential components. We found that the combinatorial perturbation consisting of PI3K inactivation and overactivation of IP3R1 can lead to increased activity levels of apoptosis-related components and tumor-suppressor genes, suggesting that this combinatorial perturbation may lead to a better target for decreasing cell proliferation and inducing apoptosis. Finally, our approach shows a potential to identify and prioritize therapeutic targets through systemic perturbation analysis of large-scale computational models of signal transduction. Although some components of the presented computational results have been validated against independent gene expression data sets, more laboratory experiments are warranted to more comprehensively validate the presented results. PMID:26904540
Performance-based measures associate with frailty in patients with end-stage liver disease
Lai, Jennifer C.; Volk, Michael L; Strasburg, Debra; Alexander, Neil
2016-01-01
Background Physical frailty, as measured by the Fried Frailty Index, is increasingly recognized as a critical determinant of outcomes in cirrhotics. However, its utility is limited by the inclusion of self-reported components. We aimed to identify performance-based measures associated with frailty in patients with cirrhosis. Methods Cirrhotics ≥50 years underwent: 6-minute walk test (6MWT, cardiopulmonary endurance), chair stands in 30 seconds (muscle endurance), isometric knee extension (lower extremity strength), unipedal stance time (static balance), and maximal step length (dynamic balance/coordination). Linear regression associated each physical performance test with frailty. Principal components exploratory factor analysis evaluated the inter-relatedness of frailty and the 5 physical performance tests. Results Of forty cirrhotics, with a median age of 64 years and Model for End-stage Liver Disease (MELD) MELD of 12,10 (25%) were frail by Fried Frailty Index ≥3. Frail cirrhotics had poorer performance in 6MWT distance (231 vs. 338 meters), 30 second chair stands (7 vs. 10), isometric knee extension (86 vs. 122 Newton meters), and maximal step length (22 vs. 27 inches) [p≤0.02 for each]. Each physical performance test was significantly associated with frailty (p<0.01), even after adjustment for MELD or hepatic encephalopathy. Principal component factor analysis demonstrated substantial, but unique, clustering of each physical performance test to a single factor – frailty. Conclusion Frailty in cirrhosis is a multi-dimensional construct that is distinct from liver dysfunction and incorporates endurance, strength, and balance. Our data provide specific targets for prehabilitation interventions aimed at reducing frailty in cirrhotics in preparation for liver transplantation. PMID:27495749
Performance-Based Measures Associate With Frailty in Patients With End-Stage Liver Disease.
Lai, Jennifer C; Volk, Michael L; Strasburg, Debra; Alexander, Neil
2016-12-01
Physical frailty, as measured by the Fried Frailty Index, is increasingly recognized as a critical determinant of outcomes in patients with cirrhosis. However, its utility is limited by the inclusion of self-reported components. We aimed to identify performance-based measures associated with frailty in patients with cirrhosis. Patients with cirrhosis, aged 50 years or older, underwent: 6-minute walk test (cardiopulmonary endurance), chair stands in 30 seconds (muscle endurance), isometric knee extension (lower extremity strength), unipedal stance time (static balance), and maximal step length (dynamic balance/coordination). Linear regression associated each physical performance test with frailty. Principal components exploratory factor analysis evaluated the interrelatedness of frailty and the 5 physical performance tests. Of 40 patients with cirrhosis, with a median age of 64 years and Model for End-stage Liver Disease (MELD) MELD of 12.10 (25%) were frail by Fried Frailty Index ≥3. Frail patients with cirrhosis had poorer performance in 6-minute walk test distance (231 vs 338 m), 30-second chair stands (7 vs 10), isometric knee extension (86 vs 122 Newton meters), and maximal step length (22 vs 27 in. (P ≤ 0.02 for each). Each physical performance test was significantly associated with frailty (P < 0.01), even after adjustment for MELD or hepatic encephalopathy. Principal component factor analysis demonstrated substantial, but unique, clustering of each physical performance test to a single factor-frailty. Frailty in cirrhosis is a multidimensional construct that is distinct from liver dysfunction and incorporates endurance, strength, and balance. Our data provide specific targets for prehabilitation interventions aimed at reducing frailty in patients with cirrhosis in preparation for liver transplantation.
Spatially distributed effects of mental exhaustion on resting-state FMRI networks.
Esposito, Fabrizio; Otto, Tobias; Zijlstra, Fred R H; Goebel, Rainer
2014-01-01
Brain activity during rest is spatially coherent over functional connectivity networks called resting-state networks. In resting-state functional magnetic resonance imaging, independent component analysis yields spatially distributed network representations reflecting distinct mental processes, such as intrinsic (default) or extrinsic (executive) attention, and sensory inhibition or excitation. These aspects can be related to different treatments or subjective experiences. Among these, exhaustion is a common psychological state induced by prolonged mental performance. Using repeated functional magnetic resonance imaging sessions and spatial independent component analysis, we explored the effect of several hours of sustained cognitive performances on the resting human brain. Resting-state functional magnetic resonance imaging was performed on the same healthy volunteers in two days, with and without, and before, during and after, an intensive psychological treatment (skill training and sustained practice with a flight simulator). After each scan, subjects rated their level of exhaustion and performed an N-back task to evaluate eventual decrease in cognitive performance. Spatial maps of selected resting-state network components were statistically evaluated across time points to detect possible changes induced by the sustained mental performance. The intensive treatment had a significant effect on exhaustion and effort ratings, but no effects on N-back performances. Significant changes in the most exhausted state were observed in the early visual processing and the anterior default mode networks (enhancement) and in the fronto-parietal executive networks (suppression), suggesting that mental exhaustion is associated with a more idling brain state and that internal attention processes are facilitated to the detriment of more extrinsic processes. The described application may inspire future indicators of the level of fatigue in the neural attention system.
Guided filter and principal component analysis hybrid method for hyperspectral pansharpening
NASA Astrophysics Data System (ADS)
Qu, Jiahui; Li, Yunsong; Dong, Wenqian
2018-01-01
Hyperspectral (HS) pansharpening aims to generate a fused HS image with high spectral and spatial resolution through integrating an HS image with a panchromatic (PAN) image. A guided filter (GF) and principal component analysis (PCA) hybrid HS pansharpening method is proposed. First, the HS image is interpolated and the PCA transformation is performed on the interpolated HS image. The first principal component (PC1) channel concentrates on the spatial information of the HS image. Different from the traditional PCA method, the proposed method sharpens the PAN image and utilizes the GF to obtain the spatial information difference between the HS image and the enhanced PAN image. Then, in order to reduce spectral and spatial distortion, an appropriate tradeoff parameter is defined and the spatial information difference is injected into the PC1 channel through multiplying by this tradeoff parameter. Once the new PC1 channel is obtained, the fused image is finally generated by the inverse PCA transformation. Experiments performed on both synthetic and real datasets show that the proposed method outperforms other several state-of-the-art HS pansharpening methods in both subjective and objective evaluations.
Ding, Shujing; Dudley, Ed; Plummer, Sue; Tang, Jiandong; Newton, Russell P; Brenton, A Gareth
2006-01-01
A reversed-phase high-performance liquid chromatography/electrospray ionisation mass spectrometry (RP-HPLC/ESI-MS) method was developed and validated for the simultaneous determination of ten major active components in Ginkgo biloba extract (bilobalide, ginkgolides A, B, C, quercetin, kaempferol, isorhamnetin, rutin hydrate, quercetin-3-beta-D-glucoside and quercitrin hydrate) which have not been previously reported to be quantified in a single analysis. The ten components exhibit baseline separation in 50 min by C18 chromatography using a water/1:1 (v/v) methanol/acetonitrile gradient. Quantitation was performed using negative ESI-MS in selected ion monitoring (SIM) mode. Good reproducibility and recovery were obtained by this method. The sensitivity of both UV and different mass spectrometry modes (full scan, selected ion monitoring (SIM), and selected reaction monitoring (SRM)) were compared and both quantitation with and without internal standard were evaluated. The analysis of Ginkgo biloba commercial products showed remarkable variations in the rutin and quercetin content as well as the terpene lactone contents although all the products satisfy the conventional quality control method. Copyright 2006 John Wiley & Sons, Ltd.
Lou, Qiong; Ye, Xiaolan; Zhou, Yingyi; Li, Hua; Song, Fenyun
2015-06-01
A method incorporating double-wavelength ultra high performance liquid chromatography with quadrupole time-of-flight mass spectrometry was developed for the investigation of the chemical fingerprint of Ganmaoling granule. The chromatographic separations were performed on an ACQUITY UPLC HSS C18 column (2.1 × 50 mm, 1.8 μm) at 30°C using gradient elution with water/formic acid (1%) and acetonitrile at a flow rate of 0.4 mL/min. A total of 11 chemical constituents of Ganmaoling granule were identified from their molecular weight, UV spectra, tandem mass spectrometry data, and retention behavior by comparing the results with those of the reference standards or literature. And 25 peaks were selected as the common peaks for fingerprint analysis to evaluate the similarities among 25 batches of Ganmaoling granule. The results of principal component analysis and orthogonal projection to latent structures discriminant analysis showed that the important chemical markers that could distinguish the different batches were revealed as 4,5-di-O-caffeoylquinic acid, 3,5-di-O-caffeoylquinic acid, and 4-O-caffeoylquinic acid. This is the first report of the ultra high performance liquid chromatography chemical fingerprint and component identification of Ganmaoling granule, which could lay a foundation for further studies of Ganmaoling granule. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Coping with Trial-to-Trial Variability of Event Related Signals: A Bayesian Inference Approach
NASA Technical Reports Server (NTRS)
Ding, Mingzhou; Chen, Youghong; Knuth, Kevin H.; Bressler, Steven L.; Schroeder, Charles E.
2005-01-01
In electro-neurophysiology, single-trial brain responses to a sensory stimulus or a motor act are commonly assumed to result from the linear superposition of a stereotypic event-related signal (e.g. the event-related potential or ERP) that is invariant across trials and some ongoing brain activity often referred to as noise. To extract the signal, one performs an ensemble average of the brain responses over many identical trials to attenuate the noise. To date, h s simple signal-plus-noise (SPN) model has been the dominant approach in cognitive neuroscience. Mounting empirical evidence has shown that the assumptions underlying this model may be overly simplistic. More realistic models have been proposed that account for the trial-to-trial variability of the event-related signal as well as the possibility of multiple differentially varying components within a given ERP waveform. The variable-signal-plus-noise (VSPN) model, which has been demonstrated to provide the foundation for separation and characterization of multiple differentially varying components, has the potential to provide a rich source of information for questions related to neural functions that complement the SPN model. Thus, being able to estimate the amplitude and latency of each ERP component on a trial-by-trial basis provides a critical link between the perceived benefits of the VSPN model and its many concrete applications. In this paper we describe a Bayesian approach to deal with this issue and the resulting strategy is referred to as the differentially Variable Component Analysis (dVCA). We compare the performance of dVCA on simulated data with Independent Component Analysis (ICA) and analyze neurobiological recordings from monkeys performing cognitive tasks.
The dependability of medical students' performance ratings as documented on in-training evaluations.
van Barneveld, Christina
2005-03-01
To demonstrate an approach to obtain an unbiased estimate of the dependability of students' performance ratings during training, when the data-collection design includes nesting of student in rater, unbalanced nest sizes, and dependent observations. In 2003, two variance components analyses of in-training evaluation (ITE) report data were conducted using urGENOVA software. In the first analysis, the dependability for the nested and unbalanced data-collection design was calculated. In the second analysis, an approach using multiple generalizability studies was used to obtain an unbiased estimate of the student variance component, resulting in an unbiased estimate of dependability. Results suggested that there is bias in estimates of the dependability of students' performance on ITEs that are attributable to the data-collection design. When the bias was corrected, the results indicated that the dependability of ratings of student performance was almost zero. The combination of the multiple generalizability studies method and the use of specialized software provides an unbiased estimate of the dependability of ratings of student performance on ITE scores for data-collection designs that include nesting of student in rater, unbalanced nest sizes, and dependent observations.
Jackson, Brian A; Faith, Kay Sullivan
2013-02-01
Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.
Multifractal Cross Wavelet Analysis
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Gao, Xing-Lu; Zhou, Wei-Xing; Stanley, H. Eugene
Complex systems are composed of mutually interacting components and the output values of these components usually exhibit long-range cross-correlations. Using wavelet analysis, we propose a method of characterizing the joint multifractal nature of these long-range cross correlations, a method we call multifractal cross wavelet analysis (MFXWT). We assess the performance of the MFXWT method by performing extensive numerical experiments on the dual binomial measures with multifractal cross correlations and the bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. For binomial multifractal measures, we find the empirical joint multifractality of MFXWT to be in approximate agreement with the theoretical formula. For bFBMs, MFXWT may provide spurious multifractality because of the wide spanning range of the multifractal spectrum. We also apply the MFXWT method to stock market indices, and in pairs of index returns and volatilities we find an intriguing joint multifractal behavior. The tests on surrogate series also reveal that the cross correlation behavior, particularly the cross correlation with zero lag, is the main origin of cross multifractality.
Clerici, Nicola; Bodini, Antonio; Ferrarini, Alessandro
2004-10-01
In order to achieve improved sustainability, local authorities need to use tools that adequately describe and synthesize environmental information. This article illustrates a methodological approach that organizes a wide suite of environmental indicators into few aggregated indices, making use of correlation, principal component analysis, and fuzzy sets. Furthermore, a weighting system, which includes stakeholders' priorities and ambitions, is applied. As a case study, the described methodology is applied to the Reggio Emilia Province in Italy, by considering environmental information from 45 municipalities. Principal component analysis is used to condense an initial set of 19 indicators into 6 fundamental dimensions that highlight patterns of environmental conditions at the provincial scale. These dimensions are further aggregated in two indices of environmental performance through fuzzy sets. The simple form of these indices makes them particularly suitable for public communication, as they condensate a wide set of heterogeneous indicators. The main outcomes of the analysis and the potential applications of the method are discussed.
Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study
NASA Astrophysics Data System (ADS)
Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh
2018-03-01
Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.
Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica
2016-04-19
The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.
Cognitive Task Analysis of Prioritization in Air Traffic Control.
ERIC Educational Resources Information Center
Redding, Richard E.; And Others
A cognitive task analysis was performed to analyze the key cognitive components of the en route air traffic controllers' jobs. The goals were to ascertain expert mental models and decision-making strategies and to identify important differences in controller knowledge, skills, and mental models as a function of expertise. Four groups of…
The Integration of Psycholinguistic and Discourse Processing Theories of Reading Comprehension.
ERIC Educational Resources Information Center
Beebe, Mona J.
To assess the compatibility of miscue analysis and recall analysis as independent elements in a theory of reading comprehension, a study was performed that operationalized each theory and separated its components into measurable units to allow empirical testing. A cueing strategy model was estimated, but the discourse processing model was broken…
Integrable multi-component generalization of a modified short pulse equation
NASA Astrophysics Data System (ADS)
Matsuno, Yoshimasa
2016-11-01
We propose a multi-component generalization of the modified short pulse (SP) equation which was derived recently as a reduction of Feng's two-component SP equation. Above all, we address the two-component system in depth. We obtain the Lax pair, an infinite number of conservation laws and multisoliton solutions for the system, demonstrating its integrability. Subsequently, we show that the two-component system exhibits cusp solitons and breathers for which the detailed analysis is performed. Specifically, we explore the interaction process of two cusp solitons and derive the formula for the phase shift. While cusp solitons are singular solutions, smooth breather solutions are shown to exist, provided that the parameters characterizing the solutions satisfy certain conditions. Last, we discuss the relation between the proposed system and existing two-component SP equations.
EFICAz2: enzyme function inference by a combined approach enhanced by machine learning.
Arakaki, Adrian K; Huang, Ying; Skolnick, Jeffrey
2009-04-13
We previously developed EFICAz, an enzyme function inference approach that combines predictions from non-completely overlapping component methods. Two of the four components in the original EFICAz are based on the detection of functionally discriminating residues (FDRs). FDRs distinguish between member of an enzyme family that are homofunctional (classified under the EC number of interest) or heterofunctional (annotated with another EC number or lacking enzymatic activity). Each of the two FDR-based components is associated to one of two specific kinds of enzyme families. EFICAz exhibits high precision performance, except when the maximal test to training sequence identity (MTTSI) is lower than 30%. To improve EFICAz's performance in this regime, we: i) increased the number of predictive components and ii) took advantage of consensual information from the different components to make the final EC number assignment. We have developed two new EFICAz components, analogs to the two FDR-based components, where the discrimination between homo and heterofunctional members is based on the evaluation, via Support Vector Machine models, of all the aligned positions between the query sequence and the multiple sequence alignments associated to the enzyme families. Benchmark results indicate that: i) the new SVM-based components outperform their FDR-based counterparts, and ii) both SVM-based and FDR-based components generate unique predictions. We developed classification tree models to optimally combine the results from the six EFICAz components into a final EC number prediction. The new implementation of our approach, EFICAz2, exhibits a highly improved prediction precision at MTTSI < 30% compared to the original EFICAz, with only a slight decrease in prediction recall. A comparative analysis of enzyme function annotation of the human proteome by EFICAz2 and KEGG shows that: i) when both sources make EC number assignments for the same protein sequence, the assignments tend to be consistent and ii) EFICAz2 generates considerably more unique assignments than KEGG. Performance benchmarks and the comparison with KEGG demonstrate that EFICAz2 is a powerful and precise tool for enzyme function annotation, with multiple applications in genome analysis and metabolic pathway reconstruction. The EFICAz2 web service is available at: http://cssb.biology.gatech.edu/skolnick/webservice/EFICAz2/index.html.
Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S
2017-06-01
Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.
Failure modes and effects analysis automation
NASA Technical Reports Server (NTRS)
Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron
1988-01-01
A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.
Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background
NASA Astrophysics Data System (ADS)
McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.
2017-12-01
Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.
Comparison of cemented and uncemented fixation in total knee arthroplasty.
Brown, Thomas E; Harper, Benjamin L; Bjorgul, Kristian
2013-05-01
As a result of reading this article, physicians should be able to :1. Understand the rationale behind using uncemented fixation in total knee arthroplasty.2.Discuss the current literature comparing cemented and uncemented total knee arthroplasty3. Describe the value of radiostereographic analysis in assessing implant stability.4. Appreciate the limitations in the available literature advocating 1 mode of fixation in total knee arthroplasty. Total knee arthroplasty performed worldwide uses either cemented, cementless, or hybrid (cementless femur with a cemented tibia) fixation of the components. No recent literature review concerning the outcomes of cemented vs noncemented components has been performed. Noncemented components offer the potential advantage of a biologic interface between the bone and implants, which could demonstrate the greatest advantage in long-term durable fixation in the follow-up of young patients undergoing arthroplasty. Several advances have been made in the backing of the tibial components that have not been available long enough to yield long-term comparative follow-up studies. Short-term radiostereographic analysis studies have yielded differing results. Although long-term, high-quality studies are still needed, material advances in biologic fixation surfaces, such as trabecular metal and hydroxyapatite, may offer promising results for young and active patients undergoing total knee arthroplasty when compared with traditional cemented options. Copyright 2013, SLACK Incorporated.
Yuan, Jinbin; Chen, Yang; Liang, Jian; Wang, Chong-Zhi; Liu, Xiaofei; Yan, Zhihong; Tang, Yi; Li, Jiankang; Yuan, Chun-Su
2016-12-01
Ginseng is one of the most widely used natural medicines in the world. Recent studies have suggested Panax ginseng has a wide range of beneficial effects on aging, central nervous system disorders, and neurodegenerative diseases. However, knowledge about the specific bioactive components of ginseng is still limited. This work aimed to screen for the bioactive components in Panax ginseng that act against neurodegenerative diseases, using the target cell-based bioactivity screening method. Firstly, component analysis of Panax ginseng extracts was performed by UPLC-QTOF-MS, and a total of 54 compounds in white ginseng were characterized and identified according to the retention behaviors, accurate MW, MS characteristics, parent nucleus, aglycones, side chains, and literature data. Then target cell-based bioactivity screening method was developed to predict the candidate compounds in ginseng with SH-SY5Y cells. Four ginsenosides, Rg 2 , Rh 1 , Ro, and Rd, were observed to be active. The target cell-based bioactivity screening method coupled with UPLC-QTOF-MS technique has suitable sensitivity and it can be used as a screening tool for low content bioactive constituents in natural products. Copyright © 2016 Elsevier B.V. All rights reserved.
Initial Design and Construction of a Mobil Regenerative Fuel Cell System
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.; Maloney, Thomas; Hoberecht, Mark (Technical Monitor)
2003-01-01
The design and initial construction of a mobile regenerative power system is described. The main components of the power system consists of a photovoltaic array, regenerative fuel cell and electrolyzer. The system is mounted on a modified landscape trailer and is completely self contained. An operational analysis is also presented that shows predicted performance for the system at various times of the year. The operational analysis consists of performing an energy balance on the system based on array output and total desired operational time.
Total knee replacement-cementless tibial fixation with screws: 10-year results.
Ersan, Önder; Öztürk, Alper; Çatma, Mehmet Faruk; Ünlü, Serhan; Akdoğan, Mutlu; Ateş, Yalım
2017-12-01
The aim of this study was to evaluate the long term clinical and radiological results of cementless total knee replacement. A total of 51 knees of 49 patients (33 female and 16 male; mean age: 61.6 years (range, 29-66 years)) who underwent TKR surgery with a posterior stabilized hydroxyapatite coated knee implant were included in this study. All of the tibial components were fixed with screws. The HSS scores were examined preoperatively and at the final follow-up. Radiological assessment was performed with Knee Society evaluating and scoring system. Kaplan-Meier survival analysis was performed to rule out the survival of the tibial component. The mean HSS scores were 45.8 (range 38-60) and 88.1 (range 61-93), preoperatively and at the final follow-up respectively. Complete radiological assessment was performed for 48 knees. Lucent lines at the tibial component were observed in 4 patients; one of these patients underwent a revision surgery due to the loosening of the tibial component. The 10-year survival rate of a tibial component was 98%. Cementless total knee replacement has satisfactory long term clinical results. Primary fixation of the tibial component with screws provides adequate stability even in elderly patients with good bone quality. Level IV, Therapeutic study. Copyright © 2017 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.
2011-01-01
Background The computer-aided identification of specific gait patterns is an important issue in the assessment of Parkinson's disease (PD). In this study, a computer vision-based gait analysis approach is developed to assist the clinical assessments of PD with kernel-based principal component analysis (KPCA). Method Twelve PD patients and twelve healthy adults with no neurological history or motor disorders within the past six months were recruited and separated according to their "Non-PD", "Drug-On", and "Drug-Off" states. The participants were asked to wear light-colored clothing and perform three walking trials through a corridor decorated with a navy curtain at their natural pace. The participants' gait performance during the steady-state walking period was captured by a digital camera for gait analysis. The collected walking image frames were then transformed into binary silhouettes for noise reduction and compression. Using the developed KPCA-based method, the features within the binary silhouettes can be extracted to quantitatively determine the gait cycle time, stride length, walking velocity, and cadence. Results and Discussion The KPCA-based method uses a feature-extraction approach, which was verified to be more effective than traditional image area and principal component analysis (PCA) approaches in classifying "Non-PD" controls and "Drug-Off/On" PD patients. Encouragingly, this method has a high accuracy rate, 80.51%, for recognizing different gaits. Quantitative gait parameters are obtained, and the power spectrums of the patients' gaits are analyzed. We show that that the slow and irregular actions of PD patients during walking tend to transfer some of the power from the main lobe frequency to a lower frequency band. Our results indicate the feasibility of using gait performance to evaluate the motor function of patients with PD. Conclusion This KPCA-based method requires only a digital camera and a decorated corridor setup. The ease of use and installation of the current method provides clinicians and researchers a low cost solution to monitor the progression of and the treatment to PD. In summary, the proposed method provides an alternative to perform gait analysis for patients with PD. PMID:22074315
Han, Shengli; Huang, Jing; Cui, Ronghua; Zhang, Tao
2015-02-01
Carthamus tinctorius, used in traditional Chinese medicine, has many pharmacological effects, such as anticoagulant effects, antioxidant effects, antiaging effects, regulation of gene expression, and antitumor effects. However, there is no report on the antiallergic effects of the components in C. tinctorius. In the present study, we investigated the antiallergic components of C. tinctorius and its mechanism of action. A rat basophilic leukemia 2H3/cell membrane chromatography coupled online with high-performance liquid chromatography and tandem mass spectrometry method was developed to screen antiallergic components from C. tinctorius. The screening results showed that Hydroxysafflor yellow A, from C. tinctorius, was the targeted component that retained on the rat basophilic leukemia 2H3/cell membrane chromatography column. We measured the amount of β-hexosaminidase and histamine released in mast cells and the key markers of degranulation. The release assays showed that Hydroxysafflor yellow A could attenuate the immunoglobulin E induced release of allergic cytokines without affecting cell viability from 1.0 to 50.0 μM. In conclusion, the established rat basophilic leukemia 2H3 cell membrane chromatography coupled with online high-performance liquid chromatography and tandem mass spectrometry method successfully screened and identified Hydroxysafflor yellow A from C. tinctorius as a potential antiallergic component. Pharmacological analysis elucidated that Hydroxysafflor yellow A is an effective natural component for inhibiting immunoglobulin E-antigen-mediated degranulation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Millán, F; Gracia, S; Sánchez-Martín, F M; Angerri, O; Rousaud, F; Villavicencio, H
2011-03-01
To evaluate a new approach to urinary stone analysis according to the combination of the components. A total of 7949 stones were analysed and their main components and combinations of components were classified according to gender and age. Statistical analysis was performed using the chi-square test. Calcium oxalate monohydrate (COM) was the most frequent component in both males (39%) and females (37.4%), followed by calcium oxalate dihydrate (COD) (28%) and uric acid (URI) (14.6%) in males and by phosphate (PHO) (22.2%) and COD (19.6%) in females (p=0.0001). In young people, COD and PHO were the most frequent components in males and females respectively (p=0.0001). In older patients, COM and URI (in that order) were the most frequent components in both genders (p=0.0001). COM is oxalate dependent and is related to diets with a high oxalate content and low water intake. The progressive increase in URI with age is related mainly to overweight and metabolic syndrome. Regarding the combinations of components, the most frequent were COM (26.3%), COD+Apatite (APA) (15.5%), URI (10%) and COM+COD (7.5%) (p=0.0001). This study reports not only the composition of stones but also the main combinations of components according to age and gender. The results prove that stone composition is related to the changes in dietary habits and life-style that occur over a lifetime, and the morphological structure of stones is indicative of the aetiopathogenic mechanisms. Copyright © 2010 AEU. Published by Elsevier Espana. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwin A. Harvego; James E. O'Brien; Michael G. McKellar
2012-11-01
Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less
Solar-powered unmanned aerial vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reinhardt, K.C.; Lamp, T.R.; Geis, J.W.
1996-12-31
An analysis was performed to determine the impact of various power system components and mission requirements on the size of solar-powered high altitude long endurance (HALE)-type aircraft. The HALE unmanned aerial vehicle (UAV) has good potential for use in many military and civil applications. The primary power system components considered in this study were photovoltaic (PV) modules for power generation and regenerative fuel cells for energy storage. The impact of relevant component performance on UAV size and capability were considered; including PV module efficiency and mass, power electronics efficiency, and fuel cell specific energy. Mission parameters such as time ofmore » year, flight altitude, flight latitude, and payload mass and power were also varied to determine impact on UAV size. The aircraft analysis method used determines the required aircraft wing aspect ratio, wing area, and total mass based on maximum endurance or minimum required power calculations. The results indicate that the capacity of the energy storage system employed, fuel cells in this analysis, greatly impacts aircraft size, whereas the impact of PV module efficiency and mass is much less important. It was concluded that an energy storage specific energy (total system) of 250--500 Whr/kg is required to enable most useful missions, and that PV cells with efficiencies greater than {approximately} 12% are suitable for use.« less
New preparation method of {beta}{double_prime}-alumina and application for AMTEC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nishi, Toshiro; Tsuru, Yasuhiko; Yamamoto, Hirokazu
1995-12-31
The Alkali Metal Thermo-Electric Converter(AMTEC) is an energy conversion system that converts heat to electrical energy with high efficiency. The {beta}{double_prime}-alumina solid electrolyte (BASE) is the most important component in the AMTEC system. In this paper, the relationship among the conduction property, the microstructure and the amount of chemical component for BASE is studied. As an analysis of the chemical reaction for each component, the authors established a new BASE preparation method rather than using the conventional method. They also report the AMTFC cell performance using this electrolyte tube on which Mo or TiC electrode is filmed by the screenmore » printing method. Then, an electrochemical analysis and a heat cycle test of AMTEC cell are studied.« less
Wu, Huey-Min; Lin, Chin-Kai; Yang, Yu-Mao; Kuo, Bor-Chen
2014-11-12
Visual perception is the fundamental skill required for a child to recognize words, and to read and write. There was no visual perception assessment tool developed for preschool children based on Chinese characters in Taiwan. The purposes were to develop the computerized visual perception assessment tool for Chinese Characters Structures and to explore the psychometrical characteristic of assessment tool. This study adopted purposive sampling. The study evaluated 551 kindergarten-age children (293 boys, 258 girls) ranging from 46 to 81 months of age. The test instrument used in this study consisted of three subtests and 58 items, including tests of basic strokes, single-component characters, and compound characters. Based on the results of model fit analysis, the higher-order item response theory was used to estimate the performance in visual perception, basic strokes, single-component characters, and compound characters simultaneously. Analyses of variance were used to detect significant difference in age groups and gender groups. The difficulty of identifying items in a visual perception test ranged from -2 to 1. The visual perception ability of 4- to 6-year-old children ranged from -1.66 to 2.19. Gender did not have significant effects on performance. However, there were significant differences among the different age groups. The performance of 6-year-olds was better than that of 5-year-olds, which was better than that of 4-year-olds. This study obtained detailed diagnostic scores by using a higher-order item response theory model to understand the visual perception of basic strokes, single-component characters, and compound characters. Further statistical analysis showed that, for basic strokes and compound characters, girls performed better than did boys; there also were differences within each age group. For single-component characters, there was no difference in performance between boys and girls. However, again the performance of 6-year-olds was better than that of 4-year-olds, but there were no statistical differences between the performance of 5-year-olds and 6-year-olds. Results of tests with basic strokes, single-component characters and compound characters tests had good reliability and validity. Therefore, it can be apply to diagnose the problem of visual perception at preschool. Copyright © 2014 Elsevier Ltd. All rights reserved.
Transient Reliability Analysis Capability Developed for CARES/Life
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
2001-01-01
The CARES/Life software developed at the NASA Glenn Research Center provides a general-purpose design tool that predicts the probability of the failure of a ceramic component as a function of its time in service. This award-winning software has been widely used by U.S. industry to establish the reliability and life of a brittle material (e.g., ceramic, intermetallic, and graphite) structures in a wide variety of 21st century applications.Present capabilities of the NASA CARES/Life code include probabilistic life prediction of ceramic components subjected to fast fracture, slow crack growth (stress corrosion), and cyclic fatigue failure modes. Currently, this code can compute the time-dependent reliability of ceramic structures subjected to simple time-dependent loading. For example, in slow crack growth failure conditions CARES/Life can handle sustained and linearly increasing time-dependent loads, whereas in cyclic fatigue applications various types of repetitive constant-amplitude loads can be accounted for. However, in real applications applied loads are rarely that simple but vary with time in more complex ways such as engine startup, shutdown, and dynamic and vibrational loads. In addition, when a given component is subjected to transient environmental and or thermal conditions, the material properties also vary with time. A methodology has now been developed to allow the CARES/Life computer code to perform reliability analysis of ceramic components undergoing transient thermal and mechanical loading. This means that CARES/Life will be able to analyze finite element models of ceramic components that simulate dynamic engine operating conditions. The methodology developed is generalized to account for material property variation (on strength distribution and fatigue) as a function of temperature. This allows CARES/Life to analyze components undergoing rapid temperature change in other words, components undergoing thermal shock. In addition, the capability has been developed to perform reliability analysis for components that undergo proof testing involving transient loads. This methodology was developed for environmentally assisted crack growth (crack growth as a function of time and loading), but it will be extended to account for cyclic fatigue (crack growth as a function of load cycles) as well.
Gaudreault, Nathaly; Mezghani, Neila; Turcot, Katia; Hagemeister, Nicola; Boivin, Karine; de Guise, Jacques A
2011-03-01
Interpreting gait data is challenging due to intersubject variability observed in the gait pattern of both normal and pathological populations. The objective of this study was to investigate the impact of using principal component analysis for grouping knee osteoarthritis (OA) patients' gait data in more homogeneous groups when studying the effect of a physiotherapy treatment. Three-dimensional (3D) knee kinematic and kinetic data were recorded during the gait of 29 participants diagnosed with knee OA before and after they received 12 weeks of physiotherapy treatment. Principal component analysis was applied to extract groups of knee flexion/extension, adduction/abduction and internal/external rotation angle and moment data. The treatment's effect on parameters of interest was assessed using paired t-tests performed before and after grouping the knee kinematic data. Increased quadriceps and hamstring strength was observed following treatment (P<0.05). Except for the knee flexion/extension angle, two different groups (G(1) and G(2)) were extracted from the angle and moment data. When pre- and post-treatment analyses were performed considering the groups, participants exhibiting a G(2) knee moment pattern demonstrated a greater first peak flexion moment, lower adduction moment impulse and smaller rotation angle range post-treatment (P<0.05). When pre- and post-treatment comparisons were performed without grouping, the data showed no treatment effect. The results of the present study suggest that the effect of physiotherapy on gait mechanics of knee osteoarthritis patients may be masked or underestimated if kinematic data are not separated into more homogeneous groups when performing pre- and post-treatment comparisons. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Koch, Julian; Cüneyd Demirel, Mehmet; Stisen, Simon
2018-05-01
The process of model evaluation is not only an integral part of model development and calibration but also of paramount importance when communicating modelling results to the scientific community and stakeholders. The modelling community has a large and well-tested toolbox of metrics to evaluate temporal model performance. In contrast, spatial performance evaluation does not correspond to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study makes a contribution towards advancing spatial-pattern-oriented model calibration by rigorously testing a multiple-component performance metric. The promoted SPAtial EFficiency (SPAEF) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multiple-component approach is found to be advantageous in order to achieve the complex task of comparing spatial patterns. SPAEF, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are applied in a spatial-pattern-oriented model calibration of a catchment model in Denmark. Results suggest the importance of multiple-component metrics because stand-alone metrics tend to fail to provide holistic pattern information. The three SPAEF components are found to be independent, which allows them to complement each other in a meaningful way. In order to optimally exploit spatial observations made available by remote sensing platforms, this study suggests applying bias insensitive metrics which further allow for a comparison of variables which are related but may differ in unit. This study applies SPAEF in the hydrological context using the mesoscale Hydrologic Model (mHM; version 5.8), but we see great potential across disciplines related to spatially distributed earth system modelling.
Tipping point analysis of atmospheric oxygen concentration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livina, V. N.; Forbes, A. B.; Vaz Martins, T. M.
2015-03-15
We apply tipping point analysis to nine observational oxygen concentration records around the globe, analyse their dynamics and perform projections under possible future scenarios, leading to oxygen deficiency in the atmosphere. The analysis is based on statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the observed data using Bayesian and wavelet techniques.
Madeo, Andrea; Piras, Paolo; Re, Federica; Gabriele, Stefano; Nardinocchi, Paola; Teresi, Luciano; Torromeo, Concetta; Chialastri, Claudia; Schiariti, Michele; Giura, Geltrude; Evangelista, Antonietta; Dominici, Tania; Varano, Valerio; Zachara, Elisabetta; Puddu, Paolo Emilio
2015-01-01
The assessment of left ventricular shape changes during cardiac revolution may be a new step in clinical cardiology to ease early diagnosis and treatment. To quantify these changes, only point registration was adopted and neither Generalized Procrustes Analysis nor Principal Component Analysis were applied as we did previously to study a group of healthy subjects. Here, we extend to patients affected by hypertrophic cardiomyopathy the original approach and preliminarily include genotype positive/phenotype negative individuals to explore the potential that incumbent pathology might also be detected. Using 3D Speckle Tracking Echocardiography, we recorded left ventricular shape of 48 healthy subjects, 24 patients affected by hypertrophic cardiomyopathy and 3 genotype positive/phenotype negative individuals. We then applied Generalized Procrustes Analysis and Principal Component Analysis and inter-individual differences were cleaned by Parallel Transport performed on the tangent space, along the horizontal geodesic, between the per-subject consensuses and the grand mean. Endocardial and epicardial layers were evaluated separately, different from many ecocardiographic applications. Under a common Principal Component Analysis, we then evaluated left ventricle morphological changes (at both layers) explained by first Principal Component scores. Trajectories’ shape and orientation were investigated and contrasted. Logistic regression and Receiver Operating Characteristic curves were used to compare these morphometric indicators with traditional 3D Speckle Tracking Echocardiography global parameters. Geometric morphometrics indicators performed better than 3D Speckle Tracking Echocardiography global parameters in recognizing pathology both in systole and diastole. Genotype positive/phenotype negative individuals clustered with patients affected by hypertrophic cardiomyopathy during diastole, suggesting that incumbent pathology may indeed be foreseen by these methods. Left ventricle deformation in patients affected by hypertrophic cardiomyopathy compared to healthy subjects may be assessed by modern shape analysis better than by traditional 3D Speckle Tracking Echocardiography global parameters. Hypertrophic cardiomyopathy pathophysiology was unveiled in a new manner whereby also diastolic phase abnormalities are evident which is more difficult to investigate by traditional ecocardiographic techniques. PMID:25875818
Reliability and Productivity Modeling for the Optimization of Separated Spacecraft Interferometers
NASA Technical Reports Server (NTRS)
Kenny, Sean (Technical Monitor); Wertz, Julie
2002-01-01
As technological systems grow in capability, they also grow in complexity. Due to this complexity, it is no longer possible for a designer to use engineering judgement to identify the components that have the largest impact on system life cycle metrics, such as reliability, productivity, cost, and cost effectiveness. One way of identifying these key components is to build quantitative models and analysis tools that can be used to aid the designer in making high level architecture decisions. Once these key components have been identified, two main approaches to improving a system using these components exist: add redundancy or improve the reliability of the component. In reality, the most effective approach to almost any system will be some combination of these two approaches, in varying orders of magnitude for each component. Therefore, this research tries to answer the question of how to divide funds, between adding redundancy and improving the reliability of components, to most cost effectively improve the life cycle metrics of a system. While this question is relevant to any complex system, this research focuses on one type of system in particular: Separate Spacecraft Interferometers (SSI). Quantitative models are developed to analyze the key life cycle metrics of different SSI system architectures. Next, tools are developed to compare a given set of architectures in terms of total performance, by coupling different life cycle metrics together into one performance metric. Optimization tools, such as simulated annealing and genetic algorithms, are then used to search the entire design space to find the "optimal" architecture design. Sensitivity analysis tools have been developed to determine how sensitive the results of these analyses are to uncertain user defined parameters. Finally, several possibilities for the future work that could be done in this area of research are presented.
Chang, Hing-Chiu; Bilgin, Ali; Bernstein, Adam; Trouard, Theodore P.
2018-01-01
Over the past several years, significant efforts have been made to improve the spatial resolution of diffusion-weighted imaging (DWI), aiming at better detecting subtle lesions and more reliably resolving white-matter fiber tracts. A major concern with high-resolution DWI is the limited signal-to-noise ratio (SNR), which may significantly offset the advantages of high spatial resolution. Although the SNR of DWI data can be improved by denoising in post-processing, existing denoising procedures may potentially reduce the anatomic resolvability of high-resolution imaging data. Additionally, non-Gaussian noise induced signal bias in low-SNR DWI data may not always be corrected with existing denoising approaches. Here we report an improved denoising procedure, termed diffusion-matched principal component analysis (DM-PCA), which comprises 1) identifying a group of (not necessarily neighboring) voxels that demonstrate very similar magnitude signal variation patterns along the diffusion dimension, 2) correcting low-frequency phase variations in complex-valued DWI data, 3) performing PCA along the diffusion dimension for real- and imaginary-components (in two separate channels) of phase-corrected DWI voxels with matched diffusion properties, 4) suppressing the noisy PCA components in real- and imaginary-components, separately, of phase-corrected DWI data, and 5) combining real- and imaginary-components of denoised DWI data. Our data show that the new two-channel (i.e., for real- and imaginary-components) DM-PCA denoising procedure performs reliably without noticeably compromising anatomic resolvability. Non-Gaussian noise induced signal bias could also be reduced with the new denoising method. The DM-PCA based denoising procedure should prove highly valuable for high-resolution DWI studies in research and clinical uses. PMID:29694400
Effects of complex aural stimuli on mental performance.
Vij, Mohit; Aghazadeh, Fereydoun; Ray, Thomas G; Hatipkarasulu, Selen
2003-06-01
The objective of this study is to investigate the effect of complex aural stimuli on mental performance. A series of experiments were designed to obtain data for two different analyses. The first analysis is a "Stimulus" versus "No-stimulus" comparison for each of the four dependent variables, i.e. quantitative ability, reasoning ability, spatial ability and memory of an individual, by comparing the control treatment with the rest of the treatments. The second set of analysis is a multi-variant analysis of variance for component level main effects and interactions. The two component factors are tempo of the complex aural stimuli and sound volume level, each administered at three discrete levels for all four dependent variables. Ten experiments were conducted on eleven subjects. It was found that complex aural stimuli influence the quantitative and spatial aspect of the mind, while the reasoning ability was unaffected by the stimuli. Although memory showed a trend to be worse with the presence of complex aural stimuli, the effect was statistically insignificant. Variation in tempo and sound volume level of an aural stimulus did not significantly affect the mental performance of an individual. The results of these experiments can be effectively used in designing work environments.
Structural reliability methods: Code development status
NASA Astrophysics Data System (ADS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-05-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Structural reliability methods: Code development status
NASA Technical Reports Server (NTRS)
Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.
1991-01-01
The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.
Visual Computing Environment Workshop
NASA Technical Reports Server (NTRS)
Lawrence, Charles (Compiler)
1998-01-01
The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.
Failure mode analysis to predict product reliability.
NASA Technical Reports Server (NTRS)
Zemanick, P. P.
1972-01-01
The failure mode analysis (FMA) is described as a design tool to predict and improve product reliability. The objectives of the failure mode analysis are presented as they influence component design, configuration selection, the product test program, the quality assurance plan, and engineering analysis priorities. The detailed mechanics of performing a failure mode analysis are discussed, including one suggested format. Some practical difficulties of implementation are indicated, drawn from experience with preparing FMAs on the nuclear rocket engine program.
Modal Analysis of Space-rocket Equipment Components
NASA Astrophysics Data System (ADS)
Igolkin, A. A.; Safin, A. I.; Prokofiev, A. B.
2018-01-01
In order to prevent vibration damage an analysis of natural frequencies and mode shapes of elements of rocket and space technology should be developed. This paper discusses technique of modal analysis on the example of the carrier platform. Modal analysis was performed by using mathematical modeling and laser vibrometer. Experimental data was clarified by using Test.Lab software. As a result of modal analysis amplitude-frequency response of carrier platform was obtained and the parameters of the elasticity was clarified.
Principal component analysis and the locus of the Fréchet mean in the space of phylogenetic trees.
Nye, Tom M W; Tang, Xiaoxian; Weyenberg, Grady; Yoshida, Ruriko
2017-12-01
Evolutionary relationships are represented by phylogenetic trees, and a phylogenetic analysis of gene sequences typically produces a collection of these trees, one for each gene in the analysis. Analysis of samples of trees is difficult due to the multi-dimensionality of the space of possible trees. In Euclidean spaces, principal component analysis is a popular method of reducing high-dimensional data to a low-dimensional representation that preserves much of the sample's structure. However, the space of all phylogenetic trees on a fixed set of species does not form a Euclidean vector space, and methods adapted to tree space are needed. Previous work introduced the notion of a principal geodesic in this space, analogous to the first principal component. Here we propose a geometric object for tree space similar to the [Formula: see text]th principal component in Euclidean space: the locus of the weighted Fréchet mean of [Formula: see text] vertex trees when the weights vary over the [Formula: see text]-simplex. We establish some basic properties of these objects, in particular showing that they have dimension [Formula: see text], and propose algorithms for projection onto these surfaces and for finding the principal locus associated with a sample of trees. Simulation studies demonstrate that these algorithms perform well, and analyses of two datasets, containing Apicomplexa and African coelacanth genomes respectively, reveal important structure from the second principal components.
Bowers, Andrew; Saltuklaroglu, Tim; Harkrider, Ashley; Cuellar, Megan
2013-01-01
Background Constructivist theories propose that articulatory hypotheses about incoming phonetic targets may function to enhance perception by limiting the possibilities for sensory analysis. To provide evidence for this proposal, it is necessary to map ongoing, high-temporal resolution changes in sensorimotor activity (i.e., the sensorimotor μ rhythm) to accurate speech and non-speech discrimination performance (i.e., correct trials.) Methods Sixteen participants (15 female and 1 male) were asked to passively listen to or actively identify speech and tone-sweeps in a two-force choice discrimination task while the electroencephalograph (EEG) was recorded from 32 channels. The stimuli were presented at signal-to-noise ratios (SNRs) in which discrimination accuracy was high (i.e., 80–100%) and low SNRs producing discrimination performance at chance. EEG data were decomposed using independent component analysis and clustered across participants using principle component methods in EEGLAB. Results ICA revealed left and right sensorimotor µ components for 14/16 and 13/16 participants respectively that were identified on the basis of scalp topography, spectral peaks, and localization to the precentral and postcentral gyri. Time-frequency analysis of left and right lateralized µ component clusters revealed significant (pFDR<.05) suppression in the traditional beta frequency range (13–30 Hz) prior to, during, and following syllable discrimination trials. No significant differences from baseline were found for passive tasks. Tone conditions produced right µ beta suppression following stimulus onset only. For the left µ, significant differences in the magnitude of beta suppression were found for correct speech discrimination trials relative to chance trials following stimulus offset. Conclusions Findings are consistent with constructivist, internal model theories proposing that early forward motor models generate predictions about likely phonemic units that are then synthesized with incoming sensory cues during active as opposed to passive processing. Future directions and possible translational value for clinical populations in which sensorimotor integration may play a functional role are discussed. PMID:23991030
Bowers, Andrew; Saltuklaroglu, Tim; Harkrider, Ashley; Cuellar, Megan
2013-01-01
Constructivist theories propose that articulatory hypotheses about incoming phonetic targets may function to enhance perception by limiting the possibilities for sensory analysis. To provide evidence for this proposal, it is necessary to map ongoing, high-temporal resolution changes in sensorimotor activity (i.e., the sensorimotor μ rhythm) to accurate speech and non-speech discrimination performance (i.e., correct trials.). Sixteen participants (15 female and 1 male) were asked to passively listen to or actively identify speech and tone-sweeps in a two-force choice discrimination task while the electroencephalograph (EEG) was recorded from 32 channels. The stimuli were presented at signal-to-noise ratios (SNRs) in which discrimination accuracy was high (i.e., 80-100%) and low SNRs producing discrimination performance at chance. EEG data were decomposed using independent component analysis and clustered across participants using principle component methods in EEGLAB. ICA revealed left and right sensorimotor µ components for 14/16 and 13/16 participants respectively that were identified on the basis of scalp topography, spectral peaks, and localization to the precentral and postcentral gyri. Time-frequency analysis of left and right lateralized µ component clusters revealed significant (pFDR<.05) suppression in the traditional beta frequency range (13-30 Hz) prior to, during, and following syllable discrimination trials. No significant differences from baseline were found for passive tasks. Tone conditions produced right µ beta suppression following stimulus onset only. For the left µ, significant differences in the magnitude of beta suppression were found for correct speech discrimination trials relative to chance trials following stimulus offset. Findings are consistent with constructivist, internal model theories proposing that early forward motor models generate predictions about likely phonemic units that are then synthesized with incoming sensory cues during active as opposed to passive processing. Future directions and possible translational value for clinical populations in which sensorimotor integration may play a functional role are discussed.
Andersen, Mikkel R; Winther, Nikkolaj S; Lind, Thomas; Schrøder, Henrik M; Flivik, Gunnar; Petersen, Michael M
2017-07-01
The fixation of uncemented tibia components in total knee arthroplasty may rely on the bone quality of the tibia; however, no previous studies have shown convincing objective proof of this. Component migration is relevant as it has been shown to predict aseptic loosening. We performed 2-year follow-up of 92 patients who underwent total knee arthroplasty surgery with an uncemented tibia component. Bone mineral density (BMD; g/cm 2 ) of the tibia host bone was measured preoperatively using dual energy X-ray absorptiometry. The proximal tibia was divided into 2 regions of interest (ROI) in the part of the tibia bone where the components were implanted. Radiostereometric analysis was performed postoperatively and after 3, 6, 12, and 24 months. The primary outcome was maximum total point motion (MTPM; mm). Regression analysis was performed to evaluate the relation between preoperative BMD and MTPM. We found low preoperative BMD in ROI1 to be significantly related to high MTPM at all follow-ups: after 3 months (R 2 = 20%, P BMD = 0.017), 6 months (R 2 = 29%, P BMD = 0.003), 12 months (R 2 = 33%, P BMD = 0.001), and 24 months (R 2 = 27%, P BMD = 0.001). We also found a significant relation for low BMD in ROI2 and high MTPM: 3 months (R 2 = 19%, P BMD = 0.042), 6 months (R 2 = 28%, P BMD = 0.04), 12 months (R 2 = 32%, P BMD = 0.004), and 24 months (R 2 = 24%, P BMD = 0.005). Low preoperative BMD in the tibia is related to high MTPM. Thus, high migration of uncemented tibia components is to be expected in patients with poor bone quality. Copyright © 2017 Elsevier Inc. All rights reserved.
Low-Level Analytical Methodology Updates to Support Decontaminant Performance Evaluations
2011-06-01
from EPDM and tire rubber coupon materials that were spiked with a known amount of the chemical agent VX, treated with bleach decontaminant, and...to evaluate the performance of bleach decontaminant on EPDM and tire rubber coupons. Dose-confirmation or Tool samples were collected by delivering...components • An aging or damaged analytical column • Dirty detector • Other factors related to general instrument and/or sample analysis performance
Socaci, Sonia A; Socaciu, Carmen; Tofană, Maria; Raţi, Ioan V; Pintea, Adela
2013-01-01
The health benefits of sea buckthorn (Hippophae rhamnoides L.) are well documented due to its rich content in bioactive phytochemicals (pigments, phenolics and vitamins) as well as volatiles responsible for specific flavours and bacteriostatic action. The volatile compounds are good biomarkers of berry freshness, quality and authenticity. To develop a fast and efficient GC-MS method including a minimal sample preparation technique (in-tube extraction, ITEX) for the discrimination of sea buckthorn varieties based on their chromatographic volatile fingerprint. Twelve sea buckthorn varieties (wild and cultivated) were collected from forestry departments and experimental fields, respectively. The extraction of volatile compounds was performed using the ITEX technique whereas separation and identification was performed using a GC-MS QP-2010. Principal component analysis (PCA) was applied to discriminate the differences among sample composition. Using GC-MS analysis, from the headspace of sea buckthorn samples, 46 volatile compounds were separated with 43 being identified. The most abundant derivatives were ethyl esters of 2-methylbutanoic acid, 3-methylbutanoic acid, hexanoic acid, octanoic acid and butanoic acid, as well as 3-methylbutyl 3-methylbutanoate, 3-methylbutyl 2-methylbutanoate and benzoic acid ethyl ester (over 80% of all volatile compounds). Principal component analysis showed that the first two components explain 79% of data variance, demonstrating a good discrimination between samples. A reliable, fast and eco-friendly ITEX/GC-MS method was applied to fingerprint the volatile profile and to discriminate between wild and cultivated sea buckthorn berries originating from the Carpathians, with relevance to food science and technology. Copyright © 2013 John Wiley & Sons, Ltd.
Guide for Oxygen Component Qualification Tests
NASA Technical Reports Server (NTRS)
Bamford, Larry J.; Rucker, Michelle A.; Dobbin, Douglas
1996-01-01
Although oxygen is a chemically stable element, it is not shock sensitive, will not decompose, and is not flammable. Oxygen use therefore carries a risk that should never be overlooked, because oxygen is a strong oxidizer that vigorously supports combustion. Safety is of primary concern in oxygen service. To promote safety in oxygen systems, the flammability of materials used in them should be analyzed. At the NASA White Sands Test Facility (WSTF), we have performed configurational tests of components specifically engineered for oxygen service. These tests follow a detailed WSTF oxygen hazards analysis. The stated objective of the tests was to provide performance test data for customer use as part of a qualification plan for a particular component in a particular configuration, and under worst-case conditions. In this document - the 'Guide for Oxygen Component Qualification Tests' - we outline recommended test systems, and cleaning, handling, and test procedures that address worst-case conditions. It should be noted that test results apply specifically to: manual valves, remotely operated valves, check valves, relief valves, filters, regulators, flexible hoses, and intensifiers. Component systems are not covered.
NASA Technical Reports Server (NTRS)
Ling, Lisa
2014-01-01
For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.
Intrinsic Resting-State Functional Connectivity in the Human Spinal Cord at 3.0 T.
San Emeterio Nateras, Oscar; Yu, Fang; Muir, Eric R; Bazan, Carlos; Franklin, Crystal G; Li, Wei; Li, Jinqi; Lancaster, Jack L; Duong, Timothy Q
2016-04-01
To apply resting-state functional magnetic resonance (MR) imaging to map functional connectivity of the human spinal cord. Studies were performed in nine self-declared healthy volunteers with informed consent and institutional review board approval. Resting-state functional MR imaging was performed to map functional connectivity of the human cervical spinal cord from C1 to C4 at 1 × 1 × 3-mm resolution with a 3.0-T clinical MR imaging unit. Independent component analysis (ICA) was performed to derive resting-state functional MR imaging z-score maps rendered on two-dimensional and three-dimensional images. Seed-based analysis was performed for cross validation with ICA networks by using Pearson correlation. Reproducibility analysis of resting-state functional MR imaging maps from four repeated trials in a single participant yielded a mean z score of 6 ± 1 (P < .0001). The centroid coordinates across the four trials deviated by 2 in-plane voxels ± 2 mm (standard deviation) and up to one adjacent image section ± 3 mm. ICA of group resting-state functional MR imaging data revealed prominent functional connectivity patterns within the spinal cord gray matter. There were statistically significant (z score > 3, P < .001) bilateral, unilateral, and intersegmental correlations in the ventral horns, dorsal horns, and central spinal cord gray matter. Three-dimensional surface rendering provided visualization of these components along the length of the spinal cord. Seed-based analysis showed that many ICA components exhibited strong and significant (P < .05) correlations, corroborating the ICA results. Resting-state functional MR imaging connectivity networks are qualitatively consistent with known neuroanatomic and functional structures in the spinal cord. Resting-state functional MR imaging of the human cervical spinal cord with a 3.0-T clinical MR imaging unit and standard MR imaging protocols and hardware reveals prominent functional connectivity patterns within the spinal cord gray matter, consistent with known functional and anatomic layouts of the spinal cord.
Cao, Yan; Wang, Shaozhan; Li, Yinghua; Chen, Xiaofei; Chen, Langdong; Wang, Dongyao; Zhu, Zhenyu; Yuan, Yongfang; Lv, Diya
2018-03-09
Cell membrane chromatography (CMC) has been successfully applied to screen bioactive compounds from Chinese herbs for many years, and some offline and online two-dimensional (2D) CMC-high performance liquid chromatography (HPLC) hyphenated systems have been established to perform screening assays. However, the requirement of sample preparation steps for the second-dimensional analysis in offline systems and the need for an interface device and technical expertise in the online system limit their extensive use. In the present study, an offline 2D CMC-HPLC analysis combined with the XCMS (various forms of chromatography coupled to mass spectrometry) Online statistical tool for data processing was established. First, our previously reported online 2D screening system was used to analyze three Chinese herbs that were reported to have potential anti-inflammatory effects, and two binding components were identified. By contrast, the proposed offline 2D screening method with XCMS Online analysis was applied, and three more ingredients were discovered in addition to the two compounds revealed by the online system. Then, cross-validation of the three compounds was performed, and they were confirmed to be included in the online data as well, but were not identified there because of their low concentrations and lack of credible statistical approaches. Last, pharmacological experiments showed that these five ingredients could inhibit IL-6 release and IL-6 gene expression on LPS-induced RAW cells in a dose-dependent manner. Compared with previous 2D CMC screening systems, this newly developed offline 2D method needs no sample preparation steps for the second-dimensional analysis, and it is sensitive, efficient, and convenient. It will be applicable in identifying active components from Chinese herbs and practical in discovery of lead compounds derived from herbs. Copyright © 2018 Elsevier B.V. All rights reserved.
The N2-P3 complex of the evoked potential and human performance
NASA Technical Reports Server (NTRS)
Odonnell, Brian F.; Cohen, Ronald A.
1988-01-01
The N2-P3 complex and other endogenous components of human evoked potential provide a set of tools for the investigation of human perceptual and cognitive processes. These multidimensional measures of central nervous system bioelectrical activity respond to a variety of environmental and internal factors which have been experimentally characterized. Their application to the analysis of human performance in naturalistic task environments is just beginning. Converging evidence suggests that the N2-P3 complex reflects processes of stimulus evaluation, perceptual resource allocation, and decision making that proceed in parallel, rather than in series, with response generation. Utilization of these EP components may provide insights into the central nervous system mechanisms modulating task performance unavailable from behavioral measures alone. The sensitivity of the N2-P3 complex to neuropathology, psychopathology, and pharmacological manipulation suggests that these components might provide sensitive markers for the effects of environmental stressors on the human central nervous system.