Signal Frequency Spectra with Audacity®
ERIC Educational Resources Information Center
Gailey, Alycia
2015-01-01
The primary objective of the activity presented here is to allow students to explore the frequency components of various simple signals, with the ultimate goal of teaching them how to remove unwanted noise from a voice signal. Analysis of the frequency components of a signal allows students to design filters that remove unwanted components of a…
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, A.; Duffy, S. F.; Gyekenyesi, J. P.
1992-01-01
An updated version of the integrated design program C/CARES (composite ceramic analysis and reliability evaluation of structures) was developed for the reliability evaluation of CMC laminated shell components. The algorithm is now split in two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The new interface program from the finite-element code MARC also includes the option of using hybrid laminates and allows for variations in temperature fields throughout the component.
Jesse, Stephen; Kalinin, Sergei V
2009-02-25
An approach for the analysis of multi-dimensional, spectroscopic-imaging data based on principal component analysis (PCA) is explored. PCA selects and ranks relevant response components based on variance within the data. It is shown that for examples with small relative variations between spectra, the first few PCA components closely coincide with results obtained using model fitting, and this is achieved at rates approximately four orders of magnitude faster. For cases with strong response variations, PCA allows an effective approach to rapidly process, de-noise, and compress data. The prospects for PCA combined with correlation function analysis of component maps as a universal tool for data analysis and representation in microscopy are discussed.
Ofner, Johannes; Kamilli, Katharina A; Eitenberger, Elisabeth; Friedbacher, Gernot; Lendl, Bernhard; Held, Andreas; Lohninger, Hans
2015-09-15
The chemometric analysis of multisensor hyperspectral data allows a comprehensive image-based analysis of precipitated atmospheric particles. Atmospheric particulate matter was precipitated on aluminum foils and analyzed by Raman microspectroscopy and subsequently by electron microscopy and energy dispersive X-ray spectroscopy. All obtained images were of the same spot of an area of 100 × 100 μm(2). The two hyperspectral data sets and the high-resolution scanning electron microscope images were fused into a combined multisensor hyperspectral data set. This multisensor data cube was analyzed using principal component analysis, hierarchical cluster analysis, k-means clustering, and vertex component analysis. The detailed chemometric analysis of the multisensor data allowed an extensive chemical interpretation of the precipitated particles, and their structure and composition led to a comprehensive understanding of atmospheric particulate matter.
Hardisty, Frank; Robinson, Anthony C.
2010-01-01
In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423
Multiple Component Event-Related Potential (mcERP) Estimation
NASA Technical Reports Server (NTRS)
Knuth, K. H.; Clanton, S. T.; Shah, A. S.; Truccolo, W. A.; Ding, M.; Bressler, S. L.; Trejo, L. J.; Schroeder, C. E.; Clancy, Daniel (Technical Monitor)
2002-01-01
We show how model-based estimation of the neural sources responsible for transient neuroelectric signals can be improved by the analysis of single trial data. Previously, we showed that a multiple component event-related potential (mcERP) algorithm can extract the responses of individual sources from recordings of a mixture of multiple, possibly interacting, neural ensembles. McERP also estimated single-trial amplitudes and onset latencies, thus allowing more accurate estimation of ongoing neural activity during an experimental trial. The mcERP algorithm is related to informax independent component analysis (ICA); however, the underlying signal model is more physiologically realistic in that a component is modeled as a stereotypic waveshape varying both in amplitude and onset latency from trial to trial. The result is a model that reflects quantities of interest to the neuroscientist. Here we demonstrate that the mcERP algorithm provides more accurate results than more traditional methods such as factor analysis and the more recent ICA. Whereas factor analysis assumes the sources are orthogonal and ICA assumes the sources are statistically independent, the mcERP algorithm makes no such assumptions thus allowing investigators to examine interactions among components by estimating the properties of single-trial responses.
The Natural Helmholtz-Hodge Decomposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatia, H.
nHHD is a C++ library to decompose a flow field into three components exhibiting specific types of behaviors. These components allow more targeted analysis of flow behavior and can be applied to a variety of application areas.
Delorme, Arnaud; Makeig, Scott
2004-03-15
We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.
Using McStas for modelling complex optics, using simple building bricks
NASA Astrophysics Data System (ADS)
Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim
2011-04-01
The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training.In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors.For quite a while, users have requested the ability to allow “components inside components” or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide.We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.
Hencken, Kenneth; Flower, William L.
1999-01-01
A compact optical probe is disclosed particularly useful for analysis of emissions in industrial environments. The instant invention provides a geometry for optically-based measurements that allows all optical components (source, detector, rely optics, etc.) to be located in proximity to one another. The geometry of the probe disclosed herein provides a means for making optical measurements in environments where it is difficult and/or expensive to gain access to the vicinity of a flow stream to be measured. Significantly, the lens geometry of the optical probe allows the analysis location within a flow stream being monitored to be moved while maintaining optical alignment of all components even when the optical probe is focused on a plurality of different analysis points within the flow stream.
Analysis and design of a mechanical system to use with the Ronchi and Fizeau tests
NASA Astrophysics Data System (ADS)
Galán-Martínez, Arturo D.; Santiago-Alvarado, Agustín.; González-García, Jorge; Cruz-Martínez, Víctor M.; Cordero-Dávila, Alberto; Granados-Agustin, Fermin S.; Robledo-Sánchez, Calos
2013-11-01
Nowadays, there is a demand for more efficient opto-mechanical mounts which allow for the implementation of robust optical arrays in a quick and simple fashion. That is to say, mounts are needed which facilitate alignment of the optical components in order to perform the desired movements of each component. Optical testing systems available in the market today are costly, heavy and sometimes require multiple kits depending on the dimensions of the optical components. In this paper, we present the design and analysis of a mechanical system with some interchangeable basic mounts which allow for the application of both Ronchi and Fizeau tests for the evaluation of concave reflective surfaces with a diameter of 2 to 10 cm. The mechanical system design is done using the methodology of product design process, while the analysis is performed using the commercial software SolidWorks.
Hydrodynamic design of generic pump components
NASA Technical Reports Server (NTRS)
Eastland, A. H. J.; Dodson, H. C.
1991-01-01
Inducer and impellar base geometries were defined for a fuel pump for a generic generator cycle. Blade surface data and inlet flowfield definition are available in sufficient detail to allow computational fluid dynamic analysis of the two components.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.
2014-01-01
Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.
Model-free fMRI group analysis using FENICA.
Schöpf, V; Windischberger, C; Robinson, S; Kasess, C H; Fischmeister, F PhS; Lanzenberger, R; Albrecht, J; Kleemann, A M; Kopietz, R; Wiesmann, M; Moser, E
2011-03-01
Exploratory analysis of functional MRI data allows activation to be detected even if the time course differs from that which is expected. Independent Component Analysis (ICA) has emerged as a powerful approach, but current extensions to the analysis of group studies suffer from a number of drawbacks: they can be computationally demanding, results are dominated by technical and motion artefacts, and some methods require that time courses be the same for all subjects or that templates be defined to identify common components. We have developed a group ICA (gICA) method which is based on single-subject ICA decompositions and the assumption that the spatial distribution of signal changes in components which reflect activation is similar between subjects. This approach, which we have called Fully Exploratory Network Independent Component Analysis (FENICA), identifies group activation in two stages. ICA is performed on the single-subject level, then consistent components are identified via spatial correlation. Group activation maps are generated in a second-level GLM analysis. FENICA is applied to data from three studies employing a wide range of stimulus and presentation designs. These are an event-related motor task, a block-design cognition task and an event-related chemosensory experiment. In all cases, the group maps identified by FENICA as being the most consistent over subjects correspond to task activation. There is good agreement between FENICA results and regions identified in prior GLM-based studies. In the chemosensory task, additional regions are identified by FENICA and temporal concatenation ICA that we show is related to the stimulus, but exhibit a delayed response. FENICA is a fully exploratory method that allows activation to be identified without assumptions about temporal evolution, and isolates activation from other sources of signal fluctuation in fMRI. It has the advantage over other gICA methods that it is computationally undemanding, spotlights components relating to activation rather than artefacts, allows the use of familiar statistical thresholding through deployment of a higher level GLM analysis and can be applied to studies where the paradigm is different for all subjects. Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Khodasevich, M. A.; Sinitsyn, G. V.; Skorbanova, E. A.; Rogovaya, M. V.; Kambur, E. I.; Aseev, V. A.
2016-06-01
Analysis of multiparametric data on transmission spectra of 24 divins (Moldovan cognacs) in the 190-2600 nm range allows identification of outliers and their removal from a sample under study in the following consideration. The principal component analysis and classification tree with a single-rank predictor constructed in the 2D space of principal components allow classification of divin manufacturers. It is shown that the accuracy of syringaldehyde, ethyl acetate, vanillin, and gallic acid concentrations in divins calculated with the regression to latent structures depends on the sample volume and is 3, 6, 16, and 20%, respectively, which is acceptable for the application.
Investigation of domain walls in PPLN by confocal raman microscopy and PCA analysis
NASA Astrophysics Data System (ADS)
Shur, Vladimir Ya.; Zelenovskiy, Pavel; Bourson, Patrice
2017-07-01
Confocal Raman microscopy (CRM) is a powerful tool for investigation of ferroelectric domains. Mechanical stresses and electric fields existed in the vicinity of neutral and charged domain walls modify frequency, intensity and width of spectral lines [1], thus allowing to visualize micro- and nanodomain structures both at the surface and in the bulk of the crystal [2,3]. Stresses and fields are naturally coupled in ferroelectrics due to inverse piezoelectric effect and hardly can be separated in Raman spectra. PCA is a powerful statistical method for analysis of large data matrix providing a set of orthogonal variables, called principal components (PCs). PCA is widely used for classification of experimental data, for example, in crystallization experiments, for detection of small amounts of components in solid mixtures etc. [4,5]. In Raman spectroscopy PCA was applied for analysis of phase transitions and provided critical pressure with good accuracy [6]. In the present work we for the first time applied Principal Component Analysis (PCA) method for analysis of Raman spectra measured in periodically poled lithium niobate (PPLN). We found that principal components demonstrate different sensitivity to mechanical stresses and electric fields in the vicinity of the domain walls. This allowed us to separately visualize spatial distribution of fields and electric fields at the surface and in the bulk of PPLN.
Comparative study of human blood Raman spectra and biochemical analysis of patients with cancer
NASA Astrophysics Data System (ADS)
Shamina, Lyudmila A.; Bratchenko, Ivan A.; Artemyev, Dmitry N.; Myakinin, Oleg O.; Moryatov, Alexander A.; Orlov, Andrey E.; Kozlov, Sergey V.; Zakharov, Valery P.
2018-04-01
In this study we measured spectral features of blood by Raman spectroscopy. Correlation of the obtained spectral data and biochemical studies results is investigated. Analysis of specific spectra allows for identification of informative spectral bands proportional to components whose content is associated with body fluids homeostasis changes at various pathological conditions. Regression analysis of the obtained spectral data allows for discriminating the lung cancer from other tumors with a posteriori probability of 88.3%. The potentiality of applying surface-enhanced Raman spectroscopy with utilized experimental setup for further studies of the body fluids component composition was estimated. The greatest signal amplification was achieved for the gold substrate with a surface roughness of 1 μm. In general, the developed approach of body fluids analysis provides the basis of a useful and minimally invasive method of pathologies screening.
NASA Astrophysics Data System (ADS)
Rogowitz, Bernice E.; Matasci, Naim
2011-03-01
The explosion of online scientific data from experiments, simulations, and observations has given rise to an avalanche of algorithmic, visualization and imaging methods. There has also been enormous growth in the introduction of tools that provide interactive interfaces for exploring these data dynamically. Most systems, however, do not support the realtime exploration of patterns and relationships across tools and do not provide guidance on which colors, colormaps or visual metaphors will be most effective. In this paper, we introduce a general architecture for sharing metadata between applications and a "Metadata Mapper" component that allows the analyst to decide how metadata from one component should be represented in another, guided by perceptual rules. This system is designed to support "brushing [1]," in which highlighting a region of interest in one application automatically highlights corresponding values in another, allowing the scientist to develop insights from multiple sources. Our work builds on the component-based iPlant Cyberinfrastructure [2] and provides a general approach to supporting interactive, exploration across independent visualization and visual analysis components.
HPLC-Orbitrap analysis for identification of organic molecules in complex material
NASA Astrophysics Data System (ADS)
Gautier, T.; Schmitz-Afonso, I.; Carrasco, N.; Touboul, D.; Szopa, C.; Buch, A.; Pernot, P.
2015-10-01
We performed High Performance Liquid Chromatography (HPLC) coupled to Orbitrap High Resolution Mass Spectrometry (OHR MS) analysis of Titan's tholins. This analysis allowed us to determine the exact composition and structure of some of the major components of tholins.
Development and test of advanced composite components. Center Directors discretionary fund program
NASA Technical Reports Server (NTRS)
Faile, G.; Hollis, R.; Ledbetter, F.; Maldonado, J.; Sledd, J.; Stuckey, J.; Waggoner, G.; Engler, E.
1985-01-01
This report describes the design, analysis, fabrication, and test of a complex bathtub fitting. Graphite fibers in an epoxy matrix were utilized in manufacturing of 11 components representing four different design and layup concepts. Design allowables were developed for use in the final stress analysis. Strain gage measurements were taken throughout the static load test and correlation of test and analysis data were performed, yielding good understanding of the material behavior and instrumentation requirements for future applications.
NASA Astrophysics Data System (ADS)
Popov, V. N.; Botygin, I. A.; Kolochev, A. S.
2017-01-01
The approach allows representing data of international codes for exchange of meteorological information using metadescription as the formalism associated with certain categories of resources. Development of metadata components was based on an analysis of the data of surface meteorological observations, atmosphere vertical sounding, atmosphere wind sounding, weather radar observing, observations from satellites and others. A common set of metadata components was formed including classes, divisions and groups for a generalized description of the meteorological data. The structure and content of the main components of a generalized metadescription are presented in detail by the example of representation of meteorological observations from land and sea stations. The functional structure of a distributed computing system is described. It allows organizing the storage of large volumes of meteorological data for their further processing in the solution of problems of the analysis and forecasting of climatic processes.
An economic analysis methodology for project evaluation and programming.
DOT National Transportation Integrated Search
2013-08-01
Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...
BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0
NASA Technical Reports Server (NTRS)
1991-01-01
The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.
ADAPTION OF NONSTANDARD PIPING COMPONENTS INTO PRESENT DAY SEISMIC CODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. T. Clark; M. J. Russell; R. E. Spears
2009-07-01
With spiraling energy demand and flat energy supply, there is a need to extend the life of older nuclear reactors. This sometimes requires that existing systems be evaluated to present day seismic codes. Older reactors built in the 1960s and early 1970s often used fabricated piping components that were code compliant during their initial construction time period, but are outside the standard parameters of present-day piping codes. There are several approaches available to the analyst in evaluating these non-standard components to modern codes. The simplest approach is to use the flexibility factors and stress indices for similar standard components withmore » the assumption that the non-standard component’s flexibility factors and stress indices will be very similar. This approach can require significant engineering judgment. A more rational approach available in Section III of the ASME Boiler and Pressure Vessel Code, which is the subject of this paper, involves calculation of flexibility factors using finite element analysis of the non-standard component. Such analysis allows modeling of geometric and material nonlinearities. Flexibility factors based on these analyses are sensitive to the load magnitudes used in their calculation, load magnitudes that need to be consistent with those produced by the linear system analyses where the flexibility factors are applied. This can lead to iteration, since the magnitude of the loads produced by the linear system analysis depend on the magnitude of the flexibility factors. After the loading applied to the nonstandard component finite element model has been matched to loads produced by the associated linear system model, the component finite element model can then be used to evaluate the performance of the component under the loads with the nonlinear analysis provisions of the Code, should the load levels lead to calculated stresses in excess of Allowable stresses. This paper details the application of component-level finite element modeling to account for geometric and material nonlinear component behavior in a linear elastic piping system model. Note that this technique can be applied to the analysis of B31 piping systems.« less
A new approach to the human muscle model.
Baildon, R W; Chapman, A E
1983-01-01
Hill's (1938) two component muscle model is used as basis for digital computer simulation of human muscular contraction by means of an iterative process. The contractile (CC) and series elastic (SEC) components are lumped components of structures which produce and transmit torque to the external environment. The CC is described in angular terms along four dimensions as a series of non-planar torque-angle-angular velocity surfaces stacked on top of each other, each surface being appropriate to a given level of muscular activation. The SEC is described similarly along dimensions of torque, angular stretch, overall muscle angular displacement and activation. The iterative process introduces negligible error and allows the mechanical outcome of a variety of normal muscular contractions to be evaluated parsimoniously. The model allows analysis of many aspects of muscle behaviour as well as optimization studies. Definition of relevant relations should also allow reproduction and prediction of the outcome of contractions in individuals.
Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools
ERIC Educational Resources Information Center
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…
ERIC Educational Resources Information Center
Kirk, Emily R.; Becker, Jennifer A.; Skinner, Christopher H., Fearrington, Jamie Yarbr; McCane-Bowling, Sara J.; Amburn, Christie; Luna, Elisa; Greear, Corinne
2010-01-01
Teacher referrals for consultation resulted in two independent teams collecting evidence that allowed for a treatment component evaluation of color wheel (CW) procedures and/or interdependent group-oriented reward (IGOR) procedures on inappropriate vocalizations in one third- and one first-grade classroom. Both studies involved the application of…
Structural reliability analysis of laminated CMC components
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.
1991-01-01
For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.
1992-01-01
An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.
NASA Technical Reports Server (NTRS)
Jensen, Ralph H.; Dever, Timothy P.
2006-01-01
Design of a flywheel module, designated the G2 module, is described. The G2 flywheel is a 60,000 RPM, 525 W-hr, 1 kW system designed for a laboratory environment; it will be used for component testing and system demonstrations, with the goal of applying flywheels to aerospace energy storage and integrated power and attitude control (IPACS) applications. G2 has a modular design, which allows for new motors, magnetic bearings, touchdown bearings, and rotors to be installed without a complete redesign of the system. This design process involves several engineering disciplines, and requirements are developed for the speed, energy storage, power level, and operating environment. The G2 rotor system consists of a multilayer carbon fiber rim with a titanium hub on which the other components mount, and rotordynamics analysis is conducted to ensure rigid and flexible rotor modes are controllable or outside of the operating speed range. Magnetic bearings are sized using 1-D magnetic circuit analysis and refined using 3-D finite element analysis. The G2 magnetic bearing system was designed by Texas A&M and has redundancy which allows derated operation after the loss of some components, and an existing liquid cooled two pole permanent magnet motor/generator is used. The touchdown bearing system is designed with a squeeze film damper system allowing spin down from full operating speed in case of a magnetic bearing failure. The G2 flywheel will enable module level demonstrations of component technology, and will be a key building block in system level attitude control and IPACS demonstrations.
Steingass, Christof Björn; Jutzi, Manfred; Müller, Jenny; Carle, Reinhold; Schmarr, Hans-Georg
2015-03-01
Ripening-dependent changes of pineapple volatiles were studied in a nontargeted profiling analysis. Volatiles were isolated via headspace solid phase microextraction and analyzed by comprehensive 2D gas chromatography and mass spectrometry (HS-SPME-GC×GC-qMS). Profile patterns presented in the contour plots were evaluated applying image processing techniques and subsequent multivariate statistical data analysis. Statistical methods comprised unsupervised hierarchical cluster analysis (HCA) and principal component analysis (PCA) to classify the samples. Supervised partial least squares discriminant analysis (PLS-DA) and partial least squares (PLS) regression were applied to discriminate different ripening stages and describe the development of volatiles during postharvest storage, respectively. Hereby, substantial chemical markers allowing for class separation were revealed. The workflow permitted the rapid distinction between premature green-ripe pineapples and postharvest-ripened sea-freighted fruits. Volatile profiles of fully ripe air-freighted pineapples were similar to those of green-ripe fruits postharvest ripened for 6 days after simulated sea freight export, after PCA with only two principal components. However, PCA considering also the third principal component allowed differentiation between air-freighted fruits and the four progressing postharvest maturity stages of sea-freighted pineapples.
Typification of cider brandy on the basis of cider used in its manufacture.
Rodríguez Madrera, Roberto; Mangas Alonso, Juan J
2005-04-20
A study of typification of cider brandies on the basis of the origin of the raw material used in their manufacture was conducted using chemometric techniques (principal component analysis, linear discriminant analysis, and Bayesian analysis) together with their composition in volatile compounds, as analyzed by gas chromatography with flame ionization to detect the major volatiles and by mass spectrometric to detect the minor ones. Significant principal components computed by a double cross-validation procedure allowed the structure of the database to be visualized as a function of the raw material, that is, cider made from fresh apple juice versus cider made from apple juice concentrate. Feasible and robust discriminant rules were computed and validated by a cross-validation procedure that allowed the authors to classify fresh and concentrate cider brandies, obtaining classification hits of >92%. The most discriminating variables for typifying cider brandies according to their raw material were 1-butanol and ethyl hexanoate.
Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.
1989-01-01
A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.
ΔΔPT: a comprehensive toolbox for the analysis of protein motion
2013-01-01
Background Normal Mode Analysis is one of the most successful techniques for studying motions in proteins and macromolecules. It can provide information on the mechanism of protein functions, used to aid crystallography and NMR data reconstruction, and calculate protein free energies. Results ΔΔPT is a toolbox allowing calculation of elastic network models and principle component analysis. It allows the analysis of pdb files or trajectories taken from; Gromacs, Amber, and DL_POLY. As well as calculation of the normal modes it also allows comparison of the modes with experimental protein motion, variation of modes with mutation or ligand binding, and calculation of molecular dynamic entropies. Conclusions This toolbox makes the respective tools available to a wide community of potential NMA users, and allows them unrivalled ability to analyse normal modes using a variety of techniques and current software. PMID:23758746
Distributed optical fiber vibration sensor based on spectrum analysis of Polarization-OTDR system.
Zhang, Ziyi; Bao, Xiaoyi
2008-07-07
A fully distributed optical fiber vibration sensor is demonstrated based on spectrum analysis of Polarization-OTDR system. Without performing any data averaging, vibration disturbances up to 5 kHz is successfully demonstrated in a 1km fiber link with 10m spatial resolution. The FFT is performed at each spatial resolution; the relation of the disturbance at each frequency component versus location allows detection of multiple events simultaneously with different and the same frequency components.
Solder Reflow Failures in Electronic Components During Manual Soldering
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander; Greenwell, Chris; Felt, Frederick
2008-01-01
This viewgraph presentation reviews the solder reflow failures in electronic components that occur during manual soldering. It discusses the specifics of manual-soldering-induced failures in plastic devices with internal solder joints. The failure analysis turned up that molten solder had squeezed up to the die surface along the die molding compound interface, and the dice were not protected with glassivation allowing solder to short gate and source to the drain contact. The failure analysis concluded that the parts failed due to overheating during manual soldering.
NASA Astrophysics Data System (ADS)
Aouabdi, Salim; Taibi, Mahmoud; Bouras, Slimane; Boutasseta, Nadir
2017-06-01
This paper describes an approach for identifying localized gear tooth defects, such as pitting, using phase currents measured from an induction machine driving the gearbox. A new tool of anomaly detection based on multi-scale entropy (MSE) algorithm SampEn which allows correlations in signals to be identified over multiple time scales. The motor current signature analysis (MCSA) in conjunction with principal component analysis (PCA) and the comparison of observed values with those predicted from a model built using nominally healthy data. The Simulation results show that the proposed method is able to detect gear tooth pitting in current signals.
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.
2009-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575
Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D
2008-01-01
Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.
User's manual for the Composite HTGR Analysis Program (CHAP-1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.
1977-03-01
CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.
Alerts Analysis and Visualization in Network-based Intrusion Detection Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Dr. Li
2010-08-01
The alerts produced by network-based intrusion detection systems, e.g. Snort, can be difficult for network administrators to efficiently review and respond to due to the enormous number of alerts generated in a short time frame. This work describes how the visualization of raw IDS alert data assists network administrators in understanding the current state of a network and quickens the process of reviewing and responding to intrusion attempts. The project presented in this work consists of three primary components. The first component provides a visual mapping of the network topology that allows the end-user to easily browse clustered alerts. Themore » second component is based on the flocking behavior of birds such that birds tend to follow other birds with similar behaviors. This component allows the end-user to see the clustering process and provides an efficient means for reviewing alert data. The third component discovers and visualizes patterns of multistage attacks by profiling the attacker s behaviors.« less
Gruen, D.M.; Young, C.E.; Pellin, M.J.
1989-12-26
A charged particle spectrometer is described for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode. 12 figs.
Analysis of scorpion venom composition by Raman Spectroscopy
NASA Astrophysics Data System (ADS)
Martínez-Zérega, Brenda E.; González-Solís, José L.
2015-01-01
In this work we study the venom of two Centruroides scorpion species using Raman spectroscopy. The spectra analysis allows to determine the venoms chemical composition and to establish the main differences and similarities among the species. It is also shown that the use of Principal Component Analysis may help to tell apart between the scorpion species.
Neural Networks for Rapid Design and Analysis
NASA Technical Reports Server (NTRS)
Sparks, Dean W., Jr.; Maghami, Peiman G.
1998-01-01
Artificial neural networks have been employed for rapid and efficient dynamics and control analysis of flexible systems. Specifically, feedforward neural networks are designed to approximate nonlinear dynamic components over prescribed input ranges, and are used in simulations as a means to speed up the overall time response analysis process. To capture the recursive nature of dynamic components with artificial neural networks, recurrent networks, which use state feedback with the appropriate number of time delays, as inputs to the networks, are employed. Once properly trained, neural networks can give very good approximations to nonlinear dynamic components, and by their judicious use in simulations, allow the analyst the potential to speed up the analysis process considerably. To illustrate this potential speed up, an existing simulation model of a spacecraft reaction wheel system is executed, first conventionally, and then with an artificial neural network in place.
Deflection Analysis of the Space Shuttle External Tank Door Drive Mechanism
NASA Technical Reports Server (NTRS)
Tosto, Michael A.; Trieu, Bo C.; Evernden, Brent A.; Hope, Drew J.; Wong, Kenneth A.; Lindberg, Robert E.
2008-01-01
Upon observing an abnormal closure of the Space Shuttle s External Tank Doors (ETD), a dynamic model was created in MSC/ADAMS to conduct deflection analyses of the Door Drive Mechanism (DDM). For a similar analysis, the traditional approach would be to construct a full finite element model of the mechanism. The purpose of this paper is to describe an alternative approach that models the flexibility of the DDM using a lumped parameter approximation to capture the compliance of individual parts within the drive linkage. This approach allows for rapid construction of a dynamic model in a time-critical setting, while still retaining the appropriate equivalent stiffness of each linkage component. As a validation of these equivalent stiffnesses, finite element analysis (FEA) was used to iteratively update the model towards convergence. Following this analysis, deflections recovered from the dynamic model can be used to calculate stress and classify each component s deformation as either elastic or plastic. Based on the modeling assumptions used in this analysis and the maximum input forcing condition, two components in the DDM show a factor of safety less than or equal to 0.5. However, to accurately evaluate the induced stresses, additional mechanism rigging information would be necessary to characterize the input forcing conditions. This information would also allow for the classification of stresses as either elastic or plastic.
Bayesian hierarchical functional data analysis via contaminated informative priors.
Scarpa, Bruno; Dunson, David B
2009-09-01
A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.
The effect of filtering on the determination of lunar tides
NASA Astrophysics Data System (ADS)
Palumbo, A.; Mazzarella, A.
1980-01-01
The determination of lunar tides obtained by combination of a filtering process and the fixed lunar age technique is proposed. It is shown that such a method allows a reduction of the signal-to-noise ratio without altering the amplitude and the phase angle of the signal. It consequently allows the significant determination of the lunar semidiurnal component M2 from the series of data shorter than those required by other methods and the deduction of other interesting lunisolar components which have not previously been significantly determined in surface pressure and temperature data. The analysis of the data for Gan, Vesuvian Observatory and the Eiffel Tower have provided new determinations of L2(p) and have allowed comparison between the results obtained by the present and other methods.
Calibration method and apparatus for measuring the concentration of components in a fluid
Durham, M.D.; Sagan, F.J.; Burkhardt, M.R.
1993-12-21
A calibration method and apparatus for use in measuring the concentrations of components of a fluid is provided. The measurements are determined from the intensity of radiation over a selected range of radiation wavelengths using peak-to-trough calculations. The peak-to-trough calculations are simplified by compensating for radiation absorption by the apparatus. The invention also allows absorption characteristics of an interfering fluid component to be accurately determined and negated thereby facilitating analysis of the fluid. 7 figures.
Calibration method and apparatus for measuring the concentration of components in a fluid
Durham, Michael D.; Sagan, Francis J.; Burkhardt, Mark R.
1993-01-01
A calibration method and apparatus for use in measuring the concentrations of components of a fluid is provided. The measurements are determined from the intensity of radiation over a selected range of radiation wavelengths using peak-to-trough calculations. The peak-to-trough calculations are simplified by compensating for radiation absorption by the apparatus. The invention also allows absorption characteristics of an interfering fluid component to be accurately determined and negated thereby facilitating analysis of the fluid.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, F.S.; Filby, R.H.
Instrumental neutron activation analysis was used to measure the concentrations of 30 elements in Athabasca oil sands and oil-sand components. The oil sands were separated into solid residue, bitumen, and fines by Soxhlet extraction with toluene-bitumen extract. The mineral content of the extracted bitumen was dependent on the treatment of the oil sand prior to extraction. The geochemically important and organically associated trace element contents of the bitumen (and asphaltenes) were determined by subtracting the mineral contributions from the total measured concentrations. The method allows analysis of the bitumen without the necessity of ultracentrifugation or membrane filtration, which might removemore » geochemically important components of the bitumen. The method permits classification of trace elements into organic and inorganic combinations.« less
[3D visualization and analysis of vocal fold dynamics].
Bohr, C; Döllinger, M; Kniesburges, S; Traxdorf, M
2016-04-01
Visual investigation methods of the larynx mainly allow for the two-dimensional presentation of the three-dimensional structures of the vocal fold dynamics. The vertical component of the vocal fold dynamics is often neglected, yielding a loss of information. The latest studies show that the vertical dynamic components are in the range of the medio-lateral dynamics and play a significant role within the phonation process. This work presents a method for future 3D reconstruction and visualization of endoscopically recorded vocal fold dynamics. The setup contains a high-speed camera (HSC) and a laser projection system (LPS). The LPS projects a regular grid on the vocal fold surfaces and in combination with the HSC allows a three-dimensional reconstruction of the vocal fold surface. Hence, quantitative information on displacements and velocities can be provided. The applicability of the method is presented for one ex-vivo human larynx, one ex-vivo porcine larynx and one synthetic silicone larynx. The setup introduced allows the reconstruction of the entire visible vocal fold surfaces for each oscillation status. This enables a detailed analysis of the three dimensional dynamics (i. e. displacements, velocities, accelerations) of the vocal folds. The next goal is the miniaturization of the LPS to allow clinical in-vivo analysis in humans. We anticipate new insight on dependencies between 3D dynamic behavior and the quality of the acoustic outcome for healthy and disordered phonation.
Component pattern analysis of chemicals using multispectral THz imaging system
NASA Astrophysics Data System (ADS)
Kawase, Kodo; Ogawa, Yuichi; Watanabe, Yuki
2004-04-01
We have developed a novel basic technology for terahertz (THz) imaging, which allows detection and identification of chemicals by introducing the component spatial pattern analysis. The spatial distributions of the chemicals were obtained from terahertz multispectral transillumination images, using absorption spectra previously measured with a widely tunable THz-wave parametric oscillator. Further we have applied this technique to the detection and identification of illicit drugs concealed in envelopes. The samples we used were methamphetamine and MDMA, two of the most widely consumed illegal drugs in Japan, and aspirin as a reference.
SAS program for quantitative stratigraphic correlation by principal components
Hohn, M.E.
1985-01-01
A SAS program is presented which constructs a composite section of stratigraphic events through principal components analysis. The variables in the analysis are stratigraphic sections and the observational units are range limits of taxa. The program standardizes data in each section, extracts eigenvectors, estimates missing range limits, and computes the composite section from scores of events on the first principal component. Provided is an option of several types of diagnostic plots; these help one to determine conservative range limits or unrealistic estimates of missing values. Inspection of the graphs and eigenvalues allow one to evaluate goodness of fit between the composite and measured data. The program is extended easily to the creation of a rank-order composite. ?? 1985.
Self organising maps for visualising and modelling
2012-01-01
The paper describes the motivation of SOMs (Self Organising Maps) and how they are generally more accessible due to the wider available modern, more powerful, cost-effective computers. Their advantages compared to Principal Components Analysis and Partial Least Squares are discussed. These allow application to non-linear data, are not so dependent on least squares solutions, normality of errors and less influenced by outliers. In addition there are a wide variety of intuitive methods for visualisation that allow full use of the map space. Modern problems in analytical chemistry include applications to cultural heritage studies, environmental, metabolomic and biological problems result in complex datasets. Methods for visualising maps are described including best matching units, hit histograms, unified distance matrices and component planes. Supervised SOMs for classification including multifactor data and variable selection are discussed as is their use in Quality Control. The paper is illustrated using four case studies, namely the Near Infrared of food, the thermal analysis of polymers, metabolomic analysis of saliva using NMR, and on-line HPLC for pharmaceutical process monitoring. PMID:22594434
NASA Astrophysics Data System (ADS)
Kistenev, Yury V.; Karapuzikov, Alexander I.; Kostyukova, Nadezhda Yu.; Starikova, Marina K.; Boyko, Andrey A.; Bukreeva, Ekaterina B.; Bulanova, Anna A.; Kolker, Dmitry B.; Kuzmin, Dmitry A.; Zenov, Konstantin G.; Karapuzikov, Alexey A.
2015-06-01
A human exhaled air analysis by means of infrared (IR) laser photoacoustic spectroscopy is presented. Eleven healthy nonsmoking volunteers (control group) and seven patients with chronic obstructive pulmonary disease (COPD, target group) were involved in the study. The principal component analysis method was used to select the most informative ranges of the absorption spectra of patients' exhaled air in terms of the separation of the studied groups. It is shown that the data of the profiles of exhaled air absorption spectrum in the informative ranges allow identifying COPD patients in comparison to the control group.
ERIC Educational Resources Information Center
Fortuin, K. P. J.; van Koppen, C. S. A.; Kroeze, C.
2013-01-01
Professionals in the environmental domain require cognitive interdisciplinary skills to be able to develop sustainable solutions to environmental problems. We demonstrate that education in environmental systems analysis allows for the development of these skills. We identify three components of cognitive interdisciplinary skills: (1) the ability…
The Integration of Psycholinguistic and Discourse Processing Theories of Reading Comprehension.
ERIC Educational Resources Information Center
Beebe, Mona J.
To assess the compatibility of miscue analysis and recall analysis as independent elements in a theory of reading comprehension, a study was performed that operationalized each theory and separated its components into measurable units to allow empirical testing. A cueing strategy model was estimated, but the discourse processing model was broken…
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
Taranik, Maksim; Kopanitsa, Georgy
2017-01-01
The paper presents developed decision system, oriented for healthcare providers. The system allows healthcare providers to detect and decrease nonconformities in health records and forecast the sum of insurance payments taking into account nonconformities. The components are ISO13606, fuzzy logic and case-based reasoning concept. The result of system implementation allowed to 10% increase insurance payments for healthcare provider.
Virtual directions in paleomagnetism: A global and rapid approach to evaluate the NRM components.
NASA Astrophysics Data System (ADS)
Ramón, Maria J.; Pueyo, Emilio L.; Oliva-Urcia, Belén; Larrasoaña, Juan C.
2017-02-01
We introduce a method and software to process demagnetization data for a rapid and integrative estimation of characteristic remanent magnetization (ChRM) components. The virtual directions (VIDI) of a paleomagnetic site are “all” possible directions that can be calculated from a given demagnetization routine of “n” steps (being m the number of specimens in the site). If the ChRM can be defined for a site, it will be represented in the VIDI set. Directions can be calculated for successive steps using principal component analysis, both anchored to the origin (resultant virtual directions RVD; m * (n2+n)/2) and not anchored (difference virtual directions DVD; m * (n2-n)/2). The number of directions per specimen (n2) is very large and will enhance all ChRM components with noisy regions where two components were fitted together (mixing their unblocking intervals). In the same way, resultant and difference virtual circles (RVC, DVC) are calculated. Virtual directions and circles are a global and objective approach to unravel different natural remanent magnetization (NRM) components for a paleomagnetic site without any assumption. To better constrain the stable components, some filters can be applied, such as establishing an upper boundary to the MAD, removing samples with anomalous intensities, or stating a minimum number of demagnetization steps (objective filters) or selecting a given unblocking interval (subjective but based on the expertise). On the other hand, the VPD program also allows the application of standard approaches (classic PCA fitting of directions a circles) and other ancillary methods (stacking routine, linearity spectrum analysis) giving an objective, global and robust idea of the demagnetization structure with minimal assumptions. Application of the VIDI method to natural cases (outcrops in the Pyrenees and u-channel data from a Roman dam infill in northern Spain) and their comparison to other approaches (classic end-point, demagnetization circle analysis, stacking routine and linearity spectrum analysis) allows validation of this technique. The VIDI is a global approach and it is especially useful for large data sets and rapid estimation of the NRM components.
DOE-FG02-00ER62797 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweedler, J.V.
2004-12-01
Specific Aims The overall goal of this proposal has been to develop and interface a new technology, molecular gates, with microfabricated systems to add an important capability to microfabricated DNA measurement systems. This project specifically focused on demonstrating how molecular gates could be used to capture a single analyte band, among a stream of bands from a separation or a flow injection analysis experiment, and release it for later measurement, thus allowing further manipulations on the selected analyte. Since the original proposal, the molecular gate concept has been greatly expanded to allow the gates to be used as externally controllablemore » intelligent interconnects in multilayer microfluidic networks. We have demonstrated: (1) the ability of the molecular gates to work with a much wider range of biological molecules including DNA, proteins and small metabolites; and (2) the capability of performing an electrophoretic separation and sequestering individual picoliter volume components (or even classes of components) into separate channels for further analysis. Both capabilities will enable characterization of small mass amounts of complex mixtures of DNA, proteins and even small molecules--allowing them to be further separated and chemically characterized.« less
Analysis of possible designs of processing units with radial plasma flows
NASA Astrophysics Data System (ADS)
Kolesnik, V. V.; Zaitsev, S. V.; Vashilin, V. S.; Limarenko, M. V.; Prochorenkov, D. S.
2018-03-01
Analysis of plasma-ion methods of obtaining thin-film coatings shows that their development goes along the path of the increasing use of sputter deposition processes, which allow one to obtain multicomponent coatings with varying percentage of particular components. One of the methods that allow one to form multicomponent coatings with virtually any composition of elementary components is the method of coating deposition using quasi-magnetron sputtering systems [1]. This requires the creation of an axial magnetic field of a defined configuration with the flux density within the range of 0.01-0.1 T [2]. In order to compare and analyze various configurations of processing unit magnetic systems, it is necessary to obtain the following dependencies: the dependency of magnetic core section on the input power to inductors, the distribution of magnetic induction within the equatorial plane in the corresponding sections, the distribution of the magnetic induction value in the area of cathode target location.
Integration of passive driver-assistance systems with on-board vehicle systems
NASA Astrophysics Data System (ADS)
Savchenko, V. V.; Poddubko, S. N.
2018-02-01
Implementation in OIAS such functions as driver’s state monitoring and high-precision calculation of the current navigation coordinates of the vehicle, modularity of the OIAS construction and the possible increase in the functionality through integration with other onboard systems has a promising development future. The development of intelligent transport systems and their components allows setting and solving fundamentally new tasks for the safety of human-to-machine transport systems, and the automatic analysis of heterogeneous information flows provides a synergistic effect. The analysis of cross-modal information exchange in human-machine transport systems, from uniform methodological points of view, will allow us, with an accuracy acceptable for solving applied problems, to form in real time an integrated assessment of the state of the basic components of the human-to-machine system and the dynamics in changing situation-centered environment, including the external environment, in their interrelations.
Fernández-Arjona, María Del Mar; Grondona, Jesús M; Granados-Durán, Pablo; Fernández-Llebrez, Pedro; López-Ávalos, María D
2017-01-01
It is known that microglia morphology and function are closely related, but only few studies have objectively described different morphological subtypes. To address this issue, morphological parameters of microglial cells were analyzed in a rat model of aseptic neuroinflammation. After the injection of a single dose of the enzyme neuraminidase (NA) within the lateral ventricle (LV) an acute inflammatory process occurs. Sections from NA-injected animals and sham controls were immunolabeled with the microglial marker IBA1, which highlights ramifications and features of the cell shape. Using images obtained by section scanning, individual microglial cells were sampled from various regions (septofimbrial nucleus, hippocampus and hypothalamus) at different times post-injection (2, 4 and 12 h). Each cell yielded a set of 15 morphological parameters by means of image analysis software. Five initial parameters (including fractal measures) were statistically different in cells from NA-injected rats (most of them IL-1β positive, i.e., M1-state) compared to those from control animals (none of them IL-1β positive, i.e., surveillant state). However, additional multimodal parameters were revealed more suitable for hierarchical cluster analysis (HCA). This method pointed out the classification of microglia population in four clusters. Furthermore, a linear discriminant analysis (LDA) suggested three specific parameters to objectively classify any microglia by a decision tree. In addition, a principal components analysis (PCA) revealed two extra valuable variables that allowed to further classifying microglia in a total of eight sub-clusters or types. The spatio-temporal distribution of these different morphotypes in our rat inflammation model allowed to relate specific morphotypes with microglial activation status and brain location. An objective method for microglia classification based on morphological parameters is proposed. Main points Microglia undergo a quantifiable morphological change upon neuraminidase induced inflammation.Hierarchical cluster and principal components analysis allow morphological classification of microglia.Brain location of microglia is a relevant factor.
Fernández-Arjona, María del Mar; Grondona, Jesús M.; Granados-Durán, Pablo; Fernández-Llebrez, Pedro; López-Ávalos, María D.
2017-01-01
It is known that microglia morphology and function are closely related, but only few studies have objectively described different morphological subtypes. To address this issue, morphological parameters of microglial cells were analyzed in a rat model of aseptic neuroinflammation. After the injection of a single dose of the enzyme neuraminidase (NA) within the lateral ventricle (LV) an acute inflammatory process occurs. Sections from NA-injected animals and sham controls were immunolabeled with the microglial marker IBA1, which highlights ramifications and features of the cell shape. Using images obtained by section scanning, individual microglial cells were sampled from various regions (septofimbrial nucleus, hippocampus and hypothalamus) at different times post-injection (2, 4 and 12 h). Each cell yielded a set of 15 morphological parameters by means of image analysis software. Five initial parameters (including fractal measures) were statistically different in cells from NA-injected rats (most of them IL-1β positive, i.e., M1-state) compared to those from control animals (none of them IL-1β positive, i.e., surveillant state). However, additional multimodal parameters were revealed more suitable for hierarchical cluster analysis (HCA). This method pointed out the classification of microglia population in four clusters. Furthermore, a linear discriminant analysis (LDA) suggested three specific parameters to objectively classify any microglia by a decision tree. In addition, a principal components analysis (PCA) revealed two extra valuable variables that allowed to further classifying microglia in a total of eight sub-clusters or types. The spatio-temporal distribution of these different morphotypes in our rat inflammation model allowed to relate specific morphotypes with microglial activation status and brain location. An objective method for microglia classification based on morphological parameters is proposed. Main points Microglia undergo a quantifiable morphological change upon neuraminidase induced inflammation.Hierarchical cluster and principal components analysis allow morphological classification of microglia.Brain location of microglia is a relevant factor. PMID:28848398
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
Estévez, Jorge; Selva, Verónica; Benabent, Mónica; Mangas, Iris; Sogorb, Miguel Ángel; Vilanova, Eugenio
2016-11-25
Some effects of organophosphorus compounds (OPs) esters cannot be explained through actions on currently recognized targets acetylcholinesterase or neuropathy target esterase (NTE). In soluble chicken brain fraction, three components (Eα, Eβ and Eγ) of pheny lvalerate esterase activity (PVase) were kinetically discriminated and their relationship with acetylcholine-hydrolyzing activity (cholinesterase activity) were studied in previous works. In this work, four enzymatic components (CS1, CS2, CS3 and CS4) of cholinesterase activity have been discriminated in soluble fraction, according to their sensitivity to irreversible inhibitors mipafox, paraoxon, PMSF and iso-OMPA and to reversible inhibitors ethopropazine and BW284C51. Cholinesterase component CS1 can be related to the Eα component of PVase activity and identified as butyrylcholinesterase (BuChE). No association and similarities can be stablished among the other PVase component (Eβ and Eγ) with the other cholinesterase components (CS2, CS3, CS4). The kinetic analysis has allowed us to stablish a method for discriminating the enzymatic component based on a simple test with two inhibitors. It can be used as biomarker in toxicological studies and for monitoring these cholinesterase components during isolation and molecular identification processes, which will allow OP toxicity to be understood by a multi-target approach. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method
NASA Astrophysics Data System (ADS)
De Waal, Sybrand A.
1996-07-01
A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.
Big-Data RHEED analysis for understanding epitaxial film growth processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence.more » This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.« less
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2010-01-01
The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.
Morin, R.H.
1997-01-01
Returns from drilling in unconsolidated cobble and sand aquifers commonly do not identify lithologic changes that may be meaningful for Hydrogeologic investigations. Vertical resolution of saturated, Quaternary, coarse braided-slream deposits is significantly improved by interpreting natural gamma (G), epithermal neutron (N), and electromagnetically induced resistivity (IR) logs obtained from wells at the Capital Station site in Boise, Idaho. Interpretation of these geophysical logs is simplified because these sediments are derived largely from high-gamma-producing source rocks (granitics of the Boise River drainage), contain few clays, and have undergone little diagenesis. Analysis of G, N, and IR data from these deposits with principal components analysis provides an objective means to determine if units can be recognized within the braided-stream deposits. In particular, performing principal components analysis on G, N, and IR data from eight wells at Capital Station (1) allows the variable system dimensionality to be reduced from three to two by selecting the two eigenvectors with the greatest variance as axes for principal component scatterplots, (2) generates principal components with interpretable physical meanings, (3) distinguishes sand from cobble-dominated units, and (4) provides a means to distinguish between cobble-dominated units.
Fast characterization of cheeses by dynamic headspace-mass spectrometry.
Pérès, Christophe; Denoyer, Christian; Tournayre, Pascal; Berdagué, Jean-Louis
2002-03-15
This study describes a rapid method to characterize cheeses by analysis of their volatile fraction using dynamic headspace-mass spectrometry. Major factors governing the extraction and concentration of the volatile components were first studied. These components were extracted from the headspace of the cheeses in a stream of helium and concentrated on a Tenax TA trap. They were then desorbed by heating and injected directly into the source of a mass spectrometer via a short deactivated silica transfer line. The mass spectra of the mixture of volatile components were considered as fingerprints of the analyzed substances. Forward stepwise factorial discriminant analysis afforded a limited number of characteristic mass fragments that allowed a good classification of the batches of cheeses studied.
NASA Technical Reports Server (NTRS)
1991-01-01
Analytical Design Service Corporation, Ann Arbor, MI, used NASTRAN (a NASA Structural Analysis program that analyzes a design and predicts how parts will perform) in tests of transmissions, engine cooling systems, internal engine parts, and body components. They also use it to design future automobiles. Analytical software can save millions by allowing computer simulated analysis of performance even before prototypes are built.
Mottram, Hazel R; Woodbury, Simon E; Rossell, J Barry; Evershed, Richard P
2003-01-01
Maize oil commands a premium price and is thus a target for adulteration with cheaper vegetable oils. Detection of this activity presents a particular challenge to the analyst because of the natural variability in the fatty acid composition of maize oils and because of their high sterol and tocopherol contents. This paper describes a method that allows detection of adulteration at concentrations of just 5% (m/m), based on the Mahalanobis distances of the principal component scores of the delta(13)C values of major and minor vegetable oil components. The method makes use of a database consisting of delta(13)C values and relative abundances of the major fatty acyl components of over 150 vegetable oils. The sterols and tocopherols of 16 maize oils and 6 potential adulterant oils were found to be depleted in (13)C by a constant amount relative to the bulk oil. Moreover, since maize oil contains particularly high levels of sterols and tocopherols, their delta(13)C values were not significantly altered when groundnut oil was added up to 20% (m/m) and it is possible to use the values for the minor components to predict the values that would be expected in a pure oil; therefore, comparison of the predicted values with those obtained experimentally allows adulteration to be detected. A refinement involved performing a discriminant analysis on the delta(13)C values of the bulk oil and the major fatty acids (16:0, 18:1 and 18:2) and using the Mahalanobis distances to determine the percentage of adulterant oil present. This approach may be refined further by including the delta(13)C values of the minor components in the discriminant analysis thereby increasing the sensitivity of the approach to concentrations at which adulteration would not be attractive economically. Copyright 2003 John Wiley & Sons, Ltd.
Analysis and visualization of single-trial event-related potentials
NASA Technical Reports Server (NTRS)
Jung, T. P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T. J.
2001-01-01
In this study, a linear decomposition technique, independent component analysis (ICA), is applied to single-trial multichannel EEG data from event-related potential (ERP) experiments. Spatial filters derived by ICA blindly separate the input data into a sum of temporally independent and spatially fixed components arising from distinct or overlapping brain or extra-brain sources. Both the data and their decomposition are displayed using a new visualization tool, the "ERP image," that can clearly characterize single-trial variations in the amplitudes and latencies of evoked responses, particularly when sorted by a relevant behavioral or physiological variable. These tools were used to analyze data from a visual selective attention experiment on 28 control subjects plus 22 neurological patients whose EEG records were heavily contaminated with blink and other eye-movement artifacts. Results show that ICA can separate artifactual, stimulus-locked, response-locked, and non-event-related background EEG activities into separate components, a taxonomy not obtained from conventional signal averaging approaches. This method allows: (1) removal of pervasive artifacts of all types from single-trial EEG records, (2) identification and segregation of stimulus- and response-locked EEG components, (3) examination of differences in single-trial responses, and (4) separation of temporally distinct but spatially overlapping EEG oscillatory activities with distinct relationships to task events. The proposed methods also allow the interaction between ERPs and the ongoing EEG to be investigated directly. We studied the between-subject component stability of ICA decomposition of single-trial EEG epochs by clustering components with similar scalp maps and activation power spectra. Components accounting for blinks, eye movements, temporal muscle activity, event-related potentials, and event-modulated alpha activities were largely replicated across subjects. Applying ICA and ERP image visualization to the analysis of sets of single trials from event-related EEG (or MEG) experiments can increase the information available from ERP (or ERF) data. Copyright 2001 Wiley-Liss, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cobb, G.P.; Braman, R.S.; Gilbert, R.A.
Atmospheric organics were sampled and analyzed by using the carbon hollow tube-gas chromatography method. Chromatograms from spice mixtures, cigarettes, and ambient air were analyzed. Principal factor analysis of row order chromatographic data produces factors which are eigenchromatograms of the components in the samples. Component sources are identified from the eigenchromatograms in all experiments and the individual eigenchromatogram corresponding to a particular source is determined in most cases. Organic sources in ambient air and in cigaretts are identified with 87% certainty. Analysis of clove cigarettes allows the determination of the relative amount of clove in different cigarettes. A new nondestructive qualitymore » control method using the hollow tube-gas chromatography analysis is discussed.« less
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)
NASA Astrophysics Data System (ADS)
De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.
1993-01-01
The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.
A practically unconditionally gradient stable scheme for the N-component Cahn-Hilliard system
NASA Astrophysics Data System (ADS)
Lee, Hyun Geun; Choi, Jeong-Whan; Kim, Junseok
2012-02-01
We present a practically unconditionally gradient stable conservative nonlinear numerical scheme for the N-component Cahn-Hilliard system modeling the phase separation of an N-component mixture. The scheme is based on a nonlinear splitting method and is solved by an efficient and accurate nonlinear multigrid method. The scheme allows us to convert the N-component Cahn-Hilliard system into a system of N-1 binary Cahn-Hilliard equations and significantly reduces the required computer memory and CPU time. We observe that our numerical solutions are consistent with the linear stability analysis results. We also demonstrate the efficiency of the proposed scheme with various numerical experiments.
Faster tissue interface analysis from Raman microscopy images using compressed factorisation
NASA Astrophysics Data System (ADS)
Palmer, Andrew D.; Bannerman, Alistair; Grover, Liam; Styles, Iain B.
2013-06-01
The structure of an artificial ligament was examined using Raman microscopy in combination with novel data analysis. Basis approximation and compressed principal component analysis are shown to provide efficient compression of confocal Raman microscopy images, alongside powerful methods for unsupervised analysis. This scheme allows the acceleration of data mining, such as principal component analysis, as they can be performed on the compressed data representation, providing a decrease in the factorisation time of a single image from five minutes to under a second. Using this workflow the interface region between a chemically engineered ligament construct and a bone-mimic anchor was examined. Natural ligament contains a striated interface between the bone and tissue that provides improved mechanical load tolerance, a similar interface was found in the ligament construct.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Periodic response of nonlinear systems
NASA Technical Reports Server (NTRS)
Nataraj, C.; Nelson, H. D.
1988-01-01
A procedure is developed to determine approximate periodic solutions of autonomous and non-autonomous systems. The trignometric collocation method (TCM) is formalized to allow for the analysis of relatively small order systems directly in physical coordinates. The TCM is extended to large order systems by utilizing modal analysis in a component mode synthesis strategy. The procedure was coded and verified by several check cases. Numerical results for two small order mechanical systems and one large order rotor dynamic system are presented. The method allows for the possibility of approximating periodic responses for large order forced and self-excited nonlinear systems.
Clinical Growth: An Evolutionary Concept Analysis.
Barkimer, Jessica
2016-01-01
Clinical growth is an essential component of nursing education, although challenging to evaluate. Considering the paradigm shift toward constructivism and student-centered learning, clinical growth requires an examination within contemporary practices. A concept analysis of clinical growth in nursing education produced defining attributes, antecedents, and consequences. Attributes included higher-level thinking, socialization, skill development, self-reflection, self-investment, interpersonal communication, and linking theory to practice. Identification of critical attributes allows educators to adapt to student-centered learning in the clinical environment. These findings allow educators to determine significant research questions, develop situation-specific theories, and identify strategies to enhance student learning in the clinical environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Fuchs, Marcus; Nouidui, Thierry
This paper discusses design decisions for exporting Modelica thermofluid flow components as Functional Mockup Units. The purpose is to provide guidelines that will allow building energy simulation programs and HVAC equipment manufacturers to effectively use FMUs for modeling of HVAC components and systems. We provide an analysis for direct input-output dependencies of such components and discuss how these dependencies can lead to algebraic loops that are formed when connecting thermofluid flow components. Based on this analysis, we provide recommendations that increase the computing efficiency of such components and systems that are formed by connecting multiple components. We explain what codemore » optimizations are lost when providing thermofluid flow components as FMUs rather than Modelica code. We present an implementation of a package for FMU export of such components, explain the rationale for selecting the connector variables of the FMUs and finally provide computing benchmarks for different design choices. It turns out that selecting temperature rather than specific enthalpy as input and output signals does not lead to a measurable increase in computing time, but selecting nine small FMUs rather than a large FMU increases computing time by 70%.« less
Optical sensor technology for a noninvasive continuous monitoring of blood components
NASA Astrophysics Data System (ADS)
Kraitl, Jens; Timm, Ulrich; Lewis, Elfed; Ewald, Hartmut
2010-02-01
NIR-spectroscopy and Photoplethysmography (PPG) is used for a measurement of blood components. The absorptioncoefficient of blood differs at different wavelengths. This fact is used to calculate the optical absorbability characteristics of blood which is yielding information about blood components like hemoglobin (Hb), carboxyhemoglobin (CoHb) and arterial oxygen saturation (SpO2). The measured PPG time signals and the ratio between the peak to peak pulse amplitudes are used for a measurement of these parameters. Hemoglobin is the main component of red blood cells. The primary function of Hb is the transport of oxygen from the lungs to the tissue and carbon dioxide back to the lungs. The Hb concentration in human blood is an important parameter in evaluating the physiological status of an individual and an essential parameter in every blood count. Currently, invasive methods are used to measure the Hb concentration, whereby blood is taken from the patient and subsequently analyzed. Apart from the discomfort of drawing blood samples, an added disadvantage of this method is the delay between the blood collection and its analysis, which does not allow real time patient monitoring in critical situations. A noninvasive method allows pain free continuous on-line patient monitoring with minimum risk of infection and facilitates real time data monitoring allowing immediate clinical reaction to the measured data.
Pitcher, Alex; Emberson, Jonathan; Lacro, Ronald V.; Sleeper, Lynn A.; Stylianou, Mario; Mahony, Lynn; Pearson, Gail D.; Groenink, Maarten; Mulder, Barbara J.; Zwinderman, Aeilko H.; De Backer, Julie; De Paepe, Anne M.; Arbustini, Eloisa; Erdem, Guliz; Jin, Xu Yu; Flather, Marcus D.; Mullen, Michael J.; Child, Anne H.; Forteza, Alberto; Evangelista, Arturo; Chiu, Hsin-Hui; Wu, Mei-Hwan; Sandor, George; Bhatt, Ami B.; Creager, Mark A.; Devereux, Richard B.; Loeys, Bart; Forfar, J. Colin; Neubauer, Stefan; Watkins, Hugh; Boileau, Catherine; Jondeau, Guillaume; Dietz, Harry C.; Baigent, Colin
2015-01-01
Rationale A number of randomized trials are underway, which will address the effects of angiotensin receptor blockers (ARBs) on aortic root enlargement and a range of other end points in patients with Marfan syndrome. If individual participant data from these trials were to be combined, a meta-analysis of the resulting data, totaling approximately 2,300 patients, would allow estimation across a number of trials of the treatment effects both of ARB therapy and of β-blockade. Such an analysis would also allow estimation of treatment effects in particular subgroups of patients on a range of end points of interest and would allow a more powerful estimate of the effects of these treatments on a composite end point of several clinical outcomes than would be available from any individual trial. Design A prospective, collaborative meta-analysis based on individual patient data from all randomized trials in Marfan syndrome of (i) ARBs versus placebo (or open-label control) and (ii) ARBs versus β-blockers will be performed. A prospective study design, in which the principal hypotheses, trial eligibility criteria, analyses, and methods are specified in advance of the unblinding of the component trials, will help to limit bias owing to data-dependent emphasis on the results of particular trials. The use of individual patient data will allow for analysis of the effects of ARBs in particular patient subgroups and for time-to-event analysis for clinical outcomes. The meta-analysis protocol summarized in this report was written on behalf of the Marfan Treatment Trialists' Collaboration and finalized in late 2012, without foreknowledge of the results of any component trial, and will be made available online (http://www.ctsu.ox.ac.uk/research/meta-trials). PMID:25965707
NASA Astrophysics Data System (ADS)
Prete, Antonio Del; Franchi, Rodolfo; Antermite, Fabrizio; Donatiello, Iolanda
2018-05-01
Residual stresses appear in a component as a consequence of thermo-mechanical processes (e.g. ring rolling process) casting and heat treatments. When machining these kinds of components, distortions arise due to the redistribution of residual stresses due to the foregoing process history inside the material. If distortions are excessive, they can lead to a large number of scrap parts. Since dimensional accuracy can affect directly the engines efficiency, the dimensional control for aerospace components is a non-trivial issue. In this paper, the problem related to the distortions of large thin walled aeroengines components in nickel superalloys has been addressed. In order to estimate distortions on inner diameters after internal turning operations, a 3D Finite Element Method (FEM) analysis has been developed on a real industrial test case. All the process history, has been taken into account by developing FEM models of ring rolling process and heat treatments. Three different strategies of ring rolling process have been studied and the combination of related parameters which allows to obtain the best dimensional accuracy has been found. Furthermore, grain size evolution and recrystallization phenomena during manufacturing process has been numerically investigated using a semi empirical Johnson-Mehl-Avrami-Kohnogorov (JMAK) model. The volume subtractions have been simulated by boolean trimming: a one step and a multi step analysis have been performed. The multi-step procedure has allowed to choose the best material removal sequence in order to reduce machining distortions.
Stashenko, Elena E; Martínez, Jairo R; Ruíz, Carlos A; Arias, Ginna; Durán, Camilo; Salgar, William; Cala, Mónica
2010-01-01
Chromatographic (GC/flame ionization detection, GC/MS) and statistical analyses were applied to the study of essential oils and extracts obtained from flowers, leaves, and stems of Lippia origanoides plants, growing wild in different Colombian regions. Retention indices, mass spectra, and standard substances were used in the identification of 139 substances detected in these essential oils and extracts. Principal component analysis allowed L. origanoides classification into three chemotypes, characterized according to their essential oil major components. Alpha- and beta-phellandrenes, p-cymene, and limonene distinguished chemotype A; carvacrol and thymol were the distinctive major components of chemotypes B and C, respectively. Pinocembrin (5,7-dihydroxyflavanone) was found in L. origanoides chemotype A supercritical fluid (CO(2)) extract at a concentration of 0.83+/-0.03 mg/g of dry plant material, which makes this plant an interesting source of an important bioactive flavanone with diverse potential applications in cosmetic, food, and pharmaceutical products.
NASA Astrophysics Data System (ADS)
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2014-12-01
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise), and study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, we apply vbICA to different tectonically active scenarios, such as earthquakes in central and northern Italy, as well as the study of slow slip events in Cascadia.
Development and analysis of new type microresonator with electro-optic feedback
NASA Astrophysics Data System (ADS)
Janusas, Giedrius; Palevicius, Arvydas; Cekas, Elingas; Brunius, Alfredas; Bauce, Jokubas
2016-04-01
Micro-resonators are fundamental components integrated in a hosts of MEMS applications: safety and stability systems, biometric sensors, switches, mechanical filters, micro-mirror devices, material characterization, gyroscopes, etc. A constituent part of the micro-resonator is a diffractive optical element (DOE). Different methods and materials are used to produce diffraction gratings for DOEs. Two-dimensional or three-dimensional periodic structures of micrometer-scale period are widely used in microsystems or their components. They can be used as elements for micro-scale synthesis, processing, and analysis of chemical and biological samples. On the other hand micro-resonator was designed using composite piezoelectric material. In case when microscopes, vibrometers or other direct measurement methods are destructive and hardly can be employed for in-situ analysis, indirect measurement of electrical signal generated by composite piezoelectric layer allows to measure natural frequency changes. Also piezoelectric layer allows to create a novel micro-resonator with controllable parameters, which could assure much higher functionality of micro-electromechanical systems. The novel micro-resonator for pollution detection is proposed. Mathematical model of the micro-resonator and its dynamical, electrical and optical characteristics are presented.
Status of the calibration and alignment framework at the Belle II experiment
NASA Astrophysics Data System (ADS)
Dossett, D.; Sevior, M.; Ritter, M.; Kuhr, T.; Bilka, T.; Yaschenko, S.;
2017-10-01
The Belle II detector at the Super KEKB e+e-collider plans to take first collision data in 2018. The monetary and CPU time costs associated with storing and processing the data mean that it is crucial for the detector components at Belle II to be calibrated quickly and accurately. A fast and accurate calibration system would allow the high level trigger to increase the efficiency of event selection, and can give users analysis-quality reconstruction promptly. A flexible framework to automate the fast production of calibration constants is being developed in the Belle II Analysis Software Framework (basf2). Detector experts only need to create two components from C++ base classes in order to use the automation system. The first collects data from Belle II event data files and outputs much smaller files to pass to the second component. This runs the main calibration algorithm to produce calibration constants ready for upload into the conditions database. A Python framework coordinates the input files, order of processing, and submission of jobs. Splitting the operation into collection and algorithm processing stages allows the framework to optionally parallelize the collection stage on a batch system.
NASA Technical Reports Server (NTRS)
Kenyon, Scott J.; Mikolajewska, Joanna; Mikolajewski, Maciej; Polidan, Ronald S.; Slovak, Mark H.
1993-01-01
We present an analysis of new and existing photometric and spectroscopic observations of the ongoing eruption in the symbiotic star AG Pegasi, showing that this binary has evolved considerably since the turn of the century. Recent dramatic changes in both the UV continuum and the wind from the hot component allow a more detailed analysis than in previous papers. AG Peg is composed of a normal M3 giant and a hot, compact star embedded in a dense, ionized nebula. The hot component powers the activity observed in this system, including a dense wind and a photoionized region within the outer atmosphere of the red giant. The hot component contracted in radius at roughly constant luminosity from 1850 to 1985. Its bolometric luminosity declined by a factor of about 4 during the past 5 yr. Both the mass loss rate from the hot component and the emission activity decreased in step with the hot component's total luminosity, while photospheric radiation from the red giant companion remained essentially constant.
Optical analysis of thermal induced structural distortions
NASA Technical Reports Server (NTRS)
Weinswig, Shepard; Hookman, Robert A.
1991-01-01
The techniques used for the analysis of thermally induced structural distortions of optical components such as scanning mirrors and telescope optics are outlined. Particular attention is given to the methodology used in the thermal and structural analysis of the GOES scan mirror, the optical analysis using Zernike coefficients, and the optical system performance evaluation. It is pointed out that the use of Zernike coefficients allows an accurate, effective, and simple linkage between thermal/mechanical effects and the optical design.
Sample extraction is one of the most important steps in arsenic speciation analysis of solid dietary samples. One of the problem areas in this analysis is the partial extraction of arsenicals from seafood samples. The partial extraction allows the toxicity of the extracted arse...
Discrimination among Panax species using spectral fingerprinting
USDA-ARS?s Scientific Manuscript database
Spectral fingerprints of samples of three Panax species (P. quinquefolius L., P. ginseng, and P. notoginseng) were acquired using UV, NIR, and MS spectrometry. With principal components analysis (PCA), all three methods allowed visual discrimination between all three species. All three methods wer...
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
NASA Astrophysics Data System (ADS)
van Westen, Cees; Bakker, Wim; Zhang, Kaixi; Jäger, Stefan; Assmann, Andre; Kass, Steve; Andrejchenko, Vera; Olyazadeh, Roya; Berlin, Julian; Cristal, Irina
2014-05-01
Within the framework of the EU FP7 Marie Curie Project CHANGES (www.changes-itn.eu) and the EU FP7 Copernicus project INCREO (http://www.increo-fp7.eu) a spatial decision support system is under development with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs). The envisaged users of the platform are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analysing spatial data at a municipal scale.
NASA Technical Reports Server (NTRS)
Metscher, Jonathan F.; Lewandowski, Edward J.
2013-01-01
A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.
NASA Astrophysics Data System (ADS)
Moretti, Massimo; Tropeano, Marcello; Loon, A. J. (Tom) van; Acquafredda, Pasquale; Baldacconi, Rossella; Festa, Vincenzo; Lisco, Stefania; Mastronuzzi, Giuseppe; Moretti, Vincenzo; Scotti, Rosa
2016-06-01
Beach sands from the Rosa Marina locality (Adriatic coast, southern Italy) were analysed mainly microscopically in order to trace the source areas of their lithoclastic and bioclastic components. The main cropping out sedimentary units were also studied with the objective to identify the potential source areas of lithoclasts. This allowed to establish how the various rock units contribute to the formation of beach sands. The analysis of the bioclastic components allows to estimate the actual role of organisms regarding the supply of this material to the beach. Identification of taxa that are present in the beach sands as shell fragments or other remains was carried out at the genus or family level. Ecological investigation of the same beach and the recognition of sub-environments (mainly distinguished on the basis of the nature of the substrate and of the water depth) was the key topic that allowed to establish the actual source areas of bioclasts in the Rosa Marina beach sands. The sedimentological analysis (including a physical study of the beach and the calculation of some statistical parameters concerning the grain-size curves) shows that the Rosa Marina beach is nowadays subject to erosion.
Life Predicted in a Probabilistic Design Space for Brittle Materials With Transient Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Palfi, Tamas; Reh, Stefan
2005-01-01
Analytical techniques have progressively become more sophisticated, and now we can consider the probabilistic nature of the entire space of random input variables on the lifetime reliability of brittle structures. This was demonstrated with NASA s CARES/Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code combined with the commercially available ANSYS/Probabilistic Design System (ANSYS/PDS), a probabilistic analysis tool that is an integral part of the ANSYS finite-element analysis program. ANSYS/PDS allows probabilistic loads, component geometry, and material properties to be considered in the finite-element analysis. CARES/Life predicts the time dependent probability of failure of brittle material structures under generalized thermomechanical loading--such as that found in a turbine engine hot-section. Glenn researchers coupled ANSYS/PDS with CARES/Life to assess the effects of the stochastic variables of component geometry, loading, and material properties on the predicted life of the component for fully transient thermomechanical loading and cyclic loading.
Generation mechanisms of fundamental rogue wave spatial-temporal structure.
Ling, Liming; Zhao, Li-Chen; Yang, Zhan-Ying; Guo, Boling
2017-08-01
We discuss the generation mechanism of fundamental rogue wave structures in N-component coupled systems, based on analytical solutions of the nonlinear Schrödinger equation and modulational instability analysis. Our analysis discloses that the pattern of a fundamental rogue wave is determined by the evolution energy and growth rate of the resonant perturbation that is responsible for forming the rogue wave. This finding allows one to predict the rogue wave pattern without the need to solve the N-component coupled nonlinear Schrödinger equation. Furthermore, our results show that N-component coupled nonlinear Schrödinger systems may possess N different fundamental rogue wave patterns at most. These results can be extended to evaluate the type and number of fundamental rogue wave structure in other coupled nonlinear systems.
Neuroforecasting Aggregate Choice
Knutson, Brian; Genevsky, Alexander
2018-01-01
Advances in brain-imaging design and analysis have allowed investigators to use neural activity to predict individual choice, while emerging Internet markets have opened up new opportunities for forecasting aggregate choice. Here, we review emerging research that bridges these levels of analysis by attempting to use group neural activity to forecast aggregate choice. A survey of initial findings suggests that components of group neural activity might forecast aggregate choice, in some cases even beyond traditional behavioral measures. In addition to demonstrating the plausibility of neuroforecasting, these findings raise the possibility that not all neural processes that predict individual choice forecast aggregate choice to the same degree. We propose that although integrative choice components may confer more consistency within individuals, affective choice components may generalize more broadly across individuals to forecast aggregate choice. PMID:29706726
Nguyen, Phuong H
2007-05-15
Principal component analysis is a powerful method for projecting multidimensional conformational space of peptides or proteins onto lower dimensional subspaces in which the main conformations are present, making it easier to reveal the structures of molecules from e.g. molecular dynamics simulation trajectories. However, the identification of all conformational states is still difficult if the subspaces consist of more than two dimensions. This is mainly due to the fact that the principal components are not independent with each other, and states in the subspaces cannot be visualized. In this work, we propose a simple and fast scheme that allows one to obtain all conformational states in the subspaces. The basic idea is that instead of directly identifying the states in the subspace spanned by principal components, we first transform this subspace into another subspace formed by components that are independent of one other. These independent components are obtained from the principal components by employing the independent component analysis method. Because of independence between components, all states in this new subspace are defined as all possible combinations of the states obtained from each single independent component. This makes the conformational analysis much simpler. We test the performance of the method by analyzing the conformations of the glycine tripeptide and the alanine hexapeptide. The analyses show that our method is simple and quickly reveal all conformational states in the subspaces. The folding pathways between the identified states of the alanine hexapeptide are analyzed and discussed in some detail. 2007 Wiley-Liss, Inc.
Leucocyte classification for leukaemia detection using image processing techniques.
Putzu, Lorenzo; Caocci, Giovanni; Di Ruberto, Cecilia
2014-11-01
The counting and classification of blood cells allow for the evaluation and diagnosis of a vast number of diseases. The analysis of white blood cells (WBCs) allows for the detection of acute lymphoblastic leukaemia (ALL), a blood cancer that can be fatal if left untreated. Currently, the morphological analysis of blood cells is performed manually by skilled operators. However, this method has numerous drawbacks, such as slow analysis, non-standard accuracy, and dependences on the operator's skill. Few examples of automated systems that can analyse and classify blood cells have been reported in the literature, and most of these systems are only partially developed. This paper presents a complete and fully automated method for WBC identification and classification using microscopic images. In contrast to other approaches that identify the nuclei first, which are more prominent than other components, the proposed approach isolates the whole leucocyte and then separates the nucleus and cytoplasm. This approach is necessary to analyse each cell component in detail. From each cell component, different features, such as shape, colour and texture, are extracted using a new approach for background pixel removal. This feature set was used to train different classification models in order to determine which one is most suitable for the detection of leukaemia. Using our method, 245 of 267 total leucocytes were properly identified (92% accuracy) from 33 images taken with the same camera and under the same lighting conditions. Performing this evaluation using different classification models allowed us to establish that the support vector machine with a Gaussian radial basis kernel is the most suitable model for the identification of ALL, with an accuracy of 93% and a sensitivity of 98%. Furthermore, we evaluated the goodness of our new feature set, which displayed better performance with each evaluated classification model. The proposed method permits the analysis of blood cells automatically via image processing techniques, and it represents a medical tool to avoid the numerous drawbacks associated with manual observation. This process could also be used for counting, as it provides excellent performance and allows for early diagnostic suspicion, which can then be confirmed by a haematologist through specialised techniques. Copyright © 2014 Elsevier B.V. All rights reserved.
Using dynamic mode decomposition for real-time background/foreground separation in video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutz, Jose Nathan; Grosek, Jacob; Brunton, Steven
The technique of dynamic mode decomposition (DMD) is disclosed herein for the purpose of robustly separating video frames into background (low-rank) and foreground (sparse) components in real-time. Foreground/background separation is achieved at the computational cost of just one singular value decomposition (SVD) and one linear equation solve, thus producing results orders of magnitude faster than robust principal component analysis (RPCA). Additional techniques, including techniques for analyzing the video for multi-resolution time-scale components, and techniques for reusing computations to allow processing of streaming video in real time, are also described herein.
Transient Reliability Analysis Capability Developed for CARES/Life
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
2001-01-01
The CARES/Life software developed at the NASA Glenn Research Center provides a general-purpose design tool that predicts the probability of the failure of a ceramic component as a function of its time in service. This award-winning software has been widely used by U.S. industry to establish the reliability and life of a brittle material (e.g., ceramic, intermetallic, and graphite) structures in a wide variety of 21st century applications.Present capabilities of the NASA CARES/Life code include probabilistic life prediction of ceramic components subjected to fast fracture, slow crack growth (stress corrosion), and cyclic fatigue failure modes. Currently, this code can compute the time-dependent reliability of ceramic structures subjected to simple time-dependent loading. For example, in slow crack growth failure conditions CARES/Life can handle sustained and linearly increasing time-dependent loads, whereas in cyclic fatigue applications various types of repetitive constant-amplitude loads can be accounted for. However, in real applications applied loads are rarely that simple but vary with time in more complex ways such as engine startup, shutdown, and dynamic and vibrational loads. In addition, when a given component is subjected to transient environmental and or thermal conditions, the material properties also vary with time. A methodology has now been developed to allow the CARES/Life computer code to perform reliability analysis of ceramic components undergoing transient thermal and mechanical loading. This means that CARES/Life will be able to analyze finite element models of ceramic components that simulate dynamic engine operating conditions. The methodology developed is generalized to account for material property variation (on strength distribution and fatigue) as a function of temperature. This allows CARES/Life to analyze components undergoing rapid temperature change in other words, components undergoing thermal shock. In addition, the capability has been developed to perform reliability analysis for components that undergo proof testing involving transient loads. This methodology was developed for environmentally assisted crack growth (crack growth as a function of time and loading), but it will be extended to account for cyclic fatigue (crack growth as a function of load cycles) as well.
Creation of a virtual cutaneous tissue bank
NASA Astrophysics Data System (ADS)
LaFramboise, William A.; Shah, Sujal; Hoy, R. W.; Letbetter, D.; Petrosko, P.; Vennare, R.; Johnson, Peter C.
2000-04-01
Cellular and non-cellular constituents of skin contain fundamental morphometric features and structural patterns that correlate with tissue function. High resolution digital image acquisitions performed using an automated system and proprietary software to assemble adjacent images and create a contiguous, lossless, digital representation of individual microscope slide specimens. Serial extraction, evaluation and statistical analysis of cutaneous feature is performed utilizing an automated analysis system, to derive normal cutaneous parameters comprising essential structural skin components. Automated digital cutaneous analysis allows for fast extraction of microanatomic dat with accuracy approximating manual measurement. The process provides rapid assessment of feature both within individual specimens and across sample populations. The images, component data, and statistical analysis comprise a bioinformatics database to serve as an architectural blueprint for skin tissue engineering and as a diagnostic standard of comparison for pathologic specimens.
Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V
2014-10-28
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED images, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the data set are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of a RHEED image sequence. This approach is illustrated for growth of La(x)Ca(1-x)MnO(3) films grown on etched (001) SrTiO(3) substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the asymmetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.
Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz
2018-05-05
A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.
Progress Toward Efficient Laminar Flow Analysis and Design
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas
2011-01-01
A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.
Optimization of replacement and inspection decisions for multiple components on a power system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mauney, D.A.
1994-12-31
The use of optimization on the rescheduling of replacement dates provided a very proactive approach to deciding when components on individual units need to be addressed with a run/repair/replace decision. Including the effects of time value of money and taxes and unit need inside the spreadsheet model allowed the decision maker to concentrate on the effects of engineering input and replacement date decisions on the final net present value (NPV). The personal computer (PC)-based model was applied to a group of 140 forced outage critical fossil plant tube components across a power system. The estimated resulting NPV of the optimizationmore » was in the tens of millions of dollars. This PC spreadsheet model allows the interaction of inputs from structural reliability risk assessment models, plant foreman interviews, and actual failure history on a by component by unit basis across a complete power production system. This model includes not only the forced outage performance of these components caused by tube failures but, in addition, the forecasted need of the individual units on the power system and the expected cost of their replacement power if forced off line. The use of cash flow analysis techniques in the spreadsheet model results in the calculation of an NPV for a whole combination of replacement dates. This allows rapid assessments of {open_quotes}what if{close_quotes} scenarios of major maintenance projects on a systemwide basis and not just on a unit-by-unit basis.« less
Thermal analysis of the in-vessel components of the ITER plasma-position reflectometry.
Quental, P B; Policarpo, H; Luís, R; Varela, P
2016-11-01
The ITER plasma position reflectometry system measures the edge electron density profile of the plasma, providing real-time supplementary contribution to the magnetic measurements of the plasma-wall distance. Some of the system components will be in direct sight of the plasma and therefore subject to plasma and stray radiation, which may cause excessive temperatures and stresses. In this work, thermal finite element analysis of the antenna and adjacent waveguides is conducted with ANSYS V17 (ANSYS® Academic Research, Release 17.0, 2016). Results allow the identification of critical temperature points, and solutions are proposed to improve the thermal behavior of the system.
Thermal analysis of the in-vessel components of the ITER plasma-position reflectometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quental, P. B., E-mail: pquental@ipfn.tecnico.ulisboa.pt; Policarpo, H.; Luís, R.
The ITER plasma position reflectometry system measures the edge electron density profile of the plasma, providing real-time supplementary contribution to the magnetic measurements of the plasma-wall distance. Some of the system components will be in direct sight of the plasma and therefore subject to plasma and stray radiation, which may cause excessive temperatures and stresses. In this work, thermal finite element analysis of the antenna and adjacent waveguides is conducted with ANSYS V17 (ANSYS® Academic Research, Release 17.0, 2016). Results allow the identification of critical temperature points, and solutions are proposed to improve the thermal behavior of the system.
24 CFR 3280.303 - General requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Section 3280.303 Housing and Urban Development Regulations Relating to Housing and Urban Development... quality of work of the various trades. (c) Structural analysis. The strength and rigidity of the component... tests specified in paragraph (g) of this section. (f) Allowable design stress. The design stresses of...
24 CFR 3280.303 - General requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Section 3280.303 Housing and Urban Development Regulations Relating to Housing and Urban Development... quality of work of the various trades. (c) Structural analysis. The strength and rigidity of the component... tests specified in paragraph (g) of this section. (f) Allowable design stress. The design stresses of...
24 CFR 3280.303 - General requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Section 3280.303 Housing and Urban Development Regulations Relating to Housing and Urban Development... quality of work of the various trades. (c) Structural analysis. The strength and rigidity of the component... tests specified in paragraph (g) of this section. (f) Allowable design stress. The design stresses of...
24 CFR 3280.303 - General requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Section 3280.303 Housing and Urban Development Regulations Relating to Housing and Urban Development... quality of work of the various trades. (c) Structural analysis. The strength and rigidity of the component... tests specified in paragraph (g) of this section. (f) Allowable design stress. The design stresses of...
24 CFR 3280.303 - General requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Section 3280.303 Housing and Urban Development Regulations Relating to Housing and Urban Development... quality of work of the various trades. (c) Structural analysis. The strength and rigidity of the component... tests specified in paragraph (g) of this section. (f) Allowable design stress. The design stresses of...
Test bed experiments for various telerobotic system characteristics and configurations
NASA Technical Reports Server (NTRS)
Duffie, Neil A.; Wiker, Steven F.; Zik, John J.
1990-01-01
Dexterous manipulation and grasping in telerobotic systems depends on the integration of high-performance sensors, displays, actuators and controls into systems in which careful consideration has been given to human perception and tolerance. Research underway at the Wisconsin Center for Space Automation and Robotics (WCSAR) has the objective of enhancing the performance of these systems and their components, and quantifying the effects of the many electrical, mechanical, control, and human factors that affect their performance. This will lead to a fundamental understanding of performance issues which will in turn allow designers to evaluate sensor, actuator, display, and control technologies with respect to generic measures of dexterous performance. As part of this effort, an experimental test bed was developed which has telerobotic components with exceptionally high fidelity in master/slave operation. A Telerobotic Performance Analysis System has also been developed which allows performance to be determined for various system configurations and electro-mechanical characteristics. Both this performance analysis system and test bed experiments are described.
Aerosol in the Pacific troposphere
NASA Technical Reports Server (NTRS)
Clarke, Antony D.
1989-01-01
The use of near real-time optical techniques is emphasized for the measurement of mid-tropospheric aerosol over the Central Pacific. The primary focus is on measurement of the aerosol size distribution over the range of particle diameters from 0.15 to 5.0 microns that are essential for modeling CO2 backscatter values in support of the laser atmospheric wind sounder (LAWS) program. The measurement system employs a LAS-X (Laser Aerosol Spectrometer-PMS, Boulder, CO) with a custom 256 channel pulse height analyzer and software for detailed measurement and analysis of aerosol size distributions. A thermal preheater system (Thermo Optic Aerosol Descriminator (TOAD) conditions the aerosol in a manner that allows the discrimination of the size distribution of individual aerosol components such as sulfuric acid, sulfates and refractory species. This allows assessment of the relative contribution of each component to the BCO2 signal. This is necessary since the different components have different sources, exhibit independent variability and provide different BCO2 signals for a given mass and particle size. Field activities involve experiments designed to examine both temporal and spatial variability of these aerosol components from ground based and aircraft platforms.
Chemical Composition Variability of Essential Oils of Daucus gracilis Steinh. from Algeria.
Benyelles, Batoul; Allali, Hocine; Dib, Mohamed El Amine; Djabou, Nassim; Paolini, Julien; Costa, Jean
2017-06-01
The chemical compositions of 20 Algerian Daucus gracilis essential oils were investigated using GC-FID, GC/MS, and NMR analyses. Altogether, 47 compounds were identified, accounting for 90 - 99% of the total oil compositions. The main components were linalool (18; 12.5 - 22.6%), 2-methylbutyl 2-methylbutyrate (20; 9.2 - 20.2%), 2-methylbutyl isobutyrate (10; 4.2 - 12.2%), ammimajane (47; 2.6 - 37.1%), (E)-β-ocimene (15; 0.2 - 12.8%) and 3-methylbutyl isovalerate (19; 3.3 - 9.6%). The chemical composition of the essential oils obtained from separate organs was also studied. GC and GC/MS analysis of D. gracilis leaves and flowers allowed identifying 47 compounds, amounting to 92.3% and 94.1% of total oil composition, respectively. GC and GC/MS analysis of D. gracilis leaf and flower oils allowed identifying linalool (22.7%), 2-methylbutyl 2-methylbutyrate (18.9%), 2-methylbutyl isovalerate (13.6%), ammimajane (10.4%), 3-methylbutyl isovalerate (10.3%), (E)-β-ocimene (8.4%) and isopentyl 2-methylbutyrate (8.1%) as main components. The chemical variability of the Algerian oil samples was studied using statistical analysis, which allowed the discrimination of three main Groups. A direct correlation between the altitudes, nature of soils and the chemical compositions of the D. gracilis essential oils was evidenced. © 2017 Wiley-VHCA AG, Zurich, Switzerland.
Shape Mode Analysis Exposes Movement Patterns in Biology: Flagella and Flatworms as Case Studies
Werner, Steffen; Rink, Jochen C.; Riedel-Kruse, Ingmar H.; Friedrich, Benjamin M.
2014-01-01
We illustrate shape mode analysis as a simple, yet powerful technique to concisely describe complex biological shapes and their dynamics. We characterize undulatory bending waves of beating flagella and reconstruct a limit cycle of flagellar oscillations, paying particular attention to the periodicity of angular data. As a second example, we analyze non-convex boundary outlines of gliding flatworms, which allows us to expose stereotypic body postures that can be related to two different locomotion mechanisms. Further, shape mode analysis based on principal component analysis allows to discriminate different flatworm species, despite large motion-associated shape variability. Thus, complex shape dynamics is characterized by a small number of shape scores that change in time. We present this method using descriptive examples, explaining abstract mathematics in a graphic way. PMID:25426857
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
NASA Astrophysics Data System (ADS)
Carello, M.; Amirth, N.; Airale, A. G.; Monti, M.; Romeo, A.
2017-12-01
Advanced thermoplastic prepreg composite materials stand out with regard to their ability to allow complex designs with high specific strength and stiffness. This makes them an excellent choice for lightweight automotive components to reduce mass and increase fuel efficiency, while maintaining the functionality of traditional thermosetting prepreg (and mechanical characteristics) and with a production cycle time and recyclability suited to mass production manufacturing. Currently, the aerospace and automotive sectors struggle to carry out accurate Finite Elements (FE) component analyses and in some cases are unable to validate the obtained results. In this study, structural Finite Elements Analysis (FEA) has been done on a thermoplastic fiber reinforced component designed and manufactured through an integrated injection molding process, which consists in thermoforming the prepreg laminate and overmolding the other parts. This process is usually referred to as hybrid molding, and has the provision to reinforce the zones subjected to additional stresses with thermoformed themoplastic prepreg as required and overmolded with a shortfiber thermoplastic resin in single process. This paper aims to establish an accurate predictive model on a rational basis and an innovative methodology for the structural analysis of thermoplastic composite components by comparison with the experimental tests results.
Blömer, Wilhelm; Steinbrück, Arnd; Schröder, Christian; Grothaus, Franz-Josef; Melsheimer, Oliver; Mannel, Henrich; Forkel, Gerhard; Eilers, Thomas; Liebs, Thoralf R; Hassenpflug, Joachim; Jansson, Volkmar
2015-07-01
Every joint registry aims to improve patient care by identifying implants that have an inferior performance. For this reason, each registry records the implant name that has been used in the individual patient. In most registries, a paper-based approach has been utilized for this purpose. However, in addition to being time-consuming, this approach does not account for the fact that failure patterns are not necessarily implant specific but can be associated with design features that are used in a number of implants. Therefore, we aimed to develop and evaluate an implant product library that allows both time saving barcode scanning on site in the hospital for the registration of the implant components and a detailed description of implant specifications. A task force consisting of representatives of the German Arthroplasty Registry, industry, and computer specialists agreed on a solution that allows barcode scanning of implant components and that also uses a detailed standardized classification describing arthroplasty components. The manufacturers classified all their components that are sold in Germany according to this classification. The implant database was analyzed regarding the completeness of components by algorithms and real-time data. The implant library could be set up successfully. At this point, the implant database includes more than 38,000 items, of which all were classified by the manufacturers according to the predefined scheme. Using patient data from the German Arthroplasty Registry, several errors in the database were detected, all of which were corrected by the respective implant manufacturers. The implant library that was developed for the German Arthroplasty Registry allows not only on-site barcode scanning for the registration of the implant components but also its classification tree allows a sophisticated analysis regarding implant characteristics, regardless of brand or manufacturer. The database is maintained by the implant manufacturers, thereby allowing registries to focus their resources on other areas of research. The database might represent a possible global model, which might encourage harmonization between joint replacement registries enabling comparisons between joint replacement registries.
Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.
Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G
2012-05-01
This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.
Intuitive Tools for the Design and Analysis of Communication Payloads for Satellites
NASA Technical Reports Server (NTRS)
Culver, Michael R.; Soong, Christine; Warner, Joseph D.
2014-01-01
In an effort to make future communications satellite payload design more efficient and accessible, two tools were created with intuitive graphical user interfaces (GUIs). The first tool allows payload designers to graphically design their payload by using simple drag and drop of payload components onto a design area within the program. Information about each picked component is pulled from a database of common space-qualified communication components sold by commerical companies. Once a design is completed, various reports can be generated, such as the Master Equipment List. The second tool is a link budget calculator designed specifically for ease of use. Other features of this tool include being able to access a database of NASA ground based apertures for near Earth and Deep Space communication, the Tracking and Data Relay Satellite System (TDRSS) base apertures, and information about the solar system relevant to link budget calculations. The link budget tool allows for over 50 different combinations of user inputs, eliminating the need for multiple spreadsheets and the user errors associated with using them. Both of the aforementioned tools increase the productivity of space communication systems designers, and have the colloquial latitude to allow non-communication experts to design preliminary communication payloads.
Client-side Skype forensics: an overview
NASA Astrophysics Data System (ADS)
Meißner, Tina; Kröger, Knut; Creutzburg, Reiner
2013-03-01
IT security and computer forensics are important components in the information technology. In the present study, a client-side Skype forensics is performed. It is designed to explain which kind of user data are stored on a computer and which tools allow the extraction of those data for a forensic investigation. There are described both methods - a manual analysis and an analysis with (mainly) open source tools, respectively.
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
Dong, Fengxia; Mitchell, Paul D; Colquhoun, Jed
2015-01-01
Measuring farm sustainability performance is a crucial component for improving agricultural sustainability. While extensive assessments and indicators exist that reflect the different facets of agricultural sustainability, because of the relatively large number of measures and interactions among them, a composite indicator that integrates and aggregates over all variables is particularly useful. This paper describes and empirically evaluates a method for constructing a composite sustainability indicator that individually scores and ranks farm sustainability performance. The method first uses non-negative polychoric principal component analysis to reduce the number of variables, to remove correlation among variables and to transform categorical variables to continuous variables. Next the method applies common-weight data envelope analysis to these principal components to individually score each farm. The method solves weights endogenously and allows identifying important practices in sustainability evaluation. An empirical application to Wisconsin cranberry farms finds heterogeneity in sustainability practice adoption, implying that some farms could adopt relevant practices to improve the overall sustainability performance of the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Facility and Architecture for Autonomy Research
NASA Technical Reports Server (NTRS)
Pisanich, Greg; Clancy, Daniel (Technical Monitor)
2002-01-01
Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.
Manufacture and Experimental Analysis of a Concentrated Strain Based Deployable Truss Structure
2006-05-01
high- modulus pull-truded carbon fiber rods (CFRs) for the majority of the length. The other components were compliant flexure joints made of Nitinol ...truded carbon fiber rods (CFRs) for the majority of the length. The other components were compliant flexure joints made of Nitinol NiTi, a shape memory...allows it to recover its original shape. The most common SMA is an alloy of nickel and titanium called Nitinol .6 This particular alloy has very good
Real-space analysis of radiation-induced specific changes with independent component analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borek, Dominika; Bromberg, Raquel; Hattne, Johan
A method of analysis is presented that allows for the separation of specific radiation-induced changes into distinct components in real space. The method relies on independent component analysis (ICA) and can be effectively applied to electron density maps and other types of maps, provided that they can be represented as sets of numbers on a grid. Here, for glucose isomerase crystals, ICA was used in a proof-of-concept analysis to separate temperature-dependent and temperature-independent components of specific radiation-induced changes for data sets acquired from multiple crystals across multiple temperatures. ICA identified two components, with the temperature-independent component being responsible for themore » majority of specific radiation-induced changes at temperatures below 130 K. The patterns of specific temperature-independent radiation-induced changes suggest a contribution from the tunnelling of electron holes as a possible explanation. In the second case, where a group of 22 data sets was collected on a single thaumatin crystal, ICA was used in another type of analysis to separate specific radiation-induced effects happening on different exposure-level scales. Here, ICA identified two components of specific radiation-induced changes that likely result from radiation-induced chemical reactions progressing with different rates at different locations in the structure. In addition, ICA unexpectedly identified the radiation-damage state corresponding to reduced disulfide bridges rather than the zero-dose extrapolated state as the highest contrast structure. The application of ICA to the analysis of specific radiation-induced changes in real space and the data pre-processing for ICA that relies on singular value decomposition, which was used previously in data space to validate a two-component physical model of X-ray radiation-induced changes, are discussed in detail. This work lays a foundation for a better understanding of protein-specific radiation chemistries and provides a framework for analysing effects of specific radiation damage in crystallographic and cryo-EM experiments.« less
Non-linear principal component analysis applied to Lorenz models and to North Atlantic SLP
NASA Astrophysics Data System (ADS)
Russo, A.; Trigo, R. M.
2003-04-01
A non-linear generalisation of Principal Component Analysis (PCA), denoted Non-Linear Principal Component Analysis (NLPCA), is introduced and applied to the analysis of three data sets. Non-Linear Principal Component Analysis allows for the detection and characterisation of low-dimensional non-linear structure in multivariate data sets. This method is implemented using a 5-layer feed-forward neural network introduced originally in the chemical engineering literature (Kramer, 1991). The method is described and details of its implementation are addressed. Non-Linear Principal Component Analysis is first applied to a data set sampled from the Lorenz attractor (1963). It is found that the NLPCA approximations are more representative of the data than are the corresponding PCA approximations. The same methodology was applied to the less known Lorenz attractor (1984). However, the results obtained weren't as good as those attained with the famous 'Butterfly' attractor. Further work with this model is underway in order to assess if NLPCA techniques can be more representative of the data characteristics than are the corresponding PCA approximations. The application of NLPCA to relatively 'simple' dynamical systems, such as those proposed by Lorenz, is well understood. However, the application of NLPCA to a large climatic data set is much more challenging. Here, we have applied NLPCA to the sea level pressure (SLP) field for the entire North Atlantic area and the results show a slight imcrement of explained variance associated. Finally, directions for future work are presented.%}
[Laser microdissection for biology and medicine].
Podgornyĭ, O V; Lazarev, V N; Govorun, V M
2012-01-01
For routine extraction of DNA, RNA, proteins and metabolites, small tissue pieces are placed into lysing solution. These tissue pieces in general contain different cell types. For this reason, lysate contains components of different cell types, which complicates the interpretation of molecular analysis results. The laser microdissection allows overcoming this trouble. The laser microdissection is a method to procure tissue samples contained defined cell subpopulations, individual cells and even subsellular components under direct microscopic visualization. Collected samples can be undergone to different downstream molecular assays: DNA analysis, RNA transcript profiling, cDNA library generation and gene expression analysis, proteomic analysis and metabolite profiling. The laser microdissection has wide applications in oncology (research and routine), cellular and molecular biology, biochemistry and forensics. This paper reviews the principles of different laser microdissection instruments, examples of laser microdissection application and problems of sample preparation for laser microdissection.
Exploring the Factor Structure of Neurocognitive Measures in Older Individuals
Santos, Nadine Correia; Costa, Patrício Soares; Amorim, Liliana; Moreira, Pedro Silva; Cunha, Pedro; Cotter, Jorge; Sousa, Nuno
2015-01-01
Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the “best fit” model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate. PMID:25880732
An Analysis of the United States Naval Aviation Schedule Removal Component (SRC) Card Process
2009-12-01
JSF has the ability to communicate in flight with its maintenance system , ALIS. Its Prognostic Health Management (PHM) System abilities allow it to...end-users. PLCS allows users of the system , through a central database, visibility of a component’s history and lifecycle data . Since both OOMA...used in PLM systems .2 This research recommends a PLM system that is Web-based and uses DoD- mandated UID technology as the future for data
A Representative Shuttle Environmental Control System
NASA Technical Reports Server (NTRS)
Brose, H. F.; Stanley, M. D.; Leblanc, J. C.
1977-01-01
The Representative Shuttle Environmental Control System (RSECS) provides a ground test bed to be used in the early accumulation of component and system operating data, the evaluation of potential system improvements, and possibly the analysis of Shuttle Orbiter test and flight anomalies. Selected components are being subjected to long term tests to determine endurance and corrosion resistance capability prior to Orbiter vehicle experience. Component and system level tests in several cases are being used to support flight certification of Orbiter hardware. These activities are conducted as a development program to allow for timeliness, flexibility, and cost effectiveness not possible in a program burdened by flight documentation and monitoring constraints.
Structural health monitoring apparatus and methodology
NASA Technical Reports Server (NTRS)
Giurgiutiu, Victor (Inventor); Yu, Lingyu (Inventor); Bottai, Giola Santoni (Inventor)
2011-01-01
Disclosed is an apparatus and methodology for structural health monitoring (SHM) in which smart devices interrogate structural components to predict failure, expedite needed repairs, and thus increase the useful life of those components. Piezoelectric wafer active sensors (PWAS) are applied to or integrated with structural components and various data collected there from provide the ability to detect and locate cracking, corrosion, and disbanding through use of pitch-catch, pulse-echo, electro/mechanical impedance, and phased array technology. Stand alone hardware and an associated software program are provided that allow selection of multiple types of SHM investigations as well as multiple types of data analysis to perform a wholesome investigation of a structure.
Component Models for Fuzzy Data
ERIC Educational Resources Information Center
Coppi, Renato; Giordani, Paolo; D'Urso, Pierpaolo
2006-01-01
The fuzzy perspective in statistical analysis is first illustrated with reference to the "Informational Paradigm" allowing us to deal with different types of uncertainties related to the various informational ingredients (data, model, assumptions). The fuzzy empirical data are then introduced, referring to "J" LR fuzzy variables as observed on "I"…
Analysis of Hybrid-Electric Propulsion System Designs for Small Unmanned Aircraft Systems
2010-03-01
34 5. Fundamental Aerodynamics... turbocharger , allowing the turbine and compressor to run at different speeds. The concept would simplify designing small diesel engines, which are...ICEs. Weight reductions in ancillary components like turbochargers and cooling systems must also be achieved for use in aviation. Since small
System for analysis of explosives
Haas, Jeffrey S [San Ramon, CA
2010-06-29
A system for analysis of explosives. Samples are spotted on a thin layer chromatography plate. Multi-component explosives standards are spotted on the thin layer chromatography plate. The thin layer chromatography plate is dipped in a solvent mixture and chromatography is allowed to proceed. The thin layer chromatography plate is dipped in reagent 1. The thin layer chromatography plate is heated. The thin layer chromatography plate is dipped in reagent 2.
Kondic, L; Kramár, M; Pugnaloni, Luis A; Carlevaro, C Manuel; Mischaikow, K
2016-06-01
In the companion paper [Pugnaloni et al., Phys. Rev. E 93, 062902 (2016)10.1103/PhysRevE.93.062902], we use classical measures based on force probability density functions (PDFs), as well as Betti numbers (quantifying the number of components, related to force chains, and loops), to describe the force networks in tapped systems of disks and pentagons. In the present work, we focus on the use of persistence analysis, which allows us to describe these networks in much more detail. This approach allows us not only to describe but also to quantify the differences between the force networks in different realizations of a system, in different parts of the considered domain, or in different systems. We show that persistence analysis clearly distinguishes the systems that are very difficult or impossible to differentiate using other means. One important finding is that the differences in force networks between disks and pentagons are most apparent when loops are considered: the quantities describing properties of the loops may differ significantly even if other measures (properties of components, Betti numbers, force PDFs, or the stress tensor) do not distinguish clearly or at all the investigated systems.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses.
Montenegro-Burke, J Rafael; Phommavongsay, Thiery; Aisporna, Aries E; Huan, Tao; Rinehart, Duane; Forsberg, Erica; Poole, Farris L; Thorgersen, Michael P; Adams, Michael W W; Krantz, Gregory; Fields, Matthew W; Northen, Trent R; Robbins, Paul D; Niedernhofer, Laura J; Lairson, Luke; Benton, H Paul; Siuzdak, Gary
2016-10-04
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
2016-01-01
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism. PMID:27560777
Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.
Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark
2016-03-16
The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.; ...
2016-08-25
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Piotrowski, T; Rodrigues, G; Bajon, T; Yartsev, S
2014-03-01
Multi-institutional collaborations allow for more information to be analyzed but the data from different sources may vary in the subgroup sizes and/or conditions of measuring. Rigorous statistical analysis is required for pooling the data in a larger set. Careful comparison of all the components of the data acquisition is indispensable: identical conditions allow for enlargement of the database with improved statistical analysis, clearly defined differences provide opportunity for establishing a better practice. The optimal sequence of required normality, asymptotic normality, and independence tests is proposed. An example of analysis of six subgroups of position corrections in three directions obtained during image guidance procedures for 216 prostate cancer patients from two institutions is presented. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
de Lauro, E.; de Martino, S.; Falanga, M.; Palo, M.
2006-08-01
We analyze time series of Strombolian volcanic tremor, focusing our attention on the frequency band [0.1-0.5] Hz (very long period (VLP) tremor). Although this frequency band is largely affected by noise, we evidence two significant components by using Independent Component Analysis with the frequencies, respectively, of ~0.2 and ~0.4 Hz. We show that these components display wavefield features similar to those of the high frequency Strombolian signals (>0.5 Hz). In fact, they are radially polarised and located within the crater area. This characterization is lost when an enhancement of energy appears. In this case, the presence of microseismic noise becomes relevant. Investigating the entire large data set available, we determine how microseismic noise influences the signals. We ascribe the microseismic noise source to Scirocco wind. Moreover, our analysis allows one to evidence that the Strombolian conduit vibrates like the asymmetric cavity associated with musical instruments generating self-sustained tones.
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
NASA Astrophysics Data System (ADS)
Luce, R.; Hildebrandt, P.; Kuhlmann, U.; Liesen, J.
2016-09-01
The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for non-negative matrix factorization which is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed.
Ferreiro-González, Marta; Barbero, Gerardo F; Álvarez, José A; Ruiz, Antonio; Palma, Miguel; Ayuso, Jesús
2017-04-01
Adulteration of olive oil is not only a major economic fraud but can also have major health implications for consumers. In this study, a combination of visible spectroscopy with a novel multivariate curve resolution method (CR), principal component analysis (PCA) and linear discriminant analysis (LDA) is proposed for the authentication of virgin olive oil (VOO) samples. VOOs are well-known products with the typical properties of a two-component system due to the two main groups of compounds that contribute to the visible spectra (chlorophylls and carotenoids). Application of the proposed CR method to VOO samples provided the two pure-component spectra for the aforementioned families of compounds. A correlation study of the real spectra and the resolved component spectra was carried out for different types of oil samples (n=118). LDA using the correlation coefficients as variables to discriminate samples allowed the authentication of 95% of virgin olive oil samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Performance analysis and prediction in triathlon.
Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B
2016-01-01
Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.
Rui, Wen; Chen, Hongyuan; Tan, Yuzhi; Zhong, Yanmei; Feng, Yifan
2010-05-01
A rapid method for the analysis of the main components of the total glycosides of Ranunculus japonicus (TGOR) was developed using ultra-performance liquid chromatography with quadrupole-time-of-flight mass spectrometry (UPLC/Q-TOF-MS). The separation analysis was performed on a Waters Acquity UPLC system and the accurate mass of molecules and their fragment ions were determined by Q-TOF MS. Twenty compounds, including lactone glycosides, flavonoid glycosides and flavonoid aglycones, were identified and tentatively deduced on the basis of their elemental compositions, MS/MS data and relevant literature. The results demonstrated that lactone glycosides and flavonoids were the main constituents of TGOR. Furthermore, an effective and rapid pattern was established allowing for the comprehensive and systematic characterization of the complex samples.
A Review of Feature Extraction Software for Microarray Gene Expression Data
Tan, Ching Siang; Ting, Wai Soon; Mohamad, Mohd Saberi; Chan, Weng Howe; Deris, Safaai; Ali Shah, Zuraini
2014-01-01
When gene expression data are too large to be processed, they are transformed into a reduced representation set of genes. Transforming large-scale gene expression data into a set of genes is called feature extraction. If the genes extracted are carefully chosen, this gene set can extract the relevant information from the large-scale gene expression data, allowing further analysis by using this reduced representation instead of the full size data. In this paper, we review numerous software applications that can be used for feature extraction. The software reviewed is mainly for Principal Component Analysis (PCA), Independent Component Analysis (ICA), Partial Least Squares (PLS), and Local Linear Embedding (LLE). A summary and sources of the software are provided in the last section for each feature extraction method. PMID:25250315
NASA Astrophysics Data System (ADS)
Hus, Jean-Christophe; Bruschweiler, Rafael
2002-07-01
A general method is presented for the reconstruction of interatomic vector orientations from nuclear magnetic resonance (NMR) spectroscopic data of tensor interactions of rank 2, such as dipolar coupling and chemical shielding anisotropy interactions, in solids and partially aligned liquid-state systems. The method, called PRIMA, is based on a principal component analysis of the covariance matrix of the NMR parameters collected for multiple alignments. The five nonzero eigenvalues and their eigenvectors efficiently allow the approximate reconstruction of the vector orientations of the underlying interactions. The method is demonstrated for an isotropic distribution of sample orientations as well as for finite sets of orientations and internuclear vectors encountered in protein systems.
Preliminary analysis of a membrane-based atmosphere-control subsystem
NASA Technical Reports Server (NTRS)
Mccray, Scott B.; Newbold, David D.; Ray, Rod; Ogle, Kathryn
1993-01-01
Controlled ecological life supprot systems will require subsystems for maintaining the consentrations of atmospheric gases within acceptable ranges in human habitat chambers and plant growth chambers. The goal of this work was to develop a membrane-based atmosphere comntrol (MBAC) subsystem that allows the controlled exchange of atmospheric componets (e.g., oxygen, carbon dioxide, and water vapor) between these chambers. The MBAC subsystem promises to offer a simple, nonenergy intensive method to separate, store and exchange atmospheric components, producing optimal concentrations of components in each chamber. In this paper, the results of a preliminary analysis of the MBAC subsystem for control of oxygen and nitrogen are presented. Additionally, the MBAC subsystem and its operation are described.
NASA Technical Reports Server (NTRS)
Schoenwald, Adam; Mohammed, Priscilla; Bradley, Damon; Piepmeier, Jeffrey; Wong, Englin; Gholian, Armen
2016-01-01
Radio-frequency interference (RFI) has negatively implicated scientific measurements across a wide variation passive remote sensing satellites. This has been observed in the L-band radiometers SMOS, Aquarius and more recently, SMAP [1, 2]. RFI has also been observed at higher frequencies such as K band [3]. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements [4]. This work explores the use of ICA (Independent Component Analysis) as a blind source separation technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.
RFI Detection and Mitigation using Independent Component Analysis as a Pre-Processor
NASA Technical Reports Server (NTRS)
Schoenwald, Adam J.; Gholian, Armen; Bradley, Damon C.; Wong, Mark; Mohammed, Priscilla N.; Piepmeier, Jeffrey R.
2016-01-01
Radio-frequency interference (RFI) has negatively impacted scientific measurements of passive remote sensing satellites. This has been observed in the L-band radiometers Soil Moisture and Ocean Salinity (SMOS), Aquarius and more recently, Soil Moisture Active Passive (SMAP). RFI has also been observed at higher frequencies such as K band. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements. This work explores the use of Independent Component Analysis (ICA) as a blind source separation (BSS) technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.
Walsh, Mark; Thomas, Scott G.; Howard, Janet C.; Evans, Edward; Guyer, Kirk; Medvecz, Andrew; Swearingen, Andrew; Navari, Rudolph M.; Ploplis, Victoria; Castellino, Francis J.
2011-01-01
Abstract: 25–35% of all seriously injured multiple trauma patients are coagulopathic upon arrival to the emergency department, and therefore early diagnosis and intervention on this subset of patients is important. In addition to standard plasma based tests of coagulation, the thromboelastogram (TEG®) has resurfaced as an ideal test in the trauma population to help guide the clinician in the administration of blood components in a goal directed fashion. We describe how thromboelastographic analysis is used to assist in the management of trauma patients with coagulopathies presenting to the emergency department, in surgery, and in the postoperative period. Indications for the utilization of the TEG® and platelet mapping as point of care testing that can guide blood component therapy in a goal directed fashion in the trauma population are presented with emphasis on the more common reasons such as massive transfusion protocol, the management of traumatic brain injury with bleeding, the diagnosis and management of trauma in patients on platelet antagonists, the utilization of recombinant FVIIa, and the management of coagulopathy in terminal trauma patients in preparation for organ donation. The TEG® allows for judicious and protocol assisted utilization of blood components in a setting that has recently gained acceptance. In our program, the inclusion of the perfusionist with expertise in performing and interpreting TEG® analysis allows the multidisciplinary trauma team to more effectively manage blood products and resuscitation in this population. PMID:22164456
Evaluation of background radiation dose contributions in the United Arab Emirates.
Goddard, Braden; Bosc, Emmanuel; Al Hasani, Sarra; Lloyd, Cody
2018-09-01
The natural background radiation consists of three main components; cosmic, terrestrial, and skyshine. Although there are currently methods available to measure the total dose rate from background radiation, no established methods exist that allow for the measurement of each component the background radiation. This analysis consists of a unique methodology in which the dose rate contribution from each component of the natural background radiation is measured and calculated. This project evaluates the natural background dose rate in the Abu Dhabi City region from all three of these components using the developed methodology. Evaluating and understanding the different components of background radiation provides a baseline allowing for the detection, and possibly attribution, of elevated radiation levels. Measurements using a high-pressure ion chamber with different shielding configurations and two offshore measurements provided dose rate information that were attributed to the different components of the background radiation. Additional spectral information was obtained using an HPGe detector to verify and quantify the presence of terrestrial radionuclides. By evaluating the dose rates of the different shielding configurations the comic, terrestrial, and skyshine contribution in the Abu Dhabi City region were determined to be 33.0 ± 1.7, 15.7 ± 2.5, and 2.4 ± 2.1 nSv/h, respectively. Copyright © 2018. Published by Elsevier Ltd.
CAPRI: A Geometric Foundation for Computational Analysis and Design
NASA Technical Reports Server (NTRS)
Haimes, Robert
2006-01-01
CAPRI is a software building tool-kit that refers to two ideas; (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. A complete definition of the geometry and application programming interface can be found in the document CAPRI: Computational Analysis PRogramming Interface appended to this report. In summary the interface is subdivided into the following functional components: 1. Utility routines -- These routines include the initialization of CAPRI, loading CAD parts and querying the operational status as well as closing the system down. 2. Geometry data-base queries -- This group of functions allow all top level applications to figure out and get detailed information on any geometric component in the Volume definition. 3. Point queries -- These calls allow grid generators, or solvers doing node adaptation, to snap points directly onto geometric entities. 4. Calculated or geometrically derived queries -- These entry points calculate data from the geometry to aid in grid generation. 5. Boundary data routines -- This part of CAPRI allows general data to be attached to Boundaries so that the boundary conditions can be specified and stored within CAPRI s data-base. 6. Tag based routines -- This part of the API allows the specification of properties associated with either the Volume (material properties) or Boundary (surface properties) entities. 7. Geometry based interpolation routines -- This part of the API facilitates Multi-disciplinary coupling and allows zooming through Boundary Attachments. 8. Geometric creation and manipulation -- These calls facilitate constructing simple solid entities and perform the Boolean solid operations. Geometry constructed in this manner has the advantage that if the data is kept consistent with the CAD package, therefore a new design can be incorporated directly and is manufacturable. 9. Master Model access This addition to the API allows for the querying of the parameters and dimensions of the model. The feature tree is also exposed so it is easy to see where the parameters are applied. Calls exist to allow for the modification of the parameters and the suppression/unsuppression of nodes in the tree. Part regeneration is performed by a single API call and a new part becomes available within CAPRI (if the regeneration was successful). This is described in a separate document. Components 1-7 are considered the CAPRI base level reader.
Vizualization of Arctic Landscapes in the Geoinformation System
NASA Astrophysics Data System (ADS)
Panidi, E. A.; Tsepelev, V. Yu.; Bobkov, A. A.
2010-12-01
In order to investigate the long-scale dynamics of an ice cover, authors suggest to use the geoinformation system (GIS) which allows to conduct the operative and historical analysis of the Polar Region water-ice landscapes variability. Such GIS should include longterm monthly average fields of sea ice, hydrological and atmospheric characters. All collected data and results of their processing have been structured in ArcGISTM . For presentation in the INTERNET resources all datasets were transformed to the open format KML for using in the virtual reality of Google EarthTM . The double component system elaborating on the base of ArcGIS and Google Earth allows to make accumulation, processing and joint synchronous and asynchronous analysis of data and provide wide circle of remote users with accessibility of visual datasets analysis.
NASA Technical Reports Server (NTRS)
Reed, John A.; Afjeh, Abdollah A.
1995-01-01
A major difficulty in designing aeropropulsion systems is that of identifying and understanding the interactions between the separate engine components and disciplines (e.g., fluid mechanics, structural mechanics, heat transfer, material properties, etc.). The traditional analysis approach is to decompose the system into separate components with the interaction between components being evaluated by the application of each of the single disciplines in a sequential manner. Here, one discipline uses information from the calculation of another discipline to determine the effects of component coupling. This approach, however, may not properly identify the consequences of these effects during the design phase, leaving the interactions to be discovered and evaluated during engine testing. This contributes to the time and cost of developing new propulsion systems as, typically, several design-build-test cycles are needed to fully identify multidisciplinary effects and reach the desired system performance. The alternative to sequential isolated component analysis is to use multidisciplinary coupling at a more fundamental level. This approach has been made more plausible due to recent advancements in computation simulation along with application of concurrent engineering concepts. Computer simulation systems designed to provide an environment which is capable of integrating the various disciplines into a single simulation system have been proposed and are currently being developed. One such system is being developed by the Numerical Propulsion System Simulation (NPSS) project. The NPSS project, being developed at the Interdisciplinary Technology Office at the NASA Lewis Research Center is a 'numerical test cell' designed to provide for comprehensive computational design and analysis of aerospace propulsion systems. It will provide multi-disciplinary analyses on a variety of computational platforms, and a user-interface consisting of expert systems, data base management and visualization tools, to allow the designer to investigate the complex interactions inherent in these systems. An interactive programming software system, known as the Application Visualization System (AVS), was utilized for the development of the propulsion system simulation. The modularity of this system provides the ability to couple propulsion system components, as well as disciplines, and provides for the ability to integrate existing, well established analysis codes into the overall system simulation. This feature allows the user to customize the simulation model by inserting desired analysis codes. The prototypical simulation environment for multidisciplinary analysis, called Turbofan Engine System Simulation (TESS), which incorporates many of the characteristics of the simulation environment proposed herein, is detailed.
NASA Technical Reports Server (NTRS)
Hudson, Nicolas; Lin, Ying; Barengoltz, Jack
2010-01-01
A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.
AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments
NASA Astrophysics Data System (ADS)
Ashcroft, Brian Alan; Oosterkamp, Tjerk
2010-11-01
We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.
AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.
Ashcroft, Brian Alan; Oosterkamp, Tjerk
2010-11-01
We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.
Operation and maintenance results from ISFOC CPV plants
NASA Astrophysics Data System (ADS)
Gil, Eduardo; Martinez, María; de la Rubia, Oscar
2017-09-01
The analysis of field operation and maintenance data collected during a period of over eight years, from CPV installations consisting of three different CPV technologies (including second generation of one of these technologies), has allowed us to get valuable information about the long-term degradation of the CPV systems. Through the study of the maintenance control ratio previously defined and by applying the root cause analysis methodology, the components responsible for the most unplanned interventions for each technology were identified. Focusing maintenance efforts on these components, a reduction of the unplanned interventions and the total cost of maintenance has been achieved over the years. Therefore, the deployment of an effective maintenance plan, identifying critical components, is essential to minimize the risk for investors and maximize the CPV power plants lifetime and energy output, increasing the availability of CPV installations, boosting market confidence in CPV systems.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
Portable XRF and principal component analysis for bill characterization in forensic science.
Appoloni, C R; Melquiades, F L
2014-02-01
Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. Copyright © 2013 Elsevier Ltd. All rights reserved.
Learning representative features for facial images based on a modified principal component analysis
NASA Astrophysics Data System (ADS)
Averkin, Anton; Potapov, Alexey
2013-05-01
The paper is devoted to facial image analysis and particularly deals with the problem of automatic evaluation of the attractiveness of human faces. We propose a new approach for automatic construction of feature space based on a modified principal component analysis. Input data sets for the algorithm are the learning data sets of facial images, which are rated by one person. The proposed approach allows one to extract features of the individual subjective face beauty perception and to predict attractiveness values for new facial images, which were not included into a learning data set. The Pearson correlation coefficient between values predicted by our method for new facial images and personal attractiveness estimation values equals to 0.89. This means that the new approach proposed is promising and can be used for predicting subjective face attractiveness values in real systems of the facial images analysis.
Jung, Kwanghee; Takane, Yoshio; Hwang, Heungsun; Woodward, Todd S
2016-06-01
We extend dynamic generalized structured component analysis (GSCA) to enhance its data-analytic capability in structural equation modeling of multi-subject time series data. Time series data of multiple subjects are typically hierarchically structured, where time points are nested within subjects who are in turn nested within a group. The proposed approach, named multilevel dynamic GSCA, accommodates the nested structure in time series data. Explicitly taking the nested structure into account, the proposed method allows investigating subject-wise variability of the loadings and path coefficients by looking at the variance estimates of the corresponding random effects, as well as fixed loadings between observed and latent variables and fixed path coefficients between latent variables. We demonstrate the effectiveness of the proposed approach by applying the method to the multi-subject functional neuroimaging data for brain connectivity analysis, where time series data-level measurements are nested within subjects.
Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K
2010-12-01
Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.
NASA Technical Reports Server (NTRS)
Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.
2015-01-01
The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.
NASA Astrophysics Data System (ADS)
Palo, M.; de Lauro, E.; de Martino, S.; Falanga, M.
2006-12-01
We analyze time series of strombolian volcanic tremor recorded during the experiment performed in 1997 by using 21 three-component broadband seismometers. This work is devoted to the careful analysis of the frequency band [0.1-0.5] Hz in order to obtain information about the properties of volcanic tremor and the microseismic noise. In fact, although this frequency band is largely affected by noise, we infer the possibility of simpler hidden structures. We evidence two significant components by using Independent Component Analysis with the frequencies, respectively, of about 0.2 and 0.4 Hz. We show that these components display wavefield features similar to those of the high frequency strombolian signals (greater than 0.5 Hz). In fact they are radially polarised and located within the crater area. This characterization is lost when an enhancement of energy appears. In this case the presence of microseismic noise becomes relevant. Investigating the entire large data- set available, we determine how microseismic noise influences the signals. We ascribed the microseismic noise source to Scirocco wind. Moreover, our analysis allows one to affirm that the strombolian conduit vibrates like the asymmetric cavity associated with musical instruments generating self-sustained tones.
USSAERO version D computer program development using ANSI standard FORTRAN 77 and DI-3000 graphics
NASA Technical Reports Server (NTRS)
Wiese, M. R.
1986-01-01
The D version of the Unified Subsonic Supersonic Aerodynamic Analysis (USSAERO) program is the result of numerous modifications and enhancements to the B01 version. These changes include conversion to ANSI standard FORTRAN 77; use of the DI-3000 graphics package; removal of the overlay structure; a revised input format; the addition of an input data analysis routine; and increasing the number of aeronautical components allowed.
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Aural Classification and Temporal Robustness
2010-11-01
Canada – Atlantique ; novembre 2010. Contexte : Le présent projet vise le développement d’un classificateur robuste qui utilise des...10 4.2.2.2 Discriminant score . . . . . . . . . . . . . . . . . . . 11 4.2.3 Principal component analysis . . . . . . . . . . . . . . . . . . . 13 ...allows class separation. . . . . . . . . . . . 13 Figure 7: Hypothetical clutter and target pdfs and posterior probabilties shown as surfaces
ERIC Educational Resources Information Center
Wilcox, Bethany R.; Lewandowski, H. J.
2017-01-01
Laboratory courses represent a unique and potentially important component of the undergraduate physics curriculum, which can be designed to allow students to authentically engage with the process of experimental physics. Among other possible benefits, participation in these courses throughout the undergraduate physics curriculum presents an…
The Model for External Reliance of Localities In (MERLIN) Coastal Management Zones is a proposed solution to allow scaling of variables to smaller, nested geographies. Utilizing a Principal Components Analysis and data normalization techniques, smaller scale trends are linked to ...
NASA Technical Reports Server (NTRS)
Fabanich, William
2014-01-01
SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractors thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the mark-up of that geometry. These so-called mark-ups control how finite element (FE) meshes were generated and allowed the tagging of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. Domain-tags were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine these objects each time as one would if using TD Mesher.The use of SpaceClaim/TD Direct has helped simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It has also saved time and effort in the subsequent analysis.
Eberlin, Livia S; Abdelnur, Patricia V; Passero, Alan; de Sa, Gilberto F; Daroda, Romeu J; de Souza, Vanderlea; Eberlin, Marcos N
2009-08-01
High performance thin layer chromatography (HPTLC) combined with on-spot detection and characterization via easy ambient sonic-spray ionization mass spectrometry (EASI-MS) is applied to the analysis of biodiesel (B100) and biodiesel-petrodiesel blends (BX). HPTLC provides chromatographic resolution of major components whereas EASI-MS allows on-spot characterization performed directly on the HPTLC surface at ambient conditions. Constituents (M) are detected by EASI-MS in a one component-one ion fashion as either [M + Na](+) or [M + H](+). For both B100 and BX samples, typical profiles of fatty acid methyl esters (FAME) detected as [FAME + Na](+) ions allow biodiesel typification. The spectrum of the petrodiesel spot displays a homologous series of protonated alkyl pyridines which are characteristic for petrofuels (natural markers). The spectrum for residual or admixture oil spots is characterized by sodiated triglycerides [TAG + Na](+). The application of HPTLC to analyze B100 and BX samples and its combination with EASI-MS for on-spot characterization and quality control is demonstrated.
Failure Analysis in Platelet Molded Composite Systems
NASA Astrophysics Data System (ADS)
Kravchenko, Sergii G.
Long-fiber discontinuous composite systems in the form of chopped prepreg tapes provide an advanced, structural grade, molding compound allowing for fabrication of complex three-dimensional components. Understanding of process-structure-property relationship is essential for application of prerpeg platelet molded components, especially because of their possible irregular disordered heterogeneous morphology. Herein, a structure-property relationship was analyzed in the composite systems of many platelets. Regular and irregular morphologies were considered. Platelet-based systems with more ordered morphology possess superior mechanical performance. While regular morphologies allow for a careful inspection of failure mechanisms derived from the morphological characteristics, irregular morphologies are representative of the composite architectures resulting from uncontrolled deposition and molding with chopped prerpegs. Progressive failure analysis (PFA) was used to study the damaged deformation up to ultimate failure in a platelet-based composite system. Computational damage mechanics approaches were utilized to conduct the PFA. The developed computational models granted understanding of how the composite structure details, meaning the platelet geometry and system morphology (geometrical arrangement and orientation distribution of platelets), define the effective mechanical properties of a platelet-molded composite system, its stiffness, strength and variability in properties.
Noise characteristics of the Escherichia coli rotary motor
2011-01-01
Background The chemotaxis pathway in the bacterium Escherichia coli allows cells to detect changes in external ligand concentration (e.g. nutrients). The pathway regulates the flagellated rotary motors and hence the cells' swimming behaviour, steering them towards more favourable environments. While the molecular components are well characterised, the motor behaviour measured by tethered cell experiments has been difficult to interpret. Results We study the effects of sensing and signalling noise on the motor behaviour. Specifically, we consider fluctuations stemming from ligand concentration, receptor switching between their signalling states, adaptation, modification of proteins by phosphorylation, and motor switching between its two rotational states. We develop a model which includes all signalling steps in the pathway, and discuss a simplified version, which captures the essential features of the full model. We find that the noise characteristics of the motor contain signatures from all these processes, albeit with varying magnitudes. Conclusions Our analysis allows us to address how cell-to-cell variation affects motor behaviour and the question of optimal pathway design. A similar comprehensive analysis can be applied to other two-component signalling pathways. PMID:21951560
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldmann, I. P., E-mail: ingo@star.ucl.ac.uk
2014-01-01
Independent component analysis (ICA) has recently been shown to be a promising new path in data analysis and de-trending of exoplanetary time series signals. Such approaches do not require or assume any prior or auxiliary knowledge about the data or instrument in order to de-convolve the astrophysical light curve signal from instrument or stellar systematic noise. These methods are often known as 'blind-source separation' (BSS) algorithms. Unfortunately, all BSS methods suffer from an amplitude and sign ambiguity of their de-convolved components, which severely limits these methods in low signal-to-noise (S/N) observations where their scalings cannot be determined otherwise. Here wemore » present a novel approach to calibrate ICA using sparse wavelet calibrators. The Amplitude Calibrated Independent Component Analysis (ACICA) allows for the direct retrieval of the independent components' scalings and the robust de-trending of low S/N data. Such an approach gives us an unique and unprecedented insight in the underlying morphology of a data set, which makes this method a powerful tool for exoplanetary data de-trending and signal diagnostics.« less
NASA Technical Reports Server (NTRS)
Oeftering, Richard C.; Wade, Raymond P.; Izadnegahdar, Alain
2011-01-01
The Component-Level Electronic-Assembly Repair (CLEAR) project at the NASA Glenn Research Center is aimed at developing technologies that will enable space-flight crews to perform in situ component-level repair of electronics on Moon and Mars outposts, where there is no existing infrastructure for logistics spares. These technologies must provide effective repair capabilities yet meet the payload and operational constraints of space facilities. Effective repair depends on a diagnostic capability that is versatile but easy to use by crew members that have limited training in electronics. CLEAR studied two techniques that involve extensive precharacterization of "known good" circuits to produce graphical signatures that provide an easy-to-use comparison method to quickly identify faulty components. Analog Signature Analysis (ASA) allows relatively rapid diagnostics of complex electronics by technicians with limited experience. Because of frequency limits and the growing dependence on broadband technologies, ASA must be augmented with other capabilities. To meet this challenge while preserving ease of use, CLEAR proposed an alternative called Complex Signature Analysis (CSA). Tests of ASA and CSA were used to compare capabilities and to determine if the techniques provided an overlapping or complementary capability. The results showed that the methods are complementary.
Video analysis of projectile motion using tablet computers as experimental tools
NASA Astrophysics Data System (ADS)
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.
NASA Astrophysics Data System (ADS)
de Souza, V.; Apel, W. D.; Arteaga, J. C.; Badea, F.; Bekk, K.; Bertaina, M.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Brüggemann, M.; Buchholz, P.; Cantoni, E.; Chiavassa, A.; Cossavella, F.; Daumiller, K.; di Pierro, F.; Doll, P.; Engel, R.; Engler, J.; Finger, M.; Fuhrmann, D.; Ghia, P. L.; Gils, H. J.; Glasstetter, R.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Kickelbick, D.; Klages, H. O.; Kolotaev, Y.; Łuczak, P.; Mathes, H. J.; Mayer, H. J.; Milke, J.; Mitrica, B.; Morello, C.; Navarra, G.; Nehls, S.; Oehlschläger, J.; Ostapchenko, S.; Over, S.; Petcu, M.; Pierog, T.; Rebel, H.; Roth, M.; Schieler, H.; Schröder, F.; Sima, O.; Stümpert, M.; Toma, G.; Trinchero, G. C.; Ulrich, H.; van Buren, J.; Walkowiak, W.; Weindl, A.; Wochele, J.; Wommer, M.; Zabierowski, J.
2009-04-01
KASCADE-Grande is a multi-component detector located at Karlsruhe, Germany. It was optimized to measure cosmic ray air showers with energies between 5×1016 and 1018 eV. Its capabilities are based on the use of several techniques to measure the electromagnetic and muon components of the shower in an independent way which allows a direct comparison to hadronic interaction models and a good estimation of the primary cosmic ray composition. In this paper, we present the status of the experiment, an update of the data analysis and the latest results.
A Geometric Interpretation of the Effective Uniaxial Anisotropy Field in Magnetic Films
NASA Astrophysics Data System (ADS)
Kozlov, V. I.
2018-01-01
It is shown that the effective uniaxial anisotropy field that is usually applied in thin magnetic films (TMFs), which is noncollinear to the magnetization vector, is insufficient for deeper understanding of these processes, although it explains many physical processes in films. The analysis of the magnetization discontinuity in films under certain conditions yields the component of the effective uniaxial anisotropy field collinear to the magnetization vector. This component explains the magnetization discontinuity and allows one to speak of the total effective uniaxial anisotropy field in TMFs.
Nesakumar, Noel; Baskar, Chanthini; Kesavan, Srinivasan; Rayappan, John Bosco Balaguru; Alwarappan, Subbiah
2018-05-22
The moisture content of beetroot varies during long-term cold storage. In this work, we propose a strategy to identify the moisture content and age of beetroot using principal component analysis coupled Fourier transform infrared spectroscopy (FTIR). Frequent FTIR measurements were recorded directly from the beetroot sample surface over a period of 34 days for analysing its moisture content employing attenuated total reflectance in the spectral ranges of 2614-4000 and 1465-1853 cm -1 with a spectral resolution of 8 cm -1 . In order to estimate the transmittance peak height (T p ) and area under the transmittance curve [Formula: see text] over the spectral ranges of 2614-4000 and 1465-1853 cm -1 , Gaussian curve fitting algorithm was performed on FTIR data. Principal component and nonlinear regression analyses were utilized for FTIR data analysis. Score plot over the ranges of 2614-4000 and 1465-1853 cm -1 allowed beetroot quality discrimination. Beetroot quality predictive models were developed by employing biphasic dose response function. Validation experiment results confirmed that the accuracy of the beetroot quality predictive model reached 97.5%. This research work proves that FTIR spectroscopy in combination with principal component analysis and beetroot quality predictive models could serve as an effective tool for discriminating moisture content in fresh, half and completely spoiled stages of beetroot samples and for providing status alerts.
Analysis of feline and canine allergen components in patients sensitized to pets.
Ukleja-Sokołowska, Natalia; Gawrońska-Ukleja, Ewa; Żbikowska-Gotz, Magdalena; Socha, Ewa; Lis, Kinga; Sokołowski, Łukasz; Kuźmiński, Andrzej; Bartuzi, Zbigniew
2016-01-01
Component resolved allergen diagnosis allows for a precise evaluation of the sensitization profiles of patients sensitized to felines and canines. An accurate interpretation of these results allows better insight into the evolution of a given patients sensitizations, and allows for a more precise evaluation of their prognoses. 70 patients (42 women and 28 men, aged 18-65, with the average of 35.5) with a positive feline or canine allergy diagnosis were included in the research group. 30 patients with a negative allergy diagnosis were included in the control group. The total IgE levels of all patients with allergies as well as their allergen-specific IgE to feline and canine allergens were measured. Specific IgE levels to canine (Can f 1, Can f 2, Can f 3, Can f 5) and feline (Fel d 1, Fel d 2, Fel d 4) allergen components were also measured with the use of the ImmunoCap method. Monosensitization for only one canine or feline component was found in 30% of patients. As predicted, the main feline allergen was Fel d 1, which sensitized as many as 93.9% of patients sensitized to felines. Among 65 patients sensitized to at least one feline component, for 30 patients (46.2%) the only sensitizing feline component was Fel d 1. Only 19 patients in that group (63.3%) were not simultaneously sensitized to dogs and 11 (36.7%), the isolated sensitization to feline Fel d 1 notwithstanding, displayed concurrent sensitizations to one of the canine allergen components. Fel d 4 sensitized 49.2% of the research group.64.3% of patients sensitized to canine components had heightened levels of specific IgE to Can f 1. Monosensitization in that group occurred for 32.1% of the patients. Sensitization to Can f 5 was observed among 52.4% of the patients. Concurrent sensitizations to a few allergic components, not only cross-reactive but also originating in different protein families, are a significant problem for patients sensitized to animals.
NASA Astrophysics Data System (ADS)
Werth, Alexandra; Liakat, Sabbir; Dong, Anqi; Woods, Callie M.; Gmachl, Claire F.
2018-05-01
An integrating sphere is used to enhance the collection of backscattered light in a noninvasive glucose sensor based on quantum cascade laser spectroscopy. The sphere enhances signal stability by roughly an order of magnitude, allowing us to use a thermoelectrically (TE) cooled detector while maintaining comparable glucose prediction accuracy levels. Using a smaller TE-cooled detector reduces form factor, creating a mobile sensor. Principal component analysis has predicted principal components of spectra taken from human subjects that closely match the absorption peaks of glucose. These principal components are used as regressors in a linear regression algorithm to make glucose concentration predictions, over 75% of which are clinically accurate.
Planning Models for Tuberculosis Control Programs
Chorba, Ronald W.; Sanders, J. L.
1971-01-01
A discrete-state, discrete-time simulation model of tuberculosis is presented, with submodels of preventive interventions. The model allows prediction of the prevalence of the disease over the simulation period. Preventive and control programs and their optimal budgets may be planned by using the model for cost-benefit analysis: costs are assigned to the program components and disease outcomes to determine the ratio of program expenditures to future savings on medical and socioeconomic costs of tuberculosis. Optimization is achieved by allocating funds in successive increments to alternative program components in simulation and identifying those components that lead to the greatest reduction in prevalence for the given level of expenditure. The method is applied to four hypothetical disease prevalence situations. PMID:4999448
NASA Astrophysics Data System (ADS)
Piron, P.; Vargas Catalan, E.; Karlsson, M.
2018-02-01
Subwavelength gratings are gratings with a period smaller than the incident wavelength. They only allow the zeroth order of diffraction, they possess form birefringence and they can be modeled as birefringent plates. In this paper, we present the first results of an experimental method designed to measure their polarization properties. The method consists in measuring the variation of the light transmitted through two linear polarizers with the subwavelength component between them for several orientations of the polarizers. In this paper, the basic principles of the method are introduced and the experimental setup is presented. Several types of components are numerically studied and the optical measurements of one component are presented.
Structural analysis of gluten-free doughs by fractional rheological model
NASA Astrophysics Data System (ADS)
Orczykowska, Magdalena; Dziubiński, Marek; Owczarz, Piotr
2015-02-01
This study examines the effects of various components of tested gluten-free doughs, such as corn starch, amaranth flour, pea protein isolate, and cellulose in the form of plantain fibers on rheological properties of such doughs. The rheological properties of gluten-free doughs were assessed by using the rheological fractional standard linear solid model (FSLSM). Parameter analysis of the Maxwell-Wiechert fractional derivative rheological model allows to state that gluten-free doughs present a typical behavior of viscoelastic quasi-solid bodies. We obtained the contribution dependence of each component used in preparations of gluten-free doughs (either hard-gel or soft-gel structure). The complicate analysis of the mechanical structure of gluten-free dough was done by applying the FSLSM to explain quite precisely the effects of individual ingredients of the dough on its rheological properties.
Raman signatures of ferroic domain walls captured by principal component analysis.
Nataf, G F; Barrett, N; Kreisel, J; Guennou, M
2018-01-24
Ferroic domain walls are currently investigated by several state-of-the art techniques in order to get a better understanding of their distinct, functional properties. Here, principal component analysis (PCA) of Raman maps is used to study ferroelectric domain walls (DWs) in LiNbO 3 and ferroelastic DWs in NdGaO 3 . It is shown that PCA allows us to quickly and reliably identify small Raman peak variations at ferroelectric DWs and that the value of a peak shift can be deduced-accurately and without a priori-from a first order Taylor expansion of the spectra. The ability of PCA to separate the contribution of ferroelastic domains and DWs to Raman spectra is emphasized. More generally, our results provide a novel route for the statistical analysis of any property mapped across a DW.
Lipophilicity of oils and fats estimated by TLC.
Naşcu-Briciu, Rodica D; Sârbu, Costel
2013-04-01
A representative series of natural toxins belonging to alkaloids and mycotoxins classes was investigated by TLC on classical chemically bonded plates and also on oils- and fats-impregnated plates. Their lipophilicity indices are employed in the characterization and comparison of oils and fats. The retention results allowed an accurate indirect estimation of oils and fats lipophilicity. The investigated fats and oils near classical chemically bonded phases are classified and compared by means of multivariate exploratory techniques, such as cluster analysis, principal component analysis, or fuzzy-principal component analysis. Additionally, a concrete hierarchy of oils and fats derived from the observed lipophilic character is suggested. Human fat seems to be very similar to animal fats, but also possess RP-18, RP-18W, and RP-8. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ceramic component reliability with the restructured NASA/CARES computer program
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Starlinger, Alois; Gyekenyesi, John P.
1992-01-01
The Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design program on statistical fast fracture reliability and monolithic ceramic components is enhanced to include the use of a neutral data base, two-dimensional modeling, and variable problem size. The data base allows for the efficient transfer of element stresses, temperatures, and volumes/areas from the finite element output to the reliability analysis program. Elements are divided to insure a direct correspondence between the subelements and the Gaussian integration points. Two-dimensional modeling is accomplished by assessing the volume flaw reliability with shell elements. To demonstrate the improvements in the algorithm, example problems are selected from a round-robin conducted by WELFEP (WEakest Link failure probability prediction by Finite Element Postprocessors).
Maneshi, Mona; Vahdat, Shahabeddin; Gotman, Jean; Grova, Christophe
2016-01-01
Independent component analysis (ICA) has been widely used to study functional magnetic resonance imaging (fMRI) connectivity. However, the application of ICA in multi-group designs is not straightforward. We have recently developed a new method named “shared and specific independent component analysis” (SSICA) to perform between-group comparisons in the ICA framework. SSICA is sensitive to extract those components which represent a significant difference in functional connectivity between groups or conditions, i.e., components that could be considered “specific” for a group or condition. Here, we investigated the performance of SSICA on realistic simulations, and task fMRI data and compared the results with one of the state-of-the-art group ICA approaches to infer between-group differences. We examined SSICA robustness with respect to the number of allowable extracted specific components and between-group orthogonality assumptions. Furthermore, we proposed a modified formulation of the back-reconstruction method to generate group-level t-statistics maps based on SSICA results. We also evaluated the consistency and specificity of the extracted specific components by SSICA. The results on realistic simulated and real fMRI data showed that SSICA outperforms the regular group ICA approach in terms of reconstruction and classification performance. We demonstrated that SSICA is a powerful data-driven approach to detect patterns of differences in functional connectivity across groups/conditions, particularly in model-free designs such as resting-state fMRI. Our findings in task fMRI show that SSICA confirms results of the general linear model (GLM) analysis and when combined with clustering analysis, it complements GLM findings by providing additional information regarding the reliability and specificity of networks. PMID:27729843
Smolinski, Tomasz G; Buchanan, Roger; Boratyn, Grzegorz M; Milanova, Mariofanna; Prinz, Astrid A
2006-01-01
Background Independent Component Analysis (ICA) proves to be useful in the analysis of neural activity, as it allows for identification of distinct sources of activity. Applied to measurements registered in a controlled setting and under exposure to an external stimulus, it can facilitate analysis of the impact of the stimulus on those sources. The link between the stimulus and a given source can be verified by a classifier that is able to "predict" the condition a given signal was registered under, solely based on the components. However, the ICA's assumption about statistical independence of sources is often unrealistic and turns out to be insufficient to build an accurate classifier. Therefore, we propose to utilize a novel method, based on hybridization of ICA, multi-objective evolutionary algorithms (MOEA), and rough sets (RS), that attempts to improve the effectiveness of signal decomposition techniques by providing them with "classification-awareness." Results The preliminary results described here are very promising and further investigation of other MOEAs and/or RS-based classification accuracy measures should be pursued. Even a quick visual analysis of those results can provide an interesting insight into the problem of neural activity analysis. Conclusion We present a methodology of classificatory decomposition of signals. One of the main advantages of our approach is the fact that rather than solely relying on often unrealistic assumptions about statistical independence of sources, components are generated in the light of a underlying classification problem itself. PMID:17118151
Shellie, Robert; Marriott, Philip; Morrison, Paul
2004-09-01
The use of gas chromatography (GC)-mass spectrometry (MS), GC-time-of-flight MS (TOFMS), comprehensive two-dimensional GC (GCxGC)-flame ionization detection (FID), and GCxGC-TOFMS is discussed for the characterization of the eight important representative components, including Z-alpha-santalol, epi-alpha-bisabolol, Z-alpha-trans-bergamotol, epi-beta-santalol, Z-beta-santalol, E,E-farnesol, Z-nuciferol, and Z-lanceol, in the oil of west Australian sandalwood (Santalum spicatum). Single-column GC-MS lacks the resolving power to separate all of the listed components as pure peaks and allow precise analytical measurement of individual component abundances. With enhanced peak resolution capabilities in GCxGC, these components are sufficiently well resolved to be quantitated using flame ionization detection, following initial characterization of components by using GCxGC-TOFMS.
Three-Point Flexural Properties of Bonded Reinforcement Elements for Pleasure Craft Decks
NASA Astrophysics Data System (ADS)
Di Bella, G.; Galtieri, G.; Borsellino, C.
2018-02-01
The aim of this work was both to study the performances of pleasure craft reinforced components, bonded using a structural adhesive, and to compare them with those obtained using over-lamination as joining system, typically employed in the shipbuilding. With such aim, two different lots of components were prepared: in the first lot, the reinforcement structures were laminated directly on the investigated composite components and, in the second one; they were made separately in a mould and, then, bonded to the composite components. This last method allowed to evaluate the introduction of a product/process innovation in a field typically unwilling to innovation, still tied to craft, and non-standardized procedures. The results of bending tests, performed in order to evaluate the mechanical behaviour of the reinforced components, evidenced the goodness of this innovative design choice. Finally, a finite element analysis was performed. [Figure not available: see fulltext.
Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques
2017-10-01
The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
PCA/HEXTE Observations of Coma and A2319
NASA Technical Reports Server (NTRS)
Rephaeli, Yoel
1998-01-01
The Coma cluster was observed in 1996 for 90 ks by the PCA and HEXTE instruments aboard the RXTE satellite, the first simultaneous, pointing measurement of Coma in the broad, 2-250 keV, energy band. The high sensitivity achieved during this long observation allows precise determination of the spectrum. Our analysis of the measurements clearly indicates that in addition to the main thermal emission from hot intracluster gas at kT=7.5 keV, a second spectral component is required to best-fit the data. If thermal, it can be described with a temperature of 4.7 keV contributing about 20% of the total flux. The additional spectral component can also be described by a power-law, possibly due to Compton scattering of relativistic electrons by the CMB. This interpretation is based on the diffuse radio synchrotron emission, which has a spectral index of 2.34, within the range allowed by fits to the RXTE spectral data. A Compton origin of the measured nonthermal component would imply that the volume-averaged magnetic field in the central region of Coma is B =0.2 micro-Gauss, a value deduced directly from the radio and X-ray measurements (and thus free of the usual assumption of energy equipartition). Barring the presence of unknown systematic errors in the RXTE source or background measurements, our spectral analysis yields considerable evidence for Compton X-ray emission in the Coma cluster.
Simulink models for performance analysis of high speed DQPSK modulated optical link
NASA Astrophysics Data System (ADS)
Sharan, Lucky; Rupanshi, Chaubey, V. K.
2016-03-01
This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhanced or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.
Simulink models for performance analysis of high speed DQPSK modulated optical link
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharan, Lucky, E-mail: luckysharan@pilani.bits-pilani.ac.in; Rupanshi,, E-mail: f2011222@pilani.bits-pilani.ac.in; Chaubey, V. K., E-mail: vkc@pilani.bits-pilani.ac.in
2016-03-09
This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhancedmore » or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilke, Jeremiah J; Kenny, Joseph P.
2015-02-01
Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less
Ship Speed Retrieval From Single Channel TerraSAR-X Data
NASA Astrophysics Data System (ADS)
Soccorsi, Matteo; Lehner, Susanne
2010-04-01
A method to estimate the speed of a moving ship is presented. The technique, introduced in Kirscht (1998), is extended to marine application and validated on TerraSAR-X High-Resolution (HR) data. The generation of a sequence of single-look SAR images from a single- channel image corresponds to an image time series with reduced resolution. This allows applying change detection techniques on the time series to evaluate the velocity components in range and azimuth of the ship. The evaluation of the displacement vector of a moving target in consecutive images of the sequence allows the estimation of the azimuth velocity component. The range velocity component is estimated by evaluating the variation of the signal amplitude during the sequence. In order to apply the technique on TerraSAR-X Spot Light (SL) data a further processing step is needed. The phase has to be corrected as presented in Eineder et al. (2009) due to the SL acquisition mode; otherwise the image sequence cannot be generated. The analysis, when possible validated by the Automatic Identification System (AIS), was performed in the framework of the ESA project MARISS.
Tidal analysis of Met rocket wind data
NASA Technical Reports Server (NTRS)
Bedinger, J. F.; Constantinides, E.
1976-01-01
A method of analyzing Met Rocket wind data is described. Modern tidal theory and specialized analytical techniques were used to resolve specific tidal modes and prevailing components in observed wind data. A representation of the wind which is continuous in both space and time was formulated. Such a representation allows direct comparison with theory, allows the derivation of other quantities such as temperature and pressure which in turn may be compared with observed values, and allows the formation of a wind model which extends over a broader range of space and time. Significant diurnal tidal modes with wavelengths of 10 and 7 km were present in the data and were resolved by the analytical technique.
2013-12-13
components can be part of multiple systems simultaneously. Measures of effectiveness determine how well a certain action is meeting its operational...have been collected throughout this literature review as well as during the analysis of data in chapter 4. Performance measures allow for an...for analysis as well . Eradication, however, can occur throughout the year. This is particularly true in South America, because coca can be harvested
Kalivas, John H; Georgiou, Constantinos A; Moira, Marianna; Tsafaras, Ilias; Petrakis, Eleftherios A; Mousdis, George A
2014-04-01
Quantitative analysis of food adulterants is an important health and economic issue that needs to be fast and simple. Spectroscopy has significantly reduced analysis time. However, still needed are preparations of analyte calibration samples matrix matched to prediction samples which can be laborious and costly. Reported in this paper is the application of a newly developed pure component Tikhonov regularization (PCTR) process that does not require laboratory prepared or reference analysis methods, and hence, is a greener calibration method. The PCTR method requires an analyte pure component spectrum and non-analyte spectra. As a food analysis example, synchronous fluorescence spectra of extra virgin olive oil samples adulterated with sunflower oil is used. Results are shown to be better than those obtained using ridge regression with reference calibration samples. The flexibility of PCTR allows including reference samples and is generic for use with other instrumental methods and food products. Copyright © 2013 Elsevier Ltd. All rights reserved.
Gosetti, Fabio; Chiuminatto, Ugo; Mazzucco, Eleonora; Mastroianni, Rita; Marengo, Emilio
2015-01-15
The study investigates the sunlight photodegradation process of carminic acid, a natural red colourant used in beverages. For this purpose, both carminic acid aqueous standard solutions and sixteen different commercial beverages, ten containing carminic acid and six containing E120 dye, were subjected to photoirradiation. The results show different patterns of degradation, not only between the standard solutions and the beverages, but also from beverage to beverage. Due to the different beverage recipes, unpredictable reactions take place between the dye and the other ingredients. To identify the dye degradation products in a very complex scenario, a methodology was used, based on the combined use of principal component analysis with discriminant analysis and ultra-high-performance liquid chromatography coupled with tandem high resolution mass spectrometry. The methodology is unaffected by beverage composition and allows the degradation products of carminic acid dye to be identified for each beverage. Copyright © 2014 Elsevier Ltd. All rights reserved.
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)
2002-01-01
In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory
Develop advanced nonlinear signal analysis topographical mapping system
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1993-01-01
This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an advanced nonlinear signal analysis topographical mapping system (ATMS) of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbopump families.
Cocco, Simona; Monasson, Remi; Weigt, Martin
2013-01-01
Various approaches have explored the covariation of residues in multiple-sequence alignments of homologous proteins to extract functional and structural information. Among those are principal component analysis (PCA), which identifies the most correlated groups of residues, and direct coupling analysis (DCA), a global inference method based on the maximum entropy principle, which aims at predicting residue-residue contacts. In this paper, inspired by the statistical physics of disordered systems, we introduce the Hopfield-Potts model to naturally interpolate between these two approaches. The Hopfield-Potts model allows us to identify relevant ‘patterns’ of residues from the knowledge of the eigenmodes and eigenvalues of the residue-residue correlation matrix. We show how the computation of such statistical patterns makes it possible to accurately predict residue-residue contacts with a much smaller number of parameters than DCA. This dimensional reduction allows us to avoid overfitting and to extract contact information from multiple-sequence alignments of reduced size. In addition, we show that low-eigenvalue correlation modes, discarded by PCA, are important to recover structural information: the corresponding patterns are highly localized, that is, they are concentrated in few sites, which we find to be in close contact in the three-dimensional protein fold. PMID:23990764
MPIC: a mitochondrial protein import components database for plant and non-plant species.
Murcha, Monika W; Narsai, Reena; Devenish, James; Kubiszewski-Jakubiak, Szymon; Whelan, James
2015-01-01
In the 2 billion years since the endosymbiotic event that gave rise to mitochondria, variations in mitochondrial protein import have evolved across different species. With the genomes of an increasing number of plant species sequenced, it is possible to gain novel insights into mitochondrial protein import pathways. We have generated the Mitochondrial Protein Import Components (MPIC) Database (DB; http://www.plantenergy.uwa.edu.au/applications/mpic) providing searchable information on the protein import apparatus of plant and non-plant mitochondria. An in silico analysis was carried out, comparing the mitochondrial protein import apparatus from 24 species representing various lineages from Saccharomyces cerevisiae (yeast) and algae to Homo sapiens (human) and higher plants, including Arabidopsis thaliana (Arabidopsis), Oryza sativa (rice) and other more recently sequenced plant species. Each of these species was extensively searched and manually assembled for analysis in the MPIC DB. The database presents an interactive diagram in a user-friendly manner, allowing users to select their import component of interest. The MPIC DB presents an extensive resource facilitating detailed investigation of the mitochondrial protein import machinery and allowing patterns of conservation and divergence to be recognized that would otherwise have been missed. To demonstrate the usefulness of the MPIC DB, we present a comparative analysis of the mitochondrial protein import machinery in plants and non-plant species, revealing plant-specific features that have evolved. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Measurement of 0.511-MeV gamma rays with a balloon-borne Ge/Li/ spectrometer
NASA Technical Reports Server (NTRS)
Ling, J. C.; Mahoney, W. A.; Willett, J. B.; Jacobson, A. S.
1977-01-01
A collimated high-resolution gamma ray spectrometer was flown on a balloon over Palestine, Texas, on June 10, 1974, to obtain measurements of the terrestrial and extraterrestrial 0.511-MeV gamma rays. The spectrometer consists of four 40-cu-cm Ge(Li) crystals operating in the energy range 0.06-10 MeV; this cluster of detectors is surrounded by a CsI(Na) anticoincidence shield. This system is used primarily to allow measurements of the two escape peaks associated with high-energy gamma ray lines. It also allows a measurement of the background component of the 0.511-MeV flux produced by beta(+) decays in materials inside the CsI(Na) shield. It is shown that the measurements of the atmospheric fluxes are consistent with earlier results after allowance is made for an additional component of the background due to beta(+) decays produced by neutron- and proton-initiated interactions with materials in and near the detector. Results of the extraterrestrial flux require an extensive detailed analysis of the time-varying background because of activation buildup and balloon spatial drifts.
An Analysis of Multiple Configurations of Next-Generation Cathodes in a Low Power Hall Thruster
2009-03-01
compressor, the roughing pump , and the cryo-head temperature indicators. Figure 6. SPASS lab vacuum chamber and associated components. To measure...in progress to add additional cryo- pumps to the existing vacuum chamber that may allow higher propellant flow rates without exceeding ~1x10-5 torr... Vacuum Facility .........................................................................................................45 Test Assembly
Kevin M. Potter
2009-01-01
Forest genetic sustainability is an important component of forest health because genetic diversity and evolutionary processes allow for the adaptation of species and for the maintenance of ecosystem functionality and resilience. Phylogenetic community analyses, a set of new statistical methods for describing the evolutionary relationships among species, offer an...
Hydrogen from coal cost estimation guidebook
NASA Technical Reports Server (NTRS)
Billings, R. E.
1981-01-01
In an effort to establish baseline information whereby specific projects can be evaluated, a current set of parameters which are typical of coal gasification applications was developed. Using these parameters a computer model allows researchers to interrelate cost components in a sensitivity analysis. The results make possible an approximate estimation of hydrogen energy economics from coal, under a variety of circumstances.
Luce, Robert; Hildebrandt, Peter; Kuhlmann, Uwe; Liesen, Jörg
2016-09-01
The key challenge of time-resolved Raman spectroscopy is the identification of the constituent species and the analysis of the kinetics of the underlying reaction network. In this work we present an integral approach that allows for determining both the component spectra and the rate constants simultaneously from a series of vibrational spectra. It is based on an algorithm for nonnegative matrix factorization that is applied to the experimental data set following a few pre-processing steps. As a prerequisite for physically unambiguous solutions, each component spectrum must include one vibrational band that does not significantly interfere with the vibrational bands of other species. The approach is applied to synthetic "experimental" spectra derived from model systems comprising a set of species with component spectra differing with respect to their degree of spectral interferences and signal-to-noise ratios. In each case, the species involved are connected via monomolecular reaction pathways. The potential and limitations of the approach for recovering the respective rate constants and component spectra are discussed. © The Author(s) 2016.
Evaluation of surface energy and radiation balance systems on the Konza Prairie
NASA Technical Reports Server (NTRS)
Fritschen, Leo J.
1987-01-01
Four Surface Energy and Radiation Balance Systems (SERBS) were installed and operated for two weeks in Kansas during July of 1986. Surface energy and radiation balances were investigated on six sites on the Konza Prairie about 3 km south of Manhattan, Kansas. Measurements were made to allow the computation of these radiation components: total solar and diffuse radiation, reflected solar radiation, net radiation, and longwave radiation upward and downward. Measurements were made to allow the computation of the sensible and latent heat fluxes by the Bowen ratio method using differential psychrometers on automatic exchange mechanisms. The report includes a description of the experimental sites, data acquisition systems and sensors, data acquisitions system operating instructions, and software used for data acquisition and analysis. In addition, data listings and plots of the energy balance components for all days and systems are given.
Automation of a N-S S and C Database Generation for the Harrier in Ground Effect
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Chaderjian, Neal M.; Pandya, Shishir; Kwak, Dochan (Technical Monitor)
2001-01-01
A method of automating the generation of a time-dependent, Navier-Stokes static stability and control database for the Harrier aircraft in ground effect is outlined. Reusable, lightweight components arc described which allow different facets of the computational fluid dynamic simulation process to utilize a consistent interface to a remote database. These components also allow changes and customizations to easily be facilitated into the solution process to enhance performance, without relying upon third-party support. An analysis of the multi-level parallel solver OVERFLOW-MLP is presented, and the results indicate that it is feasible to utilize large numbers of processors (= 100) even with a grid system with relatively small number of cells (= 10(exp 6)). A more detailed discussion of the simulation process, as well as refined data for the scaling of the OVERFLOW-MLP flow solver will be included in the full paper.
Predicting and explaining the movement of mesoscale oceanographic features using CLIPS
NASA Technical Reports Server (NTRS)
Bridges, Susan; Chen, Liang-Chun; Lybanon, Matthew
1994-01-01
The Naval Research Laboratory has developed an oceanographic expert system that describes the evolution of mesoscale features in the Gulf Stream region of the northwest Atlantic Ocean. These features include the Gulf Stream current and the warm and cold core eddies associated with the Gulf Stream. An explanation capability was added to the eddy prediction component of the expert system in order to allow the system to justify the reasoning process it uses to make predictions. The eddy prediction and explanation components of the system have recently been redesigned and translated from OPS83 to C and CLIPS and the new system is called WATE (Where Are Those Eddies). The new design has improved the system's readability, understandability and maintainability and will also allow the system to be incorporated into the Semi-Automated Mesoscale Analysis System which will eventually be embedded into the Navy's Tactical Environmental Support System, Third Generation, TESS(3).
Joining of Silicon Carbide Through the Diffusion Bonding Approach
NASA Technical Reports Server (NTRS)
Halbig, Michael .; Singh, Mrityunjay
2009-01-01
In order for ceramics to be fully utilized as components for high-temperature and structural applications, joining and integration methods are needed. Such methods will allow for the fabrication the complex shapes and also allow for insertion of the ceramic component into a system that may have different adjacent materials. Monolithic silicon carbide (SiC) is a ceramic material of focus due to its high temperature strength and stability. Titanium foils were used as an interlayer to form diffusion bonds between chemical vapor deposited (CVD) SiC ceramics with the aid of hot pressing. The influence of such variables as interlayer thickness and processing time were investigated to see which conditions contributed to bonds that were well adhered and crack free. Optical microscopy, scanning electron microscopy, and electron microprobe analysis were used to characterize the bonds and to identify the reaction formed phases.
Modeling the microstructure of surface by applying BRDF function
NASA Astrophysics Data System (ADS)
Plachta, Kamil
2017-06-01
The paper presents the modeling of surface microstructure using a bidirectional reflectance distribution function. This function contains full information about the reflectance properties of the flat surfaces - it is possible to determine the share of the specular, directional and diffuse components in the reflected luminous stream. The software is based on the authorial algorithm that uses selected elements of this function models, which allows to determine the share of each component. Basing on obtained data, the surface microstructure of each material can be modeled, which allows to determine the properties of this materials. The concentrator directs the reflected solar radiation onto the photovoltaic surface, increasing, at the same time, the value of the incident luminous stream. The paper presents an analysis of selected materials that can be used to construct the solar concentrator system. The use of concentrator increases the power output of the photovoltaic system by up to 17% as compared to the standard solution.
NASA Astrophysics Data System (ADS)
Durigon, Angelica; Lier, Quirijn de Jong van; Metselaar, Klaas
2016-10-01
To date, measuring plant transpiration at canopy scale is laborious and its estimation by numerical modelling can be used to assess high time frequency data. When using the model by Jacobs (1994) to simulate transpiration of water stressed plants it needs to be reparametrized. We compare the importance of model variables affecting simulated transpiration of water stressed plants. A systematic literature review was performed to recover existing parameterizations to be tested in the model. Data from a field experiment with common bean under full and deficit irrigation were used to correlate estimations to forcing variables applying principal component analysis. New parameterizations resulted in a moderate reduction of prediction errors and in an increase in model performance. Ags model was sensitive to changes in the mesophyll conductance and leaf angle distribution parameterizations, allowing model improvement. Simulated transpiration could be separated in temporal components. Daily, afternoon depression and long-term components for the fully irrigated treatment were more related to atmospheric forcing variables (specific humidity deficit between stomata and air, relative air humidity and canopy temperature). Daily and afternoon depression components for the deficit-irrigated treatment were related to both atmospheric and soil dryness, and long-term component was related to soil dryness.
Determination of molecular weight distributions in native and pretreated wood.
Leskinen, Timo; Kelley, Stephen S; Argyropoulos, Dimitris S
2015-03-30
The analysis of native wood components by size-exclusion chromatography (SEC) is challenging. Isolation, derivatization and solubilization of wood polymers is required prior to the analysis. The present approach allowed the determination of molecular weight distributions of the carbohydrates and of lignin in native and processed woods, without preparative component isolation steps. For the first time a component selective SEC analysis of sawdust preparations was made possible by the combination of two selective derivatization methods, namely; ionic liquid assisted benzoylation of the carbohydrate fraction and acetobromination of the lignin in acetic acid media. These were optimized for wood samples. The developed method was thus used to examine changes in softwood samples after degradative mechanical and/or chemical treatments, such as ball milling, steam explosion, green liquor pulping, and chemical oxidation with 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ). The methodology can also be applied to examine changes in molecular weight and lignin-carbohydrate linkages that occur during wood-based biorefinery operations, such as pretreatments, and enzymatic saccharification. Copyright © 2014 Elsevier Ltd. All rights reserved.
Papaemmanouil, Christina; Tsiafoulis, Constantinos G; Alivertis, Dimitrios; Tzamaloukas, Ouranios; Miltiadou, Despoina; Tzakos, Andreas G; Gerothanassis, Ioannis P
2015-06-10
We report a rapid, direct, and unequivocal spin-chromatographic separation and identification of minor components in the lipid fraction of milk and common dairy products with the use of selective one-dimensional (1D) total correlation spectroscopy (TOCSY) nuclear magnetic resonance (NMR) experiments. The method allows for the complete backbone spin-coupling network to be elucidated even in strongly overlapped regions and in the presence of major components from 4 × 10(2) to 3 × 10(3) stronger NMR signal intensities. The proposed spin-chromatography method does not require any derivatization steps for the lipid fraction, is selective with excellent resolution, is sensitive with quantitation capability, and compares favorably to two-dimensional (2D) TOCSY and gas chromatography-mass spectrometry (GC-MS) methods of analysis. The results of the present study demonstrated that the 1D TOCSY NMR spin-chromatography method can become a procedure of primary interest in food analysis and generally in complex mixture analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhai, Y.; Loesser, G.; Smith, M.
ITER diagnostic first walls (DFWs) and diagnostic shield modules (DSMs) inside the port plugs (PPs) are designed to protect diagnostic instrument and components from a harsh plasma environment and provide structural support while allowing for diagnostic access to the plasma. The design of DFWs and DSMs are driven by 1) plasma radiation and nuclear heating during normal operation 2) electromagnetic loads during plasma events and associate component structural responses. A multi-physics engineering analysis protocol for the design has been established at Princeton Plasma Physics Laboratory and it was used for the design of ITER DFWs and DSMs. The analyses weremore » performed to address challenging design issues based on resultant stresses and deflections of the DFW-DSM-PP assembly for the main load cases. ITER Structural Design Criteria for In-Vessel Components (SDC-IC) required for design by analysis and three major issues driving the mechanical design of ITER DFWs are discussed. The general guidelines for the DSM design have been established as a result of design parametric studies.« less
Hyperchromatic laser scanning cytometry
NASA Astrophysics Data System (ADS)
Tárnok, Attila; Mittag, Anja
2007-02-01
In the emerging fields of high-content and high-throughput single cell analysis for Systems Biology and Cytomics multi- and polychromatic analysis of biological specimens has become increasingly important. Combining different technologies and staining methods polychromatic analysis (i.e. using 8 or more fluorescent colors at a time) can be pushed forward to measure anything stainable in a cell, an approach termed hyperchromatic cytometry. For cytometric cell analysis microscope based Slide Based Cytometry (SBC) technologies are ideal as, unlike flow cytometry, they are non-consumptive, i.e. the analyzed sample is fixed on the slide. Based on the feature of relocation identical cells can be subsequently reanalyzed. In this manner data on the single cell level after manipulation steps can be collected. In this overview various components for hyperchromatic cytometry are demonstrated for a SBC instrument, the Laser Scanning Cytometer (Compucyte Corp., Cambridge, MA): 1) polychromatic cytometry, 2) iterative restaining (using the same fluorochrome for restaining and subsequent reanalysis), 3) differential photobleaching (differentiating fluorochromes by their different photostability), 4) photoactivation (activating fluorescent nanoparticles or photocaged dyes), and 5) photodestruction (destruction of FRET dyes). With the intelligent combination of several of these techniques hyperchromatic cytometry allows to quantify and analyze virtually all components of relevance on the identical cell. The combination of high-throughput and high-content SBC analysis with high-resolution confocal imaging allows clear verification of phenotypically distinct subpopulations of cells with structural information. The information gained per specimen is only limited by the number of available antibodies and by sterical hindrance.
Drought Patterns Forecasting using an Auto-Regressive Logistic Model
NASA Astrophysics Data System (ADS)
del Jesus, M.; Sheffield, J.; Méndez Incera, F. J.; Losada, I. J.; Espejo, A.
2014-12-01
Drought is characterized by a water deficit that may manifest across a large range of spatial and temporal scales. Drought may create important socio-economic consequences, many times of catastrophic dimensions. A quantifiable definition of drought is elusive because depending on its impacts, consequences and generation mechanism, different water deficit periods may be identified as a drought by virtue of some definitions but not by others. Droughts are linked to the water cycle and, although a climate change signal may not have emerged yet, they are also intimately linked to climate.In this work we develop an auto-regressive logistic model for drought prediction at different temporal scales that makes use of a spatially explicit framework. Our model allows to include covariates, continuous or categorical, to improve the performance of the auto-regressive component.Our approach makes use of dimensionality reduction (principal component analysis) and classification techniques (K-Means and maximum dissimilarity) to simplify the representation of complex climatic patterns, such as sea surface temperature (SST) and sea level pressure (SLP), while including information on their spatial structure, i.e. considering their spatial patterns. This procedure allows us to include in the analysis multivariate representation of complex climatic phenomena, as the El Niño-Southern Oscillation. We also explore the impact of other climate-related variables such as sun spots. The model allows to quantify the uncertainty of the forecasts and can be easily adapted to make predictions under future climatic scenarios. The framework herein presented may be extended to other applications such as flash flood analysis, or risk assessment of natural hazards.
Shaughnessy, Allen F; Allen, Lucas; Duggan, Ashley
2017-05-01
Reflection, a process of self-analysis to promote learning through better understanding of one's experiences, is often used to assess learners' metacognitive ability. However, writing reflective exercises, not submitted for assessment, may allow learners to explore their experiences and indicate learning and professional growth without explicitly connecting to intentional sense-making. To identify core components of learning about medicine or medical education from family medicine residents' written reflections. Family medicine residents' wrote reflections about their experiences throughout an academic year. Qualitative thematic analysis to identify core components in 767 reflections written by 33 residents. We identified four themes of learning: 'Elaborated reporting' and 'metacognitive monitoring' represent explicit, purposeful self-analysis that typically would be characterised as reflective learning about medicine. 'Simple reporting' and 'goal setting' signal an analysis of experience that indicates learning and professional growth but that is overlooked as a component of learning. Identified themes elucidate the explicit and implicit forms of written reflection as sense-making and learning. An expanded theoretical understanding of reflection as inclusive of conscious sense-making as well as implicit discovery better enables the art of physician self-development.
Designers workbench: toward real-time immersive modeling
NASA Astrophysics Data System (ADS)
Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu
2000-05-01
This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.
2011-01-01
Background Ontologies are increasingly used to structure and semantically describe entities of domains, such as genes and proteins in life sciences. Their increasing size and the high frequency of updates resulting in a large set of ontology versions necessitates efficient management and analysis of this data. Results We present GOMMA, a generic infrastructure for managing and analyzing life science ontologies and their evolution. GOMMA utilizes a generic repository to uniformly and efficiently manage ontology versions and different kinds of mappings. Furthermore, it provides components for ontology matching, and determining evolutionary ontology changes. These components are used by analysis tools, such as the Ontology Evolution Explorer (OnEX) and the detection of unstable ontology regions. We introduce the component-based infrastructure and show analysis results for selected components and life science applications. GOMMA is available at http://dbs.uni-leipzig.de/GOMMA. Conclusions GOMMA provides a comprehensive and scalable infrastructure to manage large life science ontologies and analyze their evolution. Key functions include a generic storage of ontology versions and mappings, support for ontology matching and determining ontology changes. The supported features for analyzing ontology changes are helpful to assess their impact on ontology-dependent applications such as for term enrichment. GOMMA complements OnEX by providing functionalities to manage various versions of mappings between two ontologies and allows combining different match approaches. PMID:21914205
Polytopic vector analysis in igneous petrology: Application to lunar petrogenesis
NASA Technical Reports Server (NTRS)
Shervais, John W.; Ehrlich, R.
1993-01-01
Lunar samples represent a heterogeneous assemblage of rocks with complex inter-relationships that are difficult to decipher using standard petrogenetic approaches. These inter-relationships reflect several distinct petrogenetic trends as well as thermomechanical mixing of distinct components. Additional complications arise from the unequal quality of chemical analyses and from the fact that many samples (e.g., breccia clasts) are too small to be representative of the system from which they derived. Polytopic vector analysis (PVA) is a multi-variate procedure used as a tool for exploratory data analysis. PVA allows the analyst to classify samples and clarifies relationships among heterogenous samples with complex petrogenetic histories. It differs from orthogonal factor analysis in that it uses non-orthogonal multivariate sample vectors to extract sample endmember compositions. The output from a Q-mode (sample based) factor analysis is the initial step in PVA. The Q-mode analysis, using criteria established by Miesch and Klovan and Miesch, is used to determine the number of endmembers in the data system. The second step involves determination of endmembers and mixing proportions with all output expressed in the same geochemical variable as the input. The composition of endmembers is derived by analysis of the variability of the data set. Endmembers need not be present in the data set, nor is it necessary for their composition to be known a priori. A set of any endmembers defines a 'polytope' or classification figure (triangle for a three component system, tetrahedron for a four component system, a 'five-tope' in four dimensions for five component system, et cetera).
A finite element model of the human head for auditory bone conduction simulation.
Taschke, Henning; Hudde, Herbert
2006-01-01
In order to investigate the mechanisms of bone conduction, a finite element model of the human head was developed. The most important steps of the modelling process are described. The model was excited by means of percutaneously applied forces in order to get a deeper insight into the way the parts of the peripheral hearing organ and the surrounding tissue vibrate. The analysis is done based on the division of the bone conduction mechanisms into components. The frequency-dependent patterns of vibration of the components are analyzed. Furthermore, the model allows for the calculation of the contribution of each component to the overall bone-conducted sound. The components interact in a complicated way, which strongly depends on the nature of the excitation and the spatial region to which it is applied.
Butler, Rebecca A.
2014-01-01
Stroke aphasia is a multidimensional disorder in which patient profiles reflect variation along multiple behavioural continua. We present a novel approach to separating the principal aspects of chronic aphasic performance and isolating their neural bases. Principal components analysis was used to extract core factors underlying performance of 31 participants with chronic stroke aphasia on a large, detailed battery of behavioural assessments. The rotated principle components analysis revealed three key factors, which we labelled as phonology, semantic and executive/cognition on the basis of the common elements in the tests that loaded most strongly on each component. The phonology factor explained the most variance, followed by the semantic factor and then the executive-cognition factor. The use of principle components analysis rendered participants’ scores on these three factors orthogonal and therefore ideal for use as simultaneous continuous predictors in a voxel-based correlational methodology analysis of high resolution structural scans. Phonological processing ability was uniquely related to left posterior perisylvian regions including Heschl’s gyrus, posterior middle and superior temporal gyri and superior temporal sulcus, as well as the white matter underlying the posterior superior temporal gyrus. The semantic factor was uniquely related to left anterior middle temporal gyrus and the underlying temporal stem. The executive-cognition factor was not correlated selectively with the structural integrity of any particular region, as might be expected in light of the widely-distributed and multi-functional nature of the regions that support executive functions. The identified phonological and semantic areas align well with those highlighted by other methodologies such as functional neuroimaging and neurostimulation. The use of principle components analysis allowed us to characterize the neural bases of participants’ behavioural performance more robustly and selectively than the use of raw assessment scores or diagnostic classifications because principle components analysis extracts statistically unique, orthogonal behavioural components of interest. As such, in addition to improving our understanding of lesion–symptom mapping in stroke aphasia, the same approach could be used to clarify brain–behaviour relationships in other neurological disorders. PMID:25348632
Analysis system for characterisation of simple, low-cost microfluidic components
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Naidoo, Thegaran; Nxumalo, Zandile; Land, Kevin; Davies, Emlyn; Fourie, Louis; Marais, Philip; Roux, Pieter
2014-06-01
There is an inherent trade-off between cost and operational integrity of microfluidic components, especially when intended for use in point-of-care devices. We present an analysis system developed to characterise microfluidic components for performing blood cell counting, enabling the balance between function and cost to be established quantitatively. Microfluidic components for sample and reagent introduction, mixing and dispensing of fluids were investigated. A simple inlet port plugging mechanism is used to introduce and dispense a sample of blood, while a reagent is released into the microfluidic system through compression and bursting of a blister pack. Mixing and dispensing of the sample and reagent are facilitated via air actuation. For these microfluidic components to be implemented successfully, a number of aspects need to be characterised for development of an integrated point-of-care device design. The functional components were measured using a microfluidic component analysis system established in-house. Experiments were carried out to determine: 1. the force and speed requirements for sample inlet port plugging and blister pack compression and release using two linear actuators and load cells for plugging the inlet port, compressing the blister pack, and subsequently measuring the resulting forces exerted, 2. the accuracy and repeatability of total volumes of sample and reagent dispensed, and 3. the degree of mixing and dispensing uniformity of the sample and reagent for cell counting analysis. A programmable syringe pump was used for air actuation to facilitate mixing and dispensing of the sample and reagent. Two high speed cameras formed part of the analysis system and allowed for visualisation of the fluidic operations within the microfluidic device. Additional quantitative measures such as microscopy were also used to assess mixing and dilution accuracy, as well as uniformity of fluid dispensing - all of which are important requirements towards the successful implementation of a blood cell counting system.
Blind source separation of ex-vivo aorta tissue multispectral images
Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson
2015-01-01
Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method’s performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue. PMID:26137366
A grid-embedding transonic flow analysis computer program for wing/nacelle configurations
NASA Technical Reports Server (NTRS)
Atta, E. H.; Vadyak, J.
1983-01-01
An efficient grid-interfacing zonal algorithm was developed for computing the three-dimensional transonic flow field about wing/nacelle configurations. the algorithm uses the full-potential formulation and the AF2 approximate factorization scheme. The flow field solution is computed using a component-adaptive grid approach in which separate grids are employed for the individual components in the multi-component configuration, where each component grid is optimized for a particular geometry such as the wing or nacelle. The wing and nacelle component grids are allowed to overlap, and flow field information is transmitted from one grid to another through the overlap region using trivariate interpolation. This report represents a discussion of the computational methods used to generate both the wing and nacelle component grids, the technique used to interface the component grids, and the method used to obtain the inviscid flow solution. Computed results and correlations with experiment are presented. also presented are discussions on the organization of the wing grid generation (GRGEN3) and nacelle grid generation (NGRIDA) computer programs, the grid interface (LK) computer program, and the wing/nacelle flow solution (TWN) computer program. Descriptions of the respective subroutines, definitions of the required input parameters, a discussion on interpretation of the output, and the sample cases illustrating application of the analysis are provided for each of the four computer programs.
Ugliano, Maurizio
2016-12-01
This work describes the application of disposable screen printed carbon paste sensors for the analysis of the main white wine oxidizable compounds as well as for the rapid fingerprinting and classification of white wines from different grape varieties. The response of individual white wine antioxidants such as flavanols, flavanol derivatives, phenolic acids, SO2 and ascorbic acid was first assessed in model wine. Analysis of commercial white wines gave voltammograms featuring two unresolved anodic waves corresponding to the oxidation of different compounds, mostly phenolic antioxidants. Calculation of the first order derivative of measured current vs. applied potential allowed resolving these two waves, highlighting the occurrence of several electrode processes corresponding to the oxidation of individual wine components. Through the application of Principal Component Analysis (PCA), derivative voltammograms were used to discriminate among wines of different varieties. Copyright © 2016 Elsevier Ltd. All rights reserved.
Proton transfer reaction mass spectrometry: on-line trace gas analysis at the ppb level
NASA Astrophysics Data System (ADS)
Hansel, A.; Jordan, A.; Holzinger, R.; Prazeller, P.; Vogel, W.; Lindinger, W.
1995-11-01
A system for trace gas analysis using proton transfer reaction mass spectrometry (PTR-MS) has been developed which allows for on-line measurements of components with concentrations as low as 1 ppb. The method is based on reactions of H3O+ ions, which perform non-dissociative proton transfer to most of the common organic trace constituents but do not react with any of the components present in clean air. Examples of analysis of breath taken from smokers and non-smokers as well as from patients suffering from cirrhosis of the liver, and of air in buildings as well as of ambient air taken at a road crossing demonstrate the wide range of applicability of this method. An enhanced level of acetonitrile in the breath is a most suitable indicator that a person is a smoker. Enhanced levels of propanol strongly indicate that a person has a severe liver deficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-01-05
SandiaMCR was developed to identify pure components and their concentrations from spectral data. This software efficiently implements the multivariate calibration regression alternating least squares (MCR-ALS), principal component analysis (PCA), and singular value decomposition (SVD). Version 3.37 also includes the PARAFAC-ALS Tucker-1 (for trilinear analysis) algorithms. The alternating least squares methods can be used to determine the composition without or with incomplete prior information on the constituents and their concentrations. It allows the specification of numerous preprocessing, initialization and data selection and compression options for the efficient processing of large data sets. The software includes numerous options including the definition ofmore » equality and non-negativety constraints to realistically restrict the solution set, various normalization or weighting options based on the statistics of the data, several initialization choices and data compression. The software has been designed to provide a practicing spectroscopist the tools required to routinely analysis data in a reasonable time and without requiring expert intervention.« less
Multiaxial Cyclic Thermoplasticity Analysis with Besseling's Subvolume Method
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1983-01-01
A modification was formulated to Besseling's Subvolume Method to allow it to use multilinear stress-strain curves which are temperature dependent to perform cyclic thermoplasticity analyses. This method automotically reproduces certain aspects of real material behavior important in the analysis of Aircraft Gas Turbine Engine (AGTE) components. These include the Bauschinger effect, cross-hardening, and memory. This constitutive equation was implemented in a finite element computer program called CYANIDE. Subsequently, classical time dependent plasticity (creep) was added to the program. Since its inception, this program was assessed against laboratory and component testing and engine experience. The ability of this program to simulate AGTE material response characteristics was verified by this experience and its utility in providing data for life analyses was demonstrated. In this area of life analysis, the multiaxial thermoplasticity capabilities of the method have proved a match for the actual AGTE life experience.
NASA Technical Reports Server (NTRS)
Miura, A.; Pritchett, P. L.
1982-01-01
A general stability analysis is given of the Kevin-Helmholtz instability, for the case of sheared MHD flow of finite thickness in a compressible plasma which allows for the arbitrary orientation of the magnetic field, velocity flow, and wave vector in the plane perpendicular to the velocity gradient. The stability problem is reduced to the solution of a single second-order differential equation including a gravitational term to represent the coupling between the Kelvin-Helmholtz mode and the interchange mode. Compressibility and a magnetic field component parallel to the flow are found to be stabilizing effects, with destabilization of only the fast magnetosonic mode in the transverse case, and the presence of both Alfven and slow magnetosonic components in the parallel case. Analysis results are used in a discussion of the stability of sheared plasma flow at the magnetopause boundary and in the solar wind.
Exploring Innovation Capabilities of Hospital CIOs: An Empirical Assessment.
Esdar, Moritz; Liebe, Jan-David; Weiß, Jan-Patrick; Hübner, Ursula
2017-01-01
Hospital CIOs play a central role in the adoption of innovative health IT. Until now, it remained unclear which particular conditions constitute their capability to innovate in terms of intrapersonal as well as organisational factors. An inventory of 20 items was developed to capture these conditions and examined by analysing data obtained from 164 German hospital CIOs. Principal component analysis resulted in three internally consistent components that constitute large portions of the CIOs innovation capability: organisational innovation culture, entrepreneurship personality and openness towards users. Results were used to build composite indicators that allow further evaluations.
Improvement of the material and transport component of the system of construction waste management
NASA Astrophysics Data System (ADS)
Kostyshak, Mikhail; Lunyakov, Mikhail
2017-10-01
Relevance of the topic of selected research is conditioned with the growth of construction operations and growth rates of construction and demolition wastes. This article considers modern approaches to the management of turnover of construction waste, sequence of reconstruction or demolition processes of the building, information flow of the complete cycle of turnover of construction and demolition waste, methods for improvement of the material and transport component of the construction waste management system. Performed analysis showed that mechanism of management of construction waste allows to increase efficiency and environmental safety of this branch and regions.
Parameters modelling of amaranth grain processing technology
NASA Astrophysics Data System (ADS)
Derkanosova, N. M.; Shelamova, S. A.; Ponomareva, I. N.; Shurshikova, G. V.; Vasilenko, O. A.
2018-03-01
The article presents a technique that allows calculating the structure of a multicomponent bakery mixture for the production of enriched products, taking into account the instability of nutrient content, and ensuring the fulfilment of technological requirements and, at the same time considering consumer preferences. The results of modelling and analysis of optimal solutions are given by the example of calculating the structure of a three-component mixture of wheat and rye flour with an enriching component, that is, whole-hulled amaranth flour applied to the technology of bread from a mixture of rye and wheat flour on a liquid leaven.
Extensions to decomposition of the redistributive effect of health care finance.
Zhong, Hai
2009-10-01
The total redistributive effect (RE) of health-care finance has been decomposed into vertical, horizontal and reranking effects. The vertical effect has been further decomposed into tax rate and tax structure effects. We extend this latter decomposition to the horizontal and reranking components of the RE. We also show how to measure the vertical, horizontal and reranking effects of each component of the redistributive system, allowing analysis of the RE of health-care finance in the context of that system. The methods are illustrated with application to the RE of health-care financing in Canada.
NASA Technical Reports Server (NTRS)
Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.
1990-01-01
An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.
Cluster tool solution for fabrication and qualification of advanced photomasks
NASA Astrophysics Data System (ADS)
Schaetz, Thomas; Hartmann, Hans; Peter, Kai; Lalanne, Frederic P.; Maurin, Olivier; Baracchi, Emanuele; Miramond, Corinne; Brueck, Hans-Juergen; Scheuring, Gerd; Engel, Thomas; Eran, Yair; Sommer, Karl
2000-07-01
The reduction of wavelength in optical lithography, phase shift technology and optical proximity correction (OPC), requires a rapid increase in cost effective qualification of photomasks. The knowledge about CD variation, loss of pattern fidelity especially for OPC pattern and mask defects concerning the impact on wafer level is becoming a key issue for mask quality assessment. As part of the European Community supported ESPRIT projection 'Q-CAP', a new cluster concept has been developed, which allows the combination of hardware tools as well as software tools via network communication. It is designed to be open for any tool manufacturer and mask hose. The bi-directional network access allows the exchange of all relevant mask data including grayscale images, measurement results, lithography parameters, defect coordinates, layout data, process data etc. and its storage to a SQL database. The system uses SEMI format descriptions as well as standard network hardware and software components for the client server communication. Each tool is used mainly to perform its specific application without using expensive time to perform optional analysis, but the availability of the database allows each component to share the full data ste gathered by all components. Therefore, the cluster can be considered as one single virtual tool. The paper shows the advantage of the cluster approach, the benefits of the tools linked together already, and a vision of a mask house in the near future.
Analysis of the Space Propulsion System Problem Using RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
diego mandelli; curtis smith; cristian rabiti
This paper presents the solution of the space propulsion problem using a PRA code currently under development at Idaho National Laboratory (INL). RAVEN (Reactor Analysis and Virtual control ENviroment) is a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities. It is designed to derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures) and to perform both Monte- Carlo sampling of random distributed events and Event Tree based analysis. In order to facilitate the input/output handling, a Graphical User Interface (GUI) and a post-processing data-mining module are available.more » RAVEN allows also to interface with several numerical codes such as RELAP5 and RELAP-7 and ad-hoc system simulators. For the space propulsion system problem, an ad-hoc simulator has been developed and written in python language and then interfaced to RAVEN. Such simulator fully models both deterministic (e.g., system dynamics and interactions between system components) and stochastic behaviors (i.e., failures of components/systems such as distribution lines and thrusters). Stochastic analysis is performed using random sampling based methodologies (i.e., Monte-Carlo). Such analysis is accomplished to determine both the reliability of the space propulsion system and to propagate the uncertainties associated to a specific set of parameters. As also indicated in the scope of the benchmark problem, the results generated by the stochastic analysis are used to generate risk-informed insights such as conditions under witch different strategy can be followed.« less
Comparative proteomic analysis of male and female venoms from the Cuban scorpion Rhopalurus junceus.
Rodríguez-Ravelo, Rodolfo; Batista, Cesar V F; Coronas, Fredy I V; Zamudio, Fernando Z; Hernández-Orihuela, Lorena; Espinosa-López, Georgina; Ruiz-Urquiola, Ariel; Possani, Lourival D
2015-12-01
A complete mass spectrometry analysis of venom components from male and female scorpions of the species Rhophalurus junceus of Cuba is reported. In the order of 200 individual molecular masses were identified in both venoms, from which 63 are identical in male and females genders. It means that a significant difference of venom components exists between individuals of different sexes, but the most abundant components are present in both sexes. The relative abundance of identical components is different among the genders. Three well defined groups of different peptides were separated and identified. The first group corresponds to peptides with molecular masses of 1000-2000 Da; the second to peptides with 3500-4500 Da molecular weight, and the third with 6500-8000 Da molecular weights. A total of 86 peptides rich in disulfide bridges were found in the venoms, 27 with three disulfide bridges and 59 with four disulfide bridges. LC-MS/MS analysis allowed the identification and amino acid sequence determination of 31 novel peptides in male venom. Two new putative K(+)-channel peptides were sequences by Edman degradation. They contain 37 amino acid residues, packed by three disulfide bridges and were assigned the systematic numbers: α-KTx 1.18 and α-KTx 2.15. Copyright © 2015 Elsevier Ltd. All rights reserved.
Staneva, Jordanka; Denkova, Pavletta; Todorova, Milka; Evstatieva, Ljuba
2011-01-05
(1)H NMR spectroscopy was used as a method for quantitative analysis of sesquiterpene lactones present in a crude lactone fraction isolated from Arnica montana. Eight main components - tigloyl-, methacryloyl-, isobutyryl- and 2-methylbutyryl-esters of helenalin (H) and 11α,13-dihydrohelenalin (DH) were identified in the studied sample. The method allows the determination of the total amount of sesquiterpene lactones and the quantity of both type helenalin and 11α,13-dihydrohelenalin esters separately. Furthermore, 6-O-tigloylhelenalin (HT, 1), 6-O-methacryloylhelenalin (HM, 2), 6-O-tigloyl-11α,13-dihydrohelenalin (DHT, 5), and 6-O-methacryloyl-11α,13-dihydrohelenalin (DHM, 6) were quantified as individual components. Copyright © 2010 Elsevier B.V. All rights reserved.
Boubaker, Moez Ben; Picard, Donald; Duchesne, Carl; Tessier, Jayson; Alamdari, Houshang; Fafard, Mario
2018-05-17
This paper reports on the application of an acousto-ultrasonic (AU) scheme for the inspection of industrial-size carbon anode blocks used in the production of primary aluminium by the Hall-Héroult process. A frequency-modulated wave is used to excite the anode blocks at multiple points. The collected attenuated AU signals are decomposed using the Discrete Wavelet Transform (DTW) after which vectors of features are calculated. Principal Component Analysis (PCA) is utilized to cluster the AU responses of the anodes. The approach allows locating cracks in the blocks and the AU features were found sensitive to crack severity. The results are validated using images collected after cutting some anodes. Copyright © 2018 Elsevier B.V. All rights reserved.
Size-exclusion chromatography system for macromolecular interaction analysis
Stevens, Fred J.
1988-01-01
A low pressure, microcomputer controlled system employing high performance liquid chromatography (HPLC) allows for precise analysis of the interaction of two reversibly associating macromolecules such as proteins. Since a macromolecular complex migrates faster than its components during size-exclusion chromatography, the difference between the elution profile of a mixture of two macromolecules and the summation of the elution profiles of the two components provides a quantifiable indication of the degree of molecular interaction. This delta profile is used to qualitatively reveal the presence or absence of significant interaction or to rank the relative degree of interaction in comparing samples and, in combination with a computer simulation, is further used to quantify the magnitude of the interaction in an arrangement wherein a microcomputer is coupled to analytical instrumentation in a novel manner.
A topological multilayer model of the human body.
Barbeito, Antonio; Painho, Marco; Cabral, Pedro; O'Neill, João
2015-11-04
Geographical information systems deal with spatial databases in which topological models are described with alphanumeric information. Its graphical interfaces implement the multilayer concept and provide powerful interaction tools. In this study, we apply these concepts to the human body creating a representation that would allow an interactive, precise, and detailed anatomical study. A vector surface component of the human body is built using a three-dimensional (3-D) reconstruction methodology. This multilayer concept is implemented by associating raster components with the corresponding vector surfaces, which include neighbourhood topology enabling spatial analysis. A root mean square error of 0.18 mm validated the three-dimensional reconstruction technique of internal anatomical structures. The expansion of the identification and the development of a neighbourhood analysis function are the new tools provided in this model.
Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.
Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A
2005-04-07
Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
Development of Nomarski microscopy for quantitative determination of surface topography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J. S.; Gordon, R. L.; Lessor, D. L.
1979-01-01
The use of Nomarski differential interference contrast (DIC) microscopy has been extended to provide nondestructive, quantitative analysis of a sample's surface topography. Theoretical modeling has determined the dependence of the image intensity on the microscope's optical components, the sample's optical properties, and the sample's surface orientation relative to the microscope. Results include expressions to allow the inversion of image intensity data to determine sample surface slopes. A commercial Nomarski system has been modified and characterized to allow the evaluation of the optical model. Data have been recorded with smooth, planar samples that verify the theoretical predictions.
Hamy, Valentin; Dikaios, Nikolaos; Punwani, Shonit; Melbourne, Andrew; Latifoltojar, Arash; Makanyanga, Jesica; Chouhan, Manil; Helbren, Emma; Menys, Alex; Taylor, Stuart; Atkinson, David
2014-02-01
Motion correction in Dynamic Contrast Enhanced (DCE-) MRI is challenging because rapid intensity changes can compromise common (intensity based) registration algorithms. In this study we introduce a novel registration technique based on robust principal component analysis (RPCA) to decompose a given time-series into a low rank and a sparse component. This allows robust separation of motion components that can be registered, from intensity variations that are left unchanged. This Robust Data Decomposition Registration (RDDR) is demonstrated on both simulated and a wide range of clinical data. Robustness to different types of motion and breathing choices during acquisition is demonstrated for a variety of imaged organs including liver, small bowel and prostate. The analysis of clinically relevant regions of interest showed both a decrease of error (15-62% reduction following registration) in tissue time-intensity curves and improved areas under the curve (AUC60) at early enhancement. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Virtual Laboratories to Achieve Higher-Order Learning in Fluid Mechanics
NASA Astrophysics Data System (ADS)
Ward, A. S.; Gooseff, M. N.; Toto, R.
2009-12-01
Bloom’s higher-order cognitive skills (analysis, evaluation, and synthesis) are recognized as necessary in engineering education, yet these are difficult to achieve in traditional lecture formats. Laboratory components supplement traditional lectures in an effort to emphasize active learning and provide higher-order challenges, but these laboratories are often subject to the constraints of (a) increasing student enrollment, (b) limited funding for operational, maintenance, and instructional expenses and (c) increasing demands on undergraduate student credit requirements. Here, we present results from a pilot project implementing virtual (or online) laboratory experiences as an alternative to a traditional laboratory experience in Fluid Mechanics, a required third year course. Students and faculty were surveyed to identify the topics that were most difficult, and virtual laboratory and design components developed to supplement lecture material. Each laboratory includes a traditional lab component, requiring student analysis and evaluation. The lab concludes with a design exercise, which imposes additional problem constraints and allows students to apply their laboratory observations to a real-world situation.
An Analysis of Freight Forwarder Operations in an International Distribution Channel.
1987-01-01
44 3. International Marketing Mix ....................... 45 4. Security Assistance Distribution Channel .......... 69 5...an item is ultimately derived from the interaction of variables in the marketing mix . Of those variables, the distribution functions seem to allow the...Component of the Marketing Mix ,"Proceedings, NCPDM Fall Meeting, National council of Physical Distribution Management, San Francisco, CA., 1982. 7
Molecular Design and Evaluation of Biodegradable Polymers Using a Statistical Approach
Lewitus, Dan; Rios, Fabian; Rojas, Ramiro; Kohn, Joachim
2013-01-01
The challenging paradigm of bioresorbable polymers, whether in drug delivery or tissue engineering, states that a fine-tuning of the interplay between polymer properties (e.g., thermal, degradation), and the degree of cell/tissue replacement and remodeling is required. In this paper we describe how changes in the molecular architecture of a series of terpolymers allow for the design of polymers with varying glass transition temperatures and degradation rates. The effect of each component in the terpolymers is quantified via design of experiment (DoE) analysis. A linear relationship between terpolymer components and resulting Tg (ranging from 34 to 86 °C) was demonstrated. These findings were further supported with mass-per-flexible-bond (MPFB) analysis. The effect of terpolymer composition on the in vitro degradation of these polymers revealed molecular weight loss ranging from 20 to 60% within the first 24 hours. DoE modeling further illustrated the linear (but reciprocal) relationship between structure elements and degradation for these polymers. Thus, we describe a simple technique to provide insight into the structure property relationship of degradable polymers, specifically applied using a new family of tyrosine-derived polycarbonates, allowing for optimal design of materials for specific applications. PMID:23888354
Scientific Data Management (SDM) Center for Enabling Technologies. 2007-2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ludascher, Bertram; Altintas, Ilkay
Over the past five years, our activities have both established Kepler as a viable scientific workflow environment and demonstrated its value across multiple science applications. We have published numerous peer-reviewed papers on the technologies highlighted in this short paper and have given Kepler tutorials at SC06,SC07,SC08,and SciDAC 2007. Our outreach activities have allowed scientists to learn best practices and better utilize Kepler to address their individual workflow problems. Our contributions to advancing the state-of-the-art in scientific workflows have focused on the following areas. Progress in each of these areas is described in subsequent sections. Workflow development. The development of amore » deeper understanding of scientific workflows "in the wild" and of the requirements for support tools that allow easy construction of complex scientific workflows; Generic workflow components and templates. The development of generic actors (i.e.workflow components and processes) which can be broadly applied to scientific problems; Provenance collection and analysis. The design of a flexible provenance collection and analysis infrastructure within the workflow environment; and, Workflow reliability and fault tolerance. The improvement of the reliability and fault-tolerance of workflow environments.« less
Woodrow, Graham
2007-06-01
Complex abnormalities of body composition occur in peritoneal dialysis (PD). These abnormalities reflect changes in hydration, nutrition, and body fat, and they are of major clinical significance. Clinical assessment of these body compartments is insensitive and inaccurate. Frequently, simultaneous changes of hydration, wasting, and body fat content can occur, confounding clinical assessment of each component. Body composition can be described by models of varying complexity that use one or more measurement techniques. "Gold standard" methods provide accurate and precise data, but are not practical for routine clinical use. Dual energy X-ray absorptiometry allows for measurement of regional as well as whole-body composition, which can provide further information of clinical relevance. Simpler techniques such as anthropometry and bioelectrical impedance analysis are suited to routine use in clinic or at the bedside, but may be less accurate. Body composition methodology sometimes makes assumptions regarding relationships between components, particularly in regard to hydration, which may be invalid in pathologic states. Uncritical application of these methods to the PD patient may result in erroneous interpretation of results. Understanding the foundations and limitations of body composition techniques allows for optimal application in clinical practice.
Spectroscopic study of honey from Apis mellifera from different regions in Mexico
NASA Astrophysics Data System (ADS)
Frausto-Reyes, C.; Casillas-Peñuelas, R.; Quintanar-Stephano, JL; Macías-López, E.; Bujdud-Pérez, JM; Medina-Ramírez, I.
2017-05-01
The objective of this study was to analyze by Raman and UV-Vis-NIR Spectroscopic techniques, Mexican honey from Apis Mellífera, using representative samples with different botanic origins (unifloral and multifloral) and diverse climates. Using Raman spectroscopy together with principal components analysis, the results obtained represent the possibility to use them for determination of floral origin of honey, independently of the region of sampling. For this, the effect of heat up the honey was analyzed in relation that it was possible to greatly reduce the fluorescence background in Raman spectra, which allowed the visualization of fructose and glucose peaks. Using UV-Vis-NIR, spectroscopy, a characteristic spectrum profile of transmittance was obtained for each honey type. In addition, to have an objective characterization of color, a CIE Yxy and CIE L*a*b* colorimetric register was realized for each honey type. Applying the principal component analysis and their correlation with chromaticity coordinates allowed classifying the honey samples in one plot as: cutoff wavelength, maximum transmittance, tones and lightness. The results show that it is possible to obtain a spectroscopic record of honeys with specific characteristics by reducing the effects of fluorescence.
Distributed Engine Control Empirical/Analytical Verification Tools
NASA Technical Reports Server (NTRS)
DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan
2013-01-01
NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.
Yücel, Yasin; Sultanoğlu, Pınar
2013-09-01
Chemical characterisation has been carried out on 45 honey samples collected from Hatay region of Turkey. The concentrations of 17 elements were determined by inductively coupled plasma optical emission spectrometry (ICP-OES). Ca, K, Mg and Na were the most abundant elements, with mean contents of 219.38, 446.93, 49.06 and 95.91 mg kg(-1) respectively. The trace element mean contents ranged between 0.03 and 15.07 mg kg(-1). Chemometric methods such as principal component analysis (PCA) and cluster analysis (CA) techniques were applied to classify honey according to mineral content. The first most important principal component (PC) was strongly associated with the value of Al, B, Cd and Co. CA showed eight clusters corresponding to the eight botanical origins of honey. PCA explained 75.69% of the variance with the first six PC variables. Chemometric analysis of the analytical data allowed the accurate classification of the honey samples according to origin. Copyright © 2013 Elsevier Ltd. All rights reserved.
Development of a Multi-Disciplinary Computing Environment (MDICE)
NASA Technical Reports Server (NTRS)
Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.
1999-01-01
The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.
NASA Astrophysics Data System (ADS)
Sollberger, David; Greenhalgh, Stewart A.; Schmelzbach, Cedric; Van Renterghem, Cédéric; Robertsson, Johan O. A.
2018-04-01
We provide a six-component (6-C) polarization model for P-, SV-, SH-, Rayleigh-, and Love-waves both inside an elastic medium as well as at the free surface. It is shown that single-station 6-C data comprised of three components of rotational motion and three components of translational motion provide the opportunity to unambiguously identify the wave type, propagation direction, and local P- and S-wave velocities at the receiver location by use of polarization analysis. To extract such information by conventional processing of three-component (3-C) translational data would require large and dense receiver arrays. The additional rotational components allow the extension of the rank of the coherency matrix used for polarization analysis. This enables us to accurately determine the wave type and wave parameters (propagation direction and velocity) of seismic phases, even if more than one wave is present in the analysis time window. This is not possible with standard, pure-translational 3-C recordings. In order to identify modes of vibration and to extract the accompanying wave parameters, we adapt the multiple signal classification algorithm (MUSIC). Due to the strong nonlinearity of the MUSIC estimator function, it can be used to detect the presence of specific wave types within the analysis time window at very high resolution. We show how the extracted wavefield properties can be used, in a fully automated way, to separate the wavefield into its different wave modes using only a single 6-C recording station. As an example, we apply the method to remove surface wave energy while preserving the underlying reflection signal and to suppress energy originating from undesired directions, such as side-scattered waves.
Interpretation of a compositional time series
NASA Astrophysics Data System (ADS)
Tolosana-Delgado, R.; van den Boogaart, K. G.
2012-04-01
Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA. In this data set, the proportion of annual precipitation falling in winter, spring, summer and autumn is considered a 4-component time series. Three invertible log-ratios are defined for calculations, balancing rainfall in autumn vs. winter, in summer vs. spring, and in autumn-winter vs. spring-summer. Results suggest a 2-year correlation range, and certain oscillatory behaviour in the last balance, which does not occur in the other two.
Independent Orbiter Assessment (IOA): Analysis of the landing/deceleration subsystem
NASA Technical Reports Server (NTRS)
Compton, J. M.; Beaird, H. G.; Weissinger, W. D.
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Landing/Deceleration Subsystem hardware. The Landing/Deceleration Subsystem is utilized to allow the Orbiter to perform a safe landing, allowing for landing-gear deploy activities, steering and braking control throughout the landing rollout to wheel-stop, and to allow for ground-handling capability during the ground-processing phase of the flight cycle. Specifically, the Landing/Deceleration hardware consists of the following components: Nose Landing Gear (NLG); Main Landing Gear (MLG); Brake and Antiskid (B and AS) Electrical Power Distribution and Controls (EPD and C); Nose Wheel Steering (NWS); and Hydraulics Actuators. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Due to the lack of redundancy in the Landing/Deceleration Subsystems there is a high number of critical items.
NASA Technical Reports Server (NTRS)
Fabanich, William A., Jr.
2014-01-01
SpaceClaim/TD Direct has been used extensively in the development of the Advanced Stirling Radioisotope Generator (ASRG) thermal model. This paper outlines the workflow for that aspect of the task and includes proposed best practices and lessons learned. The ASRG thermal model was developed to predict component temperatures and power output and to provide insight into the prime contractor's thermal modeling efforts. The insulation blocks, heat collectors, and cold side adapter flanges (CSAFs) were modeled with this approach. The model was constructed using mostly TD finite difference (FD) surfaces/solids. However, some complex geometry could not be reproduced with TD primitives while maintaining the desired degree of geometric fidelity. Using SpaceClaim permitted the import of original CAD files and enabled the defeaturing/repair of those geometries. TD Direct (a SpaceClaim add-on from CRTech) adds features that allowed the "mark-up" of that geometry. These so-called "mark-ups" control how finite element (FE) meshes are to be generated through the "tagging" of features (e.g. edges, solids, surfaces). These tags represent parameters that include: submodels, material properties, material orienters, optical properties, and radiation analysis groups. TD aliases were used for most tags to allow analysis to be performed with a variety of parameter values. "Domain-tags" were also attached to individual and groups of surfaces and solids to allow them to be used later within TD to populate objects like, for example, heaters and contactors. These tools allow the user to make changes to the geometry in SpaceClaim and then easily synchronize the mesh in TD without having to redefine the objects each time as one would if using TDMesher. The use of SpaceClaim/TD Direct helps simplify the process for importing existing geometries and in the creation of high fidelity FE meshes to represent complex parts. It also saves time and effort in the subsequent analysis.
The VRFurnace: A Virtual Reality Application for Energy System Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Peter Eric
2001-01-01
The VRFurnace is a unique VR application designed to analyze a complete coal-combustion CFD model of a power plant furnace. Although other applications have been created that analyze furnace performance, no other has included the added complications of particle tracking and the reactions associated with coal combustion. Currently the VRFurnace is a versatile analysis tool. Data translators have been written to allow data from most of the major commercial CFD software packages as well as standard data formats of hand-written code to be uploaded into the VR application. Because of this almost any type of CFD model of any powermore » plant component can be analyzed immediately. The ease of use of the VRFurnace is another of its qualities. The menu system created for the application not only guides first time users through the various button combinations but it also helps the experienced user keep track of which tool is being used. Because the VRFurnace was designed for use in the C6 device at Iowa State University's Virtual Reality Applications Center it is naturally a collaborative project. The projection-based system allows many people to be involved in the analysis process. This type of environment opens the design process to not only include CFD analysts but management teams and plant operators as well by making it easier for engineers to explain design changes. The 3D visualization allows power plant components to be studied in the context of their natural physical environments giving engineers a chance to use their innate pattern recognition and intuitive skills to bring to light key relationships that may have previously gone unrecognized. More specifically, the tools that have been developed make better use of the third dimension that the synthetic environment provides. Whereas the plane tools make it easier to track down interesting features of a given flow field, the box tools allow the user to focus on these features and reduce the data load on the computer.« less
NASA Astrophysics Data System (ADS)
Campbell, Ian S.; Ton, Alain T.; Mulligan, Christopher C.
2011-07-01
An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.
Campbell, Ian S; Ton, Alain T; Mulligan, Christopher C
2011-07-01
An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.
Identification of phases, symmetries and defects through local crystallography
Belianinov, Alex; He, Qian; Kravchenko, Mikhail; ...
2015-07-20
Here we report that advances in electron and probe microscopies allow 10 pm or higher precision in measurements of atomic positions. This level of fidelity is sufficient to correlate the length (and hence energy) of bonds, as well as bond angles to functional properties of materials. Traditionally, this relied on mapping locally measured parameters to macroscopic variables, for example, average unit cell. This description effectively ignores the information contained in the microscopic degrees of freedom available in a high-resolution image. Here we introduce an approach for local analysis of material structure based on statistical analysis of individual atomic neighbourhoods. Clusteringmore » and multivariate algorithms such as principal component analysis explore the connectivity of lattice and bond structure, as well as identify minute structural distortions, thus allowing for chemical description and identification of phases. This analysis lays the framework for building image genomes and structure–property libraries, based on conjoining structural and spectral realms through local atomic behaviour.« less
A data analysis expert system for large established distributed databases
NASA Technical Reports Server (NTRS)
Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick
1987-01-01
A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.
Analysis of a human phenomenon: self-concept.
LeMone, P
1991-01-01
This analysis of self-concept includes an examination of definitions, historical perspectives, theoretical basis, and closely related terms. Antecedents, consequences, defining attributes, and a definition were formulated based on the analysis. The purpose of the analysis was to provide support for the use of the label "self-concept" as a broad category that encompasses the self-esteem, identity, and body-image nursing diagnoses within Taxonomy I. This classification could allow the use of a broad diagnostic label to better describe conditions that necessitate nursing care. It may also further explain the relationships between and among those diagnoses that describe human responses to disturbance of any component of the self-concept.
NASA Astrophysics Data System (ADS)
Moreau, O.; Libbrecht, C.; Lee, D.-W.; Surdej, J.
2005-06-01
Using an optimal image subtraction technique, we have derived the V and R light curves of the four lensed QSO components of Q2237+0305 from the monitoring CCD frames obtained by the GLITP collaboration with the 2.6 m NOT telescope in 1999/2000 (Alcalde et al. 2002). We give here a detailed account of the data reduction and analysis and of the error estimates. In agreement with Woźniak et al. (2000a,b), the good derived photometric accuracy of the GLITP data allows to discuss the possible interpretation of the light curve of component A as due to a microlensing event taking place in the deflecting galaxy. This interpretation is strengthened by the colour dependence of the early rise of the light curve of component A, as it probably corresponds to a caustics crossing by the QSO source.
NASA Astrophysics Data System (ADS)
de la Cruz, Javier; Cano, Ulises; Romero, Tatiana
2016-10-01
A critical parameter for PEM fuel cell's electric contact is the nominal clamping pressure. Predicting the mechanical behavior of all components in a fuel cell stack is a very complex task due to the diversity of materials properties. Prior to the integration of a 3 kW PEMFC power plant, a numerical simulation was performed in order to obtain the mechanical stress distribution for two of the most pressure sensitive components of the stack: the membrane, and the graphite plates. The stress distribution of the above mentioned components was numerically simulated by finite element analysis and the stress magnitude for the membrane was confirmed using pressure films. Stress values were found within the elastic zone which guarantees mechanical integrity of fuel cell components. These low stress levels particularly for the membrane will allow prolonging the life and integrity of the fuel cell stack according to its design specifications.
Lukasczyk, Jonas; Weber, Gunther; Maciejewski, Ross; ...
2017-06-01
Tracking graphs are a well established tool in topological analysis to visualize the evolution of components and their properties over time, i.e., when components appear, disappear, merge, and split. However, tracking graphs are limited to a single level threshold and the graphs may vary substantially even under small changes to the threshold. To examine the evolution of features for varying levels, users have to compare multiple tracking graphs without a direct visual link between them. We propose a novel, interactive, nested graph visualization based on the fact that the tracked superlevel set components for different levels are related to eachmore » other through their nesting hierarchy. This approach allows us to set multiple tracking graphs in context to each other and enables users to effectively follow the evolution of components for different levels simultaneously. We show the effectiveness of our approach on datasets from finite pointset methods, computational fluid dynamics, and cosmology simulations.« less
First impressions: gait cues drive reliable trait judgements.
Thoresen, John C; Vuong, Quoc C; Atkinson, Anthony P
2012-09-01
Personality trait attribution can underpin important social decisions and yet requires little effort; even a brief exposure to a photograph can generate lasting impressions. Body movement is a channel readily available to observers and allows judgements to be made when facial and body appearances are less visible; e.g., from great distances. Across three studies, we assessed the reliability of trait judgements of point-light walkers and identified motion-related visual cues driving observers' judgements. The findings confirm that observers make reliable, albeit inaccurate, trait judgements, and these were linked to a small number of motion components derived from a Principal Component Analysis of the motion data. Parametric manipulation of the motion components linearly affected trait ratings, providing strong evidence that the visual cues captured by these components drive observers' trait judgements. Subsequent analyses suggest that reliability of trait ratings was driven by impressions of emotion, attractiveness and masculinity. Copyright © 2012 Elsevier B.V. All rights reserved.
The carbon component of the UK power price
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kris Voorspools
2006-08-01
CO{sub 2} emissions trading is in full swing in Europe and is already having an impact on the price of power in the UK. If EU allowances (EUAs) trade at euro 20/t-CO{sub 2}, the EUA component in the power price is estimated to be slightly < euro 10/MW.h. In the case of UK power for delivery 1 year ahead, this is {approximately} 10% of the market price of power. The introduction of a carbon components into the UK power prices took place along before the 'official' start of ETS in 2005. Analysis of historical data of the price of power,more » gas, coal and EUAs shows that the first trace of a CO{sub 2} component in UK power dates back to August 2003, shortly after EUAs first started to trade. In April 2004, CO{sub 2} was fully integrated into the UK power price. 4 refs., 5 figs.« less
NASA Astrophysics Data System (ADS)
van Berkel, M.; Kobayashi, T.; Igami, H.; Vandersteen, G.; Hogeweij, G. M. D.; Tanaka, K.; Tamura, N.; Zwart, H. J.; Kubo, S.; Ito, S.; Tsuchiya, H.; de Baar, M. R.; LHD Experiment Group
2017-12-01
A new methodology to analyze non-linear components in perturbative transport experiments is introduced. The methodology has been experimentally validated in the Large Helical Device for the electron heat transport channel. Electron cyclotron resonance heating with different modulation frequencies by two gyrotrons has been used to directly quantify the amplitude of the non-linear component at the inter-modulation frequencies. The measurements show significant quadratic non-linear contributions and also the absence of cubic and higher order components. The non-linear component is analyzed using the Volterra series, which is the non-linear generalization of transfer functions. This allows us to study the radial distribution of the non-linearity of the plasma and to reconstruct linear profiles where the measurements were not distorted by non-linearities. The reconstructed linear profiles are significantly different from the measured profiles, demonstrating the significant impact that non-linearity can have.
Søndergaard, Anders Aspegren; Shepperson, Benjamin; Stapelfeldt, Henrik
2017-07-07
We present an efficient, noise-robust method based on Fourier analysis for reconstructing the three-dimensional measure of the alignment degree, ⟨cos 2 θ⟩, directly from its two-dimensional counterpart, ⟨cos 2 θ 2D ⟩. The method applies to nonadiabatic alignment of linear molecules induced by a linearly polarized, nonresonant laser pulse. Our theoretical analysis shows that the Fourier transform of the time-dependent ⟨cos 2 θ 2D ⟩ trace over one molecular rotational period contains additional frequency components compared to the Fourier transform of ⟨cos 2 θ⟩. These additional frequency components can be identified and removed from the Fourier spectrum of ⟨cos 2 θ 2D ⟩. By rescaling of the remaining frequency components, the Fourier spectrum of ⟨cos 2 θ⟩ is obtained and, finally, ⟨cos 2 θ⟩ is reconstructed through inverse Fourier transformation. The method allows the reconstruction of the ⟨cos 2 θ⟩ trace from a measured ⟨cos 2 θ 2D ⟩ trace, which is the typical observable of many experiments, and thereby provides direct comparison to calculated ⟨cos 2 θ⟩ traces, which is the commonly used alignment metric in theoretical descriptions. We illustrate our method by applying it to the measurement of nonadiabatic alignment of I 2 molecules. In addition, we present an efficient algorithm for calculating the matrix elements of cos 2 θ 2D and any other observable in the symmetric top basis. These matrix elements are required in the rescaling step, and they allow for highly efficient numerical calculation of ⟨cos 2 θ 2D ⟩ and ⟨cos 2 θ⟩ in general.
Variational Bayesian Learning for Wavelet Independent Component Analysis
NASA Astrophysics Data System (ADS)
Roussos, E.; Roberts, S.; Daubechies, I.
2005-11-01
In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.
NASA Astrophysics Data System (ADS)
Koch, Julian; Cüneyd Demirel, Mehmet; Stisen, Simon
2018-05-01
The process of model evaluation is not only an integral part of model development and calibration but also of paramount importance when communicating modelling results to the scientific community and stakeholders. The modelling community has a large and well-tested toolbox of metrics to evaluate temporal model performance. In contrast, spatial performance evaluation does not correspond to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study makes a contribution towards advancing spatial-pattern-oriented model calibration by rigorously testing a multiple-component performance metric. The promoted SPAtial EFficiency (SPAEF) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multiple-component approach is found to be advantageous in order to achieve the complex task of comparing spatial patterns. SPAEF, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are applied in a spatial-pattern-oriented model calibration of a catchment model in Denmark. Results suggest the importance of multiple-component metrics because stand-alone metrics tend to fail to provide holistic pattern information. The three SPAEF components are found to be independent, which allows them to complement each other in a meaningful way. In order to optimally exploit spatial observations made available by remote sensing platforms, this study suggests applying bias insensitive metrics which further allow for a comparison of variables which are related but may differ in unit. This study applies SPAEF in the hydrological context using the mesoscale Hydrologic Model (mHM; version 5.8), but we see great potential across disciplines related to spatially distributed earth system modelling.
NASA Astrophysics Data System (ADS)
Cappon, Derek J.; Farrell, Thomas J.; Fang, Qiyin; Hayward, Joseph E.
2016-12-01
Optical spectroscopy of human tissue has been widely applied within the field of biomedical optics to allow rapid, in vivo characterization and analysis of the tissue. When designing an instrument of this type, an imaging spectrometer is often employed to allow for simultaneous analysis of distinct signals. This is especially important when performing spatially resolved diffuse reflectance spectroscopy. In this article, an algorithm is presented that allows for the automated processing of 2-dimensional images acquired from an imaging spectrometer. The algorithm automatically defines distinct spectrometer tracks and adaptively compensates for distortion introduced by optical components in the imaging chain. Crosstalk resulting from the overlap of adjacent spectrometer tracks in the image is detected and subtracted from each signal. The algorithm's performance is demonstrated in the processing of spatially resolved diffuse reflectance spectra recovered from an Intralipid and ink liquid phantom and is shown to increase the range of wavelengths over which usable data can be recovered.
Long, J.M.; Fisher, W.L.
2006-01-01
We present a method for spatial interpretation of environmental variation in a reservoir that integrates principal components analysis (PCA) of environmental data with geographic information systems (GIS). To illustrate our method, we used data from a Great Plains reservoir (Skiatook Lake, Oklahoma) with longitudinal variation in physicochemical conditions. We measured 18 physicochemical features, mapped them using GIS, and then calculated and interpreted four principal components. Principal component 1 (PC1) was readily interpreted as longitudinal variation in water chemistry, but the other principal components (PC2-4) were difficult to interpret. Site scores for PC1-4 were calculated in GIS by summing weighted overlays of the 18 measured environmental variables, with the factor loadings from the PCA as the weights. PC1-4 were then ordered into a landscape hierarchy, an emergent property of this technique, which enabled their interpretation. PC1 was interpreted as a reservoir scale change in water chemistry, PC2 was a microhabitat variable of rip-rap substrate, PC3 identified coves/embayments and PC4 consisted of shoreline microhabitats related to slope. The use of GIS improved our ability to interpret the more obscure principal components (PC2-4), which made the spatial variability of the reservoir environment more apparent. This method is applicable to a variety of aquatic systems, can be accomplished using commercially available software programs, and allows for improved interpretation of the geographic environmental variability of a system compared to using typical PCA plots. ?? Copyright by the North American Lake Management Society 2006.
Mapping carcass and meat quality QTL on Sus Scrofa chromosome 2 in commercial finishing pigs
Heuven, Henri CM; van Wijk, Rik HJ; Dibbits, Bert; van Kampen, Tony A; Knol, Egbert F; Bovenhuis, Henk
2009-01-01
Quantitative trait loci (QTL) affecting carcass and meat quality located on SSC2 were identified using variance component methods. A large number of traits involved in meat and carcass quality was detected in a commercial crossbred population: 1855 pigs sired by 17 boars from a synthetic line, which where homozygous (A/A) for IGF2. Using combined linkage and linkage disequilibrium mapping (LDLA), several QTL significantly affecting loin muscle mass, ham weight and ham muscles (outer ham and knuckle ham) and meat quality traits, such as Minolta-L* and -b*, ultimate pH and Japanese colour score were detected. These results agreed well with previous QTL-studies involving SSC2. Since our study is carried out on crossbreds, different QTL may be segregating in the parental lines. To address this question, we compared models with a single QTL-variance component with models allowing for separate sire and dam QTL-variance components. The same QTL were identified using a single QTL variance component model compared to a model allowing for separate variances with minor differences with respect to QTL location. However, the variance component method made it possible to detect QTL segregating in the paternal line (e.g. HAMB), the maternal lines (e.g. Ham) or in both (e.g. pHu). Combining association and linkage information among haplotypes improved slightly the significance of the QTL compared to an analysis using linkage information only. PMID:19284675
The banana code-natural blend processing in the olfactory circuitry of Drosophila melanogaster.
Schubert, Marco; Hansson, Bill S; Sachse, Silke
2014-01-01
Odor information is predominantly perceived as complex odor blends. For Drosophila melanogaster one of the most attractive blends is emitted by an over-ripe banana. To analyze how the fly's olfactory system processes natural blends we combined the experimental advantages of gas chromatography and functional imaging (GC-I). In this way, natural banana compounds were presented successively to the fly antenna in close to natural occurring concentrations. This technique allowed us to identify the active odor components, use these compounds as stimuli and measure odor-induced Ca(2+) signals in input and output neurons of the Drosophila antennal lobe (AL), the first olfactory neuropil. We demonstrate that mixture interactions of a natural blend are very rare and occur only at the AL output level resulting in a surprisingly linear blend representation. However, the information regarding single components is strongly modulated by the olfactory circuitry within the AL leading to a higher similarity between the representation of individual components and the banana blend. This observed modulation might tune the olfactory system in a way to distinctively categorize odor components and improve the detection of suitable food sources. Functional GC-I thus enables analysis of virtually any unknown natural odorant blend and its components in their relative occurring concentrations and allows characterization of neuronal responses of complete neural assemblies. This technique can be seen as a valuable complementary method to classical GC/electrophysiology techniques, and will be a highly useful tool in future investigations of insect-insect and insect-plant chemical interactions.
Optimized principal component analysis on coronagraphic images of the fomalhaut system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meshkat, Tiffany; Kenworthy, Matthew A.; Quanz, Sascha P.
We present the results of a study to optimize the principal component analysis (PCA) algorithm for planet detection, a new algorithm complementing angular differential imaging and locally optimized combination of images (LOCI) for increasing the contrast achievable next to a bright star. The stellar point spread function (PSF) is constructed by removing linear combinations of principal components, allowing the flux from an extrasolar planet to shine through. The number of principal components used determines how well the stellar PSF is globally modeled. Using more principal components may decrease the number of speckles in the final image, but also increases themore » background noise. We apply PCA to Fomalhaut Very Large Telescope NaCo images acquired at 4.05 μm with an apodized phase plate. We do not detect any companions, with a model dependent upper mass limit of 13-18 M {sub Jup} from 4-10 AU. PCA achieves greater sensitivity than the LOCI algorithm for the Fomalhaut coronagraphic data by up to 1 mag. We make several adaptations to the PCA code and determine which of these prove the most effective at maximizing the signal-to-noise from a planet very close to its parent star. We demonstrate that optimizing the number of principal components used in PCA proves most effective for pulling out a planet signal.« less
artdaq: DAQ software development made simple
NASA Astrophysics Data System (ADS)
Biery, Kurt; Flumerfelt, Eric; Freeman, John; Ketchum, Wesley; Lukhanin, Gennadiy; Rechenmacher, Ron
2017-10-01
For a few years now, the artdaq data acquisition software toolkit has provided numerous experiments with ready-to-use components which allow for rapid development and deployment of DAQ systems. Developed within the Fermilab Scientific Computing Division, artdaq provides data transfer, event building, run control, and event analysis functionality. This latter feature includes built-in support for the art event analysis framework, allowing experiments to run art modules for real-time filtering, compression, disk writing and online monitoring. As art, also developed at Fermilab, is also used for offline analysis, a major advantage of artdaq is that it allows developers to easily switch between developing online and offline software. artdaq continues to be improved. Support for an alternate mode of running whereby data from some subdetector components are only streamed if requested has been added; this option will reduce unnecessary DAQ throughput. Real-time reporting of DAQ metrics has been implemented, along with the flexibility to choose the format through which experiments receive the reports; these formats include the Ganglia, Graphite and syslog software packages, along with flat ASCII files. Additionally, work has been performed investigating more flexible modes of online monitoring, including the capability to run multiple online monitoring processes on different hosts, each running its own set of art modules. Finally, a web-based GUI interface through which users can configure details of their DAQ system has been implemented, increasing the ease of use of the system. Already successfully deployed on the LArlAT, DarkSide-50, DUNE 35ton and Mu2e experiments, artdaq will be employed for SBND and is a strong candidate for use on ICARUS and protoDUNE. With each experiment comes new ideas for how artdaq can be made more flexible and powerful. The above improvements will be described, along with potential ideas for the future.
Coevality in Young Eclipsing Binaries
NASA Astrophysics Data System (ADS)
Simon, M.; Toraskar, Jayashree
2017-06-01
The ages of the components in very short period pre-main-sequence (PMS) binaries are essential to an understanding of their formation. We considered a sample of seven PMS eclipsing binaries (EBs) with ages 1-6.3 MY and component masses 0.2-1.4 {M}⊙ . The very high precision with which their masses and radii have been measured and the capability provided by the Modules for Experiments in Stellar Astrophysics to calculate their evolutionary tracks at exactly the measured masses allows the determination of age differences of the components independent of their luminosities and effective temperatures. We found that the components of five EBs, ASAS J052821+0338.5, Parenago 1802, JW 380, CoRoT 223992193, and UScoCTIO 5, formed within 0.3 MY of each other. The parameters for the components of V1174 Ori imply an implausible large age difference of 2.7 MY and should be reconsidered. The seventh EB in our sample, RX J0529.4+0041 fell outside the applicability of our analysis.
Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.
Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin
2016-01-01
This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.
Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations
Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin
2016-01-01
This paper introduces a new approach–the Principal Component Gradient Analysis (PCGA)–to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA. PMID:27467508
Common mode error in Antarctic GPS coordinate time series on its effect on bedrock-uplift estimates
NASA Astrophysics Data System (ADS)
Liu, Bin; King, Matt; Dai, Wujiao
2018-05-01
Spatially-correlated common mode error always exists in regional, or-larger, GPS networks. We applied independent component analysis (ICA) to GPS vertical coordinate time series in Antarctica from 2010 to 2014 and made a comparison with the principal component analysis (PCA). Using PCA/ICA, the time series can be decomposed into a set of temporal components and their spatial responses. We assume the components with common spatial responses are common mode error (CME). An average reduction of ˜40% about the RMS values was achieved in both PCA and ICA filtering. However, the common mode components obtained from the two approaches have different spatial and temporal features. ICA time series present interesting correlations with modeled atmospheric and non-tidal ocean loading displacements. A white noise (WN) plus power law noise (PL) model was adopted in the GPS velocity estimation using maximum likelihood estimation (MLE) analysis, with ˜55% reduction of the velocity uncertainties after filtering using ICA. Meanwhile, spatiotemporal filtering reduces the amplitude of PL and periodic terms in the GPS time series. Finally, we compare the GPS uplift velocities, after correction for elastic effects, with recent models of glacial isostatic adjustment (GIA). The agreements of the GPS observed velocities and four GIA models are generally improved after the spatiotemporal filtering, with a mean reduction of ˜0.9 mm/yr of the WRMS values, possibly allowing for more confident separation of various GIA model predictions.
NASA Astrophysics Data System (ADS)
Wojciechowski, Adam
2017-04-01
In order to assess ecodiversity understood as a comprehensive natural landscape factor (Jedicke 2001), it is necessary to apply research methods which recognize the environment in a holistic way. Principal component analysis may be considered as one of such methods as it allows to distinguish the main factors determining landscape diversity on the one hand, and enables to discover regularities shaping the relationships between various elements of the environment under study on the other hand. The procedure adopted to assess ecodiversity with the use of principal component analysis involves: a) determining and selecting appropriate factors of the assessed environment qualities (hypsometric, geological, hydrographic, plant, and others); b) calculating the absolute value of individual qualities for the basic areas under analysis (e.g. river length, forest area, altitude differences, etc.); c) principal components analysis and obtaining factor maps (maps of selected components); d) generating a resultant, detailed map and isolating several classes of ecodiversity. An assessment of ecodiversity with the use of principal component analysis was conducted in the test area of 299,67 km2 in Debnica Kaszubska commune. The whole commune is situated in the Weichselian glaciation area of high hypsometric and morphological diversity as well as high geo- and biodiversity. The analysis was based on topographical maps of the commune area in scale 1:25000 and maps of forest habitats. Consequently, nine factors reflecting basic environment elements were calculated: maximum height (m), minimum height (m), average height (m), the length of watercourses (km), the area of water reservoirs (m2), total forest area (ha), coniferous forests habitats area (ha), deciduous forest habitats area (ha), alder habitats area (ha). The values for individual factors were analysed for 358 grid cells of 1 km2. Based on the principal components analysis, four major factors affecting commune ecodiversity were distinguished: hypsometric component (PC1), deciduous forest habitats component (PC2), river valleys and alder habitats component (PC3), and lakes component (PC4). The distinguished factors characterise natural qualities of postglacial area and reflect well the role of the four most important groups of environment components in shaping ecodiversity of the area under study. The map of ecodiversity of Debnica Kaszubska commune was created on the basis of the first four principal component scores and then five classes of diversity were isolated: very low, low, average, high and very high. As a result of the assessment, five commune regions of very high ecodiversity were separated. These regions are also very attractive for tourists and valuable in terms of their rich nature which include protected areas such as Slupia Valley Landscape Park. The suggested method of ecodiversity assessment with the use of principal component analysis may constitute an alternative methodological proposition to other research methods used so far. Literature Jedicke E., 2001. Biodiversität, Geodiversität, Ökodiversität. Kriterien zur Analyse der Landschaftsstruktur - ein konzeptioneller Diskussionsbeitrag. Naturschutz und Landschaftsplanung, 33(2/3), 59-68.
Akbari, Hamed; Macyszyn, Luke; Da, Xiao; Wolf, Ronald L.; Bilello, Michel; Verma, Ragini; O’Rourke, Donald M.
2014-01-01
Purpose To augment the analysis of dynamic susceptibility contrast material–enhanced magnetic resonance (MR) images to uncover unique tissue characteristics that could potentially facilitate treatment planning through a better understanding of the peritumoral region in patients with glioblastoma. Materials and Methods Institutional review board approval was obtained for this study, with waiver of informed consent for retrospective review of medical records. Dynamic susceptibility contrast-enhanced MR imaging data were obtained for 79 patients, and principal component analysis was applied to the perfusion signal intensity. The first six principal components were sufficient to characterize more than 99% of variance in the temporal dynamics of blood perfusion in all regions of interest. The principal components were subsequently used in conjunction with a support vector machine classifier to create a map of heterogeneity within the peritumoral region, and the variance of this map served as the heterogeneity score. Results The calculated principal components allowed near-perfect separability of tissue that was likely highly infiltrated with tumor and tissue that was unlikely infiltrated with tumor. The heterogeneity map created by using the principal components showed a clear relationship between voxels judged by the support vector machine to be highly infiltrated and subsequent recurrence. The results demonstrated a significant correlation (r = 0.46, P < .0001) between the heterogeneity score and patient survival. The hazard ratio was 2.23 (95% confidence interval: 1.4, 3.6; P < .01) between patients with high and low heterogeneity scores on the basis of the median heterogeneity score. Conclusion Analysis of dynamic susceptibility contrast-enhanced MR imaging data by using principal component analysis can help identify imaging variables that can be subsequently used to evaluate the peritumoral region in glioblastoma. These variables are potentially indicative of tumor infiltration and may become useful tools in guiding therapy, as well as individualized prognostication. © RSNA, 2014 PMID:24955928
SpecViz: Interactive Spectral Data Analysis
NASA Astrophysics Data System (ADS)
Earl, Nicholas Michael; STScI
2016-06-01
The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight-forward, consistent way. Through the development of such tools, STScI hopes to unify astronomical data analysis software for JWST and other instruments, allowing for efficient, reliable, and consistent scientific results.
Pietsch, Torsten; Haberler, Christine
2016-01-01
The revised WHO classification of tumors of the CNS 2016 has introduced the concept of the integrated diagnosis. The definition of medulloblastoma entities now requires a combination of the traditional histological information with additional molecular/genetic features. For definition of the histopathological component of the medulloblastoma diagnosis, the tumors should be assigned to one of the four entities classic, desmoplastic/nodular (DNMB), extensive nodular (MBEN), or large cell/anaplastic (LC/A) medulloblastoma. The genetically defined component comprises the four entities WNT-activated, SHH-activated and TP53 wildtype, SHH-activated and TP53 mutant, or non-WNT/non-SHH medulloblastoma. Robust and validated methods are available to allow a precise diagnosis of these medulloblastoma entities according to the updated WHO classification, and for differential diagnostic purposes. A combination of immunohistochemical markers including β-catenin, Yap1, p75-NGFR, Otx2, and p53, in combination with targeted sequencing and copy number assessment such as FISH analysis for MYC genes allows a precise assignment of patients for risk-adapted stratification. It also allows comparison to results of study cohorts in the past and provides a robust basis for further treatment refinement. PMID:27781424
Pietsch, Torsten; Haberler, Christine
The revised WHO classification of tumors of the CNS 2016 has introduced the concept of the integrated diagnosis. The definition of medulloblastoma entities now requires a combination of the traditional histological information with additional molecular/genetic features. For definition of the histopathological component of the medulloblastoma diagnosis, the tumors should be assigned to one of the four entities classic, desmoplastic/nodular (DNMB), extensive nodular (MBEN), or large cell/anaplastic (LC/A) medulloblastoma. The genetically defined component comprises the four entities WNT-activated, SHH-activated and TP53 wildtype, SHH-activated and TP53 mutant, or non-WNT/non-SHH medulloblastoma. Robust and validated methods are available to allow a precise diagnosis of these medulloblastoma entities according to the updated WHO classification, and for differential diagnostic purposes. A combination of immunohistochemical markers including β-catenin, Yap1, p75-NGFR, Otx2, and p53, in combination with targeted sequencing and copy number assessment such as FISH analysis for MYC genes allows a precise assignment of patients for risk-adapted stratification. It also allows comparison to results of study cohorts in the past and provides a robust basis for further treatment refinement. .
Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M
2007-01-01
We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
Dimensionality reduction for the quantitative evaluation of a smartphone-based Timed Up and Go test.
Palmerini, Luca; Mellone, Sabato; Rocchi, Laura; Chiari, Lorenzo
2011-01-01
The Timed Up and Go is a clinical test to assess mobility in the elderly and in Parkinson's disease. Lately instrumented versions of the test are being considered, where inertial sensors assess motion. To improve the pervasiveness, ease of use, and cost, we consider a smartphone's accelerometer as the measurement system. Several parameters (usually highly correlated) can be computed from the signals recorded during the test. To avoid redundancy and obtain the features that are most sensitive to the locomotor performance, a dimensionality reduction was performed through principal component analysis (PCA). Forty-nine healthy subjects of different ages were tested. PCA was performed to extract new features (principal components) which are not redundant combinations of the original parameters and account for most of the data variability. They can be useful for exploratory analysis and outlier detection. Then, a reduced set of the original parameters was selected through correlation analysis with the principal components. This set could be recommended for studies based on healthy adults. The proposed procedure could be used as a first-level feature selection in classification studies (i.e. healthy-Parkinson's disease, fallers-non fallers) and could allow, in the future, a complete system for movement analysis to be incorporated in a smartphone.
Designers Workbench: Towards Real-Time Immersive Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuester, F; Duchaineau, M A; Hamann, B
2001-10-03
This paper introduces the DesignersWorkbench, a semi-immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing (CAM), and computer-aided engineering (CAE) systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates from a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The DesignersWorkbench aims at closing this technologymore » or ''digital gap'' experienced by design and CAD engineers by transforming the classical design paradigm into its filly integrated digital and virtual analog allowing collaborative development in a semi-immersive virtual environment. This project emphasizes two key components from the classical product design cycle: freeform modeling and analysis. In the freeform modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.« less
MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.
Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan
2017-01-01
Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.
Principal component analysis of Raman spectra for TiO2 nanoparticle characterization
NASA Astrophysics Data System (ADS)
Ilie, Alina Georgiana; Scarisoareanu, Monica; Morjan, Ion; Dutu, Elena; Badiceanu, Maria; Mihailescu, Ion
2017-09-01
The Raman spectra of anatase/rutile mixed phases of Sn doped TiO2 nanoparticles and undoped TiO2 nanoparticles, synthesised by laser pyrolysis, with nanocrystallite dimensions varying from 8 to 28 nm, was simultaneously processed with a self-written software that applies Principal Component Analysis (PCA) on the measured spectrum to verify the possibility of objective auto-characterization of nanoparticles from their vibrational modes. The photo-excited process of Raman scattering is very sensible to the material characteristics, especially in the case of nanomaterials, where more properties become relevant for the vibrational behaviour. We used PCA, a statistical procedure that performs eigenvalue decomposition of descriptive data covariance, to automatically analyse the sample's measured Raman spectrum, and to interfere the correlation between nanoparticle dimensions, tin and carbon concentration, and their Principal Component values (PCs). This type of application can allow an approximation of the crystallite size, or tin concentration, only by measuring the Raman spectrum of the sample. The study of loadings of the principal components provides information of the way the vibrational modes are affected by the nanoparticle features and the spectral area relevant for the classification.
Shade avoidance components and pathways in adult plants revealed by phenotypic profiling.
Nozue, Kazunari; Tat, An V; Kumar Devisetty, Upendra; Robinson, Matthew; Mumbach, Maxwell R; Ichihashi, Yasunori; Lekkala, Saradadevi; Maloof, Julin N
2015-04-01
Shade from neighboring plants limits light for photosynthesis; as a consequence, plants have a variety of strategies to avoid canopy shade and compete with their neighbors for light. Collectively the response to foliar shade is called the shade avoidance syndrome (SAS). The SAS includes elongation of a variety of organs, acceleration of flowering time, and additional physiological responses, which are seen throughout the plant life cycle. However, current mechanistic knowledge is mainly limited to shade-induced elongation of seedlings. Here we use phenotypic profiling of seedling, leaf, and flowering time traits to untangle complex SAS networks. We used over-representation analysis (ORA) of shade-responsive genes, combined with previous annotation, to logically select 59 known and candidate novel mutants for phenotyping. Our analysis reveals shared and separate pathways for each shade avoidance response. In particular, auxin pathway components were required for shade avoidance responses in hypocotyl, petiole, and flowering time, whereas jasmonic acid pathway components were only required for petiole and flowering time responses. Our phenotypic profiling allowed discovery of seventeen novel shade avoidance mutants. Our results demonstrate that logical selection of mutants increased success of phenotypic profiling to dissect complex traits and discover novel components.
Performance deterioration based on existing (historical) data; JT9D jet engine diagnostics program
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1978-01-01
The results of the collection and analysis of historical data pertaining to the deterioration of JT9D engine performance are presented. The results of analyses of prerepair and postrepair engine test stand performance data from a number of airlines to establish the individual as well as average losses in engine performance with respect to service use are included. Analysis of the changes in mechanical condition of parts, obtained by inspection of used gas-path parts of varying age, allowed preliminary assessments of component performance deterioration levels and identification of the causitive factors. These component performance estimates, refined by data from special engine back-to-back testing related to module performance restoration, permitted the development of preliminary models of engine component/module performance deterioration with respect to usage. The preliminary assessment of the causes of module performance deterioration and the trends with usage are explained, along with the role each module plays in overall engine performance deterioration. Preliminary recommendations with respect to operating and maintenance practices which could be adopted to control the level of performance deterioration are presented. The needs for additional component sensitivity testing as well as outstanding issues are discussed.
Time-of-flight expansion of binary Bose–Einstein condensates at finite temperature
NASA Astrophysics Data System (ADS)
Lee, K. L.; Jørgensen, N. B.; Wacker, L. J.; Skou, M. G.; Skalmstang, K. T.; Arlt, J. J.; Proukakis, N. P.
2018-05-01
Ultracold quantum gases provide a unique setting for studying and understanding the properties of interacting quantum systems. Here, we investigate a multi-component system of 87Rb–39K Bose–Einstein condensates (BECs) with tunable interactions both theoretically and experimentally. Such multi-component systems can be characterized by their miscibility, where miscible components lead to a mixed ground state and immiscible components form a phase-separated state. Here we perform the first full simulation of the dynamical expansion of this system including both BECs and thermal clouds, which allows for a detailed comparison with experimental results. In particular we show that striking features emerge in time-of-flight (TOF) for BECs with strong interspecies repulsion, even for systems which were separated in situ by a large gravitational sag. An analysis of the centre of mass positions of the BECs after expansion yields qualitative agreement with the homogeneous criterion for phase-separation, but reveals no clear transition point between the mixed and the separated phases. Instead one can identify a transition region, for which the presence of a gravitational sag is found to be advantageous. Moreover, we analyse the situation where only one component is condensed and show that the density distribution of the thermal component also shows some distinct features. Our work sheds new light on the analysis of multi-component systems after TOF and will guide future experiments on the detection of miscibility in these systems.
Amazon Business And GSA Advantage: A Comparative Analysis
2017-12-01
training for businesses or a customer -ordering guide; however, the site does offer a help center where businesses and users can submit questions...Electronic Offer FAR Federal Acquisition Regulation FAS Federal Acquisition Service FASA Federal Acquisition Streamlining Act FGO Field Grade Officer...component of GSA Advantage, is an online procurement tool that allows customers to request quotes for (1) commercial supplies and services under
Preliminary candidate advanced avionics system for general aviation
NASA Technical Reports Server (NTRS)
Mccalla, T. M.; Grismore, F. L.; Greatline, S. E.; Birkhead, L. M.
1977-01-01
An integrated avionics system design was carried out to the level which indicates subsystem function, and the methods of overall system integration. Sufficient detail was included to allow identification of possible system component technologies, and to perform reliability, modularity, maintainability, cost, and risk analysis upon the system design. Retrofit to older aircraft, availability of this system to the single engine two place aircraft, was considered.
2012-01-01
Background The aim of the paper is to assess by the principal components analysis (PCA) the heavy metal contamination of soil and vegetables widely used as food for people who live in areas contaminated by heavy metals (HMs) due to long-lasting mining activities. This chemometric technique allowed us to select the best model for determining the risk of HMs on the food chain as well as on people's health. Results Many PCA models were computed with different variables: heavy metals contents and some agro-chemical parameters which characterize the soil samples from contaminated and uncontaminated areas, HMs contents of different types of vegetables grown and consumed in these areas, and the complex parameter target hazard quotients (THQ). Results were discussed in terms of principal component analysis. Conclusion There were two major benefits in processing the data PCA: firstly, it helped in optimizing the number and type of data that are best in rendering the HMs contamination of the soil and vegetables. Secondly, it was valuable for selecting the vegetable species which present the highest/minimum risk of a negative impact on the food chain and human health. PMID:23234365
Hales, Claire A; Robinson, Emma S J; Houghton, Conor J
2016-01-01
Human decision making is modified by emotional state. Rodents exhibit similar biases during interpretation of ambiguous cues that can be altered by affective state manipulations. In this study, the impact of negative affective state on judgement bias in rats was measured using an ambiguous-cue interpretation task. Acute treatment with an anxiogenic drug (FG7142), and chronic restraint stress and social isolation both induced a bias towards more negative interpretation of the ambiguous cue. The diffusion model was fit to behavioural data to allow further analysis of the underlying decision making processes. To uncover the way in which parameters vary together in relation to affective state manipulations, independent component analysis was conducted on rate of information accumulation and distances to decision threshold parameters for control data. Results from this analysis were applied to parameters from negative affective state manipulations. These projected components were compared to control components to reveal the changes in decision making processes that are due to affective state manipulations. Negative affective bias in rodents induced by either FG7142 or chronic stress is due to a combination of more negative interpretation of the ambiguous cue, reduced anticipation of the high reward and increased anticipation of the low reward.
NASA Astrophysics Data System (ADS)
Weinmann, Martin; Jutzi, Boris; Hinz, Stefan; Mallet, Clément
2015-07-01
3D scene analysis in terms of automatically assigning 3D points a respective semantic label has become a topic of great importance in photogrammetry, remote sensing, computer vision and robotics. In this paper, we address the issue of how to increase the distinctiveness of geometric features and select the most relevant ones among these for 3D scene analysis. We present a new, fully automated and versatile framework composed of four components: (i) neighborhood selection, (ii) feature extraction, (iii) feature selection and (iv) classification. For each component, we consider a variety of approaches which allow applicability in terms of simplicity, efficiency and reproducibility, so that end-users can easily apply the different components and do not require expert knowledge in the respective domains. In a detailed evaluation involving 7 neighborhood definitions, 21 geometric features, 7 approaches for feature selection, 10 classifiers and 2 benchmark datasets, we demonstrate that the selection of optimal neighborhoods for individual 3D points significantly improves the results of 3D scene analysis. Additionally, we show that the selection of adequate feature subsets may even further increase the quality of the derived results while significantly reducing both processing time and memory consumption.
Rinaldi, Maurizio; Gindro, Roberto; Barbeni, Massimo; Allegrone, Gianna
2009-01-01
Orange (Citrus sinensis L.) juice comprises a complex mixture of volatile components that are difficult to identify and quantify. Classification and discrimination of the varieties on the basis of the volatile composition could help to guarantee the quality of a juice and to detect possible adulteration of the product. To provide information on the amounts of volatile constituents in fresh-squeezed juices from four orange cultivars and to establish suitable discrimination rules to differentiate orange juices using new chemometric approaches. Fresh juices of four orange cultivars were analysed by headspace solid-phase microextraction (HS-SPME) coupled with GC-MS. Principal component analysis, linear discriminant analysis and heuristic methods, such as neural networks, allowed clustering of the data from HS-SPME analysis while genetic algorithms addressed the problem of data reduction. To check the quality of the results the chemometric techniques were also evaluated on a sample. Thirty volatile compounds were identified by HS-SPME and GC-MS analyses and their relative amounts calculated. Differences in composition of orange juice volatile components were observed. The chosen orange cultivars could be discriminated using neural networks, genetic relocation algorithms and linear discriminant analysis. Genetic algorithms applied to the data were also able to detect the most significant compounds. SPME is a useful technique to investigate orange juice volatile composition and a flexible chemometric approach is able to correctly separate the juices.
Visual target modulation of functional connectivity networks revealed by self-organizing group ICA.
van de Ven, Vincent; Bledowski, Christoph; Prvulovic, David; Goebel, Rainer; Formisano, Elia; Di Salle, Francesco; Linden, David E J; Esposito, Fabrizio
2008-12-01
We applied a data-driven analysis based on self-organizing group independent component analysis (sogICA) to fMRI data from a three-stimulus visual oddball task. SogICA is particularly suited to the investigation of the underlying functional connectivity and does not rely on a predefined model of the experiment, which overcomes some of the limitations of hypothesis-driven analysis. Unlike most previous applications of ICA in functional imaging, our approach allows the analysis of the data at the group level, which is of particular interest in high order cognitive studies. SogICA is based on the hierarchical clustering of spatially similar independent components, derived from single subject decompositions. We identified four main clusters of components, centered on the posterior cingulate, bilateral insula, bilateral prefrontal cortex, and right posterior parietal and prefrontal cortex, consistently across all participants. Post hoc comparison of time courses revealed that insula, prefrontal cortex and right fronto-parietal components showed higher activity for targets than for distractors. Activation for distractors was higher in the posterior cingulate cortex, where deactivation was observed for targets. While our results conform to previous neuroimaging studies, they also complement conventional results by showing functional connectivity networks with unique contributions to the task that were consistent across subjects. SogICA can thus be used to probe functional networks of active cognitive tasks at the group-level and can provide additional insights to generate new hypotheses for further study. Copyright 2007 Wiley-Liss, Inc.
Current understanding of point defects and diffusion processes in silicon
NASA Technical Reports Server (NTRS)
Tan, T. Y.; Goesele, U.
1985-01-01
The effects of oxidation of Si which established that vacancies (V) and Si self interstitials (I) coexist in Si at high temperatures under thermal equilibrium and oxidizing conditions are discussed. Some essential points associated with Au diffusion in Si are then discussed. Analysis of Au diffusion results allowed a determination of the I component and an estimate of the V component of the Si self diffusion coefficient. A discussion of theories on high concentration P diffusion into Si is then presented. Although presently there still is no theory that is completely satisfactory, significant progresses are recently made in treating some essential aspects of this subject.
Extracting Independent Local Oscillatory Geophysical Signals by Geodetic Tropospheric Delay
NASA Technical Reports Server (NTRS)
Botai, O. J.; Combrinck, L.; Sivakumar, V.; Schuh, H.; Bohm, J.
2010-01-01
Zenith Tropospheric Delay (ZTD) due to water vapor derived from space geodetic techniques and numerical weather prediction simulated-reanalysis data exhibits non-linear and non-stationary properties akin to those in the crucial geophysical signals of interest to the research community. These time series, once decomposed into additive (and stochastic) components, have information about the long term global change (the trend) and other interpretable (quasi-) periodic components such as seasonal cycles and noise. Such stochastic component(s) could be a function that exhibits at most one extremum within a data span or a monotonic function within a certain temporal span. In this contribution, we examine the use of the combined Ensemble Empirical Mode Decomposition (EEMD) and Independent Component Analysis (ICA): the EEMD-ICA algorithm to extract the independent local oscillatory stochastic components in the tropospheric delay derived from the European Centre for Medium-Range Weather Forecasts (ECMWF) over six geodetic sites (HartRAO, Hobart26, Wettzell, Gilcreek, Westford, and Tsukub32). The proposed methodology allows independent geophysical processes to be extracted and assessed. Analysis of the quality index of the Independent Components (ICs) derived for each cluster of local oscillatory components (also called the Intrinsic Mode Functions (IMFs)) for all the geodetic stations considered in the study demonstrate that they are strongly site dependent. Such strong dependency seems to suggest that the localized geophysical signals embedded in the ZTD over the geodetic sites are not correlated. Further, from the viewpoint of non-linear dynamical systems, four geophysical signals the Quasi-Biennial Oscillation (QBO) index derived from the NCEP/NCAR reanalysis, the Southern Oscillation Index (SOI) anomaly from NCEP, the SIDC monthly Sun Spot Number (SSN), and the Length of Day (LoD) are linked to the extracted signal components from ZTD. Results from the synchronization analysis show that ZTD and the geophysical signals exhibit (albeit subtle) site dependent phase synchronization index.
Gao, Min-Tian; Shimamura, Takashi; Ishida, Nobuhiro; Takahashi, Haruo
2012-09-01
In this study, component analysis of a novel biodiesel-producing alga, Pseudochoricystis ellipsoidea, was performed. The component analysis results indicated that proteins and amino acids are abundant in P. ellipsoidea while the sugar content is relatively low. Rather than its use as a carbon source, the use of the algal biomass residue after oil extraction as a nutrient source provided a new way for lowering the total production cost of biodiesel. In both lactic acid and ethanol fermentations, the use of the residue instead of high-cost nutrient yeast extract allowed a significant saving, showing the promise of the algal biomass residue for use as a fermentation nutrient source. Copyright © 2012 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.
2018-01-01
A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.
Large-scale structural optimization
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, J.
1983-01-01
Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.
Designing and encoding models for synthetic biology.
Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas
2009-08-06
A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology 'loop'.
Chang, Le; Baseggio, Oscar; Sementa, Luca; Cheng, Daojian; Fronzoni, Giovanna; Toffoli, Daniele; Aprà, Edoardo; Stener, Mauro; Fortunelli, Alessandro
2018-06-13
We introduce Individual Component Maps of Rotatory Strength (ICM-RS) and Rotatory Strength Density (RSD) plots as analysis tools of chiro-optical linear response spectra deriving from time-dependent density functional theory (TDDFT) simulations. ICM-RS and RSD allow one to visualize the origin of chiro-optical response in momentum or real space, including signed contributions and therefore highlighting cancellation terms that are ubiquitous in chirality phenomena, and should be especially useful in analyzing the spectra of complex systems. As test cases, we use ICM-RS and RSD to analyze circular dichroism spectra of selected (Ag-Au)30(SR)18 monolayer-protected metal nanoclusters, showing the potential of the proposed tools to derive insight and understanding, and eventually rational design, in chiro-optical studies of complex systems.
Evaluation of LANDSAT MSS vs TM simulated data for distinguishing hydrothermal alteration
NASA Technical Reports Server (NTRS)
Abrams, M. J.; Kahle, A. B.; Madura, D. P.; Soha, J. M.
1978-01-01
The LANDSAT Follow-On (LFO) data was simulated to demonstrate the mineral exploration capability of this system for segregating different types of hydrothermal alteration and to compare this capability with that of the existing LANDSAT system. Multispectral data were acquired for several test sites with the Bendix 24-channel MSDS scanner. Contrast enhancements, band ratioing, and principal component transformations were used to process the simulated LFO data for analysis. For Red Mountain, Arizona, the LFO data allowed identification of silicified areas, not identifiable with LANDSAT 1 and 2 data. The improved LFO resolution allowed detection of small silicic outcrops and of a narrow silicified dike. For Cuprite - Ralston, Nevada, the LFO spectral bands allowed discrimination of argillic and opalized altered areas; these could not be spectrally discriminated using LANDSAT 1 and 2 data. Addition of data from the 1.3- and 2.2- micrometer regions allowed better discriminations of hydrothermal alteration types.
Concept and set-up of an IR-gas sensor construction kit
NASA Astrophysics Data System (ADS)
Sieber, Ingo; Perner, Gernot; Gengenbach, Ulrich
2015-10-01
The paper presents an approach to a cost-efficient modularly built non-dispersive optical IR-gas sensor (NDIR) based on a construction kit. The modularity of the approach offers several advantages: First of all it allows for an adaptation of the performance of the gas sensor to individual specifications by choosing the suitable modular components. The sensitivity of the sensor e.g. can be altered by selecting a source which emits a favorable wavelength spectrum with respect to the absorption spectrum of the gas to be measured or by tuning the measuring distance (ray path inside the medium to be measured). Furthermore the developed approach is very well suited to be used in teaching. Together with students a construction kit on basis of an optical free space system was developed and partly implemented to be further used as a teaching and training aid for bachelor and master students at our institute. The components of the construction kit are interchangeable and freely fixable on a base plate. The components are classified into five groups: sources, reflectors, detectors, gas feed, and analysis cell. Source, detector, and the positions of the components are fundamental to experiment and test different configurations and beam paths. The reflectors are implemented by an aluminum coated adhesive foil, mounted onto a support structure fabricated by additive manufacturing. This approach allows derivation of the reflecting surface geometry from the optical design tool and generating the 3D-printing files by applying related design rules. The rapid fabrication process and the adjustment of the modules on the base plate allow rapid, almost LEGO®-like, experimental assessment of design ideas. Subject of this paper is modeling, design, and optimization of the reflective optical components, as well as of the optical subsystem. The realization of a sample set-up used as a teaching aid and the optical measurement of the beam path in comparison to the simulation results are shown as well.
Ion Elevators and Escalators in Multilevel Structures for Lossless Ion Manipulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Yehia M.; Hamid, Ahmed M.; Cox, Jonathan T.
2017-01-19
We describe two approaches based upon ion ‘elevator’ and ‘escalator’ components that allow moving ions to different levels in structures for lossless ion manipulations (SLIM). Guided by ion motion simulations we designed elevator and escalator components providing essentially lossless transmission in multi-level designs based upon ion current measurements. The ion elevator design allowed ions to efficiently bridge a 4 mm gap between levels. The component was integrated in a SLIM and coupled to a QTOF mass spectrometer using an ion funnel interface to evaluate the m/z range transmitted as compared to transmission within a level (e.g. in a linear section).more » Mass spectra for singly-charged ions of m/z 600-2700 produced similar mass spectra for both elevator and straight (linear motion) components. In the ion escalator design, traveling waves (TW) were utilized to transport ions efficiently between two SLIM levels. Ion current measurements and ion mobility (IM) spectrometry analysis illustrated that ions can be transported between TW-SLIM levels with no significant loss of either ions or IM resolution. These developments provide a path for the development of multilevel designs providing e.g. much longer IM path lengths, more compact designs, and the implementation of much more complex SLIM devices in which e.g. different levels may operate at different temperatures or with different gases.« less
NASA Astrophysics Data System (ADS)
Silva, N.; Esper, A.
2012-01-01
The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.
NASA Astrophysics Data System (ADS)
Krasnenko, N. P.; Kapegesheva, O. F.; Shamanaeva, L. G.
2017-11-01
Spatiotemporal dynamics of the standard deviations of three wind velocity components measured with a mini-sodar in the atmospheric boundary layer is analyzed. During the day on September 16 and at night on September 12 values of the standard deviation changed for the x- and y-components from 0.5 to 4 m/s, and for the z-component from 0.2 to 1.2 m/s. An analysis of the vertical profiles of the standard deviations of three wind velocity components for a 6-day measurement period has shown that the increase of σx and σy with altitude is well described by a power law dependence with exponent changing from 0.22 to 1.3 depending on the time of day, and σz depends linearly on the altitude. The approximation constants have been found and their errors have been estimated. The established physical regularities and the approximation constants allow the spatiotemporal dynamics of the standard deviation of three wind velocity components in the atmospheric boundary layer to be described and can be recommended for application in ABL models.
Barba, Lida; Sánchez-Macías, Davinia; Barba, Iván; Rodríguez, Nibaldo
2018-06-01
Guinea pig meat consumption is increasing exponentially worldwide. The evaluation of the contribution of carcass components to carcass quality potentially can allow for the estimation of the value added to food animal origin and make research in guinea pigs more practicable. The aim of this study was to propose a methodology for modelling the contribution of different carcass components to the overall carcass quality of guinea pigs by using non-invasive pre- and post mortem carcass measurements. The selection of predictors was developed through correlation analysis and statistical significance; whereas the prediction models were based on Multiple Linear Regression. The prediction results showed higher accuracy in the prediction of carcass component contribution expressed in grams, compared to when expressed as a percentage of carcass quality components. The proposed prediction models can be useful for the guinea pig meat industry and research institutions by using non-invasive and time- and cost-efficient carcass component measuring techniques. Copyright © 2018 Elsevier Ltd. All rights reserved.
Sánchez-Salcedo, Eva M; Tassotti, Michele; Del Rio, Daniele; Hernández, Francisca; Martínez, Juan José; Mena, Pedro
2016-12-01
This study reports the (poly)phenolic fingerprinting and chemometric discrimination of leaves of eight mulberry clones from Morus alba and Morus nigra cultivated in Spain. UHPLC-MS(n) (Ultra High Performance Liquid Chromatography-Mass Spectrometry) high-throughput analysis allowed the tentative identification of a total of 31 compounds. The phenolic profile of mulberry leaf was characterized by the presence of a high number of flavonol derivatives, mainly glycosylated forms of quercetin and kaempferol. Caffeoylquinic acids, simple phenolic acids, and some organic acids were also detected. Seven compounds were identified for the first time in mulberry leaves. The chemometric analysis (cluster analysis and principal component analysis) of the chromatographic data allowed the characterization of the different mulberry clones and served to explain the great intraspecific variability in mulberry secondary metabolism. This screening of the complete phenolic profile of mulberry leaves can assist the increasing interest for purposes related to quality control, germplasm screening, and bioactivity evaluation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Direct analysis of herbal powders by pipette-tip electrospray ionization mass spectrometry.
Wang, Haixing; So, Pui-Kin; Yao, Zhong-Ping
2014-01-27
Conventional electrospray ionization mass spectrometry (ESI-MS) is widely used for analysis of solution samples. The development of solid-substrate ESI-MS allows direct ionization analysis of bulky solid samples. In this study, we developed pipette-tip ESI-MS, a technique that combines pipette tips with syringe and syringe pump, for direct analysis of herbal powders, another common form of samples. We demonstrated that various herbal powder samples, including herbal medicines and food samples, could be readily online extracted and analyzed using this technique. Various powder samples, such as Rhizoma coptidis, lotus plumule, great burdock achene, black pepper, Panax ginseng, roasted coffee beans, Fructus Schisandrae Chinensis and Fructus Schisandrae Sphenantherae, were analyzed using pipette-tip ESI-MS and quality mass spectra with stable and durable signals could be obtained. Both positive and negative ion modes were attempted and various compounds including amino acids, oligosaccharides, glycosides, alkaloids, organic acids, ginosensides, flavonoids and lignans could be detected. Principal component analysis (PCA) based on the acquired mass spectra allowed rapid differentiation of closely related herbal species. Copyright © 2013 Elsevier B.V. All rights reserved.
Systems-Level Analysis of Innate Immunity
Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan
2014-01-01
Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298
DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.
Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco
2017-07-24
DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Komorowski, Dariusz; Pietraszek, Stanislaw
2016-01-01
This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.
NASA Astrophysics Data System (ADS)
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we introduce the vbICA technique and present its application on synthetic data that simulate a GPS network recording ground deformation in a tectonically active region, with synthetic time-series containing interseismic, coseismic, and postseismic deformation, plus seasonal deformation, and white and coloured noise. We study the ability of the algorithm to recover the original (known) sources of deformation, and then apply it to a real scenario: the Emilia seismic sequence (2012, northern Italy), which is an example of seismic sequence occurred in a slowly converging tectonic setting, characterized by several local to regional anthropogenic or natural sources of deformation, mainly subsidence due to fluid withdrawal and sediments compaction. We apply both PCA and vbICA to displacement time-series recorded by continuous GPS and InSAR (Pezzo et al., EGU2015-8950).
Understanding electrostatic charge behaviour in aircraft fuel systems
NASA Astrophysics Data System (ADS)
Ogilvy, Jill A.; Hooker, Phil; Bennett, Darrell
2015-10-01
This paper presents work on the simulation of electrostatic charge build-up and decay in aircraft fuel systems. A model (EC-Flow) has been developed by BAE Systems under contract to Airbus, to allow the user to assess the effects of changes in design or in refuel conditions. Some of the principles behind the model are outlined. The model allows for a range of system components, including metallic and non-metallic pipes, valves, filters, junctions, bends and orifices. A purpose-built experimental rig was built at the Health and Safety Laboratory in Buxton, UK, to provide comparison data. The rig comprises a fuel delivery system, a test section where different components may be introduced into the system, and a Faraday Pail for measuring generated charge. Diagnostics include wall currents, charge densities and pressure losses. This paper shows sample results from the fitting of model predictions to measurement data and shows how analysis may be used to explain some of the observed trends.
STAR Online Framework: from Metadata Collection to Event Analysis and System Control
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.
2015-05-01
In preparation for the new era of RHIC running (RHIC-II upgrades and possibly, the eRHIC era), the STAR experiment is expanding its modular Message Interface and Reliable Architecture framework (MIRA). MIRA allowed STAR to integrate meta-data collection, monitoring, and online QA components in a very agile and efficient manner using a messaging infrastructure approach. In this paper, we briefly summarize our past achievements, provide an overview of the recent development activities focused on messaging patterns and describe our experience with the complex event processor (CEP) recently integrated into the MIRA framework. CEP was used in the recent RHIC Run 14, which provided practical use cases. Finally, we present our requirements and expectations for the planned expansion of our systems, which will allow our framework to acquire features typically associated with Detector Control Systems. Special attention is given to aspects related to latency, scalability and interoperability within heterogeneous set of services, various data and meta-data acquisition components coexisting in STAR online domain.
Estimating the vibration level of an L-shaped beam using power flow techniques
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.; Mccollum, M.; Rassineux, J. L.; Gilbert, T.
1986-01-01
The response of one component of an L-shaped beam, with point force excitation on the other component, is estimated using the power flow method. The transmitted power from the source component to the receiver component is expressed in terms of the transfer and input mobilities at the excitation point and the joint. The response is estimated both in narrow frequency bands, using the exact geometry of the beams, and as a frequency averaged response using infinite beam models. The results using this power flow technique are compared to the results obtained using finite element analysis (FEA) of the L-shaped beam for the low frequency response and to results obtained using statistical energy analysis (SEA) for the high frequencies. The agreement between the FEA results and the power flow method results at low frequencies is very good. SEA results are in terms of frequency averaged levels and these are in perfect agreement with the results obtained using the infinite beam models in the power flow method. The narrow frequency band results from the power flow method also converge to the SEA results at high frequencies. The advantage of the power flow method is that detail of the response can be retained while reducing computation time, which will allow the narrow frequency band analysis of the response to be extended to higher frequencies.
Translational PK/PD of Anti-Infective Therapeutics
Rathi, Chetan; Lee, Richard E.; Meibohm, Bernd
2016-01-01
Translational PK/PD modeling has emerged as a critical technique for quantitative analysis of the relationship between dose, exposure and response of antibiotics. By combining model components for pharmacokinetics, bacterial growth kinetics and concentration-dependent drug effects, these models are able to quantitatively capture and simulate the complex interplay between antibiotic, bacterium and host organism. Fine-tuning of these basic model structures allows to further account for complicating factors such as resistance development, combination therapy, or host responses. With this tool set at hand, mechanism-based PK/PD modeling and simulation allows to develop optimal dosing regimens for novel and established antibiotics for maximum efficacy and minimal resistance development. PMID:27978987
The approach to engineering tasks composition on knowledge portals
NASA Astrophysics Data System (ADS)
Novogrudska, Rina; Globa, Larysa; Schill, Alexsander; Romaniuk, Ryszard; Wójcik, Waldemar; Karnakova, Gaini; Kalizhanova, Aliya
2017-08-01
The paper presents an approach to engineering tasks composition on engineering knowledge portals. The specific features of engineering tasks are highlighted, their analysis makes the basis for partial engineering tasks integration. The formal algebraic system for engineering tasks composition is proposed, allowing to set the context-independent formal structures for engineering tasks elements' description. The method of engineering tasks composition is developed that allows to integrate partial calculation tasks into general calculation tasks on engineering portals, performed on user request demand. The real world scenario «Calculation of the strength for the power components of magnetic systems» is represented, approving the applicability and efficiency of proposed approach.
Vibrational Spectroscopy as a Promising Toolbox for Analyzing Functionalized Ceramic Membranes.
Kiefer, Johannes; Bartels, Julia; Kroll, Stephen; Rezwan, Kurosch
2018-01-01
Ceramic materials find use in many fields including the life sciences and environmental engineering. For example, ceramic membranes have shown to be promising filters for water treatment and virus retention. The analysis of such materials, however, remains challenging. In the present study, the potential of three vibrational spectroscopic methods for characterizing functionalized ceramic membranes for water treatment is evaluated. For this purpose, Raman scattering, infrared (IR) absorption, and solvent infrared spectroscopy (SIRS) were employed. The data were analyzed with respect to spectral changes as well as using principal component analysis (PCA). The Raman spectra allow an unambiguous discrimination of the sample types. The IR spectra do not change systematically with functionalization state of the material. Solvent infrared spectroscopy allows a systematic distinction and enables studying the molecular interactions between the membrane surface and the solvent.
NASA Astrophysics Data System (ADS)
Alyami, Abeer; Saviello, Daniela; McAuliffe, Micheal A. P.; Cucciniello, Raffaele; Mirabile, Antonio; Proto, Antonio; Lewis, Liam; Iacopino, Daniela
2017-08-01
Au nanorods were used as an alternative to commonly used Ag nanoparticles as Surface Enhanced Raman Scattering (SERS) probes for identification of dye composition of blue BIC ballpoint pens. When used in combination with Thin Layer Chromatography (TLC), Au nanorod colloids allowed identification of the major dye components of the BIC pen ink, otherwise not identifiable by normal Raman spectroscopy. Thanks to their enhanced chemical stability compared to Ag colloids, Au nanorods provided stable and reproducible SERS signals and allowed easy identification of phthalocyanine and triarylene dyes in the pen ink mixture. These findings were supported by FTIR and MALDI analyses, also performed on the pen ink. Furthermore, the self-assembly of Au nanorods into large area ordered superstructures allowed identification of BIC pen traces. SERS spectra of good intensity and high reproducibility were obtained using Au nanorod vertical arrays, due to the high density of hot spots and morphological reproducibility of these superstructures. These results open the way to the employment of SERS for fast screening analysis and for quantitative analysis of pens and faded pens which are relevant for the fields of forensic and art conservation sciences.
Analysis of dynamical response of air blast loaded safety device
NASA Astrophysics Data System (ADS)
Tropkin, S. N.; Tlyasheva, R. R.; Bayazitov, M. I.; Kuzeev, I. R.
2018-03-01
Equipment of many oil and gas processing plants in the Russian Federation is considerably worn-out. This causes the decrease of reliability and durability of equipment and rises the accident rate. An air explosion is the one of the most dangerous cases for plants in oil and gas industry, usually caused by uncontrolled emission and inflammation of oil products. Air explosion can lead to significant danger for life and health of plant staff, so it necessitates safety device usage. A new type of a safety device is designed. Numerical simulation is necessary to analyse design parameters and performance of the safety device, subjected to air blast loading. Coupled fluid-structure interaction analysis is performed to determine strength of the protective device and its performance. The coupled Euler-Lagrange method, allowable in Abaqus by SIMULIA, is selected as the most appropriate analysis tool to study blast wave interaction with the safety device. Absorption factors of blast wave are evaluated for the safety device. This factors allow one to assess efficiency of the safety device, and its main structural component – dampener. Usage of CEL allowed one to model fast and accurately the dampener behaviour, and to develop the parametric model to determine safety device sizes.
Grimbergen, M C M; van Swol, C F P; Kendall, C; Verdaasdonk, R M; Stone, N; Bosch, J L H R
2010-01-01
The overall quality of Raman spectra in the near-infrared region, where biological samples are often studied, has benefited from various improvements to optical instrumentation over the past decade. However, obtaining ample spectral quality for analysis is still challenging due to device requirements and short integration times required for (in vivo) clinical applications of Raman spectroscopy. Multivariate analytical methods, such as principal component analysis (PCA) and linear discriminant analysis (LDA), are routinely applied to Raman spectral datasets to develop classification models. Data compression is necessary prior to discriminant analysis to prevent or decrease the degree of over-fitting. The logical threshold for the selection of principal components (PCs) to be used in discriminant analysis is likely to be at a point before the PCs begin to introduce equivalent signal and noise and, hence, include no additional value. Assessment of the signal-to-noise ratio (SNR) at a certain peak or over a specific spectral region will depend on the sample measured. Therefore, the mean SNR over the whole spectral region (SNR(msr)) is determined in the original spectrum as well as for spectra reconstructed from an increasing number of principal components. This paper introduces a method of assessing the influence of signal and noise from individual PC loads and indicates a method of selection of PCs for LDA. To evaluate this method, two data sets with different SNRs were used. The sets were obtained with the same Raman system and the same measurement parameters on bladder tissue collected during white light cystoscopy (set A) and fluorescence-guided cystoscopy (set B). This method shows that the mean SNR over the spectral range in the original Raman spectra of these two data sets is related to the signal and noise contribution of principal component loads. The difference in mean SNR over the spectral range can also be appreciated since fewer principal components can reliably be used in the low SNR data set (set B) compared to the high SNR data set (set A). Despite the fact that no definitive threshold could be found, this method may help to determine the cutoff for the number of principal components used in discriminant analysis. Future analysis of a selection of spectral databases using this technique will allow optimum thresholds to be selected for different applications and spectral data quality levels.
NASA Astrophysics Data System (ADS)
Golenko, Mariya; Golenko, Nikolay
2014-05-01
Numerical modeling of the currents' spatial structure in some regions of the Baltic Sea is performed on the base of POM (Princeton Ocean Model). The calculations were performed under the westerly (most frequent in the Baltic) and north-easterly wind forcings. In the regions adjacent to the Kaliningrad Region's, Polish and Lithuanian coasts these winds generate oppositely directed geostrophic, drift and others types of currents. On the whole these processes can be considered as downwelling and upwelling. Apart from the regions mentioned above the Slupsk Furrow region, which determines the mass and momentum exchange between the Western and Central Baltic, is also considered. During the analysis of currents not only the whole model velocity but also components directed along and across the barotropic geostrophic current velocity are considered. The along geostrophic component for one's turn is separated into the geostrophic current itself and an ageostrophic part. The across geostrophic component is totally ageostrophic. The velocity components directed along and across the geostrophic current approximately describe the velocity components directed along the coast (along isobathes) and from the coast towards the open sea. The suggested approach allowed to present the currents' spatial structures typical for different wind forcings as two maps with the components directed along and across the barotropic geostrophic current velocity. On these maps the areas of the intensive alongshore currents are clearly depicted (for ex. near the base of the Hel Spit, in the region of the Slupsk Sill). The combined analysis of the vectors of the whole and geostrophic velocities allows to reveal the areas where the geostrophic component is significantly strengthened or weakened by the ageostrophic component. Under the westerly wind such currents' features are clearly observed near the end of the Hel Spit and at the southern boarder of the Slupsk Sill, under the north-easterly wind - near the base of the Hel Spit, at the southern boarder of the Slupsk Furrow, near the Curonian Spit (where the relief is bent). On the maps presenting the spatial distributions of the across shore velocities the areas where the mass and momentum transport from the shore to the open sea in the surface layer and vice versa takes place are discriminated. There are also revealed the areas where sharp changes of different velocity components under the wind changes are expected as well as the areas where such changes are expected to be minimal. The model is validated using the field surveys of current velocities by ADCP in the area adjacent to the Kaliningrad region. The comparison of current velocities has shown a close correspondence. In rather wide area the directions and amplitudes of the model and ADCP surface velocities are close, that is additionally confirmed by the comparison of the local vorticity distributions. On the vertical transects of the ADCP current velocity directed across the shoreline the geostrophic jet is clearly pronounced. Its horizontal and vertical scales are in close correspondence with ones of the model jet. At that the more detail calculations which are allowed during the modeling have shown that the geostrophic currents amount to 40-60% (in average) of the whole velocity; two components of the ageostrophic velocity directed along and across the geostrophic velocity are highly variable (from 10 to 60% of the whole velocity). The ageostrophic component directed along the geostrophic current generally strengthens it (up to 20-40% in average and up to 60-70% near the end of the Hel Spit). But in some regions, for example, in the Slupsk Furrow the ageostrophic component slows down the geostrophic current (to 30-40%). In some narrow local areas immediately adjacent to the coast currents directed oppositely to the general quasi geostrophic jet were registered on both field and model data. Before the comparison with the field data these local jets revealed on the model data were considered as improbable. As a result, the comparative analysis of the field and model data led to more detail understanding of dynamic processes in some coastal parts of the Baltic Sea.
Peiris, Ramila H; Ignagni, Nicholas; Budman, Hector; Moresoli, Christine; Legge, Raymond L
2012-09-15
Characterization of the interactions between natural colloidal/particulate- and protein-like matter is important for understanding their contribution to different physiochemical phenomena like membrane fouling, adsorption of bacteria onto surfaces and various applications of nanoparticles in nanomedicine and nanotoxicology. Precise interpretation of the extent of such interactions is however hindered due to the limitations of most characterization methods to allow rapid, sensitive and accurate measurements. Here we report on a fluorescence-based excitation-emission matrix (EEM) approach in combination with principal component analysis (PCA) to extract information related to the interaction between natural colloidal/particulate- and protein-like matter. Surface plasmon resonance (SPR) analysis and fiber-optic probe based surface fluorescence measurements were used to confirm that the proposed approach can be used to characterize colloidal/particulate-protein interactions at the physical level. This method has potential to be a fundamental measurement of these interactions with the advantage that it can be performed rapidly and with high sensitivity. Copyright © 2012 Elsevier B.V. All rights reserved.
A cyclostationary multi-domain analysis of fluid instability in Kaplan turbines
NASA Astrophysics Data System (ADS)
Pennacchi, P.; Borghesani, P.; Chatterton, S.
2015-08-01
Hydraulic instabilities represent a critical problem for Francis and Kaplan turbines, reducing their useful life due to increase of fatigue on the components and cavitation phenomena. Whereas an exhaustive list of publications on computational fluid-dynamic models of hydraulic instability is available, the possibility of applying diagnostic techniques based on vibration measurements has not been investigated sufficiently, also because the appropriate sensors seldom equip hydro turbine units. The aim of this study is to fill this knowledge gap and to exploit fully, for this purpose, the potentiality of combining cyclostationary analysis tools, able to describe complex dynamics such as those of fluid-structure interactions, with order tracking procedures, allowing domain transformations and consequently the separation of synchronous and non-synchronous components. This paper will focus on experimental data obtained on a full-scale Kaplan turbine unit, operating in a real power plant, tackling the issues of adapting such diagnostic tools for the analysis of hydraulic instabilities and proposing techniques and methodologies for a highly automated condition monitoring system.
Big Data in Reciprocal Space: Sliding Fast Fourier Transforms for Determining Periodicity
Vasudevan, Rama K.; Belianinov, Alex; Gianfrancesco, Anthony G.; ...
2015-03-03
Significant advances in atomically resolved imaging of crystals and surfaces have occurred in the last decade allowing unprecedented insight into local crystal structures and periodicity. Yet, the analysis of the long-range periodicity from the local imaging data, critical to correlation of functional properties and chemistry to the local crystallography, remains a challenge. Here, we introduce a Sliding Fast Fourier Transform (FFT) filter to analyze atomically resolved images of in-situ grown La5/8Ca3/8MnO3 films. We demonstrate the ability of sliding FFT algorithm to differentiate two sub-lattices, resulting from a mixed-terminated surface. Principal Component Analysis (PCA) and Independent Component Analysis (ICA) of themore » Sliding FFT dataset reveal the distinct changes in crystallography, step edges and boundaries between the multiple sub-lattices. The method is universal for images with any periodicity, and is especially amenable to atomically resolved probe and electron-microscopy data for rapid identification of the sub-lattices present.« less
Big Data in Reciprocal Space: Sliding Fast Fourier Transforms for Determining Periodicity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasudevan, Rama K.; Belianinov, Alex; Gianfrancesco, Anthony G.
Significant advances in atomically resolved imaging of crystals and surfaces have occurred in the last decade allowing unprecedented insight into local crystal structures and periodicity. Yet, the analysis of the long-range periodicity from the local imaging data, critical to correlation of functional properties and chemistry to the local crystallography, remains a challenge. Here, we introduce a Sliding Fast Fourier Transform (FFT) filter to analyze atomically resolved images of in-situ grown La5/8Ca3/8MnO3 films. We demonstrate the ability of sliding FFT algorithm to differentiate two sub-lattices, resulting from a mixed-terminated surface. Principal Component Analysis (PCA) and Independent Component Analysis (ICA) of themore » Sliding FFT dataset reveal the distinct changes in crystallography, step edges and boundaries between the multiple sub-lattices. The method is universal for images with any periodicity, and is especially amenable to atomically resolved probe and electron-microscopy data for rapid identification of the sub-lattices present.« less
Delfino, Ines; Perna, Giuseppe; Lasalvia, Maria; Capozzi, Vito; Manti, Lorenzo; Camerlingo, Carlo; Lepore, Maria
2015-03-01
A micro-Raman spectroscopy investigation has been performed in vitro on single human mammary epithelial cells after irradiation by graded x-ray doses. The analysis by principal component analysis (PCA) and interval-PCA (i-PCA) methods has allowed us to point out the small differences in the Raman spectra induced by irradiation. This experimental approach has enabled us to delineate radiation-induced changes in protein, nucleic acid, lipid, and carbohydrate content. In particular, the dose dependence of PCA and i-PCA components has been analyzed. Our results have confirmed that micro-Raman spectroscopy coupled to properly chosen data analysis methods is a very sensitive technique to detect early molecular changes at the single-cell level following exposure to ionizing radiation. This would help in developing innovative approaches to monitor radiation cancer radiotherapy outcome so as to reduce the overall radiation dose and minimize damage to the surrounding healthy cells, both aspects being of great importance in the field of radiation therapy.
NASA Technical Reports Server (NTRS)
1979-01-01
The accompanying photos show two types of offshore oil platforms used by Exxon Corporation. In the upper photo is a leg-supported gravity platform; the other structure is a "jackettype" platform, built in sections, towed to sea and assembled on-site. In construction of platforms like these, Exxon Production Research Company, Houston, Texas, conducts extensive structural investigations of decks, supporting members and other platform components, making use of the NASTRAN @ (NASA Structural Analysis) computer program. NASTRAN is a predictive tool which analyzes a computerized design and reports how the structure will react to a great many conditions it will encounter in its operational environment; in this case, NASTRAN studies the effects of waves, winds, ocean storms and other stress-inducing factors. NASTRAN allows Exxon Production Research to perform more complex and more detailed analysis than was possible with previous programs. The same program has also been used by Exxon Research and Engineering Company, Florham Park, New Jersey, in analysis of pressure vessels, turbine components and composite building boards.
Spangenberg, J E; Dionisi, F
2001-09-01
The fatty acids from cocoa butters of different origins, varieties, and suppliers and a number of cocoa butter equivalents (Illexao 30-61, Illexao 30-71, Illexao 30-96, Choclin, Coberine, Chocosine-Illipé, Chocosine-Shea, Shokao, Akomax, Akonord, and Ertina) were investigated by bulk stable carbon isotope analysis and compound specific isotope analysis. The interpretation is based on principal component analysis combining the fatty acid concentrations and the bulk and molecular isotopic data. The scatterplot of the two first principal components allowed detection of the addition of vegetable fats to cocoa butters. Enrichment in heavy carbon isotope ((13)C) of the bulk cocoa butter and of the individual fatty acids is related to mixing with other vegetable fats and possibly to thermally or oxidatively induced degradation during processing (e.g., drying and roasting of the cocoa beans or deodorization of the pressed fat) or storage. The feasibility of the analytical approach for authenticity assessment is discussed.
Relevant principal component analysis applied to the characterisation of Portuguese heather honey.
Martins, Rui C; Lopes, Victor V; Valentão, Patrícia; Carvalho, João C M F; Isabel, Paulo; Amaral, Maria T; Batista, Maria T; Andrade, Paula B; Silva, Branca M
2008-01-01
The main purpose of this study was the characterisation of 'Serra da Lousã' heather honey by using novel statistical methodology, relevant principal component analysis, in order to assess the correlations between production year, locality and composition. Herein, we also report its chemical composition in terms of sugars, glycerol and ethanol, and physicochemical parameters. Sugars profiles from 'Serra da Lousã' heather and 'Terra Quente de Trás-os-Montes' lavender honeys were compared and allowed the discrimination: 'Serra da Lousã' honeys do not contain sucrose, generally exhibit lower contents of turanose, trehalose and maltose and higher contents of fructose and glucose. Different localities from 'Serra da Lousã' provided groups of samples with high and low glycerol contents. Glycerol and ethanol contents were revealed to be independent of the sugars profiles. These data and statistical models can be very useful in the comparison and detection of adulterations during the quality control analysis of 'Serra da Lousã' honey.
Feijão, Tália; Afonso, Olga; Maia, André F; Sunkel, Claudio E
2013-10-01
Kinetochores bind spindle microtubules and also act as signaling centers that monitor this interaction. Defects in kinetochore assembly lead to chromosome missegregation and aneuploidy. The interaction between microtubules and chromosomes involves a conserved super-complex of proteins, known as the KNL1Mis12Ndc80 (KMN) network, composed by the KNL1 (Spc105), Mis12, and Ndc80 complexes. Previous studies indicate that all components of the network are required for kinetochore-microtubule attachment and all play relevant functions in chromosome congression, biorientation, and segregation. Here, we report a comparative study addressing the role of the different KMN components using dsRNA and in vivo fluorescence microscopy in Drosophila S2 cells allowing us to suggest that different KMN network components might perform different roles in chromosome segregation and the mitotic checkpoint signaling. Depletion of different components results in mostly lateral kinetochore-microtubule attachments that are relatively stable on depletion of Mis12 or Ndc80 but very unstable after Spc105 depletion. In vivo analysis on depletion of Mis12, Ndc80, and to some extent Spc105, shows that lateral kinetochore-microtubule interactions are still functional allowing poleward kinetochore movement. We also find that different KMN network components affect differently the localization of spindle assembly checkpoint (SAC) proteins at kinetochores. Depletion of Ndc80 and Spc105 abolishes the mitotic checkpoint, whereas depletion of Mis12 causes a delay in mitotic progression. Taken together, our results suggest that Mis12 and Ndc80 complexes help to properly orient microtubule attachment, whereas Spc105 plays a predominant role in the kinetochore-microtubule attachment as well as in the poleward movement of chromosomes, SAC response, and cell viability. Copyright © 2013 Wiley Periodicals, Inc.
Ho, Hsing-Hao; Li, Ya-Hui; Lee, Jih-Chin; Wang, Chih-Wei; Yu, Yi-Lin; Hueng, Dueng-Yuan; Ma, Hsin-I; Hsu, Hsian-He; Juan, Chun-Jung
2018-01-01
We estimated the volume of vestibular schwannomas by an ice cream cone formula using thin-sliced magnetic resonance images (MRI) and compared the estimation accuracy among different estimating formulas and between different models. The study was approved by a local institutional review board. A total of 100 patients with vestibular schwannomas examined by MRI between January 2011 and November 2015 were enrolled retrospectively. Informed consent was waived. Volumes of vestibular schwannomas were estimated by cuboidal, ellipsoidal, and spherical formulas based on a one-component model, and cuboidal, ellipsoidal, Linskey's, and ice cream cone formulas based on a two-component model. The estimated volumes were compared to the volumes measured by planimetry. Intraobserver reproducibility and interobserver agreement was tested. Estimation error, including absolute percentage error (APE) and percentage error (PE), was calculated. Statistical analysis included intraclass correlation coefficient (ICC), linear regression analysis, one-way analysis of variance, and paired t-tests with P < 0.05 considered statistically significant. Overall tumor size was 4.80 ± 6.8 mL (mean ±standard deviation). All ICCs were no less than 0.992, suggestive of high intraobserver reproducibility and high interobserver agreement. Cuboidal formulas significantly overestimated the tumor volume by a factor of 1.9 to 2.4 (P ≤ 0.001). The one-component ellipsoidal and spherical formulas overestimated the tumor volume with an APE of 20.3% and 29.2%, respectively. The two-component ice cream cone method, and ellipsoidal and Linskey's formulas significantly reduced the APE to 11.0%, 10.1%, and 12.5%, respectively (all P < 0.001). The ice cream cone method and other two-component formulas including the ellipsoidal and Linskey's formulas allow for estimation of vestibular schwannoma volume more accurately than all one-component formulas.
NASA Astrophysics Data System (ADS)
Schelkanova, Irina; Toronov, Vladislav
2011-07-01
Although near infrared spectroscopy (NIRS) is now widely used both in emerging clinical techniques and in cognitive neuroscience, the development of the apparatuses and signal processing methods for these applications is still a hot research topic. The main unresolved problem in functional NIRS is the separation of functional signals from the contaminations by systemic and local physiological fluctuations. This problem was approached by using various signal processing methods, including blind signal separation techniques. In particular, principal component analysis (PCA) and independent component analysis (ICA) were applied to the data acquired at the same wavelength and at multiple sites on the human or animal heads during functional activation. These signal processing procedures resulted in a number of principal or independent components that could be attributed to functional activity but their physiological meaning remained unknown. On the other hand, the best physiological specificity is provided by broadband NIRS. Also, a comparison with functional magnetic resonance imaging (fMRI) allows determining the spatial origin of fNIRS signals. In this study we applied PCA and ICA to broadband NIRS data to distill the components correlating with the breath hold activation paradigm and compared them with the simultaneously acquired fMRI signals. Breath holding was used because it generates blood carbon dioxide (CO2) which increases the blood-oxygen-level-dependent (BOLD) signal as CO2 acts as a cerebral vasodilator. Vasodilation causes increased cerebral blood flow which washes deoxyhaemoglobin out of the cerebral capillary bed thus increasing both the cerebral blood volume and oxygenation. Although the original signals were quite diverse, we found very few different components which corresponded to fMRI signals at different locations in the brain and to different physiological chromophores.
Vergara, Victor M; Ulloa, Alvaro; Calhoun, Vince D; Boutte, David; Chen, Jiayu; Liu, Jingyu
2014-09-01
Multi-modal data analysis techniques, such as the Parallel Independent Component Analysis (pICA), are essential in neuroscience, medical imaging and genetic studies. The pICA algorithm allows the simultaneous decomposition of up to two data modalities achieving better performance than separate ICA decompositions and enabling the discovery of links between modalities. However, advances in data acquisition techniques facilitate the collection of more than two data modalities from each subject. Examples of commonly measured modalities include genetic information, structural magnetic resonance imaging (MRI) and functional MRI. In order to take full advantage of the available data, this work extends the pICA approach to incorporate three modalities in one comprehensive analysis. Simulations demonstrate the three-way pICA performance in identifying pairwise links between modalities and estimating independent components which more closely resemble the true sources than components found by pICA or separate ICA analyses. In addition, the three-way pICA algorithm is applied to real experimental data obtained from a study that investigate genetic effects on alcohol dependence. Considered data modalities include functional MRI (contrast images during alcohol exposure paradigm), gray matter concentration images from structural MRI and genetic single nucleotide polymorphism (SNP). The three-way pICA approach identified links between a SNP component (pointing to brain function and mental disorder associated genes, including BDNF, GRIN2B and NRG1), a functional component related to increased activation in the precuneus area, and a gray matter component comprising part of the default mode network and the caudate. Although such findings need further verification, the simulation and in-vivo results validate the three-way pICA algorithm presented here as a useful tool in biomedical data fusion applications. Copyright © 2014 Elsevier Inc. All rights reserved.
Ulloa, Alvaro; Jingyu Liu; Vergara, Victor; Jiayu Chen; Calhoun, Vince; Pattichis, Marios
2014-01-01
In the biomedical field, current technology allows for the collection of multiple data modalities from the same subject. In consequence, there is an increasing interest for methods to analyze multi-modal data sets. Methods based on independent component analysis have proven to be effective in jointly analyzing multiple modalities, including brain imaging and genetic data. This paper describes a new algorithm, three-way parallel independent component analysis (3pICA), for jointly identifying genomic loci associated with brain function and structure. The proposed algorithm relies on the use of multi-objective optimization methods to identify correlations among the modalities and maximally independent sources within modality. We test the robustness of the proposed approach by varying the effect size, cross-modality correlation, noise level, and dimensionality of the data. Simulation results suggest that 3p-ICA is robust to data with SNR levels from 0 to 10 dB and effect-sizes from 0 to 3, while presenting its best performance with high cross-modality correlations, and more than one subject per 1,000 variables. In an experimental study with 112 human subjects, the method identified links between a genetic component (pointing to brain function and mental disorder associated genes, including PPP3CC, KCNQ5, and CYP7B1), a functional component related to signal decreases in the default mode network during the task, and a brain structure component indicating increases of gray matter in brain regions of the default mode region. Although such findings need further replication, the simulation and in-vivo results validate the three-way parallel ICA algorithm presented here as a useful tool in biomedical data decomposition applications.
Regional prioritisation of flood risk in mountainous areas
NASA Astrophysics Data System (ADS)
Rogelis, M. C.; Werner, M.; Obregón, N.; Wright, G.
2015-07-01
A regional analysis of flood risk was carried out in the mountainous area surrounding the city of Bogotá (Colombia). Vulnerability at regional level was assessed on the basis of a principal component analysis carried out with variables recognised in literature to contribute to vulnerability; using watersheds as the unit of analysis. The area exposed was obtained from a simplified flood analysis at regional level to provide a mask where vulnerability variables were extracted. The vulnerability indicator obtained from the principal component analysis was combined with an existing susceptibility indicator, thus providing an index that allows the watersheds to be prioritised in support of flood risk management at regional level. Results show that the components of vulnerability can be expressed in terms of four constituent indicators; socio-economic fragility, which is composed of demography and lack of well-being; lack of resilience, which is composed of education, preparedness and response capacity, rescue capacity, social cohesion and participation; and physical exposure is composed of exposed infrastructure and exposed population. A sensitivity analysis shows that the classification of vulnerability is robust for watersheds with low and high values of the vulnerability indicator, while some watersheds with intermediate values of the indicator are sensitive to shifting between medium and high vulnerability. The complex interaction between vulnerability and hazard is evidenced in the case study. Environmental degradation in vulnerable watersheds shows the influence that vulnerability exerts on hazard and vice versa, thus establishing a cycle that builds up risk conditions.
Tailored multivariate analysis for modulated enhanced diffraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caliandro, Rocco; Guccione, Pietro; Nico, Giovanni
2015-10-21
Modulated enhanced diffraction (MED) is a technique allowing the dynamic structural characterization of crystalline materials subjected to an external stimulus, which is particularly suited forin situandoperandostructural investigations at synchrotron sources. Contributions from the (active) part of the crystal system that varies synchronously with the stimulus can be extracted by an offline analysis, which can only be applied in the case of periodic stimuli and linear system responses. In this paper a new decomposition approach based on multivariate analysis is proposed. The standard principal component analysis (PCA) is adapted to treat MED data: specific figures of merit based on their scoresmore » and loadings are found, and the directions of the principal components obtained by PCA are modified to maximize such figures of merit. As a result, a general method to decompose MED data, called optimum constrained components rotation (OCCR), is developed, which produces very precise results on simulated data, even in the case of nonperiodic stimuli and/or nonlinear responses. The multivariate analysis approach is able to supply in one shot both the diffraction pattern related to the active atoms (through the OCCR loadings) and the time dependence of the system response (through the OCCR scores). When applied to real data, OCCR was able to supply only the latter information, as the former was hindered by changes in abundances of different crystal phases, which occurred besides structural variations in the specific case considered. To develop a decomposition procedure able to cope with this combined effect represents the next challenge in MED analysis.« less
NASA Technical Reports Server (NTRS)
Steele, John; Rector, tony; Gazda, Daniel; Lewis, John
2009-01-01
An EMU water processing kit (Airlock Coolant Loop Recovery A/L CLR) was developed as a corrective action to Extravehicular Mobility Unit (EMU) coolant flow disruptions experienced on the International Space Station (ISS) in May of 2004 and thereafter. Conservative schedules for A/L CLR use and component life were initially developed and implemented based on prior analysis results and analytical modeling. The examination of postflight samples and EMU hardware in November of 2006 indicated that the A/L CLR kits were functioning well and had excess capacity that would allow a relaxation of the initially conservative schedules of use and component life. A relaxed use schedule and list of component lives was implemented thereafter. Since the adoption of the relaxed A/L CLR schedules of use and component lives, several A/L CLR kit components, transport loop water samples and sensitive EMU transport loop components have been examined to gage the impact of the relaxed requirements. The intent of this paper is to summarize the findings of that evaluation, and to outline updated schedules for A/L CLR use and component life.
Mager, P P; Rothe, H
1990-10-01
Multicollinearity of physicochemical descriptors leads to serious consequences in quantitative structure-activity relationship (QSAR) analysis, such as incorrect estimators and test statistics of regression coefficients of the ordinary least-squares (OLS) model applied usually to QSARs. Beside the diagnosis of the known simple collinearity, principal component regression analysis (PCRA) also allows the diagnosis of various types of multicollinearity. Only if the absolute values of PCRA estimators are order statistics that decrease monotonically, the effects of multicollinearity can be circumvented. Otherwise, obscure phenomena may be observed, such as good data recognition but low predictive model power of a QSAR model.
Samsir, Sri A'jilah; Bunawan, Hamidun; Yen, Choong Chee; Noor, Normah Mohd
2016-09-01
In this dataset, we distinguish 15 accessions of Garcinia mangostana from Peninsular Malaysia using Fourier transform-infrared spectroscopy coupled with chemometric analysis. We found that the position and intensity of characteristic peaks at 3600-3100 cm(-) (1) in IR spectra allowed discrimination of G. mangostana from different locations. Further principal component analysis (PCA) of all the accessions suggests the two main clusters were formed: samples from Johor, Melaka, and Negeri Sembilan (South) were clustered together in one group while samples from Perak, Kedah, Penang, Selangor, Kelantan, and Terengganu (North and East Coast) were in another clustered group.
Kabytaev, Kuanysh; Durairaj, Anita; Shin, Dmitriy; Rohlfing, Curt L; Connolly, Shawn; Little, Randie R; Stoyanov, Alexander V
2016-02-01
A liquid chromatography with mass spectrometry on-line platform that includes the orthogonal techniques of ion exchange and reversed phase chromatography is applied for C-peptide analysis. Additional improvement is achieved by the subsequent application of cation- and anion-exchange purification steps that allow for isolating components that have their isoelectric points in a narrow pH range before final reversed-phase mass spectrometry analysis. The utility of this approach for isolating fractions in the desired "pI window" for profiling complex mixtures is discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ferrero, Alejandro; Rabal, Ana; Campos, Joaquín; Martínez-Verdú, Francisco; Chorro, Elísabet; Perales, Esther; Pons, Alicia; Hernanz, María Luisa
2013-02-01
A reduced set of measurement geometries allows the spectral reflectance of special effect coatings to be predicted for any other geometry. A physical model based on flake-related parameters has been used to determine nonredundant measurement geometries for the complete description of the spectral bidirectional reflectance distribution function (BRDF). The analysis of experimental spectral BRDF was carried out by means of principal component analysis. From this analysis, a set of nine measurement geometries was proposed to characterize special effect coatings. It was shown that, for two different special effect coatings, these geometries provide a good prediction of their complete color shift.
ERIC Educational Resources Information Center
Au, Wayne; Ferrare, Joseph J.
2014-01-01
Background/Context: Charter school policy has evolved into a major component of the current education reform movement in the United States. As of 2012, all but nine U.S. states allowed charter schools, and in one of those nine, Washington State, charter school legislation was passed by popular vote in November 2012. There is a substantial, if…
Multifamily Building Operator Job/Task Analysis and Report: September 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, C. M.
The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Building Operator JTA identifies and catalogs all of the tasks performed by multifamily building operators, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.
Multifamily Energy Auditor Job/Task Analysis and Report: September 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, C. M.
The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Energy Auditor JTA identifies and catalogs all of the tasks performed by multifamily energy auditors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.
NASA Astrophysics Data System (ADS)
Jin, Zhibin; Pei, Shiling; Li, Xiaozhen; Liu, Hongyan; Qiang, Shizhong
2016-11-01
The running safety of railway vehicles on bridges can be negatively affected by earthquake events. This phenomenon has traditionally been investigated with only the lateral ground excitation component considered. This paper presented results from a numerical investigation on the contribution of vertical ground motion component to the derailment of vehicles on simply-supported bridges. A full nonlinear wheel-rail contact model was used in the investigation together with the Hertzian contact theory and nonlinear creepage theory, which allows the wheel to jump vertically and separate from the rail. The wheel-rail relative displacement was used as the criterion for derailment events. A total of 18 ground motion records were used in the analysis to account for the uncertainty of ground motions. The results showed that inclusion of vertical ground motion will likely increase the chance of derailment. It is recommended to include vertical ground motion component in earthquake induced derailment analysis to ensure conservative estimations. The derailment event on bridges was found to be more closely related to the deck acceleration rather than the ground acceleration.
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Aboudi, Jacob
1998-01-01
The objective of this three-year project was to develop and deliver to NASA Lewis one-dimensional and two-dimensional higher-order theories, and related computer codes, for the analysis, optimization and design of cylindrical functionally graded materials/structural components for use in advanced aircraft engines (e.g., combustor linings, rotor disks, heat shields, blisk blades). To satisfy this objective, a quasi one-dimensional version of the higher-order theory, HOTCFGM-1D, and four computer codes based on this theory, for the analysis, design and optimization of cylindrical structural components functionally graded in the radial direction were developed. The theory is applicable to thin multi-phased composite shell/cylinders subjected to macroscopically axisymmetric thermomechanical and inertial loading applied uniformly along the axial direction such that the overall deformation is characterized by a constant average axial strain. The reinforcement phases are uniformly distributed in the axial and circumferential directions, and arbitrarily distributed in the radial direction, thereby allowing functional grading of the internal reinforcement in this direction.
Mudge, Elizabeth; Lopes-Lutz, Daise; Brown, Paula; Schieber, Andreas
2011-08-10
Alkylamides are a class of compounds present in plants of the genus Echinacea (Asteraceae), which have been shown to have high bioavailability and immunomodulatory effects. Fast analysis to identify these components in a variety of products is essential to profile products used in clinical trials and for quality control of these products. A method based on ultrafast liquid chromatography (UFLC) coupled with diode array detection and electrospray ionization mass spectrometry was developed for the analysis of alkylamides from the roots of Echinacea angustifolia (DC.) Hell., Echinacea purpurea (L.) Moench, and commercial dietary supplements. A total of 24 alkylamides were identified by LC-MS. The analysis time for these components is 15 min. Compared to the alkylamide profiles determined in the Echinacea root materials, the commercial products showed a more complex profile due to the blending of root and aerial parts of E. purpurea. This versatile method allows for the identification of alkylamides in a variety of Echinacea products and presents the most extensive characterization of alkylamides in E. angustifolia roots so far.
New Representation of Bearings in LS-DYNA
NASA Technical Reports Server (NTRS)
Carney, Kelly S.; Howard, Samuel A.; Miller, Brad A.; Benson, David J.
2014-01-01
Non-linear, dynamic, finite element analysis is used in various engineering disciplines to evaluate high-speed, dynamic impact and vibration events. Some of these applications require connecting rotating to stationary components. For example, bird impacts on rotating aircraft engine fan blades are a common analysis performed using this type of analysis tool. Traditionally, rotating machines utilize some type of bearing to allow rotation in one degree of freedom while offering constraints in the other degrees of freedom. Most times, bearings are modeled simply as linear springs with rotation. This is a simplification that is not necessarily accurate under the conditions of high-velocity, high-energy, dynamic events such as impact problems. For this reason, it is desirable to utilize a more realistic non-linear force-deflection characteristic of real bearings to model the interaction between rotating and non-rotating components during dynamic events. The present work describes a rolling element bearing model developed for use in non-linear, dynamic finite element analysis. This rolling element bearing model has been implemented in LS-DYNA as a new element, *ELEMENT_BEARING.
The banana code—natural blend processing in the olfactory circuitry of Drosophila melanogaster
Schubert, Marco; Hansson, Bill S.; Sachse, Silke
2014-01-01
Odor information is predominantly perceived as complex odor blends. For Drosophila melanogaster one of the most attractive blends is emitted by an over-ripe banana. To analyze how the fly's olfactory system processes natural blends we combined the experimental advantages of gas chromatography and functional imaging (GC-I). In this way, natural banana compounds were presented successively to the fly antenna in close to natural occurring concentrations. This technique allowed us to identify the active odor components, use these compounds as stimuli and measure odor-induced Ca2+ signals in input and output neurons of the Drosophila antennal lobe (AL), the first olfactory neuropil. We demonstrate that mixture interactions of a natural blend are very rare and occur only at the AL output level resulting in a surprisingly linear blend representation. However, the information regarding single components is strongly modulated by the olfactory circuitry within the AL leading to a higher similarity between the representation of individual components and the banana blend. This observed modulation might tune the olfactory system in a way to distinctively categorize odor components and improve the detection of suitable food sources. Functional GC-I thus enables analysis of virtually any unknown natural odorant blend and its components in their relative occurring concentrations and allows characterization of neuronal responses of complete neural assemblies. This technique can be seen as a valuable complementary method to classical GC/electrophysiology techniques, and will be a highly useful tool in future investigations of insect-insect and insect-plant chemical interactions. PMID:24600405
GATE: software for the analysis and visualization of high-dimensional time series expression data.
MacArthur, Ben D; Lachmann, Alexander; Lemischka, Ihor R; Ma'ayan, Avi
2010-01-01
We present Grid Analysis of Time series Expression (GATE), an integrated computational software platform for the analysis and visualization of high-dimensional biomolecular time series. GATE uses a correlation-based clustering algorithm to arrange molecular time series on a two-dimensional hexagonal array and dynamically colors individual hexagons according to the expression level of the molecular component to which they are assigned, to create animated movies of systems-level molecular regulatory dynamics. In order to infer potential regulatory control mechanisms from patterns of correlation, GATE also allows interactive interroga-tion of movies against a wide variety of prior knowledge datasets. GATE movies can be paused and are interactive, allowing users to reconstruct networks and perform functional enrichment analyses. Movies created with GATE can be saved in Flash format and can be inserted directly into PDF manuscript files as interactive figures. GATE is available for download and is free for academic use from http://amp.pharm.mssm.edu/maayan-lab/gate.htm
Mathematical supply-chain modelling: Product analysis of cost and time
NASA Astrophysics Data System (ADS)
Easters, D. J.
2014-03-01
Establishing a mathematical supply-chain model is a proposition that has received attention due to its inherent benefits of evolving global supply-chain efficiencies. This paper discusses the prevailing relationships found within apparel supply-chain environments, and contemplates the complex issues indicated for constituting a mathematical model. Principal results identified within the data suggest, that the multifarious nature of global supply-chain activities require a degree of simplification in order to fully dilate the necessary factors which affect, each sub-section of the chain. Subsequently, the research findings allowed the division of supply-chain components into sub-sections, which amassed a coherent method of product development activity. Concurrently, the supply-chain model was found to allow systematic mathematical formulae analysis, of cost and time, within the multiple contexts of each subsection encountered. The paper indicates the supply-chain model structure, the mathematics, and considers how product analysis of cost and time can improve the comprehension of product lifecycle management.
NASA Technical Reports Server (NTRS)
Lodwick, G. D. (Principal Investigator)
1976-01-01
A digital computer and multivariate statistical techniques were used to analyze 4-band multispectral data. A representation of the original data for each of the four bands allows a certain degree of terrain interpretation; however, variations in appearance of sites within and between bands, without additional criteria for deciding which representation should be preferred, create difficulties for classification. Investigation of the video data groups produced by principal components analysis and cluster analysis techniques shows that effective correlations with classifications of terrain produced by conventional methods could be carried out. The analyses also highlighted underlying relationships between the various elements. The approach used allows large areas (185 cm by 185 cm) to be classified into fundamental units within a matter of hours and can be applied to those parts of the Earth where facilities for conventional studies are poor or lacking.
Iurov, Iu B; Khazatskiĭ, I A; Akindinov, V A; Dovgilov, L V; Kobrinskiĭ, B A; Vorsanova, S G
2000-08-01
Original software FISHMet has been developed and tried for improving the efficiency of diagnosis of hereditary diseases caused by chromosome aberrations and for chromosome mapping by fluorescent in situ hybridization (FISH) method. The program allows creation and analysis of pseudocolor chromosome images and hybridization signals in the Windows 95 system, allows computer analysis and editing of the results of pseudocolor hybridization in situ, including successive imposition of initial black-and-white images created using fluorescent filters (blue, green, and red), and editing of each image individually or of a summary pseudocolor image in BMP, TIFF, and JPEG formats. Components of image computer analysis system (LOMO, Leitz Ortoplan, and Axioplan fluorescent microscopes, COHU 4910 and Sanyo VCB-3512P CCD cameras, Miro-Video, Scion LG-3 and VG-5 image capture maps, and Pentium 100 and Pentium 200 computers) and specialized software for image capture and visualization (Scion Image PC and Video-Cup) have been used with good results in the study.
Pintus, M A; Gaspa, G; Nicolazzi, E L; Vicario, D; Rossoni, A; Ajmone-Marsan, P; Nardone, A; Dimauro, C; Macciotta, N P P
2012-06-01
The large number of markers available compared with phenotypes represents one of the main issues in genomic selection. In this work, principal component analysis was used to reduce the number of predictors for calculating genomic breeding values (GEBV). Bulls of 2 cattle breeds farmed in Italy (634 Brown and 469 Simmental) were genotyped with the 54K Illumina beadchip (Illumina Inc., San Diego, CA). After data editing, 37,254 and 40,179 single nucleotide polymorphisms (SNP) were retained for Brown and Simmental, respectively. Principal component analysis carried out on the SNP genotype matrix extracted 2,257 and 3,596 new variables in the 2 breeds, respectively. Bulls were sorted by birth year to create reference and prediction populations. The effect of principal components on deregressed proofs in reference animals was estimated with a BLUP model. Results were compared with those obtained by using SNP genotypes as predictors with either the BLUP or Bayes_A method. Traits considered were milk, fat, and protein yields, fat and protein percentages, and somatic cell score. The GEBV were obtained for prediction population by blending direct genomic prediction and pedigree indexes. No substantial differences were observed in squared correlations between GEBV and EBV in prediction animals between the 3 methods in the 2 breeds. The principal component analysis method allowed for a reduction of about 90% in the number of independent variables when predicting direct genomic values, with a substantial decrease in calculation time and without loss of accuracy. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Litvinenko, S. V.; Bielobrov, D. O.; Lysenko, V.; Skryshevsky, V. A.
2016-08-01
The electronic tongue based on the array of low selective photovoltaic (PV) sensors and principal component analysis is proposed for detection of various alcohol solutions. A sensor array is created at the forming of p-n junction on silicon wafer with porous silicon layer on the opposite side. A dynamical set of sensors is formed due to the inhomogeneous distribution of the surface recombination rate at this porous silicon side. The sensitive to molecular adsorption photocurrent is induced at the scanning of this side by laser beam. Water, ethanol, iso-propanol, and their mixtures were selected for testing. It is shown that the use of the random dispersion of surface recombination rates on different spots of the rear side of p-n junction and principal component analysis of PV signals allows identifying mentioned liquid substances and their mixtures.
Migration of cemented femoral components after THR. Roentgen stereophotogrammetric analysis.
Kiss, J; Murray, D W; Turner-Smith, A R; Bithell, J; Bulstrode, C J
1996-09-01
We studied the migration of 58 cemented Hinek femoral components for total hip replacement, using roentgen stereophotogrammetric analysis over four years. The implants migrated faster during the first year than subsequently, and the pattern of migration in the second period was very different. During the first year they subsided, tilted into varus and internally rotated. After this there was slow distal migration with no change in orientation. None of the prostheses has yet failed. The early migration is probably caused by resorption of bone damaged by surgical trauma or the heat generated by the polymerisation of bone cement. Later migration may be due to creep in the bone cement or the surrounding fibrous membrane. The prosthesis which we studied allows the preservation of some of the femoral neck, and comparison with published migration studies of the Charnley stem suggests that this decreases rotation and may help to prevent loosening.
Aerothermal modeling. Executive summary
NASA Technical Reports Server (NTRS)
Kenworthy, M. K.; Correa, S. M.; Burrus, D. L.
1983-01-01
One of the significant ways in which the performance level of aircraft turbine engines has been improved is by the use of advanced materials and cooling concepts that allow a significant increase in turbine inlet temperature level, with attendant thermodynamic cycle benefits. Further cycle improvements have been achieved with higher pressure ratio compressors. The higher turbine inlet temperatures and compressor pressure ratios with corresponding higher temperature cooling air has created a very hostile environment for the hot section components. To provide the technology needed to reduce the hot section maintenance costs, NASA has initiated the Hot Section Technology (HOST) program. One key element of this overall program is the Aerothermal Modeling Program. The overall objective of his program is to evolve and validate improved analysis methods for use in the design of aircraft turbine engine combustors. The use of such combustor analysis capabilities can be expected to provide significant improvement in the life and durability characteristics of both combustor and turbine components.
Thermogravimetric analysis of the gasification of microalgae Chlorella vulgaris.
Figueira, Camila Emilia; Moreira, Paulo Firmino; Giudici, Reinaldo
2015-12-01
The gasification of microalgae Chlorella vulgaris under an atmosphere of argon and water vapor was investigated by thermogravimetric analysis. The data were interpreted by using conventional isoconversional methods and also by the independent parallel reaction (IPR) model, in which the degradation is considered to happen individually to each pseudo-component of biomass (lipid, carbohydrate and protein). The IPR model allows obtaining the kinetic parameters of the degradation reaction of each component. Three main stages were observed during the gasification process and the differential thermogravimetric curve was satisfactorily fitted by the IPR model considering three pseudocomponents. The comparison of the activation energy values obtained by the methods and those found in the literature for other microalgae was satisfactory. Quantification of reaction products was performed using online gas chromatography. The major products detected were H2, CO and CH4, indicating the potential for producing fuel gas and syngas from microalgae. Copyright © 2015 Elsevier Ltd. All rights reserved.
Asymptotics of empirical eigenstructure for high dimensional spiked covariance.
Wang, Weichen; Fan, Jianqing
2017-06-01
We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies.
Asymptotics of empirical eigenstructure for high dimensional spiked covariance
Wang, Weichen
2017-01-01
We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies. PMID:28835726
Ryder, Alan G
2002-03-01
Eighty-five solid samples consisting of illegal narcotics diluted with several different materials were analyzed by near-infrared (785 nm excitation) Raman spectroscopy. Principal Component Analysis (PCA) was employed to classify the samples according to narcotic type. The best sample discrimination was obtained by using the first derivative of the Raman spectra. Furthermore, restricting the spectral variables for PCA to 2 or 3% of the original spectral data according to the most intense peaks in the Raman spectrum of the pure narcotic resulted in a rapid discrimination method for classifying samples according to narcotic type. This method allows for the easy discrimination between cocaine, heroin, and MDMA mixtures even when the Raman spectra are complex or very similar. This approach of restricting the spectral variables also decreases the computational time by a factor of 30 (compared to the complete spectrum), making the methodology attractive for rapid automatic classification and identification of suspect materials.
Repressing the effects of variable speed harmonic orders in operational modal analysis
NASA Astrophysics Data System (ADS)
Randall, R. B.; Coats, M. D.; Smith, W. A.
2016-10-01
Discrete frequency components such as machine shaft orders can disrupt the operation of normal Operational Modal Analysis (OMA) algorithms. With constant speed machines, they have been removed using time synchronous averaging (TSA). This paper compares two approaches for varying speed machines. In one method, signals are transformed into the order domain, and after the removal of shaft speed related components by a cepstral notching method, are transformed back to the time domain to allow normal OMA. In the other simpler approach an exponential shortpass lifter is applied directly in the time domain cepstrum to enhance the modal information at the expense of other disturbances. For simulated gear signals with speed variations of both ±5% and ±15%, the simpler approach was found to give better results The TSA method is shown not to work in either case. The paper compares the results with those obtained using a stationary random excitation.
Two-component flux explanation for the high energy neutrino events at IceCube
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chien-Yi; Dev, P. S. Bhupal; Soni, Amarjit
In understanding the spectral and flavor composition of the astrophysical neutrino flux responsible for the recently observed ultrahigh-energy events at IceCube we see how important both astrophysics and particle physics are. Here, we perform a statistical likelihood analysis to the three-year IceCube data and derive the allowed range of the spectral index and flux normalization for various well-motivated physical flavor compositions at the source. While most of the existing analyses so far assume the flavor composition of the neutrinos at an astrophysical source to be (1:2:0), it seems rather unnatural to assume only one type of source, once we recognizemore » the possibility of at least two physical sources. Bearing this in mind, we entertain the possibility of a two-component source for the analysis of IceCube data. It appears that our two-component hypothesis explains some key features of the data better than a single-component scenario; i.e. it addresses the apparent energy gap between 400 TeV and about 1 PeV and easily accommodates the observed track-to-shower ratio. Given the extreme importance of the flavor composition for the correct interpretation of the underlying astrophysical processes as well as for the ramification for particle physics, this two-component flux should be tested as more data is accumulated.« less
Two-component flux explanation for the high energy neutrino events at IceCube
Chen, Chien-Yi; Dev, P. S. Bhupal; Soni, Amarjit
2015-10-01
In understanding the spectral and flavor composition of the astrophysical neutrino flux responsible for the recently observed ultrahigh-energy events at IceCube we see how important both astrophysics and particle physics are. Here, we perform a statistical likelihood analysis to the three-year IceCube data and derive the allowed range of the spectral index and flux normalization for various well-motivated physical flavor compositions at the source. While most of the existing analyses so far assume the flavor composition of the neutrinos at an astrophysical source to be (1:2:0), it seems rather unnatural to assume only one type of source, once we recognizemore » the possibility of at least two physical sources. Bearing this in mind, we entertain the possibility of a two-component source for the analysis of IceCube data. It appears that our two-component hypothesis explains some key features of the data better than a single-component scenario; i.e. it addresses the apparent energy gap between 400 TeV and about 1 PeV and easily accommodates the observed track-to-shower ratio. Given the extreme importance of the flavor composition for the correct interpretation of the underlying astrophysical processes as well as for the ramification for particle physics, this two-component flux should be tested as more data is accumulated.« less
Savage, Mark E.; Simpson, Walter W.
1999-01-01
An electrical connector accommodates high current, is not labor intensive to assemble and disassemble, and allows a wide range of motion to accommodate mechanical variations and movement of connected components. The connector comprises several parts with joints therebetween, wherein each joint provides electrical connection between and allows relative motion of the joined parts. The combination of parts and joints maintains electrical connection between two electrical components even if the components are misaligned or move after connection.
Ultrasonic Apparatus and Technique to Measure Changes in Intracranial Pressure
NASA Technical Reports Server (NTRS)
Yost, William T. (Inventor); Cantrell, John H. (Inventor)
2002-01-01
Changes in intracranial pressure can be measured dynamically and non-invasively by monitoring one or more cerebrospinal fluid pulsatile components. Pulsatile components such as systolic and diastolic blood pressures are partially transferred to the cerebrospinal fluid by way of blood vessels contained in the surrounding brain tissue and membrane. As intracranial pressure varies these cerebrospinal fluid pulsatile components also vary. Thus, intracranial pressure can be dynamically measured. Furthermore, use of acoustics allows the measurement to be completely non-invasive. In the preferred embodiment, phase comparison of a reflected acoustic signal to a reference signal using a constant frequency pulsed phase-locked-loop ultrasonic device allows the pulsatile components to be monitored. Calibrating the device by inducing a known change in intracranial pressure allows conversion to changes in intracranial pressure.
NASA Astrophysics Data System (ADS)
Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.
2016-04-01
This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of the source, which can be used, by means of the small-intensity precursors, for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the CICA method to regions with low seismicity in high ambient noise.
NASA Astrophysics Data System (ADS)
Donders, S.; Pluymers, B.; Ragnarsson, P.; Hadjit, R.; Desmet, W.
2010-04-01
In the vehicle design process, design decisions are more and more based on virtual prototypes. Due to competitive and regulatory pressure, vehicle manufacturers are forced to improve product quality, to reduce time-to-market and to launch an increasing number of design variants on the global market. To speed up the design iteration process, substructuring and component mode synthesis (CMS) methods are commonly used, involving the analysis of substructure models and the synthesis of the substructure analysis results. Substructuring and CMS enable efficient decentralized collaboration across departments and allow to benefit from the availability of parallel computing environments. However, traditional CMS methods become prohibitively inefficient when substructures are coupled along large interfaces, i.e. with a large number of degrees of freedom (DOFs) at the interface between substructures. The reason is that the analysis of substructures involves the calculation of a number of enrichment vectors, one for each interface degree of freedom (DOF). Since large interfaces are common in vehicles (e.g. the continuous line connections to connect the body with the windshield, roof or floor), this interface bottleneck poses a clear limitation in the vehicle noise, vibration and harshness (NVH) design process. Therefore there is a need to describe the interface dynamics more efficiently. This paper presents a wave-based substructuring (WBS) approach, which allows reducing the interface representation between substructures in an assembly by expressing the interface DOFs in terms of a limited set of basis functions ("waves"). As the number of basis functions can be much lower than the number of interface DOFs, this greatly facilitates the substructure analysis procedure and results in faster design predictions. The waves are calculated once from a full nominal assembly analysis, but these nominal waves can be re-used for the assembly of modified components. The WBS approach thus enables efficient structural modification predictions of the global modes, so that efficient vibro-acoustic design modification, optimization and robust design become possible. The results show that wave-based substructuring offers a clear benefit for vehicle design modifications, by improving both the speed of component reduction processes and the efficiency and accuracy of design iteration predictions, as compared to conventional substructuring approaches.
JGrass-NewAge hydrological system: an open-source platform for the replicability of science.
NASA Astrophysics Data System (ADS)
Bancheri, Marialaura; Serafin, Francesco; Formetta, Giuseppe; Rigon, Riccardo; David, Olaf
2017-04-01
JGrass-NewAge is an open source semi-distributed hydrological modelling system. It is based on the object modelling framework (OMS version 3), on the JGrasstools and on the Geotools. OMS3 allows to create independent packages of software which can be connected at run-time in a working modelling solution. These components are available as library/dependency or as repository to fork in order to add further features. Different tools are adopted to make easier the integration, the interoperability and the use of each package. Most of the components are Gradle integrated, since it represents the state-of-art of the building systems, especially for Java projects. The continuous integration is a further layer between local source code (client-side) and remote repository (server-side) and ensures the building and the testing of the source code at each commit. Finally, the use of Zenodo makes the code hosted in GitHub unique, citable and traceable, with a defined DOI. Following the previous standards, each part of the hydrological cycle is implemented in JGrass-NewAge as a component that can be selected, adopted, and connected to obtain a user "customized" hydrological model. A variety of modelling solutions are possible, allowing a complete hydrological analysis. Moreover, thanks to the JGrasstools and the Geotools, the visualization of the data and of the results using a selected GIS is possible. After the geomorphological analysis of the watershed, the spatial interpolation of the meteorological inputs can be performed using both deterministic (IDW) and geostatistic (Kriging) algorithms. For the radiation balance, the shortwave and longwave radiation can be estimated, which are, in turn, inputs for the simulation of the evapotranspiration, according to Priestly-Taylor and Penman-Monteith formulas. Three degree-day models are implemented for the snow melting and SWE. The runoff production can be simulated using two different components, "Adige" and "Embedded Reservoirs". The travel time theory has recently been integrated for a coupled analysis of the solute transport. Eventually, each component can be connected to the different calibration tools such as LUCA and PSO. Further information about the actual implementation can be found at (https://github.com/geoframecomponents), while the OMS projects with the examples, data and results are available at (https://github.com/GEOframeOMSProjects).
Convection equation modeling: A non-iterative direct matrix solution algorithm for use with SINDA
NASA Technical Reports Server (NTRS)
Schrage, Dean S.
1993-01-01
The determination of the boundary conditions for a component-level analysis, applying discrete finite element and finite difference modeling techniques often requires an analysis of complex coupled phenomenon that cannot be described algebraically. For example, an analysis of the temperature field of a coldplate surface with an integral fluid loop requires a solution to the parabolic heat equation and also requires the boundary conditions that describe the local fluid temperature. However, the local fluid temperature is described by a convection equation that can only be solved with the knowledge of the locally-coupled coldplate temperatures. Generally speaking, it is not computationally efficient, and sometimes, not even possible to perform a direct, coupled phenomenon analysis of the component-level and boundary condition models within a single analysis code. An alternative is to perform a disjoint analysis, but transmit the necessary information between models during the simulation to provide an indirect coupling. For this approach to be effective, the component-level model retains full detail while the boundary condition model is simplified to provide a fast, first-order prediction of the phenomenon in question. Specifically for the present study, the coldplate structure is analyzed with a discrete, numerical model (SINDA) while the fluid loop convection equation is analyzed with a discrete, analytical model (direct matrix solution). This indirect coupling allows a satisfactory prediction of the boundary condition, while not subjugating the overall computational efficiency of the component-level analysis. In the present study a discussion of the complete analysis of the derivation and direct matrix solution algorithm of the convection equation is presented. Discretization is analyzed and discussed to extend of solution accuracy, stability and computation speed. Case studies considering a pulsed and harmonic inlet disturbance to the fluid loop are analyzed to assist in the discussion of numerical dissipation and accuracy. In addition, the issues of code melding or integration with standard class solvers such as SINDA are discussed to advise the user of the potential problems to be encountered.
Rank estimation and the multivariate analysis of in vivo fast-scan cyclic voltammetric data
Keithley, Richard B.; Carelli, Regina M.; Wightman, R. Mark
2010-01-01
Principal component regression has been used in the past to separate current contributions from different neuromodulators measured with in vivo fast-scan cyclic voltammetry. Traditionally, a percent cumulative variance approach has been used to determine the rank of the training set voltammetric matrix during model development, however this approach suffers from several disadvantages including the use of arbitrary percentages and the requirement of extreme precision of training sets. Here we propose that Malinowski’s F-test, a method based on a statistical analysis of the variance contained within the training set, can be used to improve factor selection for the analysis of in vivo fast-scan cyclic voltammetric data. These two methods of rank estimation were compared at all steps in the calibration protocol including the number of principal components retained, overall noise levels, model validation as determined using a residual analysis procedure, and predicted concentration information. By analyzing 119 training sets from two different laboratories amassed over several years, we were able to gain insight into the heterogeneity of in vivo fast-scan cyclic voltammetric data and study how differences in factor selection propagate throughout the entire principal component regression analysis procedure. Visualizing cyclic voltammetric representations of the data contained in the retained and discarded principal components showed that using Malinowski’s F-test for rank estimation of in vivo training sets allowed for noise to be more accurately removed. Malinowski’s F-test also improved the robustness of our criterion for judging multivariate model validity, even though signal-to-noise ratios of the data varied. In addition, pH change was the majority noise carrier of in vivo training sets while dopamine prediction was more sensitive to noise. PMID:20527815
Histogram contrast analysis and the visual segregation of IID textures.
Chubb, C; Econopouly, J; Landy, M S
1994-09-01
A new psychophysical methodology is introduced, histogram contrast analysis, that allows one to measure stimulus transformations, f, used by the visual system to draw distinctions between different image regions. The method involves the discrimination of images constructed by selecting texture micropatterns randomly and independently (across locations) on the basis of a given micropattern histogram. Different components of f are measured by use of different component functions to modulate the micropattern histogram until the resulting textures are discriminable. When no discrimination threshold can be obtained for a given modulating component function, a second titration technique may be used to measure the contribution of that component to f. The method includes several strong tests of its own assumptions. An example is given of the method applied to visual textures composed of small, uniform squares with randomly chosen gray levels. In particular, for a fixed mean gray level mu and a fixed gray-level variance sigma 2, histogram contrast analysis is used to establish that the class S of all textures composed of small squares with jointly independent, identically distributed gray levels with mean mu and variance sigma 2 is perceptually elementary in the following sense: there exists a single, real-valued function f S of gray level, such that two textures I and J in S are discriminable only if the average value of f S applied to the gray levels in I is significantly different from the average value of f S applied to the gray levels in J. Finally, histogram contrast analysis is used to obtain a seventh-order polynomial approximation of f S.
An efficient method for removing point sources from full-sky radio interferometric maps
NASA Astrophysics Data System (ADS)
Berger, Philippe; Oppermann, Niels; Pen, Ue-Li; Shaw, J. Richard
2017-12-01
A new generation of wide-field radio interferometers designed for 21-cm surveys is being built as drift scan instruments allowing them to observe large fractions of the sky. With large numbers of antennas and frequency channels, the enormous instantaneous data rates of these telescopes require novel, efficient, data management and analysis techniques. The m-mode formalism exploits the periodicity of such data with the sidereal day, combined with the assumption of statistical isotropy of the sky, to achieve large computational savings and render optimal analysis methods computationally tractable. We present an extension to that work that allows us to adopt a more realistic sky model and treat objects such as bright point sources. We develop a linear procedure for deconvolving maps, using a Wiener filter reconstruction technique, which simultaneously allows filtering of these unwanted components. We construct an algorithm, based on the Sherman-Morrison-Woodbury formula, to efficiently invert the data covariance matrix, as required for any optimal signal-to-noise ratio weighting. The performance of our algorithm is demonstrated using simulations of a cylindrical transit telescope.
TASK ALLOCATION IN GEO-DISTRIBUTED CYBER-PHYSICAL SYSTEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aggarwal, Rachit; Smidts, Carol
This paper studies the task allocation algorithm for a distributed test facility (DTF), which aims to assemble geo-distributed cyber (software) and physical (hardware in the loop components into a prototype cyber-physical system (CPS). This allows low cost testing on an early conceptual prototype (ECP) of the ultimate CPS (UCPS) to be developed. The DTF provides an instrumentation interface for carrying out reliability experiments remotely such as fault propagation analysis and in-situ testing of hardware and software components in a simulated environment. Unfortunately, the geo-distribution introduces an overhead that is not inherent to the UCPS, i.e. a significant time delay inmore » communication that threatens the stability of the ECP and is not an appropriate representation of the behavior of the UCPS. This can be mitigated by implementing a task allocation algorithm to find a suitable configuration and assign the software components to appropriate computational locations, dynamically. This would allow the ECP to operate more efficiently with less probability of being unstable due to the delays introduced by geo-distribution. The task allocation algorithm proposed in this work uses a Monte Carlo approach along with Dynamic Programming to identify the optimal network configuration to keep the time delays to a minimum.« less
NASA Astrophysics Data System (ADS)
Maciel, M. J.; Costa, C. G.; Silva, M. F.; Gonçalves, S. B.; Peixoto, A. C.; Ribeiro, A. Fernando; Wolffenbuttel, R. F.; Correia, J. H.
2016-08-01
This paper reports on the development of a technology for the wafer-level fabrication of an optical Michelson interferometer, which is an essential component in a micro opto-electromechanical system (MOEMS) for a miniaturized optical coherence tomography (OCT) system. The MOEMS consists on a titanium dioxide/silicon dioxide dielectric beam splitter and chromium/gold micro-mirrors. These optical components are deposited on 45° tilted surfaces to allow the horizontal/vertical separation of the incident beam in the final micro-integrated system. The fabrication process consists of 45° saw dicing of a glass substrate and the subsequent deposition of dielectric multilayers and metal layers. The 45° saw dicing is fully characterized in this paper, which also includes an analysis of the roughness. The optimum process results in surfaces with a roughness of 19.76 nm (rms). The actual saw dicing process for a high-quality final surface results as a compromise between the dicing blade’s grit size (#1200) and the cutting speed (0.3 mm s-1). The proposed wafer-level fabrication allows rapid and low-cost processing, high compactness and the possibility of wafer-level alignment/assembly with other optical micro components for OCT integrated imaging.
NASA Astrophysics Data System (ADS)
Boerwinkel, Dirk Jan; Yarden, Anat; Waarlo, Arend Jan
2017-12-01
To determine what knowledge of genetics is needed for decision-making on genetic-related issues, a consensus-reaching approach was used. An international group of 57 experts, involved in teaching, studying, or developing genetic education and communication or working with genetic applications in medicine, agriculture, or forensics, answered the questions: "What knowledge of genetics is relevant to those individuals not professionally involved in science?" and "Why is this knowledge relevant?" The answers were classified in different knowledge components following the PISA 2015 science framework. During a workshop with the participants, the results were discussed and applied to seven cases in which genetic knowledge is relevant for decision-making. The analysis of these discussions resulted in a revised framework consisting of nine conceptual knowledge components, three sociocultural components, and four epistemic components. The framework can be used in curricular decisions; its open character allows for including new technologies and applications and facilitates comparisons of different cases.
Enhancements to the Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Hofmann, Martin O.
1993-01-01
The Engine Data Interpretation System (EDIS) expert system project assists the data review personnel at NASA/MSFC in performing post-test data analysis and engine diagnosis of the Space Shuttle Main Engine (SSME). EDIS uses knowledge of the engine, its components, and simple thermodynamic principles instead of, and in addition to, heuristic rules gathered from the engine experts. EDIS reasons in cooperation with human experts, following roughly the pattern of logic exhibited by human experts. EDIS concentrates on steady-state static faults, such as small leaks, and component degradations, such as pump efficiencies. The objective of this contract was to complete the set of engine component models, integrate heuristic rules into EDIS, integrate the Power Balance Model into EDIS, and investigate modification of the qualitative reasoning mechanisms to allow 'fuzzy' value classification. The results of this contract is an operational version of EDIS. EDIS will become a module of the Post-Test Diagnostic System (PTDS) and will, in this context, provide system-level diagnostic capabilities which integrate component-specific findings provided by other modules.
Enhancements to the Engine Data Interpretation System (EDIS)
NASA Technical Reports Server (NTRS)
Hofmann, Martin O.
1993-01-01
The Engine Data Interpretation System (EDIS) expert system project assists the data review personnel at NASA/MSFC in performing post-test data analysis and engine diagnosis of the Space Shuttle Main Engine (SSME). EDIS uses knowledge of the engine, its components, and simple thermodynamic principles instead of, and in addition to, heuristic rules gathered from the engine experts. EDIS reasons in cooperation with human experts, following roughly the pattern of logic exhibited by human experts. EDIS concentrates on steady-state static faults, such as small leaks, and component degradations, such as pump efficiencies. The objective of this contract was to complete the set of engine component models, integrate heuristic rules into EDIS, integrate the Power Balance Model into EDIS, and investigate modification of the qualitative reasoning mechanisms to allow 'fuzzy' value classification. The result of this contract is an operational version of EDIS. EDIS will become a module of the Post-Test Diagnostic System (PTDS) and will, in this context, provide system-level diagnostic capabilities which integrate component-specific findings provided by other modules.
Non-rigid image registration using a statistical spline deformation model.
Loeckx, Dirk; Maes, Frederik; Vandermeulen, Dirk; Suetens, Paul
2003-07-01
We propose a statistical spline deformation model (SSDM) as a method to solve non-rigid image registration. Within this model, the deformation is expressed using a statistically trained B-spline deformation mesh. The model is trained by principal component analysis of a training set. This approach allows to reduce the number of degrees of freedom needed for non-rigid registration by only retaining the most significant modes of variation observed in the training set. User-defined transformation components, like affine modes, are merged with the principal components into a unified framework. Optimization proceeds along the transformation components rather then along the individual spline coefficients. The concept of SSDM's is applied to the temporal registration of thorax CR-images using pattern intensity as the registration measure. Our results show that, using 30 training pairs, a reduction of 33% is possible in the number of degrees of freedom without deterioration of the result. The same accuracy as without SSDM's is still achieved after a reduction up to 66% of the degrees of freedom.
Systems and methods for interactive virtual reality process control and simulation
Daniel, Jr., William E.; Whitney, Michael A.
2001-01-01
A system for visualizing, controlling and managing information includes a data analysis unit for interpreting and classifying raw data using analytical techniques. A data flow coordination unit routes data from its source to other components within the system. A data preparation unit handles the graphical preparation of the data and a data rendering unit presents the data in a three-dimensional interactive environment where the user can observe, interact with, and interpret the data. A user can view the information on various levels, from a high overall process level view, to a view illustrating linkage between variables, to view the hard data itself, or to view results of an analysis of the data. The system allows a user to monitor a physical process in real-time and further allows the user to manage and control the information in a manner not previously possible.
Self-organizing maps: a versatile tool for the automatic analysis of untargeted imaging datasets.
Franceschi, Pietro; Wehrens, Ron
2014-04-01
MS-based imaging approaches allow for location-specific identification of chemical components in biological samples, opening up possibilities of much more detailed understanding of biological processes and mechanisms. Data analysis, however, is challenging, mainly because of the sheer size of such datasets. This article presents a novel approach based on self-organizing maps, extending previous work in order to be able to handle the large number of variables present in high-resolution mass spectra. The key idea is to generate prototype images, representing spatial distributions of ions, rather than prototypical mass spectra. This allows for a two-stage approach, first generating typical spatial distributions and associated m/z bins, and later analyzing the interesting bins in more detail using accurate masses. The possibilities and advantages of the new approach are illustrated on an in-house dataset of apple slices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Baseline estimation in flame's spectra by using neural networks and robust statistics
NASA Astrophysics Data System (ADS)
Garces, Hugo; Arias, Luis; Rojas, Alejandro
2014-09-01
This work presents a baseline estimation method in flame spectra based on artificial intelligence structure as a neural network, combining robust statistics with multivariate analysis to automatically discriminate measured wavelengths belonging to continuous feature for model adaptation, surpassing restriction of measuring target baseline for training. The main contributions of this paper are: to analyze a flame spectra database computing Jolliffe statistics from Principal Components Analysis detecting wavelengths not correlated with most of the measured data corresponding to baseline; to systematically determine the optimal number of neurons in hidden layers based on Akaike's Final Prediction Error; to estimate baseline in full wavelength range sampling measured spectra; and to train an artificial intelligence structure as a Neural Network which allows to generalize the relation between measured and baseline spectra. The main application of our research is to compute total radiation with baseline information, allowing to diagnose combustion process state for optimization in early stages.
NASA Astrophysics Data System (ADS)
Schneider, Sandra; Prijs, Vera F.; Schoonhoven, Ruurd
2003-06-01
Lower sideband distortion product otoacoustic emissions (DPOAEs), measured in the ear canal upon stimulation with two continuous pure tones, are the result of interfering contributions from two different mechanisms, the nonlinear distortion component and the linear reflection component. The two contributors have been shown to have a different amplitude and, in particular, a different phase behavior as a function of the stimulus frequencies. The dominance of either component was investigated in an extensive (f1,f2) area study of DPOAE amplitude and phase in the guinea pig, which allows for both qualitative and quantitative analysis of isophase contours. Making a minimum of additional assumptions, simple relations between the direction of constant phase in the (f1,f2) plane and the group delays in f1-sweep, f2-sweep, and fixed f2/f1 paradigms can be derived, both for distortion (wave-fixed) and reflection (place-fixed) components. The experimental data indicate the presence of both components in the lower sideband DPOAEs, with the reflection component as the dominant contributor for low f2/f1 ratios and the distortion component for intermediate ratios. At high ratios the behavior cannot be explained by dominance of either component.
Systematic study of anharmonic features in a principal component analysis of gramicidin A.
Kurylowicz, Martin; Yu, Ching-Hsing; Pomès, Régis
2010-02-03
We use principal component analysis (PCA) to detect functionally interesting collective motions in molecular-dynamics simulations of membrane-bound gramicidin A. We examine the statistical and structural properties of all PCA eigenvectors and eigenvalues for the backbone and side-chain atoms. All eigenvalue spectra show two distinct power-law scaling regimes, quantitatively separating large from small covariance motions. Time trajectories of the largest PCs converge to Gaussian distributions at long timescales, but groups of small-covariance PCs, which are usually ignored as noise, have subdiffusive distributions. These non-Gaussian distributions imply anharmonic motions on the free-energy surface. We characterize the anharmonic components of motion by analyzing the mean-square displacement for all PCs. The subdiffusive components reveal picosecond-scale oscillations in the mean-square displacement at frequencies consistent with infrared measurements. In this regime, the slowest backbone mode exhibits tilting of the peptide planes, which allows carbonyl oxygen atoms to provide surrogate solvation for water and cation transport in the channel lumen. Higher-frequency modes are also apparent, and we describe their vibrational spectra. Our findings expand the utility of PCA for quantifying the essential features of motion on the anharmonic free-energy surface made accessible by atomistic molecular-dynamics simulations. Copyright (c) 2010 Biophysical Society. Published by Elsevier Inc. All rights reserved.
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
[CompuRecord--A perioperative information management-system for anesthesia].
Martin, J; Ederle, D; Milewski, P
2002-08-01
Since 1977 procedures for automatic documentation of anesthesias have repeatedly been described. Because of a limited arrangement of the desk top and because of its focussing on intraoperative documentation only a widespread introduction could not be established so far. Todays systems are offered with graphically orientated desktops which can be operated by intuition. The CompuRecord(R)-System (Philips Healthcare) is a perioperative management system for anaesthesia. It is constructed with modular components, recording the complete anaesthesiological care of a patient from preanaesthesiological assessment to the recovery room. Additional modules allow an economical check, provide for quality management and exportation of a core data base. Except for the original software all other components of the system including the net work components are IT standard products allowing reduced costs for supplementation, expansion and support. The advantage of an automatical documentation system of anaesthesia is frequent and detailed recording of anaesthesiological data as well as the possibility of a meticulous calculation of cost for each patient. The anaesthesiologist's time used for documentation is reduced remarkably with a limited and reasonable amount of data to be recorded. This leaves more time of attention for the patient himself. Time necessary for training is kept low with the touch screens of the CompuRecord(R) - System, which can be operated intuitively. Primary to purchase an exact analysis of process and of subsequent costs should be done. Standardized documentation allows to establish Standard Operating Procedures in a department of Anaesthesia. Using the given systems an implementation is possible already today despite restricted resources of man power.
Accurate Determination of Soluble Axl by Enzyme-Linked Immunosorbent Assay.
Dengler, Mirko; Huber, Heidemarie; Müller, Christian J; Zellmer, Angela; Rauch, Peter; Mikulits, Wolfgang
2016-11-01
Levels of soluble Axl (sAxl) are routinely assessed in human sera by sandwich enzyme-linked immunosorbent assay (ELISA). Although sAxl values are suggested to diagnose different types of disorders, no uniform ELISA method is available, allowing the reliable interassay comparison between results. Furthermore, little is known about the stability of sAxl under storage conditions, which is a relevant parameter for biomedical trials. The evaluation of sAxl stability under various stress conditions and the determination of proper conditions to use the sAxl ELISA for routine clinical applications are of great interest. In this study, serum samples were subjected to freeze-thaw cycles and incubation at different temperatures to analyze the stability of sAxl by ELISA. Dilution and spike-in experiments were carried out to examine the impact of serum and diluent components on the ELISA performance. Various diluents and media were employed to resolve masking effects of the serum. The assay components were further optimized for long-term usability by treatment with stabilizers and validation under temperature stress. Indeed, sAxl showed long-term stability in serum during freeze-thaw cycles and incubation under temperature stress conditions. The dilution experiments revealed that unknown components in the serum caused masking effects that can be reduced by proper dilutions. The assay performance was further increased by using a standardized buffer system to dilute serum samples. Stabilization of coated plates and of streptavidin-horseradish peroxidase allowed long-term storage for up to 6 months. In sum, our data demonstrate proper ELISA conditions, allowing the accurate analysis of sAxl levels in human serum.
RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk
NASA Astrophysics Data System (ADS)
van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina
2015-04-01
Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)
Savage, M.E.; Simpson, W.W.
1999-07-27
An electrical connector accommodates high current, is not labor intensive to assemble and disassemble, and allows a wide range of motion to accommodate mechanical variations and movement of connected components. The connector comprises several parts with joints therebetween, wherein each joint provides electrical connection between and allows relative motion of the joined parts. The combination of parts and joints maintains electrical connection between two electrical components even if the components are misaligned or move after connection. 6 figs.
NASA Astrophysics Data System (ADS)
Ogruc Ildiz, G.; Arslan, M.; Unsalan, O.; Araujo-Andrade, C.; Kurt, E.; Karatepe, H. T.; Yilmaz, A.; Yalcinkaya, O. B.; Herken, H.
2016-01-01
In this study, a methodology based on Fourier-transform infrared spectroscopy and principal component analysis and partial least square methods is proposed for the analysis of blood plasma samples in order to identify spectral changes correlated with some biomarkers associated with schizophrenia and bipolarity. Our main goal was to use the spectral information for the calibration of statistical models to discriminate and classify blood plasma samples belonging to bipolar and schizophrenic patients. IR spectra of 30 samples of blood plasma obtained from each, bipolar and schizophrenic patients and healthy control group were collected. The results obtained from principal component analysis (PCA) show a clear discrimination between the bipolar (BP), schizophrenic (SZ) and control group' (CG) blood samples that also give possibility to identify three main regions that show the major differences correlated with both mental disorders (biomarkers). Furthermore, a model for the classification of the blood samples was calibrated using partial least square discriminant analysis (PLS-DA), allowing the correct classification of BP, SZ and CG samples. The results obtained applying this methodology suggest that it can be used as a complimentary diagnostic tool for the detection and discrimination of these mental diseases.
Diffusion Modelling Reveals the Decision Making Processes Underlying Negative Judgement Bias in Rats
Hales, Claire A.; Robinson, Emma S. J.; Houghton, Conor J.
2016-01-01
Human decision making is modified by emotional state. Rodents exhibit similar biases during interpretation of ambiguous cues that can be altered by affective state manipulations. In this study, the impact of negative affective state on judgement bias in rats was measured using an ambiguous-cue interpretation task. Acute treatment with an anxiogenic drug (FG7142), and chronic restraint stress and social isolation both induced a bias towards more negative interpretation of the ambiguous cue. The diffusion model was fit to behavioural data to allow further analysis of the underlying decision making processes. To uncover the way in which parameters vary together in relation to affective state manipulations, independent component analysis was conducted on rate of information accumulation and distances to decision threshold parameters for control data. Results from this analysis were applied to parameters from negative affective state manipulations. These projected components were compared to control components to reveal the changes in decision making processes that are due to affective state manipulations. Negative affective bias in rodents induced by either FG7142 or chronic stress is due to a combination of more negative interpretation of the ambiguous cue, reduced anticipation of the high reward and increased anticipation of the low reward. PMID:27023442
NASA Astrophysics Data System (ADS)
Morren, Geert; Wolf, Martin; Lemmerling, Philippe; Wolf, Ursula; Choi, Jee H.; Gratton, Enrico; De Lathauwer, Lieven; Van Huffel, Sabine
2002-06-01
Fast changes in the range of milliseconds in the optical properties of cerebral tissue, which are associated with brain activity, can be detected using non-invasive near-infrared spectroscopy (NIRS). These changes in light scattering are due to an alteration in the refractive index at neuronal membranes. The aim of this study was to develop highly sensitive data analysis algorithms to detect this fast signal, which is small compared to other physiological signals. A frequency-domain tissue oximeter, whose laser diodes were modulated at 110MHz was used. The amplitude, mean intensity and phase of the modulated optical signal was measured at 96Hz sample rate. The probe consisting of 4 crossed source detector pairs was placed above the motor cortex, contralateral to the hand performing a tapping exercise consisting of alternating rest- and tapping periods of 20s each. The tapping frequency, which was set to 3.55Hz or 2.5 times the heart rate of the subject to avoid the influence of harmonics on the signal, could not be observed in any of the individual signals measured by the detectors. An adaptive filter was used to remove the arterial pulsatility from the optical signals. Independent Component Analysis allowed to separate signal components in which the tapping frequency was clearly visible.
Kurosumi, M; Mizukoshi, K
2018-05-01
The types of shape feature that constitutes a face have not been comprehensively established, and most previous studies of age-related changes in facial shape have focused on individual characteristics, such as wrinkle, sagging skin, etc. In this study, we quantitatively measured differences in face shape between individuals and investigated how shape features changed with age. We analyzed three-dimensionally the faces of 280 Japanese women aged 20-69 years and used principal component analysis to establish the shape features that characterized individual differences. We also evaluated the relationships between each feature and age, clarifying the shape features characteristic of different age groups. Changes in facial shape in middle age were a decreased volume of the upper face and increased volume of the whole cheeks and around the chin. Changes in older people were an increased volume of the lower cheeks and around the chin, sagging skin, and jaw distortion. Principal component analysis was effective for identifying facial shape features that represent individual and age-related differences. This method allowed straightforward measurements, such as the increase or decrease in cheeks caused by soft tissue changes or skeletal-based changes to the forehead or jaw, simply by acquiring three-dimensional facial images. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Modelling of Damage Evolution in Braided Composites: Recent Developments
NASA Astrophysics Data System (ADS)
Wang, Chen; Roy, Anish; Silberschmidt, Vadim V.; Chen, Zhong
2017-12-01
Composites reinforced with woven or braided textiles exhibit high structural stability and excellent damage tolerance thanks to yarn interlacing. With their high stiffness-to-weight and strength-to-weight ratios, braided composites are attractive for aerospace and automotive components as well as sports protective equipment. In these potential applications, components are typically subjected to multi-directional static, impact and fatigue loadings. To enhance material analysis and design for such applications, understanding mechanical behaviour of braided composites and development of predictive capabilities becomes crucial. Significant progress has been made in recent years in development of new modelling techniques allowing elucidation of static and dynamic responses of braided composites. However, because of their unique interlacing geometric structure and complicated failure modes, prediction of damage initiation and its evolution in components is still a challenge. Therefore, a comprehensive literature analysis is presented in this work focused on a review of the state-of-the-art progressive damage analysis of braided composites with finite-element simulations. Recently models employed in the studies on mechanical behaviour, impact response and fatigue analyses of braided composites are presented systematically. This review highlights the importance, advantages and limitations of as-applied failure criteria and damage evolution laws for yarns and composite unit cells. In addition, this work provides a good reference for future research on FE simulations of braided composites.
Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S
2001-11-15
Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.
Cinematique et dynamique des galaxies spirales barrees
NASA Astrophysics Data System (ADS)
Hernandez, Olivier
The total mass (luminous and dark) of galaxies is derived from their circular velocities. Spectroscopic Fabry-Perot observations of the ionized gas component of spiral galaxies allow one to derive their kinematics. In the case of purely axisymmetric velocity fields--as in non-active and unbarred spirals galaxies-- the circular velocities can be derived directly. However, the velocity fields of barred galaxies (which constitute two thirds of the spirals) exhibit strong non-circular motions and need a careful analysis to retrieve the circular component. This thesis proposes the necessary steps to recover the axisymmetric component of barred spiral galaxies. The first step was to develop the best instrumentation possible for this work. [Special characters omitted.] , which is the most sensitive photon counting camera ever developed, was coupled to a Fabry-Perot interferometer. The observations of a sample of barred spiral galaxies--the BH a BAR sample--was assembled in order to obtain the most rigourous velocity fields. Then, the Tremaine-Weinberg method, which can determine the bar pattern speed and is usually used with the observations of stellar component, has been tested on the ionised gas and gave satisfactory results. Finally, all the above techniques have been applied to the BH a BAR sample in order to study the key parameters of the galaxies' evolution--bar pattern speeds, multiple stationary waves, resonances etc.--which will allow one to use N-body+SPH simulations to model properly the non-circular motions and determine the true total mass of barred spiral galaxies.
Frank, Yulia A.; Kadnikov, Vitaly V.; Gavrilov, Sergey N.; Banks, David; Gerasimchuk, Anna L.; Podosokorskaya, Olga A.; Merkel, Alexander Y.; Chernyh, Nikolai A.; Mardanov, Andrey V.; Ravin, Nikolai V.; Karnachuk, Olga V.; Bonch-Osmolovskaya, Elizaveta A.
2016-01-01
The goal of this work was to study the diversity of microorganisms inhabiting a deep subsurface aquifer system in order to understand their functional roles and interspecies relations formed in the course of buried organic matter degradation. A microbial community of a deep subsurface thermal aquifer in the Tomsk Region, Western Siberia was monitored over the course of 5 years via a 2.7 km deep borehole 3P, drilled down to a Palaeozoic basement. The borehole water discharges with a temperature of ca. 50°C. Its chemical composition varies, but it steadily contains acetate, propionate, and traces of hydrocarbons and gives rise to microbial mats along the surface flow. Community analysis by PCR-DGGE 16S rRNA genes profiling, repeatedly performed within 5 years, revealed several dominating phylotypes consistently found in the borehole water, and highly variable diversity of prokaryotes, brought to the surface with the borehole outflow. The major planktonic components of the microbial community were Desulfovirgula thermocuniculi and Methanothermobacter spp. The composition of the minor part of the community was unstable, and molecular analysis did not reveal any regularity in its variations, except some predominance of uncultured Firmicutes. Batch cultures with complex organic substrates inoculated with water samples were set in order to enrich prokaryotes from the variable part of the community. PCR-DGGE analysis of these enrichments yielded uncultured Firmicutes, Chloroflexi, and Ignavibacteriae. A continuous-flow microaerophilic enrichment culture with a water sample amended with acetate contained Hydrogenophilus thermoluteolus, which was previously detected in the microbial mat developing at the outflow of the borehole. Cultivation results allowed us to assume that variable components of the 3P well community are hydrolytic organotrophs, degrading buried biopolymers, while the constant planktonic components of the community degrade dissolved fermentation products to methane and CO2, possibly via interspecies hydrogen transfer. Occasional washout of minor community components capable of oxygen respiration leads to the development of microbial mats at the outflow of the borehole where residual dissolved fermentation products are aerobically oxidized. Long-term community analysis with the combination of molecular and cultivation techniques allowed us to characterize stable and variable parts of the community and propose their environmental roles. PMID:28082967
DOT National Transportation Integrated Search
1990-10-01
Guaranteed Ride HOme programs work like a safety net to take the worry out of ridesharing. They allow employees to carpool or vanpool worry-free, confident that their employer has a program in place that will provide a ride if they really need one. S...
Multifamily Quality Control Inspector Job/Task Analysis and Report: September 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, C. M.
The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Quality Control Inspector JTA identifies and catalogs all of the tasks performed by multifamily quality control inspectors, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.
Multifamily Retrofit Project Manager Job/Task Analysis and Report: September 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, C. M.
The development of job/task analyses (JTAs) is one of three components of the Guidelines for Home Energy Professionals project and will allow industry to develop training resources, quality assurance protocols, accredited training programs, and professional certifications. The Multifamily Retrofit Project Manager JTA identifies and catalogs all of the tasks performed by multifamily retrofit project managers, as well as the knowledge, skills, and abilities (KSAs) needed to perform the identified tasks.
Contracting for Navy Husbanding Services: An Analysis of the Fat Leonard Case
2017-12-01
management and internal control frameworks. This research analyzes each alleged act of fraud in the Fat Leonard case and aligns the act with the contract...management phase in which the alleged act occurred and with the internal control component that most contributed to and allowed the alleged act to be...contract administration phases. Furthermore, the findings indicate that the internal control deficiencies were in the control environment and
NASA Technical Reports Server (NTRS)
Deckman, G.; Rousseau, J. (Editor)
1973-01-01
The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.
NASA Astrophysics Data System (ADS)
Gilmanshin, I. R.; Kirpichnikov, A. P.
2017-09-01
In the result of study of the algorithm of the functioning of the early detection module of excessive losses, it is proven the ability to model it by using absorbing Markov chains. The particular interest is in the study of probability characteristics of early detection module functioning algorithm of losses in order to identify the relationship of indicators of reliability of individual elements, or the probability of occurrence of certain events and the likelihood of transmission of reliable information. The identified relations during the analysis allow to set thresholds reliability characteristics of the system components.
Change Mechanisms of Schema-Centered Group Psychotherapy with Personality Disorder Patients
Tschacher, Wolfgang; Zorn, Peter; Ramseyer, Fabian
2012-01-01
Background This study addressed the temporal properties of personality disorders and their treatment by schema-centered group psychotherapy. It investigated the change mechanisms of psychotherapy using a novel method by which psychotherapy can be modeled explicitly in the temporal domain. Methodology and Findings 69 patients were assigned to a specific schema-centered behavioral group psychotherapy, 26 to social skills training as a control condition. The largest diagnostic subgroups were narcissistic and borderline personality disorder. Both treatments offered 30 group sessions of 100 min duration each, at a frequency of two sessions per week. Therapy process was described by components resulting from principal component analysis of patients' session-reports that were obtained after each session. These patient-assessed components were Clarification, Bond, Rejection, and Emotional Activation. The statistical approach focused on time-lagged associations of components using time-series panel analysis. This method provided a detailed quantitative representation of therapy process. It was found that Clarification played a core role in schema-centered psychotherapy, reducing rejection and regulating the emotion of patients. This was also a change mechanism linked to therapy outcome. Conclusions/Significance The introduced process-oriented methodology allowed to highlight the mechanisms by which psychotherapeutic treatment became effective. Additionally, process models depicted the actual patterns that differentiated specific diagnostic subgroups. Time-series analysis explores Granger causality, a non-experimental approximation of causality based on temporal sequences. This methodology, resting upon naturalistic data, can explicate mechanisms of action in psychotherapy research and illustrate the temporal patterns underlying personality disorders. PMID:22745811
NASA Astrophysics Data System (ADS)
Ozeki, Yasuyuki; Otsuka, Yoichi; Sato, Shuya; Hashimoto, Hiroyuki; Umemura, Wataru; Sumimura, Kazuhiko; Nishizawa, Norihiko; Fukui, Kiichi; Itoh, Kazuyoshi
2013-02-01
We have developed a video-rate stimulated Raman scattering (SRS) microscope with frame-by-frame wavenumber tunability. The system uses a 76-MHz picosecond Ti:sapphire laser and a subharmonically synchronized, 38-MHz Yb fiber laser. The Yb fiber laser pulses are spectrally sliced by a fast wavelength-tunable filter, which consists of a galvanometer scanner, a 4-f optical system and a reflective grating. The spectral resolution of the filter is ~ 3 cm-1. The wavenumber was scanned from 2800 to 3100 cm-1 with an arbitrary waveform synchronized to the frame trigger. For imaging, we introduced a 8-kHz resonant scanner and a galvanometer scanner. We were able to acquire SRS images of 500 x 480 pixels at a frame rate of 30.8 frames/s. Then these images were processed by principal component analysis followed by a modified algorithm of independent component analysis. This algorithm allows blind separation of constituents with overlapping Raman bands from SRS spectral images. The independent component (IC) spectra give spectroscopic information, and IC images can be used to produce pseudo-color images. We demonstrate various label-free imaging modalities such as 2D spectral imaging of the rat liver, two-color 3D imaging of a vessel in the rat liver, and spectral imaging of several sections of intestinal villi in the mouse. Various structures in the tissues such as lipid droplets, cytoplasm, fibrous texture, nucleus, and water-rich region were successfully visualized.
Session 6: Dynamic Modeling and Systems Analysis
NASA Technical Reports Server (NTRS)
Csank, Jeffrey; Chapman, Jeffryes; May, Ryan
2013-01-01
These presentations cover some of the ongoing work in dynamic modeling and dynamic systems analysis. The first presentation discusses dynamic systems analysis and how to integrate dynamic performance information into the systems analysis. The ability to evaluate the dynamic performance of an engine design may allow tradeoffs between the dynamic performance and operability of a design resulting in a more efficient engine design. The second presentation discusses the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS). T-MATS is a Simulation system with a library containing the basic building blocks that can be used to create dynamic Thermodynamic Systems. Some of the key features include Turbo machinery components, such as turbines, compressors, etc., and basic control system blocks. T-MAT is written in the Matlab-Simulink environment and is open source software. The third presentation focuses on getting additional performance from the engine by allowing the limit regulators only to be active when a limit is danger of being violated. Typical aircraft engine control architecture is based on MINMAX scheme, which is designed to keep engine operating within prescribed mechanical/operational safety limits. Using a conditionally active min-max limit regulator scheme, additional performance can be gained by disabling non-relevant limit regulators
Influence of argon impurities on the elastic scattering of x-rays from imploding beryllium capsules
Saunders, A. M.; Chapman, D. A.; Kritcher, A. L.; ...
2018-03-01
Here, we investigate the effect of argon impurities on the elastic component of x-ray scattering spectra taken from directly driven beryllium capsule implosions at the OMEGA laser. The plasma conditions were obtained in a previous analysis [18] by fitting the inelastic scattering component. We show that the known argon impurity in the beryllium modifies the elastic scattering due to the larger number of bound electrons. We indeed find significant deviations in the elastic scattering from roughly 1 at.% argon contained in the beryllium. With knowledge of the argon impurity fraction, we use the elastic scattering component to determine the chargemore » state of the compressed beryllium, as the fits are rather insensitive to the argon charge state. Lastly, we discuss how doping small fractions of mid- or high-Z elements into low-Z materials could allow ionization balance studies in dense plasmas.« less
NASA Astrophysics Data System (ADS)
Juretzek, Carina; Hadziioannou, Céline
2014-05-01
Our knowledge about common and different origins of Love and Rayleigh waves observed in the microseism band of the ambient seismic noise field is still limited, including the understanding of source locations and source mechanisms. Multi-component array methods are suitable to address this issue. In this work we use a 3-component beamforming algorithm to obtain source directions and polarization states of the ambient seismic noise field within the primary and secondary microseism bands recorded at the Gräfenberg array in southern Germany. The method allows to distinguish between different polarized waves present in the seismic noise field and estimates Love and Rayleigh wave source directions and their seasonal variations using one year of array data. We find mainly coinciding directions for the strongest acting sources of both wave types at the primary microseism and different source directions at the secondary microseism.
Genetic mixed linear models for twin survival data.
Ha, Il Do; Lee, Youngjo; Pawitan, Yudi
2007-07-01
Twin studies are useful for assessing the relative importance of genetic or heritable component from the environmental component. In this paper we develop a methodology to study the heritability of age-at-onset or lifespan traits, with application to analysis of twin survival data. Due to limited period of observation, the data can be left truncated and right censored (LTRC). Under the LTRC setting we propose a genetic mixed linear model, which allows general fixed predictors and random components to capture genetic and environmental effects. Inferences are based upon the hierarchical-likelihood (h-likelihood), which provides a statistically efficient and unified framework for various mixed-effect models. We also propose a simple and fast computation method for dealing with large data sets. The method is illustrated by the survival data from the Swedish Twin Registry. Finally, a simulation study is carried out to evaluate its performance.
Open-cycle OTEC system performance analysis. [Claude cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewandowski, A.A.; Olson, D.A.; Johnson, D.H.
1980-10-01
An algorithm developed to calculate the performance of Claude-Cycle ocean thermal energy conversion (OTEC) systems is described. The algorithm treats each component of the system separately and then interfaces them to form a complete system, allowing a component to be changed without changing the rest of the algorithm. Two components that are subject to change are the evaporator and condenser. For this study we developed mathematical models of a channel-flow evaporator and both a horizontal jet and spray director contact condenser. The algorithm was then programmed to run on SERI's CDC 7600 computer and used to calculate the effect onmore » performance of deaerating the warm and cold water streams before entering the evaporator and condenser, respectively. This study indicates that there is no advantage to removing air from these streams compared with removing the air from the condenser.« less
NASA Astrophysics Data System (ADS)
Kirovskaya, I. A.; Mironova, E. V.; Ushakov, O. V.; Nor, P. E.; Yureva, A. V.; Matyash, Yu I.
2018-01-01
A method for determining the hydrogen index of the surfaces isoelectric state (pHiso) at various gases pressures -possible components of the surrounding and technological media has been developed. With its use, changes in pH of binary and more complex semiconductors-components of the new system-ZnSe-CdS under the influence of nitrogen dioxide-have been found. The limiting sensitivity of surfaces - minimum PNO2, causing a change in pH has been estimated. The most active components of ZnSe-CdS system, recommended as materials for measuring cells of NO2, have been revealed. The relationship between the changing patterns with the composition of surface (acid-base) and bulk (in particular, theoretical calculated crystal density) properties has been established, allowing to find the most effective materials for sensor technology and for semiconductor analysis.
Influence of argon impurities on the elastic scattering of x-rays from imploding beryllium capsules
NASA Astrophysics Data System (ADS)
Saunders, A. M.; Chapman, D. A.; Kritcher, A. L.; Schoff, M.; Shuldberg, C.; Landen, O. L.; Glenzer, S. H.; Falcone, R. W.; Gericke, D. O.; Döppner, T.
2018-03-01
We investigate the effect of argon impurities on the elastic component of x-ray scattering spectra taken from directly driven beryllium capsule implosions at the OMEGA laser. The plasma conditions were obtained in a previous analysis [18] by fitting the inelastic scattering component. We show that the known argon impurity in the beryllium modifies the elastic scattering due to the larger number of bound electrons. We indeed find significant deviations in the elastic scattering from roughly 1 at.% argon contained in the beryllium. With knowledge of the argon impurity fraction, we use the elastic scattering component to determine the charge state of the compressed beryllium, as the fits are rather insensitive to the argon charge state. Finally, we discuss how doping small fractions of mid- or high-Z elements into low-Z materials could allow ionization balance studies in dense plasmas.
NASA Astrophysics Data System (ADS)
Godinho, R. M.; Cabrita, M. T.; Alves, L. C.; Pinheiro, T.
2015-04-01
Studies of the elemental composition of whole marine diatoms cells have high interest as they constitute a direct measurement of environmental changes, and allow anticipating consequences of anthropogenic alterations to organisms, ecosystems and global marine geochemical cycles. Nuclear microscopy is a powerful tool allowing direct measurement of whole cells giving qualitative imaging of distribution, and quantitative determination of intracellular concentration. Major obstacles to the analysis of marine microalgae are high medium salinity and the recurrent presence of extracellular exudates produced by algae to maintain colonies in natural media and in vitro. The objective of this paper was to optimize the methodology of sample preparation of marine unicellular algae for elemental analysis with nuclear microscopy, allowing further studies on cellular response to metals. Primary cultures of Coscinodiscus wailesii maintained in vitro were used to optimize protocols for elemental analysis with nuclear microscopy techniques. Adequate cell preparation procedures to isolate the cells from media components and exudates were established. The use of chemical agents proved to be inappropriate for elemental determination and for intracellular morphological analysis. The assessment of morphology and elemental partitioning in cell compartments obtained with nuclear microscopy techniques enabled to infer their function in natural environment and imbalances in exposure condition. Exposure to metal affected C. wailesii morphology and internal elemental distribution.
Gutiérrez-Cacciabue, Dolores; Teich, Ingrid; Poma, Hugo Ramiro; Cruz, Mercedes Cecilia; Balzarini, Mónica; Rajal, Verónica Beatriz
2014-01-01
Several recreational surface waters in Salta, Argentina, were selected to assess their quality. Seventy percent of the measurements exceeded at least one of the limits established by international legislation becoming unsuitable for their use. To interpret results of complex data, multivariate techniques were applied. Arenales River, due to the variability observed in the data, was divided in two: upstream and downstream representing low and high pollution sites, respectively; and Cluster Analysis supported that differentiation. Arenales River downstream and Campo Alegre Reservoir were the most different environments and Vaqueros and La Caldera Rivers were the most similar. Canonical Correlation Analysis allowed exploration of correlations between physicochemical and microbiological variables except in both parts of Arenales River, and Principal Component Analysis allowed finding relationships among the 9 measured variables in all aquatic environments. Variable’s loadings showed that Arenales River downstream was impacted by industrial and domestic activities, Arenales River upstream was affected by agricultural activities, Campo Alegre Reservoir was disturbed by anthropogenic and ecological effects, and La Caldera and Vaqueros Rivers were influenced by recreational activities. Discriminant Analysis allowed identification of subgroup of variables responsible for seasonal and spatial variations. Enterococcus, dissolved oxygen, conductivity, E. coli, pH, and fecal coliforms are sufficient to spatially describe the quality of the aquatic environments. Regarding seasonal variations, dissolved oxygen, conductivity, fecal coliforms, and pH can be used to describe water quality during dry season, while dissolved oxygen, conductivity, total coliforms, E. coli, and Enterococcus during wet season. Thus, the use of multivariate techniques allowed optimizing monitoring tasks and minimizing costs involved. PMID:25190636
Hierarchical control and performance evaluation of multi-vehicle autonomous systems
NASA Astrophysics Data System (ADS)
Balakirsky, Stephen; Scrapper, Chris; Messina, Elena
2005-05-01
This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together
Analyses of exobiological and potential resource materials in the Martian soil.
Mancinelli, R L; Marshall, J R; White, M R
1992-01-01
Potential Martian soil components relevant to exobiology include water, organic matter, evaporites, clays, and oxides. These materials are also resources for human expeditions to Mars. When found in particular combinations, some of these materials constitute diagnostic paleobiomarker suites, allowing insight to be gained into the probability of life originating on Mars. Critically important to exobiology is the method of data analysis and data interpretation. To that end we are investigating methods of analysis of potential biomarker and paleobiomarker compounds and resource materials in soils and rocks pertinent to Martian geology. Differential thermal analysis coupled with gas chromatography is shown to be a highly useful analytical technique for detecting this wide and complex variety of materials.
Analyses of exobiological and potential resource materials in the Martian soil
NASA Technical Reports Server (NTRS)
Mancinelli, Rocco L.; Marshall, John R.; White, Melisa R.
1992-01-01
Potential Martian soil components relevant to exobiology include water, organic matter, evaporites, clays, and oxides. These materials are also resources for human expeditions to Mars. When found in particular combinations, some of these materials constitute diagnostic paleobiomarker suites, allowing insight to be gained into the probability of life originating on Mars. Critically important to exobiology is the method of data analysis and data interpretation. To that end, methods of analysis of potential biomarker and paleobiomarker compounds and resource materials in soils and rocks pertinent to Martian geology are investigated. Differential thermal analysis coupled with gas chromotography is shown to be a highly useful analytical technique for detecting this wide and complex variety of materials.
Tailored multivariate analysis for modulated enhanced diffraction
Caliandro, Rocco; Guccione, Pietro; Nico, Giovanni; ...
2015-10-21
Modulated enhanced diffraction (MED) is a technique allowing the dynamic structural characterization of crystalline materials subjected to an external stimulus, which is particularly suited forin situandoperandostructural investigations at synchrotron sources. Contributions from the (active) part of the crystal system that varies synchronously with the stimulus can be extracted by an offline analysis, which can only be applied in the case of periodic stimuli and linear system responses. In this paper a new decomposition approach based on multivariate analysis is proposed. The standard principal component analysis (PCA) is adapted to treat MED data: specific figures of merit based on their scoresmore » and loadings are found, and the directions of the principal components obtained by PCA are modified to maximize such figures of merit. As a result, a general method to decompose MED data, called optimum constrained components rotation (OCCR), is developed, which produces very precise results on simulated data, even in the case of nonperiodic stimuli and/or nonlinear responses. Furthermore, the multivariate analysis approach is able to supply in one shot both the diffraction pattern related to the active atoms (through the OCCR loadings) and the time dependence of the system response (through the OCCR scores). Furthermore, when applied to real data, OCCR was able to supply only the latter information, as the former was hindered by changes in abundances of different crystal phases, which occurred besides structural variations in the specific case considered. In order to develop a decomposition procedure able to cope with this combined effect represents the next challenge in MED analysis.« less
NASA Astrophysics Data System (ADS)
Pal, Robert; Beeby, Andrew
2014-09-01
An inverted microscope has been adapted to allow time-gated imaging and spectroscopy to be carried out on samples containing responsive lanthanide probes. The adaptation employs readily available components, including a pulsed light source, time-gated camera, spectrometer and photon counting detector, allowing imaging, emission spectroscopy and lifetime measurements. Each component is controlled by a suite of software written in LabVIEW and is powered via conventional USB ports.
Construction of a Cyber Attack Model for Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varuttamaseni, Athi; Bari, Robert A.; Youngblood, Robert
The consideration of how one compromised digital equipment can impact neighboring equipment is critical to understanding the progression of cyber attacks. The degree of influence that one component may have on another depends on a variety of factors, including the sharing of resources such as network bandwidth or processing power, the level of trust between components, and the inclusion of segmentation devices such as firewalls. The interactions among components via mechanisms that are unique to the digital world are not usually considered in traditional PRA. This means potential sequences of events that may occur during an attack may be missedmore » if one were to only look at conventional accident sequences. This paper presents a method where, starting from the initial attack vector, the progression of a cyber attack can be modeled. The propagation of the attack is modeled by considering certain attributes of the digital components in the system. These attributes determine the potential vulnerability of a component to a class of attack and the capability gained by the attackers once they are in control of the equipment. The use of attributes allows similar components (components with the same set of attributes) to be modeled in the same way, thereby reducing the computing resources required for analysis of large systems.« less
Signatures of personality on dense 3D facial images.
Hu, Sile; Xiong, Jieyi; Fu, Pengcheng; Qiao, Lu; Tan, Jingze; Jin, Li; Tang, Kun
2017-03-06
It has long been speculated that cues on the human face exist that allow observers to make reliable judgments of others' personality traits. However, direct evidence of association between facial shapes and personality is missing from the current literature. This study assessed the personality attributes of 834 Han Chinese volunteers (405 males and 429 females), utilising the five-factor personality model ('Big Five'), and collected their neutral 3D facial images. Dense anatomical correspondence was established across the 3D facial images in order to allow high-dimensional quantitative analyses of the facial phenotypes. In this paper, we developed a Partial Least Squares (PLS) -based method. We used composite partial least squares component (CPSLC) to test association between the self-tested personality scores and the dense 3D facial image data, then used principal component analysis (PCA) for further validation. Among the five personality factors, agreeableness and conscientiousness in males and extraversion in females were significantly associated with specific facial patterns. The personality-related facial patterns were extracted and their effects were extrapolated on simulated 3D facial models.
Modified method to improve the design of Petlyuk distillation columns.
Zapiain-Salinas, Javier G; Barajas-Fernández, Juan; González-García, Raúl
2014-01-01
A response surface analysis was performed to study the effect of the composition and feeding thermal conditions of ternary mixtures on the number of theoretical stages and the energy consumption of Petlyuk columns. A modification of the pre-design algorithm was necessary for this purpose. The modified algorithm provided feasible results in 100% of the studied cases, compared with only 8.89% for the current algorithm. The proposed algorithm allowed us to attain the desired separations, despite the type of mixture and the operating conditions in the feed stream, something that was not possible with the traditional pre-design method. The results showed that the type of mixture had great influence on the number of stages and on energy consumption. A higher number of stages and a lower consumption of energy were attained with mixtures rich in the light component, while higher energy consumption occurred when the mixture was rich in the heavy component. The proposed strategy expands the search of an optimal design of Petlyuk columns within a feasible region, which allow us to find a feasible design that meets output specifications and low thermal loads.
Post-flight Analysis of the Argon Filled Ion Chamber
NASA Technical Reports Server (NTRS)
Tai, H.; Goldhagen, P.; Jones, I. W.; Wilson, J. W.; Maiden, D. L.; Shinn, J. L.
2003-01-01
Atmospheric ionizing radiation is a complex mixture of primary galactic and solar cosmic rays and a multitude of secondary particles produced in collision with air nuclei. The first series of Atmospheric Ionizing Radiation (AIR) measurement flights on the NASA research aircraft ER-2 took place in June 1997. The ER-2 flight package consisted of fifteen instruments from six countries and were chosen to provide varying sensitivity to specific components. These AIR ER-2 flight measurements are to characterize the AIR environment during solar minimum to allow the continued development of environmental models of this complex mixture of ionizing radiation. This will enable scientists to study the ionizing radiation health hazard associated with the high-altitude operation of a commercial supersonic transport and to allow estimates of single event upsets for advanced avionics systems design. The argon filled ion chamber representing about 40 percent of the contributions to radiation risks are analyzed herein and model discrepancies for solar minimum environment are on the order of 5 percent and less. Other biologically significant components remain to be analyzed.
Harmane and harmalan are bioactive components of classical clonidine-displacing substance.
Parker, Christine A; Anderson, Neil J; Robinson, Emma S J; Price, Rhiannon; Tyacke, Robin J; Husbands, Stephen M; Dillon, Michael P; Eglen, Richard M; Hudson, Alan L; Nutt, David J; Crump, Matthew P; Crosby, John
2004-12-28
Elucidation of the structure of the endogenous ligand(s) for imidazoline binding sites, clonidine-displacing substance (CDS), has been a major goal for many years. Crude CDS from bovine lung was purified by reverse-phase high-pressure liquid chromatography. Electrospray mass spectrometry (ESMS) and nuclear magnetic resonance ((1)H NMR) analysis revealed the presence of L-tryptophan and 1-carboxy-1-methyltetrahydro-beta-carboline in the active CDS extract. Competition radioligand binding studies, however, failed to show displacement of specific [(3)H]clonidine binding to rat brain membranes for either compound. Further purification of the bovine lung extract allowed the isolation of the beta-carbolines harmane and harmalan as confirmed by ESMS, (1)H NMR, and comparison with synthetic standards. Both compounds exhibited a high (nanomolar) affinity for both type 1 and type 2 imidazoline binding sites, and the synthetic standards were shown to coelute with the active classical CDS extracts. We therefore propose that the beta-carbolines harmane and harmalan represent active components of classical CDS. The identification of these compounds will allow us to establish clear physiological roles for CDS.
Allen, Samuel J; Ott, Lisa S
2012-07-01
There are a wide and growing variety of feedstocks for biodiesel fuel. Most commonly, these feedstocks contain triglycerides which are transesterified into the fatty acid alkyl esters (FAAEs) which comprise biodiesel fuel. While the tranesterification reaction itself is simple, monitoring the reaction progress and reaction products is not. Gas chromatography-mass spectrometry is useful for assessing the FAAE products, but does not directly address either the tri-, di-, or monoglycerides present from incomplete transesterification or the free fatty acids which may also be present. Analysis of the biodiesel reaction mixture is complicated by the solubility and physical property differences among the components of the tranesterification reaction mixture. In this contribution, we present a simple, rapid HPLC method which allows for monitoring all of the main components in a biodiesel fuel transesterification reaction, with specific emphasis on the ability to monitor the reaction as a function of time. The utilization of a relatively new, core-shell stationary phase for the HPLC column allows for efficient separation of peaks with short elution times, saving both time and solvent.
NASA Astrophysics Data System (ADS)
Stisen, S.; Demirel, C.; Koch, J.
2017-12-01
Evaluation of performance is an integral part of model development and calibration as well as it is of paramount importance when communicating modelling results to stakeholders and the scientific community. There exists a comprehensive and well tested toolbox of metrics to assess temporal model performance in the hydrological modelling community. On the contrary, the experience to evaluate spatial performance is not corresponding to the grand availability of spatial observations readily available and to the sophisticate model codes simulating the spatial variability of complex hydrological processes. This study aims at making a contribution towards advancing spatial pattern oriented model evaluation for distributed hydrological models. This is achieved by introducing a novel spatial performance metric which provides robust pattern performance during model calibration. The promoted SPAtial EFficiency (spaef) metric reflects three equally weighted components: correlation, coefficient of variation and histogram overlap. This multi-component approach is necessary in order to adequately compare spatial patterns. spaef, its three components individually and two alternative spatial performance metrics, i.e. connectivity analysis and fractions skill score, are tested in a spatial pattern oriented model calibration of a catchment model in Denmark. The calibration is constrained by a remote sensing based spatial pattern of evapotranspiration and discharge timeseries at two stations. Our results stress that stand-alone metrics tend to fail to provide holistic pattern information to the optimizer which underlines the importance of multi-component metrics. The three spaef components are independent which allows them to complement each other in a meaningful way. This study promotes the use of bias insensitive metrics which allow comparing variables which are related but may differ in unit in order to optimally exploit spatial observations made available by remote sensing platforms. We see great potential of spaef across environmental disciplines dealing with spatially distributed modelling.
Napoli, Edoardo M; Siracusa, Laura; Saija, Antonella; Speciale, Antonio; Trombetta, Domenico; Tuttolomondo, Teresa; La Bella, Salvatore; Licata, Mario; Virga, Giuseppe; Leone, Raffaele; Leto, Claudio; Rubino, Laura; Ruberto, Giuseppe
2015-07-01
To identify the best biotypes, an extensive survey of Sicilian wild rosemary was carried out by collecting 57 samples from various sites, followed by taxonomic characterization from an agronomic perspective. All the biotypes collected were classified as Rosmarinus officinalis L. A cluster analysis based on the morphological characteristics of the plants allowed the division of the biotypes into seven main groups, although the characteristics examined were found to be highly similar and not area-dependent. Moreover, all samples were analyzed for their phytochemical content, applying an extraction protocol to obtain the nonvolatile components and hydrodistillation to collect the essential oils for the volatile components. The extracts were characterized by LC-UV-DAD/ESI-MS, and the essential oils by GC-FID and GC/MS analyses. In the nonvolatile fractions, 18 components were identified, namely, 13 flavones, two organic acids, and three diterpenes. In the volatile fractions, a total of 82 components were found, with as predominant components α-pinene and camphene among the monoterpene hydrocarbons and 1,8-cineole, camphor, borneol, and verbenone among the oxygenated monoterpenes. Cluster analyses were carried out on both phytochemical profiles, allowing the separation of the rosemary samples into different chemical groups. Finally, the total phenol content and the antioxidant activity of the essential oils and extracts were determined with the Folin-Ciocalteu (FC) colorimetric assay, the UV radiation-induced peroxidation in liposomal membranes (UV-IP test), and the scavenging activity of the superoxide radical (O$\\rm{{_{2}^{{^\\cdot} -}}}$). The present study confirmed that the essential oils and organic extracts of the Sicilian rosemary samples analyzed showed a considerable antioxidant/free radical-scavenging activity. Copyright © 2015 Verlag Helvetica Chimica Acta AG, Zürich.
Component spectra extraction from terahertz measurements of unknown mixtures.
Li, Xian; Hou, D B; Huang, P J; Cai, J H; Zhang, G X
2015-10-20
The aim of this work is to extract component spectra from unknown mixtures in the terahertz region. To that end, a method, hard modeling factor analysis (HMFA), was applied to resolve terahertz spectral matrices collected from the unknown mixtures. This method does not require any expertise of the user and allows the consideration of nonlinear effects such as peak variations or peak shifts. It describes the spectra using a peak-based nonlinear mathematic model and builds the component spectra automatically by recombination of the resolved peaks through correlation analysis. Meanwhile, modifications on the method were made to take the features of terahertz spectra into account and to deal with the artificial baseline problem that troubles the extraction process of some terahertz spectra. In order to validate the proposed method, simulated wideband terahertz spectra of binary and ternary systems and experimental terahertz absorption spectra of amino acids mixtures were tested. In each test, not only the number of pure components could be correctly predicted but also the identified pure spectra had a good similarity with the true spectra. Moreover, the proposed method associated the molecular motions with the component extraction, making the identification process more physically meaningful and interpretable compared to other methods. The results indicate that the HMFA method with the modifications can be a practical tool for identifying component terahertz spectra in completely unknown mixtures. This work reports the solution to this kind of problem in the terahertz region for the first time, to the best of the authors' knowledge, and represents a significant advance toward exploring physical or chemical mechanisms of unknown complex systems by terahertz spectroscopy.
Domain adaptation via transfer component analysis.
Pan, Sinno Jialin; Tsang, Ivor W; Kwok, James T; Yang, Qiang
2011-02-01
Domain adaptation allows knowledge from a source domain to be transferred to a different but related target domain. Intuitively, discovering a good feature representation across domains is crucial. In this paper, we first propose to find such a representation through a new learning method, transfer component analysis (TCA), for domain adaptation. TCA tries to learn some transfer components across domains in a reproducing kernel Hilbert space using maximum mean miscrepancy. In the subspace spanned by these transfer components, data properties are preserved and data distributions in different domains are close to each other. As a result, with the new representations in this subspace, we can apply standard machine learning methods to train classifiers or regression models in the source domain for use in the target domain. Furthermore, in order to uncover the knowledge hidden in the relations between the data labels from the source and target domains, we extend TCA in a semisupervised learning setting, which encodes label information into transfer components learning. We call this extension semisupervised TCA. The main contribution of our work is that we propose a novel dimensionality reduction framework for reducing the distance between domains in a latent space for domain adaptation. We propose both unsupervised and semisupervised feature extraction approaches, which can dramatically reduce the distance between domain distributions by projecting data onto the learned transfer components. Finally, our approach can handle large datasets and naturally lead to out-of-sample generalization. The effectiveness and efficiency of our approach are verified by experiments on five toy datasets and two real-world applications: cross-domain indoor WiFi localization and cross-domain text classification.
Genome-Assisted Prediction of Quantitative Traits Using the R Package sommer.
Covarrubias-Pazaran, Giovanny
2016-01-01
Most traits of agronomic importance are quantitative in nature, and genetic markers have been used for decades to dissect such traits. Recently, genomic selection has earned attention as next generation sequencing technologies became feasible for major and minor crops. Mixed models have become a key tool for fitting genomic selection models, but most current genomic selection software can only include a single variance component other than the error, making hybrid prediction using additive, dominance and epistatic effects unfeasible for species displaying heterotic effects. Moreover, Likelihood-based software for fitting mixed models with multiple random effects that allows the user to specify the variance-covariance structure of random effects has not been fully exploited. A new open-source R package called sommer is presented to facilitate the use of mixed models for genomic selection and hybrid prediction purposes using more than one variance component and allowing specification of covariance structures. The use of sommer for genomic prediction is demonstrated through several examples using maize and wheat genotypic and phenotypic data. At its core, the program contains three algorithms for estimating variance components: Average information (AI), Expectation-Maximization (EM) and Efficient Mixed Model Association (EMMA). Kernels for calculating the additive, dominance and epistatic relationship matrices are included, along with other useful functions for genomic analysis. Results from sommer were comparable to other software, but the analysis was faster than Bayesian counterparts in the magnitude of hours to days. In addition, ability to deal with missing data, combined with greater flexibility and speed than other REML-based software was achieved by putting together some of the most efficient algorithms to fit models in a gentle environment such as R.
NASA Astrophysics Data System (ADS)
Thomas, R. G.; Berry, K.; Arrigo, J.; Hooper, R. P.
2013-12-01
Technical 'hands-on' training workshops are designed to bring together scientists, technicians, and program managers from universities, government agencies, and the private sector to discuss methods used and advances made in instrumentation and data analysis. Through classroom lectures and discussions combined with a field-day component, hands-on workshop participants get a 'full life cycle' perspective from instrumentation concepts and deployment to data analysis. Using film to document this process is becoming increasingly more popular, allowing scientists to add a story-telling component to their research. With the availability of high-quality and low priced professional video equipment and editing software, scientists are becoming digital storytellers. The science video developed from the 'hands-on' workshop, Optical Water Quality Sensors for Nutrients: Concepts, Deployment, and Analysis, encapsulates the objectives of technical training workshops for participants. Through the use of still photography, video, interviews, and sound, the short video, An Introduction to CUAHSI's Hands-on Workshops, produced by a co-instructor of the workshop acts as a multi-purpose tool. The 10-minute piece provides an overview of workshop field day activities and works to bridge the gap between classroom learning, instrumentation application and data analysis. CUAHSI 'hands-on' technical workshops have been collaboratively executed with faculty from several universities and with the U.S. Geological Survey. The video developed was designed to attract new participants to these professional development workshops, to stimulate a connection with the environment, to act as a workshop legacy resource, and also serve as a guide for prospective hands-on workshop organizers. The effective use of film and short videos in marketing scientific programs, such as technical trainings, allows scientists to visually demonstrate the technologies currently being employed and to provide a more intriguing perspective on scientific research.
NASA Astrophysics Data System (ADS)
Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.
2014-12-01
The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.
High Fidelity System Simulation of Multiple Components in Support of the UEET Program
NASA Technical Reports Server (NTRS)
Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton
2006-01-01
The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.
49 CFR 230.24 - Maximum allowable stress.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
49 CFR 230.24 - Maximum allowable stress.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 4 2012-10-01 2012-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
49 CFR 230.24 - Maximum allowable stress.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 4 2014-10-01 2014-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
49 CFR 230.24 - Maximum allowable stress.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 4 2013-10-01 2013-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
49 CFR 230.24 - Maximum allowable stress.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 4 2011-10-01 2011-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...
2014-01-01
Background The possibility of applying a novel chemometric approach which could allow the differentiation of marble samples, all from different quarries located in the Mediterranean basin and frequently used in ancient times for artistic purposes, was investigated. By suggesting tentative or allowing to rule out unlikely attributions, this kind of differentiation could, indeed, be of valuable support to restorers and other professionals in the field of cultural heritage. Experimental data were obtained only using thermal analytical techniques: Thermogravimetry (TG), Derivative Thermogravimetry (DTG) and Differential Thermal Analysis (DTA). Results The extraction of kinetic parameters from the curves obtained using these thermal analytical techniques allowed Activation Energy values to be evaluated together with the logarithm of the Arrhenius pre-exponential factor of the main TG-DTG process. The main data thus obtained after subsequent chemometric evaluation (using Principal Components Analysis) have already proved useful in the identification the original quarry of a small number of archaeological marble finds. Conclusion One of the most evident advantages of the thermoanalytical – chemometric approach adopted seems to be that it allows the certain identification of an unknown find composed of a marble known to be present among the reference samples considered, that is, contained in the reference file. On the other hand with equal certainty it prevents the occurrence of erroneous or highly uncertain identification if the find being tested does not belong to the reference file considered. PMID:24982691
NASA Technical Reports Server (NTRS)
Koumal, D. E.
1979-01-01
The design and evaluation of built-up attachments and bonded joint concepts for use at elevated temperatures is documented. Joint concept screening, verification of GR/PI material, fabrication of design allowables panels, definition of test matrices, and analysis of bonded and bolted joints are among the tasks completed. The results provide data for the design and fabrication of lightly loaded components for advanced space transportation systems and high speed aircraft.
ChemCam for Mars Science Laboratory rover, undergoing pre-flight testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2011-10-20
Los Alamos National Laboratory and partners developed a laser instrument, ChemCam, that will ride on the elevated mast of the Mars Science Laboratory rover Curiosity. The system allows Curiosity to "zap" rocks from a distance, reading their chemical composition through spectroscopic analysis. In this video, laboratory shaker-table testing of the instrument ensures that all of its components are solidly attached and resistant to damage from the rigors of launch, travel and landing.
ChemCam for Mars Science Laboratory rover, undergoing pre-flight testing
None
2018-06-06
Los Alamos National Laboratory and partners developed a laser instrument, ChemCam, that will ride on the elevated mast of the Mars Science Laboratory rover Curiosity. The system allows Curiosity to "zap" rocks from a distance, reading their chemical composition through spectroscopic analysis. In this video, laboratory shaker-table testing of the instrument ensures that all of its components are solidly attached and resistant to damage from the rigors of launch, travel and landing.
Equilibrium Phase Behavior of the Square-Well Linear Microphase-Forming Model.
Zhuang, Yuan; Charbonneau, Patrick
2016-07-07
We have recently developed a simulation approach to calculate the equilibrium phase diagram of particle-based microphase formers. Here, this approach is used to calculate the phase behavior of the square-well linear model for different strengths and ranges of the linear long-range repulsive component. The results are compared with various theoretical predictions for microphase formation. The analysis further allows us to better understand the mechanism for microphase formation in colloidal suspensions.
Synthesis and characterization of polycrystalline CdSiP2
NASA Astrophysics Data System (ADS)
Bereznaya, S. A.; Korotchenko, Z. V.; Sarkisov, S. Yu; Korolkov, I. V.; Kuchumov, B. M.; Saprykin, A. I.; Atuchin, V. V.
2018-05-01
A modified method is proposed for the CdSiP2 compound synthesis from elemental starting components. The developed technique allows completing the synthesis process within 30 h. The phase and chemical composition of the synthesized material were confirmed by the x-ray diffraction analysis and scanning electron microscopy with energy-dispersive spectroscopy. The transparent crystal block sized 3 × 3 × 2 mm3 was cut from the polycrystalline ingot and characterized by optical methods.
The corona of the broad-line radio galaxy 3C 390.3
Lohfink, A. M.; Ogle, P.; Tombesi, F.; ...
2015-11-13
We present the results from a joint Suzaku/NuSTAR broadband spectral analysis of 3C 390.3. The high quality data enables us to clearly separate the primary continuum from the reprocessed components allowing us to detect a high energy spectral cut-off (more » $${E}_{\\mathrm{cut}}={117}_{-14}^{+18}$$ keV), and to place constraints on the Comptonization parameters of the primary continuum for the first time. The hard over soft compactness is $${69}_{-24}^{+124}$$ and the optical depth is $${4.1}_{-3.6}^{+0.5},$$ this leads to an electron temperature of $${30}_{-8}^{+32}$$ keV. Expanding our study of the Comptonization spectrum to the optical/UV by studying the simultaneous Swift-UVOT data, we find indications that the compactness of the corona allows only a small fraction of the total UV/optical flux to be Comptonized. Our analysis of the reprocessed emission show that 3C 390.3 only has a small amount of reflection (R ~ 0.3), and of that the vast majority is from distant neutral matter. Furthermore, we also discover a soft-X-ray excess in the source, which can be described by a weak ionized reflection component from the inner parts of the accretion disk. In addition to the backscattered emission, we also detect the highly ionized iron emission lines Fe xxv and Fe xxvi.« less
Propagation path effects for rayleigh and love waves. Semi-annual technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrin, E.; Goforth, T.
Seismic surface waves are usually composed of overlapping wave trains representing multi-path propagation. A first task in the analysis of such waves is to identify and separate the various component wave trains so that each can be analyzed separately. Phase-matched filters are a class of linear filters in which the Fourier phase of the filter is made equal to that of a given signal. The authors previously described an iterative technique which can be used to find a phase-matched filter for a particular component of a seismic signal. Application of the filters to digital records of Rayleigh waves allowed multiplemore » arrivals to be identified and removed, and allowed recovery of the complex spectrum of the primary wave train along with its apparent group velocity dispersion curve. A comparable analysis of Love waves presents additional complications. Love waves are contaminated by both Love and Rayleigh multipathing and by primary off-axis Rayleigh energy. In the case of explosions, there is much less energy generated as Love waves than as Rayleigh waves. The applicability of phase-matched filtering to Love waves is demonstrated by its use on earthquakes occurring in the Norwegian Sea and near Iceland and on a nuclear explosion in Novaya Zemlya. Despite severe multipathing in two of the three events, the amplitude and phase of each of the primary Love waves were recovered without significant distortion.« less
Chaton, Catherine T.
2017-01-01
Sedimentation velocity analytical ultracentrifugation (SV-AUC) has seen a resurgence in popularity as a technique for characterizing macromolecules and complexes in solution. SV-AUC is a particularly powerful tool for studying protein conformation, complex stoichiometry, and interacting systems in general. Deconvoluting velocity data to determine a sedimentation coefficient distribution c(s) allows for the study of either individual proteins or multi-component mixtures. The standard c(s) approach estimates molar masses of the sedimenting species based on determination of the frictional ratio (f/f0) from boundary shapes. The frictional ratio in this case is a weight-averaged parameter, which can lead to distortion of mass estimates and loss of information when attempting to analyze mixtures of macromolecules with different shapes. A two-dimensional extension of the c(s) analysis approach provides size-and-shape distributions that describe the data in terms of a sedimentation coefficient and frictional ratio grid. This allows for better resolution of species with very distinct shapes that may co-sediment and provides better molar mass determinations for multi-component mixtures. An example case is illustrated using globular and non-globular proteins of different masses with nearly identical sedimentation coefficients that could only be resolved using the size-and-shape distribution. Other applications of this analytical approach to complex biological systems are presented, focusing on proteins involved in the innate immune response to cytosolic microbial DNA. PMID:26412652
'The genetic analysis of functional connectomics in Drosophila'
Meinertzhagen, Ian A.; Lee, Chi-Hon
2014-01-01
Fly and vertebrate nervous systems share many organization characteristics, such as layers, columns and glomeruli, and utilize similar synaptic components, such ion channels and receptors. Both also exhibit similar network features. Recent technological advances, especially in electron microscopy, now allow us to determine synaptic circuits and identify pathways cell-by-cell, as part of the fly’s connectome. Genetic tools provide the means to identify synaptic components, as well as to record and manipulate neuronal activity, adding function to the connectome. This review discusses technical advances in these emerging areas of functional connectomics, offering prognoses in each and identifying the challenges in bridging structural connectomics to molecular biology and synaptic physiology, thereby determining fundamental computation mechanisms that underlie behaviour. PMID:23084874
Exploring Galaxy Formation and Evolution via Structural Decomposition
NASA Astrophysics Data System (ADS)
Kelvin, Lee; Driver, Simon; Robotham, Aaron; Hill, David; Cameron, Ewan
2010-06-01
The Galaxy And Mass Assembly (GAMA) structural decomposition pipeline (GAMA-SIGMA Structural Investigation of Galaxies via Model Analysis) will provide multi-component information for a sample of ~12,000 galaxies across 9 bands ranging from near-UV to near-IR. This will allow the relationship between structural properties and broadband, optical-to-near-IR, spectral energy distributions of bulge, bar, and disk components to be explored, revealing clues as to the history of baryonic mass assembly within a hierarchical clustering framework. Data is initially taken from the SDSS & UKIDSS-LAS surveys to test the robustness of our automated decomposition pipeline. This will eventually be replaced with the forthcoming higher-resolution VST & VISTA surveys data, expanding the sample to ~30,000 galaxies.
Clinical care costing method for the Clinical Care Classification System.
Saba, Virginia K; Arnold, Jean M
2004-01-01
To provide a means for calculating the cost of nursing care using the Clinical Care Classification System (CCCS). Three CCCS indicators of care components, actions, and outcomes in conjunction with Clinical Care Pathways (CCPs). The cost of patient care is based on the type of action time multiplied by care components and nursing costs. The CCCM for the CCCS makes it possible to measure and cost out clinical practice. The CCCM may be used with CCPs in the electronic patient medical record. The CCPs make it easy to track the clinical nursing care across time, settings, population groups, and geographical locations. Collected data may be used many times, allowing for improved documentation, analysis, and costing out of care.
Capacitance probe for detection of anomalies in non-metallic plastic pipe
Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.
2010-11-23
The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.
Application of SERS spectroscopy for detection of trace components in urinary deposits
NASA Astrophysics Data System (ADS)
Pucetaite, Milda; Velicka, Martynas; Tamosaityte, Sandra; Sablinskas, Valdas
2014-03-01
Surface-enhanced Raman scattering (SERS) spectroscopy can be a useful tool in regard to disease diagnosis and prevention. Advantage of SERS over conventional Raman spectroscopy is its significantly increased signal (up to factor of 106-108) which allows detection of trace amounts of substances in the sample. So far, this technique is successfully used for analysis of food, pieces of art and various biochemical/biomedical samples. In this work, we survey the possibility of applying SERS spectroscopy for detection of trace components in urinary deposits. Early discovery together with the identification of the exact chemical composition of urinary sediments could be crucial for taking appropriate preventive measures that inhibit kidney stone formation or growth processes. In this initial study, SERS spectra (excitation wavelength - 1064 nm) of main components of urinary deposits (calcium oxalate, uric acid, cystine, etc.) were recorded by using silver (Ag) colloid. Spectra of 10-3-10-5 M solutions were obtained. While no/small Raman signal was detected without the Ag colloid, characteristic peaks of the substances could be clearly separated in the SERS spectra. This suggests that even small amounts of the components could be detected and taken into account while determining the type of kidney stone forming in the urinary system. We found for the first time that trace amounts of components constituting urinary deposits could be detected by SERS spectroscopy. In the future study, the analysis of centrifuged urine samples will be carried out.
NASA Astrophysics Data System (ADS)
Thompson, A. P.; Swiler, L. P.; Trott, C. R.; Foiles, S. M.; Tucker, G. J.
2015-03-01
We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum.
Mantini, Dante; Petrucci, Francesca; Del Boccio, Piero; Pieragostino, Damiana; Di Nicola, Marta; Lugaresi, Alessandra; Federici, Giorgio; Sacchetta, Paolo; Di Ilio, Carmine; Urbani, Andrea
2008-01-01
Independent component analysis (ICA) is a signal processing technique that can be utilized to recover independent signals from a set of their linear mixtures. We propose ICA for the analysis of signals obtained from large proteomics investigations such as clinical multi-subject studies based on MALDI-TOF MS profiling. The method is validated on simulated and experimental data for demonstrating its capability of correctly extracting protein profiles from MALDI-TOF mass spectra. The comparison on peak detection with an open-source and two commercial methods shows its superior reliability in reducing the false discovery rate of protein peak masses. Moreover, the integration of ICA and statistical tests for detecting the differences in peak intensities between experimental groups allows to identify protein peaks that could be indicators of a diseased state. This data-driven approach demonstrates to be a promising tool for biomarker-discovery studies based on MALDI-TOF MS technology. The MATLAB implementation of the method described in the article and both simulated and experimental data are freely available at http://www.unich.it/proteomica/bioinf/.
Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S.
2016-01-01
The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM. PMID:26927185
Blau, Ashley; Brown, Alison; Mahanta, Lisa; Amr, Sami S
2016-02-26
The Translational Genomics Core (TGC) at Partners Personalized Medicine (PPM) serves as a fee-for-service core laboratory for Partners Healthcare researchers, providing access to technology platforms and analysis pipelines for genomic, transcriptomic, and epigenomic research projects. The interaction of the TGC with various components of PPM provides it with a unique infrastructure that allows for greater IT and bioinformatics opportunities, such as sample tracking and data analysis. The following article describes some of the unique opportunities available to an academic research core operating within PPM, such the ability to develop analysis pipelines with a dedicated bioinformatics team and maintain a flexible Laboratory Information Management System (LIMS) with the support of an internal IT team, as well as the operational challenges encountered to respond to emerging technologies, diverse investigator needs, and high staff turnover. In addition, the implementation and operational role of the TGC in the Partners Biobank genotyping project of over 25,000 samples is presented as an example of core activities working with other components of PPM.
Ping-Keng Jao; Yuan-Pin Lin; Yi-Hsuan Yang; Tzyy-Ping Jung
2015-08-01
An emerging challenge for emotion classification using electroencephalography (EEG) is how to effectively alleviate day-to-day variability in raw data. This study employed the robust principal component analysis (RPCA) to address the problem with a posed hypothesis that background or emotion-irrelevant EEG perturbations lead to certain variability across days and somehow submerge emotion-related EEG dynamics. The empirical results of this study evidently validated our hypothesis and demonstrated the RPCA's feasibility through the analysis of a five-day dataset of 12 subjects. The RPCA allowed tackling the sparse emotion-relevant EEG dynamics from the accompanied background perturbations across days. Sequentially, leveraging the RPCA-purified EEG trials from more days appeared to improve the emotion-classification performance steadily, which was not found in the case using the raw EEG features. Therefore, incorporating the RPCA with existing emotion-aware machine-learning frameworks on a longitudinal dataset of each individual may shed light on the development of a robust affective brain-computer interface (ABCI) that can alleviate ecological inter-day variability.
Internal Dynamics and Chiral Analysis of Pulegone, Using Microwave Broadband Spectroscopy
NASA Astrophysics Data System (ADS)
Krin, Anna; Perez, Cristobal; Schnell, Melanie; Quesada-Moreno, María del Mar; López-González, Juan Jesús; Avilés-Moreno, Juan Ramón; Pinacho, Pablo; Blanco, Susana; Lopez, Juan Carlos
2017-06-01
Essential oils, such as peppermint or pennyroyal oil, are widely used in medicine, pharmacology and cosmetics. Their major constituents, terpenes, are mostly chiral molecules and thus may exhibit different biological functionality with respect to their enantiomers. Here, we present recent results on the enantiomers of pulegone, one of the components of the peppermint (Mentha piperita L.) and pennyroyal (Mentha pulegium) essential oils, using the microwave three-wave mixing (M3WM) technique. M3WM relies on the fact that the scalar triple product of the dipole moment components μ_{a}, μ_{b} and μ_{c} differs in sign between the enantiomers. A loop of three dipole-allowed rotational transitions is required for the analysis of a chiral molecule. Since the recorded signal will be exactly out of phase for the two enantiomers, an unambiguous differentiation between them is possible, even in complex mixtures. In addition to the chiral analysis of pulegone, its internal dynamics, resulting from the independent rotation of two of its three methyl groups, will be discussed. Moreover, a cluster of pulegone with one water molecule will be presented.
NASA Astrophysics Data System (ADS)
Hoynant, G.
2007-12-01
Fourier analysis allows to identify periodical components in a time series of measurements under the form of a spectrum of the periodical components mathematically included in the series. The reading of a spectrum is often delicate and contradictory interpretations can be presented in some cases as for the luminosity of Seyfert galaxy NGC 4151 despite the very large number of observations since 1968. The present study identifies the causes of these difficulties thanks to an experimental approach based on analysis of synthetic series with one periodic component only. The total duration of the campaign must be long as compared to the periods to be identified: this ratio governs the separation capability of the spectral analysis. A large number of observations is obviously favourable but the intervals between measurements are not critical : the analysis can accommodate intervals significantly longer than the periods to be identified. But interruptions along the campaign, with separate sessions of observations, make the physical understanding of the analysis difficult and sometimes impossible. An analysis performed on an imperfect series shows peaks which are not significant of the signal itself but of the chronological distribution of the measurements. These chronological peaks are becoming numerous and important when there are vacancy periods in the campaign. A method for authentication of a peak as a peak of the signal is to cut the chronological series in pieces with the same length than the period to identify and to superpose all these pieces. The present study shows that some chronological peaks can exhibit superposition graphics almost as clear as those for the signal peaks. Practically, the search for periodical components necessitates to organise the campaign specifically with a neutral chronological distribution of measurements without vacancies and the authentication of a peak as a peak of the signal requires a dominating amplitude or a graphic of periodical superposition significantly clearer than for any peak with a comparable or bigger amplitude.
EPE analysis of sub-N10 BEoL flow with and without fully self-aligned via using Coventor SEMulator3D
NASA Astrophysics Data System (ADS)
Franke, Joern-Holger; Gallagher, Matt; Murdoch, Gayle; Halder, Sandip; Juncker, Aurelie; Clark, William
2017-03-01
During the last few decades, the semiconductor industry has been able to scale device performance up while driving costs down. What started off as simple geometrical scaling, driven mostly by advances in lithography, has recently been accompanied by advances in processing techniques and in device architectures. The trend to combine efforts using process technology and lithography is expected to intensify, as further scaling becomes ever more difficult. One promising component of future nodes are "scaling boosters", i.e. processing techniques that enable further scaling. An indispensable component in developing these ever more complex processing techniques is semiconductor process modeling software. Visualization of complex 3D structures in SEMulator3D, along with budget analysis on film thicknesses, CD and etch budgets, allow process integrators to compare flows before any physical wafers are run. Hundreds of "virtual" wafers allow comparison of different processing approaches, along with EUV or DUV patterning options for defined layers and different overlay schemes. This "virtual fabrication" technology produces massively parallel process variation studies that would be highly time-consuming or expensive in experiment. Here, we focus on one particular scaling booster, the fully self-aligned via (FSAV). We compare metal-via-metal (mevia-me) chains with self-aligned and fully-self-aligned via's using a calibrated model for imec's N7 BEoL flow. To model overall variability, 3D Monte Carlo modeling of as many variability sources as possible is critical. We use Coventor SEMulator3D to extract minimum me-me distances and contact areas and show how fully self-aligned vias allow a better me-via distance control and tighter via-me contact area variability compared with the standard self-aligned via (SAV) approach.
NASA Astrophysics Data System (ADS)
Silberman, L.; Dekel, A.; Eldar, A.; Zehavi, I.
2001-08-01
We allow for nonlinear effects in the likelihood analysis of galaxy peculiar velocities and obtain ~35% lower values for the cosmological density parameter Ωm and for the amplitude of mass density fluctuations σ8Ω0.6m. This result is obtained under the assumption that the power spectrum in the linear regime is of the flat ΛCDM model (h=0.65, n=1, COBE normalized) with only Ωm as a free parameter. Since the likelihood is driven by the nonlinear regime, we ``break'' the power spectrum at kb~0.2 (h-1 Mpc)-1 and fit a power law at k>kb. This allows for independent matching of the nonlinear behavior and an unbiased fit in the linear regime. The analysis assumes Gaussian fluctuations and errors and a linear relation between velocity and density. Tests using mock catalogs that properly simulate nonlinear effects demonstrate that this procedure results in a reduced bias and a better fit. We find for the Mark III and SFI data Ωm=0.32+/-0.06 and 0.37+/-0.09, respectively, with σ8Ω0.6m=0.49+/-0.06 and 0.63+/-0.08, in agreement with constraints from other data. The quoted 90% errors include distance errors and cosmic variance, for fixed values of the other parameters. The improvement in the likelihood due to the nonlinear correction is very significant for Mark III and moderately significant for SFI. When allowing deviations from ΛCDM, we find an indication for a wiggle in the power spectrum: an excess near k~0.05 (h-1 Mpc)-1 and a deficiency at k~0.1 (h-1 Mpc)-1, or a ``cold flow.'' This may be related to the wiggle seen in the power spectrum from redshift surveys and the second peak in the cosmic microwave background (CMB) anisotropy. A χ2 test applied to modes of a principal component analysis (PCA) shows that the nonlinear procedure improves the goodness of fit and reduces a spatial gradient that was of concern in the purely linear analysis. The PCA allows us to address spatial features of the data and to evaluate and fine-tune the theoretical and error models. It demonstrates in particular that the models used are appropriate for the cosmological parameter estimation performed. We address the potential for optimal data compression using PCA.
NASA Astrophysics Data System (ADS)
Liang, B.; Iwnicki, S. D.; Zhao, Y.
2013-08-01
The power spectrum is defined as the square of the magnitude of the Fourier transform (FT) of a signal. The advantage of FT analysis is that it allows the decomposition of a signal into individual periodic frequency components and establishes the relative intensity of each component. It is the most commonly used signal processing technique today. If the same principle is applied for the detection of periodicity components in a Fourier spectrum, the process is called the cepstrum analysis. Cepstrum analysis is a very useful tool for detection families of harmonics with uniform spacing or the families of sidebands commonly found in gearbox, bearing and engine vibration fault spectra. Higher order spectra (HOS) (also known as polyspectra) consist of higher order moment of spectra which are able to detect non-linear interactions between frequency components. For HOS, the most commonly used is the bispectrum. The bispectrum is the third-order frequency domain measure, which contains information that standard power spectral analysis techniques cannot provide. It is well known that neural networks can represent complex non-linear relationships, and therefore they are extremely useful for fault identification and classification. This paper presents an application of power spectrum, cepstrum, bispectrum and neural network for fault pattern extraction of induction motors. The potential for using the power spectrum, cepstrum, bispectrum and neural network as a means for differentiating between healthy and faulty induction motor operation is examined. A series of experiments is done and the advantages and disadvantages between them are discussed. It has been found that a combination of power spectrum, cepstrum and bispectrum plus neural network analyses could be a very useful tool for condition monitoring and fault diagnosis of induction motors.
Pont, Laura; Sanz-Nebot, Victoria; Vilaseca, Marta; Jaumot, Joaquim; Tauler, Roma; Benavente, Fernando
2018-05-01
In this study, we describe a chemometric data analysis approach to assist in the interpretation of the complex datasets from the analysis of high-molecular mass oligomeric proteins by ion mobility mass spectrometry (IM-MS). The homotetrameric protein transthyretin (TTR) is involved in familial amyloidotic polyneuropathy type I (FAP-I). FAP-I is associated with a specific TTR mutant variant (TTR(Met30)) that can be easily detected analyzing the monomeric forms of the mutant protein. However, the mechanism of protein misfolding and aggregation onset, which could be triggered by structural changes in the native tetrameric protein, remains under investigation. Serum TTR from healthy controls and FAP-I patients was purified under non-denaturing conditions by conventional immunoprecipitation in solution and analyzed by IM-MS. IM-MS allowed separation and characterization of several tetrameric, trimeric and dimeric TTR gas ions due to their differential drift time. After an appropriate data pre-processing, multivariate curve resolution alternating least squares (MCR-ALS) was applied to the complex datasets. A group of seven independent components being characterized by their ion mobility profiles and mass spectra were resolved to explain the observed data variance in control and patient samples. Then, principal component analysis (PCA) and partial least squares discriminant analysis (PLS-DA) were considered for exploration and classification. Only four out of the seven resolved components were enough for an accurate differentiation. Furthermore, the specific TTR ions identified in the mass spectra of these components and the resolved ion mobility profiles provided a straightforward insight into the most relevant oligomeric TTR proteoforms for the disease. Copyright © 2018 Elsevier B.V. All rights reserved.
Development of a Spatial Decision Support System for Analyzing Changes in Hydro-meteorological Risk
NASA Astrophysics Data System (ADS)
van Westen, Cees
2013-04-01
In the framework of the EU FP7 Marie Curie ITN Network "CHANGES: Changing Hydro-meteorological Risks, as Analyzed by a New Generation of European Scientists (http://www.changes-itn.eu)", a spatial decision support system is under development with the aim to analyze the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The SDSS is one of the main outputs of the CHANGES network, which will develop an advanced understanding of how global changes, related to environmental and climate change as well as socio-economical change, may affect the temporal and spatial patterns of hydro-meteorological hazards and associated risks in Europe; how these changes can be assessed, modeled, and incorporated in sustainable risk management strategies, focusing on spatial planning, emergency preparedness and risk communication. The CHANGES network consists of 11 full partners and 6 associate partners of which 5 private companies, representing 10 European countries. The CHANGES network has hired 12 Early Stage Researchers (ESRs) and is currently hiring 3-6 researchers more for the implementation of the SDSS. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs). The envisaged users of the platform are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. This paper presents the main components of the SDSS and the overall design and plans for the user interface.
Principal component analysis on a torus: Theory and application to protein dynamics.
Sittel, Florian; Filk, Thomas; Stock, Gerhard
2017-12-28
A dimensionality reduction method for high-dimensional circular data is developed, which is based on a principal component analysis (PCA) of data points on a torus. Adopting a geometrical view of PCA, various distance measures on a torus are introduced and the associated problem of projecting data onto the principal subspaces is discussed. The main idea is that the (periodicity-induced) projection error can be minimized by transforming the data such that the maximal gap of the sampling is shifted to the periodic boundary. In a second step, the covariance matrix and its eigendecomposition can be computed in a standard manner. Adopting molecular dynamics simulations of two well-established biomolecular systems (Aib 9 and villin headpiece), the potential of the method to analyze the dynamics of backbone dihedral angles is demonstrated. The new approach allows for a robust and well-defined construction of metastable states and provides low-dimensional reaction coordinates that accurately describe the free energy landscape. Moreover, it offers a direct interpretation of covariances and principal components in terms of the angular variables. Apart from its application to PCA, the method of maximal gap shifting is general and can be applied to any other dimensionality reduction method for circular data.
Chemical Polymorphism of Essential Oils of Artemisia vulgaris Growing Wild in Lithuania.
Judzentiene, Asta; Budiene, Jurga
2018-02-01
Compositional variability of mugwort (Artemisia vulgaris L.) essential oils has been investigated in the study. Plant material (over ground parts at full flowering stage) was collected from forty-four wild populations in Lithuania. The oils from aerial parts were obtained by hydrodistillation and analyzed by GC(FID) and GC/MS. In total, up to 111 components were determined in the oils. As the major constituents were found: sabinene, 1,8-cineole, artemisia ketone, both thujone isomers, camphor, cis-chrysanthenyl acetate, davanone and davanone B. The compositional data were subjected to statistical analysis. The application of PCA (Principal Component Analysis) and AHC (Agglomerative Hierarchical Clustering) allowed grouping the oils into six clusters. AHC permitted to distinguish an artemisia ketone chemotype, which, to the best of our knowledge, is very scarce. Additionally, two rare cis-chrysanthenyl acetate and sabinene oil types were determined for the plants growing in Lithuania. Besides, davanone was found for the first time as a principal component in mugwort oils. The performed study revealed significant chemical polymorphism of essential oils in mugwort plants native to Lithuania; it has expanded our chemotaxonomic knowledge both of A. vulgaris species and Artemisia genus. © 2018 Wiley-VHCA AG, Zurich, Switzerland.
Derivation of Boundary Manikins: A Principal Component Analysis
NASA Technical Reports Server (NTRS)
Young, Karen; Margerum, Sarah; Barr, Abbe; Ferrer, Mike A.; Rajulu, Sudhakar
2008-01-01
When designing any human-system interface, it is critical to provide realistic anthropometry to properly represent how a person fits within a given space. This study aimed to identify a minimum number of boundary manikins or representative models of subjects anthropometry from a target population, which would realistically represent the population. The boundary manikin anthropometry was derived using, Principal Component Analysis (PCA). PCA is a statistical approach to reduce a multi-dimensional dataset using eigenvectors and eigenvalues. The measurements used in the PCA were identified as those measurements critical for suit and cockpit design. The PCA yielded a total of 26 manikins per gender, as well as their anthropometry from the target population. Reduction techniques were implemented to reduce this number further with a final result of 20 female and 22 male subjects. The anthropometry of the boundary manikins was then be used to create 3D digital models (to be discussed in subsequent papers) intended for use by designers to test components of their space suit design, to verify that the requirements specified in the Human Systems Integration Requirements (HSIR) document are met. The end-goal is to allow for designers to generate suits which accommodate the diverse anthropometry of the user population.
Principal component analysis on a torus: Theory and application to protein dynamics
NASA Astrophysics Data System (ADS)
Sittel, Florian; Filk, Thomas; Stock, Gerhard
2017-12-01
A dimensionality reduction method for high-dimensional circular data is developed, which is based on a principal component analysis (PCA) of data points on a torus. Adopting a geometrical view of PCA, various distance measures on a torus are introduced and the associated problem of projecting data onto the principal subspaces is discussed. The main idea is that the (periodicity-induced) projection error can be minimized by transforming the data such that the maximal gap of the sampling is shifted to the periodic boundary. In a second step, the covariance matrix and its eigendecomposition can be computed in a standard manner. Adopting molecular dynamics simulations of two well-established biomolecular systems (Aib9 and villin headpiece), the potential of the method to analyze the dynamics of backbone dihedral angles is demonstrated. The new approach allows for a robust and well-defined construction of metastable states and provides low-dimensional reaction coordinates that accurately describe the free energy landscape. Moreover, it offers a direct interpretation of covariances and principal components in terms of the angular variables. Apart from its application to PCA, the method of maximal gap shifting is general and can be applied to any other dimensionality reduction method for circular data.
Nguyen, Quoc-Thang; Miledi, Ricardo
2003-09-30
Current computer programs for intracellular recordings often lack advanced data management, are usually incompatible with other applications and are also difficult to adapt to new experiments. We have addressed these shortcomings in e-Phys, a suite of electrophysiology applications for intracellular recordings. The programs in e-Phys use Component Object Model (COM) technologies available in the Microsoft Windows operating system to provide enhanced data storage, increased interoperability between e-Phys and other COM-aware applications, and easy customization of data acquisition and analysis thanks to a script-based integrated programming environment. Data files are extensible, hierarchically organized and integrated in the Windows shell by using the Structured Storage technology. Data transfers to and from other programs are facilitated by implementing the ActiveX Automation standard and distributed COM (DCOM). ActiveX Scripting allows experimenters to write their own event-driven acquisition and analysis programs in the VBScript language from within e-Phys. Scripts can reuse components available from other programs on other machines to create distributed meta-applications. This paper describes the main features of e-Phys and how this package was used to determine the effect of the atypical antipsychotic drug clozapine on synaptic transmission at the neuromuscular junction.