These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

A survey of deconvolution approaches in teleseismic receiver function analysis  

NASA Astrophysics Data System (ADS)

Receiver function analysis is frequently used to image the Earth's crustal and upper mantle structure. The essential processing step in this analysis is the source normalization, which can be accomplished through deconvolution. Though a variety of deconvolution approaches have been employed over the years to solve this problem, no systematic comparison of these approaches has yet been done. Here, we present the results of such a comparison with the aim of evaluating the various deconvolution approaches and providing some guidelines as to which approach may be better suited for specific applications. The following deconvolution approaches are systematically compared: frequency-domain spectral division using both water-level and damping-factor regularization, multi-taper cross-correlation in the frequency domain, time-domain least squares filtering, and iterative time-domain deconvolution. We carry out benchmark tests on synthetic and real data to assess how the various approaches perform for different input conditions - e.g., data quality (including noise content), data volume based on number of stations and events, and the complexity of the target structure. Our results show that the different approaches produce receiver functions that are equally robust provided that a suitable regularization parameter is found - a task that is usually more easily accomplished in the time domain. However, in the case of noisy data, we find that the iterative time-domain deconvolution can generate as much ringing in the resulting receiver function as poorly regularized frequency-domain spectral division. If computational speed is sought, for example when dealing with large data sets, then the use of frequency-domain approaches might be more attractive. We also find that some deconvolution approaches may be better adapted than others to address specific imaging goals. For example, iterative time-domain deconvolution can be used to quickly construct profiles of first-order discontinuities (e.g., Moho and its multiples) by restricting the number of iterations (n=10-20) and thus filtering out higher-order converted signals.

Spieker, Kathrin; Rondenay, Stéphane; Halpaap, Felix

2014-05-01

2

Chemometric data analysis for deconvolution of overlapped ion mobility profiles.  

PubMed

We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution. PMID:22948903

Zekavat, Behrooz; Solouki, Touradj

2012-11-01

3

A L? sparse analysis prior for blind poissonian image deconvolution.  

PubMed

This paper proposes a new approach for blindly deconvolving images that are contaminated by Poisson noise. The proposed approach incorporates a new prior, that is the L0 sparse analysis prior, together with the total variation constraint into the maximum a posteriori (MAP) framework for deconvolution. A greedy analysis pursuit numerical scheme is exploited to solve the L0 regularized MAP problem. Experimental results show that our approach not only produces smooth results substantially suppressing artifacts and noise, but also preserves intensity changes sharply. Both quantitative and qualitative comparisons to the specialized state-of-the-art algorithms demonstrate its superiority. PMID:24663705

Gong, Xiaojin; Lai, Baisheng; Xiang, Zhiyu

2014-02-24

4

Impact of sampling technique on appraisal of pulsatile insulin secretion by deconvolution and cluster analysis.  

PubMed

Little is known about the optimal experimental conditions for assessing pulsatile insulin secretion in vivo. To address this, we employed a recently validated canine model (n = 12) to determine the consequences of 1) sampling from the systemic circulation (SC) vs. the portal vein (PV), 2) sampling intensity and duration, and 3) deconvolution vs. cluster analysis on assessing pulsatile insulin secretion. PV vs. SC sampling resulted in a approximately 40% higher pulse frequency by deconvolution (9.0 +/- 0.5 vs. 6.6 +/- 0.9 pulses/h, P < 0.02) and cluster analysis (7.5 +/- 0.3 vs. 5.6 +/- 0.6 pulses/h, P < 0.01) due to a higher signal-to-noise ratio (19 +/- 4.8 PV vs. 12 +/- 1.8 SC). PV sampling also disclosed a higher calculated contribution of the pulsatile vs. nonpulsatile mode of delivery to total insulin secretion (57 +/- 4 vs. 28 +/- 5%, P < 0.001). Analysis of the relevance of sampling intensity revealed that 1-min data yielded a markedly higher estimate of pulse frequency with PV sampling than 2-min data (9.0 +/- 0.5 vs. 5.4 +/- 0.5, P < 0.02, deconvolution; 7.5 +/- 0.3 vs. 4.3 +/- 0.6 pulses/h, P < 0.001, cluster). Optimal sampling duration was shown to be 40 min or more. We conclude that the resolving power of the analytical tool, the anatomic site of blood withdrawal, the frequency of blood sampling, and the duration of the total observation interval all significantly influence estimated insulin secretory pulse frequency and the fraction of insulin secreted in pulses. With the assumption that PV 1-min insulin data constitute the "gold standard," our in vivo inferences of 7.5-9.0 insulin pulses/h closely recapitulate in vitro islet secretory activity. PMID:8572204

Pørksen, N; Munn, S; Steers, J; Veldhuis, J D; Butler, P C

1995-12-01

5

Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution  

NASA Technical Reports Server (NTRS)

The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

1987-01-01

6

GlowFit—a new tool for thermoluminescence glow-curve deconvolution  

Microsoft Academic Search

A new computer program, GlowFit, for deconvoluting first-order kinetics thermoluminescence (TL) glow-curves has been developed. A non-linear function describing a single glow-peak is fitted to experimental points using the least squares Levenberg–Marquardt method. The main advantage of GlowFit is the ability to resolve complex TL glow-curves consisting of strongly overlapping peaks, as those observed in heavily-doped LiF:Mg,Ti (MTT) detectors. This

M. Puchalska; P. Bilski

2006-01-01

7

Deconvolution analysis of 99m Tc-methylene diphosphonate kinetics in metabolic bone disease  

Microsoft Academic Search

The kinetics of 99mTc-methylene diphosphonate (MDP) and 47Ca were studied in three patients with osteoporosis, three patients with hyperparathyroidism, and two patients with osteomalacia. The activities of 99mTc-MDP were recorded in the lumbar spine, paravertebral soft tissues, and in venous blood samples for 1 h after injection. The results were submitted to deconvolution analysis to determine regional bone accumulation rates.

J. Knop; E. Kröger; P. Stritzke; C. Schneider; H.-P. Kruse

1981-01-01

8

OEXP Analysis Tools Workshop  

NASA Technical Reports Server (NTRS)

This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

1988-01-01

9

Multichannel blind separation and deconvolution of images for document analysis.  

PubMed

In this paper, we apply Bayesian blind source separation (BSS) from noisy convolutive mixtures to jointly separate and restore source images degraded through unknown blur operators, and then linearly mixed. We found that this problem arises in several image processing applications, among which there are some interesting instances of degraded document analysis. In particular, the convolutive mixture model is proposed for describing multiple views of documents affected by the overlapping of two or more text patterns. We consider two different models, the interchannel model, where the data represent multispectral views of a single-sided document, and the intrachannel model, where the data are given by two sets of multispectral views of the recto and verso side of a document page. In both cases, the aim of the analysis is to recover clean maps of the main foreground text, but also the enhancement and extraction of other document features, such as faint or masked patterns. We adopt Bayesian estimation for all the unknowns and describe the typical local correlation within the individual source images through the use of suitable Gibbs priors, accounting also for well-behaved edges in the images. This a priori information is particularly suitable for the kind of objects depicted in the images treated, i.e., homogeneous texts in homogeneous background, and, as such, is capable to stabilize the ill-posed, inverse problem considered. The method is validated through numerical and real experiments that are representative of various real scenarios. PMID:20028627

Tonazzini, Anna; Gerace, Ivan; Martinelli, Francesca

2010-04-01

10

Investigation of the CLEAN deconvolution method for use with Late Time Response analysis of multiple objects  

NASA Astrophysics Data System (ADS)

This paper investigates the application of the CLEAN non-linear deconvolution method to Late Time Response (LTR) analysis for detecting multiple objects in Concealed Threat Detection (CTD). When an Ultra-Wide Band (UWB) frequency radar signal is used to illuminate a conductive target, surface currents are induced upon the object which in turn give rise to LTR signals. These signals are re-radiated from the target and the results from a number of targets are presented. The experiment was performed using double ridged horn antenna in a pseudo-monostatic arrangement. A Vector network analyser (VNA) has been used to provide the UWB Frequency Modulated Continuous Wave (FMCW) radar signal. The distance between the transmitting antenna and the target objects has been kept at 1 metre for all the experiments performed and the power level at the VNA was set to 0dBm. The targets in the experimental setup are suspended in air in a laboratory environment. Matlab has been used in post processing to perform linear and non-linear deconvolution of the signal. The Wiener filter, Fast Fourier Transform (FFT) and Continuous Wavelet Transform (CWT) are used to process the return signals and extract the LTR features from the noise clutter. A Generalized Pencil-of-Function (GPOF) method was then used to extract the complex poles of the signal. Artificial Neural Networks (ANN) and Linear Discriminant Analysis (LDA) have been used to classify the data.

Hutchinson, Simon; Taylor, Christopher T.; Fernando, Michael; Andrews, David; Bowring, Nicholas

2014-10-01

11

On the possibility of using commercial software packages for thermoluminescence glow curve deconvolution analysis.  

PubMed

This paper explores the possibility of using commercial software for thermoluminescence glow curve deconvolution (GCD) analysis. The program PEAKFIT has been used to perform GCD analysis of complex glow curves of quartz and dosimetric materials. First-order TL peaks were represented successfully using the Weibull distribution function. Second-order and general-order TL peaks were represented accurately by using the Logistic asymmetric functions with varying symmetry parameters. Analytical expressions were derived for determining the energy E from the parameters of the Logistic asymmetric functions. The accuracy of these analytical expressions for E was tested for a wide variety of kinetic parameters and was found to be comparable to the commonly used expressions in the TL literature. The effectiveness of fit of the analytical functions used here was tested using the figure of merit (FOM) and was found to be comparable to the accuracy of recently published GCD expressions for first- and general-order kinetics. PMID:12382713

Pagonis, V; Kitis, G

2002-01-01

12

Figure 2 Analysis Tool Interface Level-1 / PBBT Analysis Tool  

E-print Network

Figure 2 ­ Analysis Tool Interface Level-1 / PBBT Analysis Tool Introduction The Level-1/PBBT Analysis Tool (LPAT) was designed to assist in the analysis of North American Standard Level-1 Inspection to define data for analysis. (Fig. 2) Specific test periods Number of axles Option to consider specific

13

Extended Testability Analysis Tool  

NASA Technical Reports Server (NTRS)

The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

Melcher, Kevin; Maul, William A.; Fulton, Christopher

2012-01-01

14

Flight Operations Analysis Tool  

NASA Technical Reports Server (NTRS)

Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

2006-01-01

15

Analysis of force-deconvolution methods in frequency-modulation atomic force microscopy  

PubMed Central

Summary In frequency-modulation atomic force microscopy the direct observable is the frequency shift of an oscillating cantilever in a force field. This frequency shift is not a direct measure of the actual force, and thus, to obtain the force, deconvolution methods are necessary. Two prominent methods proposed by Sader and Jarvis (Sader–Jarvis method) and Giessibl (matrix method) are investigated with respect to the deconvolution quality. Both methods show a nontrivial dependence of the deconvolution quality on the oscillation amplitude. The matrix method exhibits spikelike features originating from a numerical artifact. By interpolation of the data, the spikelike features can be circumvented. The Sader–Jarvis method has a continuous amplitude dependence showing two minima and one maximum, which is an inherent property of the deconvolution algorithm. The optimal deconvolution depends on the ratio of the amplitude and the characteristic decay length of the force for the Sader–Jarvis method. However, the matrix method generally provides the higher deconvolution quality. PMID:22496997

Illek, Esther; Giessibl, Franz J

2012-01-01

16

A System Analysis Tool  

SciTech Connect

In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

CAMPBELL,PHILIP L.; ESPINOZA,JUAN

2000-06-01

17

Draper Station Analysis Tool  

NASA Technical Reports Server (NTRS)

Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

2011-01-01

18

Analysis of a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging  

NASA Astrophysics Data System (ADS)

Grating-based X-ray phase-contrast imaging is a promising imaging modality to increase soft tissue contrast in comparison to conventional attenuation-based radiography. Complementary and otherwise inaccessible information is provided by the dark-field image, which shows the sub-pixel size granularity of the measured object. This could especially turn out to be useful in mammography, where tumourous tissue is connected with the presence of supertiny microcalcifications. In addition to the well-established image reconstruction process, an analysis method was introduced by Modregger, 1 which is based on deconvolution of the underlying scattering distribution within a single pixel revealing information about the sample. Subsequently, the different contrast modalities can be calculated with the scattering distribution. The method already proved to deliver additional information in the higher moments of the scattering distribution and possibly reaches better image quality with respect to an increased contrast-to-noise ratio. Several measurements were carried out using melamine foams as phantoms. We analysed the dependency of the deconvolution-based method with respect to the dark-field image on different parameters such as dose, number of iterations of the iterative deconvolution-algorithm and dark-field signal. A disagreement was found in the reconstructed dark-field values between the FFT method and the iterative method. Usage of the resulting characteristics might be helpful in future applications.

Horn, Florian; Bayer, Florian; Pelzer, Georg; Rieger, Jens; Ritter, André; Weber, Thomas; Zang, Andrea; Michel, Thilo; Anton, Gisela

2014-03-01

19

A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis  

Microsoft Academic Search

DNA methylation is an indispensible epigenetic modification required for regulating the expression of mammalian genomes. Immunoprecipitation-based methods for DNA methylome analysis are rapidly shifting the bottleneck in this field from data generation to data analysis, necessitating the development of better analytical tools. In particular, an inability to estimate absolute methylation levels remains a major analytical difficulty associated with immunoprecipitation-based DNA

Daniel J Turner; Paul Flicek; Heng Li; Eugene Kulesha; Stefan Gräf; Nathan Johnson; Javier Herrero; Eleni M Tomazou; Natalie P Thorne; Liselotte Bäckdahl; Marlis Herberth; Kevin L Howe; David K Jackson; Marcos M Miretti; John C Marioni; Ewan Birney; Tim J P Hubbard; Richard Durbin; Simon Tavaré; Thomas A Down; Vardhman K Rakyan; Stephan Beck

2008-01-01

20

Hurricane Data Analysis Tool  

NASA Technical Reports Server (NTRS)

In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

2011-01-01

21

SPLAT: Spectral Analysis Tool  

NASA Astrophysics Data System (ADS)

SPLAT is a graphical tool for displaying, comparing, modifying and analyzing astronomical spectra stored in NDF, FITS and TEXT files as well as in NDX format. It can read in many spectra at the same time and then display these as line plots. Display windows can show one or several spectra at the same time and can be interactively zoomed and scrolled, centered on specific wavelengths, provide continuous coordinate readout, produce printable hardcopy and be configured in many ways. Analysis facilities include the fitting of a polynomial to selected parts of a spectrum, the fitting of Gaussian, Lorentzian and Voigt profiles to emission and absorption lines and the filtering of spectra using average, median and line-shape window functions as well as wavelet denoising. SPLAT also supports a full range of coordinate systems for spectra, which allows coordinates to be displayed and aligned in many different coordinate systems (wavelength, frequency, energy, velocity) and transformed between these and different standards of rest (topocentric, heliocentric, dynamic and kinematic local standards of rest, etc). SPLAT is distributed as part of the Starlink (ascl:1110.012) software collection.

Draper, Peter W.

2014-02-01

22

Java Radar Analysis Tool  

NASA Technical Reports Server (NTRS)

Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

Zaczek, Mariusz P.

2005-01-01

23

Deconvolution of Gas Chromatograms with Excel  

NASA Astrophysics Data System (ADS)

The use of Excel for peak deconvolution by nonlinear regression analysis is discussed. Excel is easily employed in an experiment that introduced students to the use of nonlinear regression analysis for the deconvolution of overlapped gas chromatographic peaks (J. Chem. Educ. 1994, 71, 483-486). Excel gave similar results to those reported previously that used the Fortran program KINFIT for nonlinear regression analysis.

Arena, James V.; Leu, Tuan M.

1999-06-01

24

Rupture behaviors of the 2011 Tohoku earthquake and its strongest foreshock through an empirical Green's function deconvolution analysis  

NASA Astrophysics Data System (ADS)

An empirical Green's function (EGF) deconvolution analysis was applied to study the source characteristics of the 2011 Mw 9.0 Tohoku and 2011 Mw 7.4 Sanriku-Oki earthquakes. For the 2011 Tohoku earthquake, we demonstrate that nucleation released weak but high-frequency energy and that the rupture propagated downward and sped toward the deep region after the up-dip slip extended to the trench. Moreover, the 2011 Sanriku-Oki earthquake results and a previous study on the 1994 Sanriku-Oki earthquake suggest that large earthquakes in the subduction zone around the Tohoku area prefer to rapidly rupture toward the deeper (down-dip) region.

Wen, Yi-Ying

2014-02-01

25

FSSC Science Tools: Pulsar Analysis  

NASA Technical Reports Server (NTRS)

This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

Thompson, Dave

2010-01-01

26

Independent component analysis (ICA) algorithms for improved spectral deconvolution of overlapped signals in 1H NMR analysis: application to foods and related products.  

PubMed

The major challenge facing NMR spectroscopic mixture analysis is the overlapping of signals and the arising impossibility to easily recover the structures for identification of the individual components and to integrate separated signals for quantification. In this paper, various independent component analysis (ICA) algorithms [mutual information least dependent component analysis (MILCA); stochastic non-negative ICA (SNICA); joint approximate diagonalization of eigenmatrices (JADE); and robust, accurate, direct ICA algorithm (RADICAL)] as well as deconvolution methods [simple-to-use-interactive self-modeling mixture analysis (SIMPLISMA) and multivariate curve resolution-alternating least squares (MCR-ALS)] are applied for simultaneous (1)H NMR spectroscopic determination of organic substances in complex mixtures. Among others, we studied constituents of the following matrices: honey, soft drinks, and liquids used in electronic cigarettes. Good quality spectral resolution of up to eight-component mixtures was achieved (correlation coefficients between resolved and experimental spectra were not less than 0.90). In general, the relative errors in the recovered concentrations were below 12%. SIMPLISMA and MILCA algorithms were found to be preferable for NMR spectra deconvolution and showed similar performance. The proposed method was used for analysis of authentic samples. The resolved ICA concentrations match well with the results of reference gas chromatography-mass spectrometry as well as the MCR-ALS algorithm used for comparison. ICA deconvolution considerably improves the application range of direct NMR spectroscopy for analysis of complex mixtures. PMID:24604756

Monakhova, Yulia B; Tsikin, Alexey M; Kuballa, Thomas; Lachenmeier, Dirk W; Mushtakova, Svetlana P

2014-05-01

27

Deconvolution of Gas Chromatograms with Excel  

Microsoft Academic Search

The use of Excel for peak deconvolution by nonlinear regression analysis is discussed. Excel is easily employed in an experiment that introduced students to the use of nonlinear regression analysis for the deconvolution of overlapped gas chromatographic peaks (J. Chem. Educ. 1994, 71, 483-486). Excel gave similar results to those reported previously that used the Fortran program KINFIT for nonlinear

James V. Arena; Tuan M. Leu

1999-01-01

28

Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis  

SciTech Connect

We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

Greenberg, M.; Ebel, D.S. (AMNH)

2009-03-19

29

Marginal Abatement Cost Analysis Tool  

EPA Science Inventory

The Non-CO2 Marginal Abatement Cost Analysis Tool is an extensive bottom-up engineering-economic spreadsheet model capturing the relevant cost and performance data on sectors emitting non-CO2 GHGs. The tool has 24 regions and 7 sectors and produces marginal abatement cost curves...

30

Information Gathering and Analysis Tools  

NSDL National Science Digital Library

The National Center for Environmental Decision-making Research aims "to improv[e] environmental decision making" at regional, state, and local levels. Administered by the Joint Institute for Energy and Environment in Knoxville, Tennessee, NCEDR offers many decision-making resources, most prominently, tools for information gathering and analysis. Users may select from eight categories of tool use, from Identifying Values to Post-Decision Assessment. Within each category, subcategories offer information and tools on economic market assessment, ecological relationships, and other topics. Additional Links and commentary on Strengths & Weaknesses (of tools), Communicating the Results, Looking Ahead, and Key Sources round out the site.

Research., National C.

31

VCAT: Visual Crosswalk Analysis Tool  

SciTech Connect

VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

2012-08-31

32

MORESANE: MOdel REconstruction by Synthesis-ANalysis Estimators. A sparse deconvolution algorithm for radio interferometric imaging  

E-print Network

(arXiv abridged abstract) The current years are seeing huge developments of radio telescopes and a tremendous increase of their capabilities. Such systems make mandatory the design of more sophisticated techniques not only for transporting, storing and processing this new generation of radio interferometric data, but also for restoring the astrophysical information contained in such data. In this paper we present a new radio deconvolution algorithm named MORESANE and its application to fully realistic simulated data of MeerKAT, one of the SKA precursors. This method has been designed for the difficult case of restoring diffuse astronomical sources which are faint in brightness, complex in morphology and possibly buried in the dirty beam's side lobes of bright radio sources in the field. MORESANE is a greedy algorithm which combines complementary types of sparse recovery methods in order to reconstruct the most appropriate sky model from observed radio visibilities. A synthesis approach is used for the reconst...

Dabbech, Arwa; Mary, David; Slezak, Eric; Smirnov, Oleg; Kenyon, Jonathan S

2014-01-01

33

LPA1,LPA2. Deconvolution Program  

SciTech Connect

The program is suitable for a lot of applications in applied mathematics, experimental physics, signal analytical system and some engineering applications range i.e. deconvolution spectrum, signal analysis and system property analysis etc.

Ping-An, L.; Jiang-Lai, Y. [Bejiing Normal University, Bejiing (China)

1991-01-01

34

Deconvolution Analysis for Classifying Gastric Adenocarcinoma Patients Based on Differential Scanning Calorimetry Serum Thermograms  

PubMed Central

Recently, differential scanning calorimetry (DSC) has been acknowledged as a novel tool for diagnosing and monitoring several diseases. This highly sensitive technique has been traditionally used to study thermally induced protein folding/unfolding transitions. In previous research papers, DSC profiles from blood samples of patients were analyzed and they exhibited marked differences in the thermal denaturation profile. Thus, we investigated the use of this novel technology in blood serum samples from 25 healthy subjects and 30 patients with gastric adenocarcinoma (GAC) at different stages of tumor development with a new multiparametric approach. The analysis of the calorimetric profiles of blood serum from GAC patients allowed us to discriminate three stages of cancer development (I to III) from those of healthy individuals. After a multiparametric analysis, a classification of blood serum DSC parameters from patients with GAC is proposed. Certain parameters exhibited significant differences (P < 0.05) and allowed the discrimination of healthy subjects/patients from patients at different tumor stages. The results of this work validate DSC as a novel technique for GAC patient classification and staging, and offer new graphical tools and value ranges for the acquired parameters in order to discriminate healthy from diseased subjects with increased disease burden. PMID:25614381

Vega, Sonia; Garcia-Gonzalez, María Asuncion; Lanas, Angel; Velazquez-Campoy, Adrian; Abian, Olga

2015-01-01

35

Deconvolution analysis for classifying gastric adenocarcinoma patients based on differential scanning calorimetry serum thermograms.  

PubMed

Recently, differential scanning calorimetry (DSC) has been acknowledged as a novel tool for diagnosing and monitoring several diseases. This highly sensitive technique has been traditionally used to study thermally induced protein folding/unfolding transitions. In previous research papers, DSC profiles from blood samples of patients were analyzed and they exhibited marked differences in the thermal denaturation profile. Thus, we investigated the use of this novel technology in blood serum samples from 25 healthy subjects and 30 patients with gastric adenocarcinoma (GAC) at different stages of tumor development with a new multiparametric approach. The analysis of the calorimetric profiles of blood serum from GAC patients allowed us to discriminate three stages of cancer development (I to III) from those of healthy individuals. After a multiparametric analysis, a classification of blood serum DSC parameters from patients with GAC is proposed. Certain parameters exhibited significant differences (P < 0.05) and allowed the discrimination of healthy subjects/patients from patients at different tumor stages. The results of this work validate DSC as a novel technique for GAC patient classification and staging, and offer new graphical tools and value ranges for the acquired parameters in order to discriminate healthy from diseased subjects with increased disease burden. PMID:25614381

Vega, Sonia; Garcia-Gonzalez, María Asuncion; Lanas, Angel; Velazquez-Campoy, Adrian; Abian, Olga

2015-01-01

36

Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis  

PubMed Central

The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

2015-01-01

37

Boronic acid-protected gold clusters capable of asymmetric induction: spectral deconvolution analysis of their electronic absorption and magnetic circular dichroism.  

PubMed

Gold clusters protected by 3-mercaptophenylboronic acid (3-MPB) with a mean core diameter of 1.1 nm are successfully isolated, and their absorption, magnetic circular dichroism (MCD), and chiroptical responses in metal-based electronic transition regions, which can be induced by surface D-/L-fructose complexation, are examined. It is well-known that MCD basically corresponds to electronic transitions in the absorption spectrum, so simultaneous deconvolution analysis of electronic absorption and MCD spectra of the gold cluster compound is conducted under the constrained requirement that a single set of Gaussian components be used for their fitting. We then find that fructose-induced chiroptical response is explained in terms of the deconvoluted spectra experimentally obtained. We believe this spectral analysis is expected to benefit better understanding of the electronic states and the origin of the optical activity in chiral metal clusters. PMID:22303900

Yao, Hiroshi; Saeki, Masanori; Sasaki, Akito

2012-02-28

38

Constrained spherical deconvolution analysis of the limbic network in human, with emphasis on a direct cerebello-limbic pathway  

PubMed Central

The limbic system is part of an intricate network which is involved in several functions like memory and emotion. Traditionally the role of the cerebellum was considered mainly associated to motion control; however several evidences are raising about a role of the cerebellum in learning skills, emotions control, mnemonic and behavioral processes involving also connections with limbic system. In 15 normal subjects we studied limbic connections by probabilistic Constrained Spherical Deconvolution (CSD) tractography. The main result of our work was to prove for the first time in human brain the existence of a direct cerebello-limbic pathway which was previously hypothesized but never demonstrated. We also extended our analysis to the other limbic connections including cingulate fasciculus, inferior longitudinal fasciculus, uncinated fasciculus, anterior thalamic connections and fornix. Although these pathways have been already described in the tractographic literature we provided reconstruction, quantitative analysis and Fractional Anisotropy (FA) right-left symmetry comparison using probabilistic CSD tractography that is known to provide a potential improvement compared to previously used Diffusion Tensor Imaging (DTI) techniques. The demonstration of the existence of cerebello-limbic pathway could constitute an important step in the knowledge of the anatomic substrate of non-motor cerebellar functions. Finally the CSD statistical data about limbic connections in healthy subjects could be potentially useful in the diagnosis of pathological disorders damaging this system. PMID:25538606

Arrigo, Alessandro; Mormina, Enricomaria; Anastasi, Giuseppe Pio; Gaeta, Michele; Calamuneri, Alessandro; Quartarone, Angelo; De Salvo, Simona; Bruschetta, Daniele; Rizzo, Giuseppina; Trimarchi, Fabio; Milardi, Demetrio

2014-01-01

39

eCRAM computer algorithm for implementation of the charge ratio analysis method to deconvolute electrospray ionization mass spectra  

NASA Astrophysics Data System (ADS)

A computer program (eCRAM) has been developed for automated processing of electrospray mass spectra based on the charge ratio analysis method. The eCRAM algorithm deconvolutes electrospray mass spectra solely from the ratio of mass-to-charge (m/z) values of multiply charged ions. The program first determines the ion charge by correlating the ratio of m/z values for any two (i.e., consecutive or non-consecutive) multiply charged ions to the unique ratios of two integers. The mass, and subsequently the identity of the charge carrying species, is further determined from m/z values and charge states of any two ions. For the interpretation of high-resolution electrospray mass spectra, eCRAM correlates isotopic peaks that share the same isotopic compositions. This process is also performed through charge ratio analysis after correcting the multiply charged ions to their lowest common ion charge. The application of eCRAM algorithm has been demonstrated with theoretical mass-to-charge ratios for proteins lysozyme and carbonic anhydrase, as well as experimental data for both low and high-resolution FT-ICR electrospray mass spectra of a range of proteins (ubiquitin, cytochrome c, transthyretin, lysozyme and calmodulin). This also included the simulated data for mixtures by combining experimental data for ubiquitin, cytochrome c and transthyretin.

Maleknia, Simin D.; Green, David C.

2010-02-01

40

Failure environment analysis tool applications  

NASA Technical Reports Server (NTRS)

Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

Pack, Ginger L.; Wadsworth, David B.

1993-01-01

41

Common Bolted Joint Analysis Tool  

NASA Technical Reports Server (NTRS)

Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

Imtiaz, Kauser

2011-01-01

42

Application of network identification by deconvolution method to the thermal analysis of the pump-probe transient thermoreflectance signal.  

PubMed

The paper discusses the possibility to apply network identification by deconvolution (NID) method to the analysis of the thermal transient behavior due to a laser delta pulse excitation in a pump-probe transient thermoreflectance experiment. NID is a method based on linear RC network theory using Fourier's law of heat conduction. This approach allows the extraction of the thermal time constant spectrum of the sample under study after excitation by either a step or pulse function. Furthermore, using some mathematical transformations, the method allows analyzing the detail of the heat flux path through the sample, starting from the excited top free surface, by introducing two characteristic functions: the cumulative structure function and the differential structure function. We start by a review of the theoretical background of the NID method in the case of a step function excitation and then show how this method can be adjusted to be used in the case of a delta pulse function excitation. We show how the NID method can be extended to analyze the thermal transients of many optical experiments in which the excitation function is a laser pulse. The effect of the semi-infinite substrate as well as extraction of the interface and thin film thermal resistances will be discussed. PMID:19655973

Ezzahri, Y; Shakouri, A

2009-07-01

43

Automated Deconvolution of Overlapped Ion Mobility Profiles  

NASA Astrophysics Data System (ADS)

Presence of unresolved ion mobility (IM) profiles limits the efficient utilization of IM mass spectrometry (IM-MS) systems for isomer differentiation. Here, we introduce an automated ion mobility deconvolution (AIMD) computer software for streamlined deconvolution of overlapped IM-MS profiles. AIMD is based on a previously reported post-IM/collision-induced dissociation (CID) deconvolution approach [ J. Am. Soc. Mass Spectrom. 23, 1873 (2012)] and, unlike the previously reported manual approach, it does not require resampling of post-IM/CID data. A novel data preprocessing approach is utilized to improve the accuracy and efficiency of the deconvolution process. Results from AIMD analysis of overlapped IM profiles of data from (1) Waters Synapt G1 for a binary mixture of isomeric peptides (amino acid sequences: GRGDS and SDGRG) and (2) Waters Synapt G2-S for a binary mixture of isomeric trisaccharides (raffinose and isomaltotriose) are presented.

Brantley, Matthew; Zekavat, Behrooz; Harper, Brett; Mason, Rachel; Solouki, Touradj

2014-10-01

44

General Mission Analysis Tool (GMAT)  

NASA Technical Reports Server (NTRS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

Hughes, Steven P.

2007-01-01

45

Flow Analysis Tool White Paper  

NASA Technical Reports Server (NTRS)

Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

Boscia, Nichole K.

2012-01-01

46

System analysis: Developing tools for the future  

SciTech Connect

This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

1996-02-01

47

Tool for Magnetic Analysis Package  

NSDL National Science Digital Library

The FIT-MART Launcher package is a self-contained file for simulating systems of interacting quantum magnetic moments (âspinsâ). The interactions are modeled using the Heisenberg model, calculations are carried out by numerically diagonalizing the matrix representation of the Heisenberg Hamiltonian, and several types of plots are generated to describe various aspects of the model. The FIT-MART package is a âFully Integrated Tool for Magnetic Analysis in Research & Teachingâ (hence the acronym) which provides a very simple interface for defining complex quantum spin models, carrying out complex calculations, and visualizing the results using several graphical representations. These representations include plots of the energy spectrum as well as plots of the magnetization and magnetic susceptibility as a function of temperature and magnetic field. The FIT-MART package is an Open Source Physics package written to help students as well as researchers who are studying magnetism. It is distributed as a ready-to-run (compiled) Java archive. Double clicking osp_fit_mart.jar file will run the package if Java is installed. In future versions of this package, curricular materials will be included to help students to learn about magnetism, and automated fitting routines will be included to help researchers quickly and easily model experimental data.

Engelhardt, Larry; Rainey, Cameron

2010-05-19

48

A tool for analysis of Internet metrics  

Microsoft Academic Search

Estimating the performance parameters of Internet is not an easy task, currently several estimation tools exist and many others are under development. Unfortunately, the tools only estimate a single metric, are not always robust and do not provide a friendly interface to the user. In this document, we present a novel multimetric analysis tool that is able to estimate accurately

Ramírez Pacheco; R. D. Torres

2005-01-01

49

ADVANCED POWER SYSTEMS ANALYSIS TOOLS  

SciTech Connect

The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

Robert R. Jensen; Steven A. Benson; Jason D. Laumb

2001-08-31

50

General Mission Analysis Tool (GMAT) Mathematical Specifications  

NASA Technical Reports Server (NTRS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

Hughes, Steve

2007-01-01

51

Understanding Blind Deconvolution Algorithms.  

PubMed

Blind deconvolution is the recovery of a sharp version of a blurred image when the blur kernel is unknown. Recent algorithms have afforded dramatic progress, yet many aspects of the problem remain challenging and hard to understand. The goal of this paper is to analyze and evaluate recent blind deconvolution algorithms both theoretically and experimentally. We explain the previously reported failure of the naive MAP approach by demonstrating that it mostly favors no-blur explanations. On the other hand, we show that since the kernel size is often smaller than the image size, a MAP estimation of the kernel alone can be well constrained and accurately recover the true blur. The plethora of recent deconvolution techniques makes an experimental evaluation on ground-truth data important. We have collected blur data with ground truth and compared recent algorithms under equal settings. Additionally, our data demonstrates that the shift-invariant blur assumption made by most algorithms is often violated. PMID:21788664

Levin, Anat; Weiss, Yair; Durand, Fredo; Freeman, William T

2011-07-21

52

Integrating Reliability Analysis with a Performance Tool  

NASA Technical Reports Server (NTRS)

A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

1995-01-01

53

Statistical Tools for Forensic Analysis of Toolmarks  

SciTech Connect

Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

2004-04-22

54

GAIA: Graphical Astronomy and Image Analysis Tool  

NASA Astrophysics Data System (ADS)

GAIA is an image and data-cube display and analysis tool for astronomy. It provides the usual facilities of image display tools, plus more astronomically useful ones such as aperture and optimal photometry, contouring, source detection, surface photometry, arbitrary region analysis, celestial coordinate readout, calibration and modification, grid overlays, blink comparison, defect patching and the ability to query on-line catalogues and image servers. It can also display slices from data-cubes, extract and visualize spectra as well as perform full 3D rendering. GAIA uses the Starlink software environment (ascl:1110.012) and is derived from the ESO SkyCat tool (ascl:1109.019).

Draper, Peter W.; Gray, Norman; Berry, David S.; Taylor, Mark

2014-03-01

55

Fast holographic deconvolution: a new technique for precision radio interferometry  

E-print Network

We introduce the Fast Holographic Deconvolution method for analyzing interferometric radio data. Our new method is an extension of A-projection/software-holography/forward modeling analysis techniques and shares their ...

Goeke, Robert F.

56

Stochastic Simulation Tool for Aerospace Structural Analysis  

NASA Technical Reports Server (NTRS)

Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

Knight, Norman F.; Moore, David F.

2006-01-01

57

Dynamic Analysis Tool for Legged Robots  

Microsoft Academic Search

The paper introduces a systematic approach for dealing with legged robot mechanism analysis. First, we briefly summarize basic mathematical tools for studying the dynamics of these multi-loop and parallel mechanisms using a unified spatial formulation which is useful for computer algorithms. The dynamic behavior analysis is based on two stages. The first one deals with establishing the equations of motion

F. B. Ouezdou; O. Bruneau; J. C. Guinot

1998-01-01

58

NEAT: Nebular Empirical Analysis Tool  

NASA Astrophysics Data System (ADS)

NEAT is a fully automated code which carries out a complete analysis of lists of emission lines to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEAT uses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances.

Wesson, R.; Stock, D.; Scicluna, P.

2014-11-01

59

Total variation blind deconvolution  

Microsoft Academic Search

We present a blind deconvolution algorithm based on the total variational (TV) minimization method proposed by Acar and Vogel (1994). The motivation for regularizing with the TV norm is that it is extremely effective for recovering edges of images as well as some blurring functions, e.g., motion blur and out-of-focus blur. An alternating minimization (AM) implicit iterative scheme is devised

Tony F. Chan; Chiu-Kwong Wong

1998-01-01

60

The Vampir Performance Analysis Tool-Set  

Microsoft Academic Search

\\u000a This paper presents the Vampir tool-set for performance analysis of parallel applications. It consists of the run-time measurement\\u000a system VampirTrace and the visualization tools Vampir and VampirServer. It describes the major features and outlines the underlying\\u000a implementation that is necessary to provide low overhead and good scalability. Furthermore, it gives a short overview about\\u000a the development history and future work

Andreas Knüpfer; Holger Brunst; Jens Doleschal; Matthias Jurenz; Matthias Lieber; Holger Mickler; Matthias S. Muller; Wolfgang E. Nagel

2008-01-01

61

Built Environment Energy Analysis Tool Overview (Presentation)  

SciTech Connect

This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

Porter, C.

2013-04-01

62

Application in Alzheimer's Disease Early Detection Deconvolution  

E-print Network

Outline Application in Alzheimer's Disease Early Detection Deconvolution Conclusions and Future Extraction from PET Images #12;Outline Application in Alzheimer's Disease Early Detection Deconvolution Conclusions and Future Work Application in Alzheimer's Disease Early Detection Deconvolution Regularized Least

Renaut, Rosemary

63

Windprofiler optimization using digital deconvolution procedures  

NASA Astrophysics Data System (ADS)

Digital improvements to data acquisition procedures used for windprofiler radars have the potential for improving the height coverage at optimum resolution, and permit improved height resolution. A few newer systems already use this capability. Real-time deconvolution procedures offer even further optimization, and this has not been effectively employed in recent years. In this paper we demonstrate the advantages of combining these features, with particular emphasis on the advantages of real-time deconvolution. Using several multi-core CPUs, we have been able to achieve speeds of up to 40 GHz from a standard commercial motherboard, allowing data to be digitized and processed without the need for any type of hardware except for a transmitter (and associated drivers), a receiver and a digitizer. No Digital Signal Processor chips are needed, allowing great flexibility with analysis algorithms. By using deconvolution procedures, we have then been able to not only optimize height resolution, but also have been able to make advances in dealing with spectral contaminants like ground echoes and other near-zero-Hz spectral contamination. Our results also demonstrate the ability to produce fine-resolution measurements, revealing small-scale structures within the backscattered echoes that were previously not possible to see. Resolutions of 30 m are possible for VHF radars. Furthermore, our deconvolution technique allows the removal of range-aliasing effects in real time, a major bonus in many instances. Results are shown using new radars in Canada and Costa Rica.

Hocking, W. K.; Hocking, A.; Hocking, D. G.; Garbanzo-Salas, M.

2014-10-01

64

Target deconvolution techniques in modern phenotypic profiling  

PubMed Central

The past decade has seen rapid growth in the use of diverse compound libraries in classical phenotypic screens to identify modulators of a given process. The subsequent process of identifying the molecular targets of active hits, also called ‘target deconvolution’, is an essential step for understanding compound mechanism of action and for using the identified hits as tools for further dissection of a given biological process. Recent advances in ‘omics’ technologies, coupled with in silico approaches and the reduced cost of whole genome sequencing, have greatly improved the workflow of target deconvolution and have contributed to a renaissance of ‘modern’ phenotypic profiling. In this review, we will outline how both new and old techniques are being used in the difficult process of target identification and validation as well as discuss some of the ongoing challenges remaining for phenotypic screening. PMID:23337810

Lee, Jiyoun; Bogyo, Matthew

2013-01-01

65

Photogrammetry Tool for Forensic Analysis  

NASA Technical Reports Server (NTRS)

A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

Lane, John

2012-01-01

66

CRAB: Distributed analysis tool for CMS  

NASA Astrophysics Data System (ADS)

CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

Sala, Leonardo; CMS Collaboration

2012-12-01

67

Design and Analysis Tools for Supersonic Inlets  

NASA Technical Reports Server (NTRS)

Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

Slater, John W.; Folk, Thomas C.

2009-01-01

68

Mars Reconnaissance Orbiter Uplink Analysis Tool  

NASA Technical Reports Server (NTRS)

This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

2008-01-01

69

Tools for Life Support Systems Analysis  

NASA Astrophysics Data System (ADS)

An analysis of the optimum level of closure of a life support system is a complex task involving hundreds, if not thousands, of parameters. In the absence of complete data on candidate technologies and a complete definition of the mission architecture and requirements, many assumptions are necessary. Because of the large number of parameters, it is difficult to fully comprehend and compare studies performed by different analysts. The Systems Integration, Modeling, and Analysis (SIMA) Project Element within NASA's Advanced Life Support (ALS) Project has taken measures to improve this situation by issuing documents that define ALS requirements, baseline assumptions, and reference missions. As a further step to capture and retain available knowledge and to facilitate system-level studies, various software tools are being developed. These include a database tool for storing, organizing, and updating technology parameters, modeling tools for evaluating time-average and dynamic system performance, and sizing tools for estimating overall system mass, volume, power, cooling, logistics, and crew time. This presentation describes ongoing work on the development and integration of these tools for life support systems analysis.

Lange, K.; Ewert, M.

70

Link Analysis Tools for Intelligence and Counterterrorism  

Microsoft Academic Search

Association rule mining is an important data analysis tool that can be applied with success to a variety of domains. However, most association rule mining algorithms seek to discover statistically signifl- cant patterns (i.e. those with considerable support). We argue that, in law-enforcement, intelligence and counterterrorism work, sometimes it is necessary to look for patterns which do not have large

Antonio Badia; Mehmed M. Kantardzic

2005-01-01

71

Comparing Work Skills Analysis Tools. Project Report.  

ERIC Educational Resources Information Center

This document outlines the processes and outcomes of a research project conducted to review work skills analysis tools (products and/or services) that profile required job skills and/or assess individuals' acquired skills. The document begins with a brief literature review and discussion of pertinent terminology. Presented next is a list of…

Barker, Kathryn

72

Integrated multidisciplinary analysis tool IMAT users' guide  

NASA Technical Reports Server (NTRS)

The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

Meissner, Frances T. (editor)

1988-01-01

73

Deconvolution in Astronomy: A Review  

Microsoft Academic Search

This article reviews different deconvolution methods. The all-pervasive presence of noise is what makes deconvolution particularly difficult. The diversity of resulting algorithms reflects different ways of estimating the true signal under various idealizations of its properties. Different ways of approaching signal recovery are based on different instrumental noise models, whether the astronomical objects are pointlike or extended, and indeed on

J. L. Starck; E. Pantin; F. Murtagh

2002-01-01

74

Compact multiframe blind deconvolution.  

PubMed

We describe a multiframe blind deconvolution (MFBD) algorithm that uses spectral ratios (the ratio of the Fourier spectra of two data frames) to model the inherent temporal signatures encoded by the observed images. In addition, by focusing on the separation of the object spectrum and system transfer functions only at spatial frequencies where the measured signal is above the noise level, we significantly reduce the number of unknowns to be determined. This "compact" MFBD yields high-quality restorations in a much shorter time than is achieved with MFBD algorithms that do not model the temporal signatures; it may also provide higher-fidelity solutions. PMID:21403711

Hope, Douglas A; Jefferies, Stuart M

2011-03-15

75

From sensor networks to connected analysis tools  

NASA Astrophysics Data System (ADS)

Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the applications and web services developed so far as well as to be developed in the future.

Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

2012-04-01

76

Sequential deconvolution input reconstruction  

NASA Astrophysics Data System (ADS)

The reconstruction of inputs from measured outputs is examined. It is shown that the rank deficiency that arises in de-convolving non-collocated arrangements is associated with a kernel that is non-zero only over the part of the time axis where delay from wave propagation prevents uniqueness. Input deconvolution, therefore, follows in the same manner for collocated and non-collocated scenarios, collocation being the special case where the prediction lag can be zero. This paper illustrates that deconvolution carried out on a sliding window is a conditionally stable process and the condition for stability is derived. Examination of the Cramer-Rao Lower Bound of the inputs in frequency shows that the inference model should be formulated such that the spectra of the inputs to be reconstructed, and of the realized measurement noise, are within the model bandwidth. An expression for the error in the reconstructed input as a function of the noise sequence is developed and is used to control the regularization, when regularization is needed. The paper brings attention to the fact that finite dimensional models cannot display true dead time and that failure to recognize this matter has led to algorithms that, in general, propose to violate the physical constraints.

Bernal, Dionisio; Ussia, Alessia

2015-01-01

77

Data Analysis Tools for NSTX-U Physics Meeting  

E-print Network

Data Analysis Tools for NSTX-U Bill Davis Stan Kaye Physics Meeting B-318 Aug. 26, 2013 NSTX LLC #12;NSTX-U Monday Physics Meeting­ Data Analysis Tools, Bill Davis (8/26/2013) 2 Overview ·Web tools should be developed? #12;NSTX-U Monday Physics Meeting­ Data Analysis Tools, Bill Davis (8

Princeton Plasma Physics Laboratory

78

Integrated tools for control-system analysis  

NASA Technical Reports Server (NTRS)

The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

1989-01-01

79

Challenges Facing Design and Analysis Tools  

NASA Technical Reports Server (NTRS)

The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

2001-01-01

80

Tools for Next Generation Sequencing Data Analysis  

PubMed Central

As NGS technology continues to improve, the amount of data generated per run grows exponentially. Unfortunately, the primary bottleneck in NGS studies is still bioinformatics analysis. Not all researchers have access to a bioinformatics core or dedicated bioinformatician. Additionally, much of the software for NGS analyses is written to run in a Unix / Linux environment. Researchers unfamiliar with the Unix command line may be unable to use these tools, or face a steep learning curve in trying to do so. Commercial packages exist, such as the CLC Genomics Workbench, DNANexus, and GenomeQuest. However, these commercial packages often incorporate proprietary algorithms to perform data analysis and may be costly. Galaxy provides a solution to this problem by incorporating popular open-source and community linux command line tools into an easy to use web-based environment. After sequence data has been uploaded and mapped, there are a variety of workflows for NGS analyses that use open-source tools. This includes peak-calling analyses for ChIP-Seq (MACS, GeneTrack indexer, Peak predictor), RNA-Seq (Tophat, Cufflinks), and finding small insertions, deletions, and SNPs using SAMtools. Any researcher can apply a workflow to his NGS data and retrieve results, without having to interact with a command line. Additionally, since Galaxy is cloud-based, expensive computing hardware for performing analyses is not needed. In this presentation we will provide an overview of two popular open source RNA-Seq analysis tools, Tophat and Cufflinks, and demonstrate how they can be used in Galaxy.

Bodi, K.

2011-01-01

81

Compressive blind image deconvolution.  

PubMed

We propose a novel blind image deconvolution (BID) regularization framework for compressive sensing (CS) based imaging systems capturing blurred images. The proposed framework relies on a constrained optimization technique, which is solved by a sequence of unconstrained sub-problems, and allows the incorporation of existing CS reconstruction algorithms in compressive BID problems. As an example, a non-convex lp quasi-norm with is employed as a regularization term for the image, while a simultaneous auto-regressive regularization term is selected for the blur. Nevertheless, the proposed approach is very general and it can be easily adapted to other state-of-the-art BID schemes that utilize different, application specific, image/blur regularization terms. Experimental results, obtained with simulations using blurred synthetic images and real passive millimeter-wave images, show the feasibility of the proposed method and its advantages over existing approaches. PMID:23744684

Amizic, Bruno; Spinoulas, Leonidas; Molina, Rafael; Katsaggelos, Aggelos K

2013-10-01

82

JOINT BLIND DECONVOLUTION AND SPECTRAL UNMIXING OF HYPERSPECTRAL IMAGES  

E-print Network

indispensable tools. For acquiring useful images there has been extensive work on imaging through turbulence [1JOINT BLIND DECONVOLUTION AND SPECTRAL UNMIXING OF HYPERSPECTRAL IMAGES Qiang Zhang Dept Science, Wake Forest University, Winston-Salem, NC 27109 Our interest here is spectral imaging for space

Plemmons, Robert J.

83

ATOM: a system for building customized program analysis tools  

Microsoft Academic Search

ATOM (Analysis Tools with OM) is a single framework for building a wide range of customized program analysis tools. It provides the common infrastructure present in all code-instrumenting tools; this is the difficult and time-consuming part. The user simply defines the tool-specific details in instrumentation and analysis routines. Building a basic block counting tool like Pixie with ATOM requires only

Amitabh Srivastava; Alan Eustace

1994-01-01

84

Independent evaluation of a commercial deconvolution reporting software for gas chromatography mass spectrometry analysis of pesticide residues in fruits and vegetables.  

PubMed

The gas chromatography mass spectrometry (GC-MS) deconvolution reporting software (DRS) from Agilent Technologies has been evaluated for its ability as a screening tool to detect a large number of pesticides in incurred and fortified samples extracted with acetone/dichloromethane/light petroleum (Mini-Luke method). The detection of pesticides is based on fixed retention times using retention time locking (RTL) and full scan mass spectral comparison with a partly customer built automated mass spectral deconvolution and identification system (AMDIS) database. The GC-MS was equipped with a programmable temperature vaporising (PTV) injector system which enables more sample to be injected. In a blind study of 52 real samples a total number of 158 incurred pesticides were found. In addition to the 85 pesticides found by manual interpretation of GC-NPD/ECD chromatograms, the DRS revealed 73 more pesticides (+46%). The DRS system also shows its potential to discover pesticides which are normally not searched for (EPN in long beans from Thailand). A spiking experiment was performed to blank matrices of apple, orange and lettuce with 177 different pesticides at concentration levels 0.02 and 0.1 mg/kg. The samples were analysed on GC-MS full scan and the AMDIS match factor was used as a mass spectral quality criterion. The threshold level of the AMDIS match factor was set at 20 to eliminate most of the false positives. AMDIS match factors from 20 up to 69 are regarded only as indication of a positive hit and must be followed by manual interpretation. Pesticides giving AMDIS match factors at > or = 70 are regarded as identified. To simplify and decrease the large amount of data generated at each concentration level, the AMDIS match factors > or = 20 was averaged (mean AMF) for each pesticide including the commodities and their replicates. Among 177 different pesticides spiked at 0.02 and 0.1 mg/kg level, the percentage of mean AMF values > or = 70 were 23% and 80%, respectively. For 531 individual detections of pesticides (177 pesticides x 3 replicates) giving AMDIS match factor 20 in apple, orange and lettuce, the detection rates at 0.02 mg/kg were 71%, 63% and 72%, respectively. For the 0.1 mg/kg level the detection rates were 89%, 85% and 89%, respectively. In real samples some manual interpretation must be performed in addition. However, screening by GC-MS/DRS is about 5-10 times faster compared to screening with GC-NPD/ECD because the time used for manual interpretation is much shorter and there is no need for re-injection on GC-MS for the identification of suspect peaks found on GC-NPD/ECD. PMID:20172528

Norli, Hans Ragnar; Christiansen, Agnethe; Holen, Børge

2010-03-26

85

Data Analysis with Graphical Models: Software Tools  

NASA Technical Reports Server (NTRS)

Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

Buntine, Wray L.

1994-01-01

86

Enhancement of Local Climate Analysis Tool  

NASA Astrophysics Data System (ADS)

The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

2012-12-01

87

Microfracturing and new tools improve formation analysis  

SciTech Connect

This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))

1992-12-07

88

GIS-based hydrogeochemical analysis tools (QUIMET)  

NASA Astrophysics Data System (ADS)

A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

Velasco, V.; Tubau, I.; Vázquez-Suñè, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernàndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

2014-09-01

89

Distribution System Analysis Tools for Studying High Penetration of PV  

E-print Network

Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System Analysis Tools for Studying High Penetration of PV project titled "Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support

90

Software tools supporting business process analysis and modelling  

Microsoft Academic Search

Examines the wide range of business process analysis\\/modelling (BPA\\/M) tools available, and compares the features of 12 specific tools. Presents two case studies with examples of software tool analysis results. The discussion addresses whether these tools meet the needs of managers of change and business process re-engineering (BPR) initiatives, and offers suggestions for future tool evolution. The focus is to

Bing Yu; David T. Wright

1997-01-01

91

Three-dimensional analysis tool for segmenting and measuring the structure of telomeres in mammalian nuclei  

NASA Astrophysics Data System (ADS)

Quantitative analysis in combination with fluorescence microscopy calls for innovative digital image measurement tools. We have developed a three-dimensional tool for segmenting and analyzing FISH stained telomeres in interphase nuclei. After deconvolution of the images, we segment the individual telomeres and measure a distribution parameter we call ?T. This parameter describes if the telomeres are distributed in a sphere-like volume (?T ~ 1) or in a disk-like volume (?T >> 1). Because of the statistical nature of this parameter, we have to correct for the fact that we do not have an infinite number of telomeres to calculate this parameter. In this study we show a way to do this correction. After sorting mouse lymphocytes and calculating ?T and using the correction introduced in this paper we show a significant difference between nuclei in G2 and nuclei in either G0/G1 or S phase. The mean values of ?T for G0/G1, S and G2 are 1.03, 1.02 and 13 respectively.

Vermolen, Bart J.; Young, Ian T.; Chuang, Alice; Wark, Landon; Chuang, Tony; Mai, Sabine; Garini, Yuval

2005-03-01

92

SEAT: A strategic engagement analysis tool  

SciTech Connect

The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

Dreicer, J.; Michelsen, C.; Morgeson, D.

1988-01-01

93

Global spatial deconvolution of Lunar Prospector Th abundances D. J. Lawrence,1  

E-print Network

completed the first global spatial deconvolution analysis of planetary gamma-ray data for lunar Th. Introduction [2] The global distribution of lunar Th abundances mea- sured by the Lunar Prospector GammaGlobal spatial deconvolution of Lunar Prospector Th abundances D. J. Lawrence,1 R. C. Puetter,2,3 R

Spudis, Paul D.

94

Statistical Tools for Forensic Analysis of Toolmarks  

Microsoft Academic Search

Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to

David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

2004-01-01

95

Automated Steel Cleanliness Analysis Tool (ASCAT)  

SciTech Connect

The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

2005-12-30

96

Defining Digital Forensic Examination and Analysis Tools Using Abstraction Layers  

Microsoft Academic Search

This paper uses the theory of abstraction layers to describe the purpose and goals of digital forensic analysis tools. Using abstraction layers, we identify where tools can introduce errors and provide requirements that the tools must follow. Categories of forensic analysis types are also defined based on the abstraction layers. Abstraction layers are not a new concept, but their usage

Brian Carrier

2002-01-01

97

PyRAT - python radiography analysis tool (u)  

SciTech Connect

PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory; Armstrong, Jerawan C [Los Alamos National Laboratory

2011-01-14

98

Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)  

SciTech Connect

NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

Melaina, M.; Penev, M.

2012-09-01

99

Built Environment Analysis Tool: April 2013  

SciTech Connect

This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

Porter, C.

2013-05-01

100

ISHM Decision Analysis Tool: Operations Concept  

NASA Technical Reports Server (NTRS)

The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

2006-01-01

101

Bayesian spectral deconvolution with the exchange Monte Carlo method.  

PubMed

An analytical method to deconvolute spectral data into a number of simple bands is extremely important in the analysis of the chemical properties of matter. However, there are two fundamental problems with such deconvolution methods. One is how to determine the number of bands without resorting to heuristics. The other is difficulty in avoiding the parameter solution trapped into local minima due to the hierarchy and the nonlinearity of the system. In this study, we propose a novel method of spectral deconvolution based on Bayesian estimation with the exchange Monte Carlo method, which is an application of the integral approximation of stochastic complexity and the exchange Monte Carlo method. We also experimentally show its effectiveness on synthetic data and on reflectance spectral data of olivine, one of the most common minerals of terrestrial planets. PMID:22226618

Nagata, Kenji; Sugita, Seiji; Okada, Masato

2012-04-01

102

Interpretation and deconvolution of nanodisc native mass spectra.  

PubMed

Nanodiscs are a promising system for studying gas-phase and solution complexes of membrane proteins and lipids. We previously demonstrated that native electrospray ionization allows mass spectral analysis of intact Nanodisc complexes at single lipid resolution. This report details an improved theoretical framework for interpreting and deconvoluting native mass spectra of Nanodisc lipoprotein complexes. In addition to the intrinsic lipid count and charge distributions, Nanodisc mass spectra are significantly shaped by constructive overlap of adjacent charge states at integer multiples of the lipid mass. We describe the mathematical basis for this effect and develop a probability-based algorithm to deconvolute the underlying mass and charge distributions. The probability-based deconvolution algorithm is applied to a series of dimyristoylphosphatidylcholine Nanodisc native mass spectra and used to provide a quantitative picture of the lipid loss in gas-phase fragmentation. PMID:24353133

Marty, Michael T; Zhang, Hao; Cui, Weidong; Gross, Michael L; Sligar, Stephen G

2014-02-01

103

Interpretation and Deconvolution of Nanodisc Native Mass Spectra  

NASA Astrophysics Data System (ADS)

Nanodiscs are a promising system for studying gas-phase and solution complexes of membrane proteins and lipids. We previously demonstrated that native electrospray ionization allows mass spectral analysis of intact Nanodisc complexes at single lipid resolution. This report details an improved theoretical framework for interpreting and deconvoluting native mass spectra of Nanodisc lipoprotein complexes. In addition to the intrinsic lipid count and charge distributions, Nanodisc mass spectra are significantly shaped by constructive overlap of adjacent charge states at integer multiples of the lipid mass. We describe the mathematical basis for this effect and develop a probability-based algorithm to deconvolute the underlying mass and charge distributions. The probability-based deconvolution algorithm is applied to a series of dimyristoylphosphatidylcholine Nanodisc native mass spectra and used to provide a quantitative picture of the lipid loss in gas-phase fragmentation.

Marty, Michael T.; Zhang, Hao; Cui, Weidong; Gross, Michael L.; Sligar, Stephen G.

2013-12-01

104

Interpretation and Deconvolution of Nanodisc Native Mass Spectra  

PubMed Central

Nanodiscs are a promising system for studying gas-phase and solution complexes of membrane proteins and lipids. We previously demonstrated that native electrospray ionization allows mass spectral analysis of intact Nanodisc complexes at single lipid resolution. This report details an improved theoretical framework for interpreting and deconvoluting native mass spectra of Nanodisc lipoprotein complexes. In addition to the intrinsic lipid count and charge distributions, Nanodisc mass spectra are significantly shaped by constructive overlap of adjacent charge states at integer multiples of the lipid mass. We describe the mathematical basis for this effect and develop a probability-based algorithm to deconvolute the underlying mass and charge distributions. The probability-based deconvolution algorithm is applied to a series of dimyristoylphosphatidylcholine Nanodisc native mass spectra and used to provide a quantitative picture of the lipid loss in gas-phase fragmentation. PMID:24353133

Marty, Michael T.; Zhang, Hao; Cui, Weidong; Gross, Michael L.; Sligar, Stephen G.

2014-01-01

105

Application of deconvolution based pattern recognition algorithm for identification of rings in spectra from RICH detectors  

NASA Astrophysics Data System (ADS)

This paper proposes a new pattern recognition algorithm that is applied to determine rings in two-dimensional spectra from RICH detectors. It defines a two-dimensional boosted Gold deconvolution algorithm. This paper also thoroughly analyzes and studies the influence of input parameters for different kinds of data. By choosing suitable input parameters for deconvolution one can obtain an efficient tool for identifying rings in two-dimensional spectra. Illustrative examples prove in favor of the proposed algorithm.

Morhá?, Miroslav; Hlavá?, Stanislav; Veselský, Martin; Matoušek, Vladislav

2010-09-01

106

Spacecraft Electrical Power System (EPS) generic analysis tools and techniques  

NASA Technical Reports Server (NTRS)

An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

Morris, Gladys M.; Sheppard, Mark A.

1992-01-01

107

Tools for Decision Analysis: Analysis of Risky Decisions  

NSDL National Science Digital Library

This site offers a decision making procedure for solving complex problems step by step. It presents the decision-analysis process for both public and private decision-making, using different decision criteria, different types of information, and information of varying quality. It describes the elements in the analysis of decision alternatives and choices, as well as the goals and objectives that guide decision-making. The key issues related to a decision-maker's preferences regarding alternatives, criteria for choice, and choice modes, together with the risk assessment tools are also presented.

108

[An experimental study of liver perfusion using non-diffusible radiotracers: differentiation of the arterial and portal venous components by deconvolution analysis of first-pass time-activity curves].  

PubMed

The transfer function of the liver perfusion is an idealized time-activity curve that could be registered over the liver if a non-diffusible tracer would be injected directly into the abdominal aorta and no tracer recirculation would occur. The reproducibility of the transfer function was experimentally investigated in foxhounds. Both the routes of tracer application and the modes of data evaluation were varied and the perfusion was investigated under physiological and pathological conditions. The transfer function was calculated by deconvolution analysis of first-pass time-activity curves using the matrix regularization method. The transfer function showed clearly distinguishable arterial and portal-venous components. Repeated peripheral venous and central aortic applications resulted in reproducible curves. In addition to the arterial and portal-venous components the subcomponents of the portal-venous component could also be identified in the transfer function after ligation of the appropriate vessels. The accuracy of the mathematical procedure was tested by computer simulations. The simulation studies demonstrated also that the matrix regularization technique is suitable for deconvolution analysis of time-activity curves even when they are significantly contaminated by statistical noise. Calculation of the transfer function of liver perfusion and of its quantitative parameters seems thus to be a reliable method for non-invasive investigation of liver hemodynamics under physiological and pathological conditions. PMID:3194234

Szabó, Z; Torsello, G; Reifenrath, C; Porschen, R; Vosberg, H

1988-10-01

109

Scalable analysis tools for sensitivity analysis and UQ (3160) results.  

SciTech Connect

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

2009-09-01

110

Knowledge base navigator facilitating regional analysis inter-tool communication.  

SciTech Connect

To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

2004-08-01

111

KINEMATIC ANALYSIS OF A NEW PARALLEL MACHINE TOOL: THE ORTHOGLIDE  

E-print Network

KINEMATIC ANALYSIS OF A NEW PARALLEL MACHINE TOOL: THE ORTHOGLIDE P. WENGER and D. CHABLAT Institut singularities or self-collisions. Parallel kinematic machine tools attract the interest of more and more). Despite this, it is worth noting that many users of machine tools are still not convinced by the potential

Paris-Sud XI, Université de

112

High-resolution imaging by multiple-image deconvolution  

NASA Astrophysics Data System (ADS)

Image deconvolution is a powerful tool for improving the quality of images corrupted by blurring and noise. However, in some cases, the imaging system is affected by anisotropic resolution, i. e. the resolution depends on the direction in the imaging plane or volume. Such a distortion cannot be corrected by image deconvolution. One example, from Astronomy, is the Large Binocular Telescope (LBT) under construction on the top of Mount Graham (Arizona). A second example, from Microscopy, is the confocal microscope. In both cases, the situation can be improved if different images of the same target can be detected by rotating the instrument or by rotating the target. Then the problem arises of obtaining a unique high-resolution image from different images taken at different orientation angles. Such a problem is called multiple-image deconvolution. In this paper, after a brief illustration of the two examples mentioned above, the problem of multiple-image deconvolution is formulated and preliminarily investigated in a continuous setting (all directions are available), showing that, while resolution is anisotropic in the multiple images, it becomes isotropic in the reconstructed image. Next, methods and algorithms for the solution of the problem are presented and their accuracy illustrated by means of the results of a few numerical experiments. Finally, the possibility of a further improvement of resolution by means of super-resolving methods is briefly discussed and demonstrated.

Bertero, M.; Boccacci, P.; Desiderà, G.; Vicidomini, G.

2006-10-01

113

Dynamic contrast-enhanced CT of head and neck tumors: perfusion measurements using a distributed-parameter tracer kinetic model. Initial results and comparison with deconvolution-based analysis.  

PubMed

The objective of this work was to evaluate the feasibility of a two-compartment distributed-parameter (DP) tracer kinetic model to generate functional images of several physiologic parameters from dynamic contrast-enhanced CT data obtained of patients with extracranial head and neck tumors and to compare the DP functional images to those obtained by deconvolution-based DCE-CT data analysis. We performed post-processing of DCE-CT studies, obtained from 15 patients with benign and malignant head and neck cancer. We introduced a DP model of the impulse residue function for a capillary-tissue exchange unit, which accounts for the processes of convective transport and capillary-tissue exchange. The calculated parametric maps represented blood flow (F), intravascular blood volume (v(1)), extravascular extracellular blood volume (v(2)), vascular transit time (t(1)), permeability-surface area product (PS), transfer ratios k(12) and k(21), and the fraction of extracted tracer (E). Based on the same regions of interest (ROI) analysis, we calculated the tumor blood flow (BF), blood volume (BV) and mean transit time (MTT) by using a modified deconvolution-based analysis taking into account the extravasation of the contrast agent for PS imaging. We compared the corresponding values by using Bland-Altman plot analysis. We outlined 73 ROIs including tumor sites, lymph nodes and normal tissue. The Bland-Altman plot analysis revealed that the two methods showed an accepted degree of agreement for blood flow, and, thus, can be used interchangeably for measuring this parameter. Slightly worse agreement was observed between v(1) in the DP model and BV but even here the two tracer kinetic analyses can be used interchangeably. Under consideration of whether both techniques may be used interchangeably was the case of t(1) and MTT, as well as for measurements of the PS values. The application of the proposed DP model is feasible in the clinical routine and it can be used interchangeably for measuring blood flow and vascular volume with the commercially available reference standard of the deconvolution-based approach. The lack of substantial agreement between the measurements of vascular transit time and permeability-surface area product may be attributed to the different tracer kinetic principles employed by both models and the detailed capillary tissue exchange physiological modeling of the DP technique. PMID:17921579

Bisdas, Sotirios; Konstantinou, George N; Lee, Puor Sherng; Thng, Choon Hua; Wagenblast, Jens; Baghi, Mehran; Koh, Tong San

2007-10-21

114

Dynamic contrast-enhanced CT of head and neck tumors: perfusion measurements using a distributed-parameter tracer kinetic model. Initial results and comparison with deconvolution-based analysis  

NASA Astrophysics Data System (ADS)

The objective of this work was to evaluate the feasibility of a two-compartment distributed-parameter (DP) tracer kinetic model to generate functional images of several physiologic parameters from dynamic contrast-enhanced CT data obtained of patients with extracranial head and neck tumors and to compare the DP functional images to those obtained by deconvolution-based DCE-CT data analysis. We performed post-processing of DCE-CT studies, obtained from 15 patients with benign and malignant head and neck cancer. We introduced a DP model of the impulse residue function for a capillary-tissue exchange unit, which accounts for the processes of convective transport and capillary-tissue exchange. The calculated parametric maps represented blood flow (F), intravascular blood volume (v1), extravascular extracellular blood volume (v2), vascular transit time (t1), permeability-surface area product (PS), transfer ratios k12 and k21, and the fraction of extracted tracer (E). Based on the same regions of interest (ROI) analysis, we calculated the tumor blood flow (BF), blood volume (BV) and mean transit time (MTT) by using a modified deconvolution-based analysis taking into account the extravasation of the contrast agent for PS imaging. We compared the corresponding values by using Bland-Altman plot analysis. We outlined 73 ROIs including tumor sites, lymph nodes and normal tissue. The Bland-Altman plot analysis revealed that the two methods showed an accepted degree of agreement for blood flow, and, thus, can be used interchangeably for measuring this parameter. Slightly worse agreement was observed between v1 in the DP model and BV but even here the two tracer kinetic analyses can be used interchangeably. Under consideration of whether both techniques may be used interchangeably was the case of t1 and MTT, as well as for measurements of the PS values. The application of the proposed DP model is feasible in the clinical routine and it can be used interchangeably for measuring blood flow and vascular volume with the commercially available reference standard of the deconvolution-based approach. The lack of substantial agreement between the measurements of vascular transit time and permeability-surface area product may be attributed to the different tracer kinetic principles employed by both models and the detailed capillary tissue exchange physiological modeling of the DP technique.

Bisdas, Sotirios; Konstantinou, George N.; Sherng Lee, Puor; Thng, Choon Hua; Wagenblast, Jens; Baghi, Mehran; San Koh, Tong

2007-10-01

115

A Multidimensional Analysis Tool for Visualizing Online Interactions  

ERIC Educational Resources Information Center

This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

Kim, Minjeong; Lee, Eunchul

2012-01-01

116

A Schema Analysis and Reconciliation Tool Environment for Heterogeneous Databases  

Microsoft Academic Search

To support the development of uniform query interfaces over distributed and heterogeneous databases, tools for the analysis and reconciliation of database conceptual schemas are required. The paper presents the ARTEMIS tool environment developed to support the analyst in the process of analyzing and reconciling sets of heterogeneous data schemas. Schema analysis in ARTEMIS is performed according to the concept of

Silvana Castano; Valeria De Antonellis

1999-01-01

117

Tools for Knowledge Analysis, Synthesis, and Sharing  

NASA Astrophysics Data System (ADS)

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

Medland, Michael B.

2007-04-01

118

An Integrated Tool for System Analysis of Sample Return Vehicles  

NASA Technical Reports Server (NTRS)

The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

2012-01-01

119

Model analysis tools in the Virtual Model Repository (VMR)  

NASA Astrophysics Data System (ADS)

The Virtual Model Repository (VMR) provides scientific analysis tools for a wide variety of numerical models of the Earth's magnetosphere. Data discovery, visualization tools and data/model comparisons are provided in a consistent and intuitive format. A large collection of numerical model runs are available to analyze, including the large Earth magnetosphere event run library at the CCMC and many runs from the University of Michigan. Relevant data useful for data/model comparisons is found using various APIs and included in many of the visualization tools. Recent additions to the VMR include a comprehensive suite of tools for analysis of the Global Ionosphere Thermosphere Model (GITM).

De Zeeuw, D.; Ridley, A. J.

2013-12-01

120

Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool  

NASA Technical Reports Server (NTRS)

This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

Maul, William A.; Fulton, Christopher E.

2011-01-01

121

Blind deconvolution of a noisy degraded image.  

PubMed

We develop a unified algorithm for performing blind deconvolution of a noisy degraded image. By incorporating a low-pass filter into the asymmetric multiplicative iterative algorithm and extending it to multiframe blind deconvolution, this algorithm accomplishes the blind deconvolution and noise removal concurrently. We report numerical experiments of applying the algorithm to the restoration of short-exposure atmosphere turbulence degraded images. These experiments evidently demonstrate that the unified algorithm has both good blind deconvolution performance and high-resolution image restoration. PMID:19381188

Zhang, Jianlin; Zhang, Qiheng; He, Guangming

2009-04-20

122

Tools for Knowledge Analysis, Synthesis, and Sharing  

ERIC Educational Resources Information Center

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

Medland, Michael B.

2007-01-01

123

FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)  

NASA Technical Reports Server (NTRS)

The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

Pack, G.

1994-01-01

124

An alternating minimization method for blind deconvolution from Poisson data  

NASA Astrophysics Data System (ADS)

Blind deconvolution is a particularly challenging inverse problem since information on both the desired target and the acquisition system have to be inferred from the measured data. When the collected data are affected by Poisson noise, this problem is typically addressed by the minimization of the Kullback-Leibler divergence, in which the unknowns are sought in particular feasible sets depending on the a priori information provided by the specific application. If these sets are separated, then the resulting constrained minimization problem can be addressed with an inexact alternating strategy. In this paper we apply this optimization tool to the problem of reconstructing astronomical images from adaptive optics systems, and we show that the proposed approach succeeds in providing very good results in the blind deconvolution of nondense stellar clusters.

Prato, Marco; La Camera, Andrea; Bonettini, Silvia

2014-10-01

125

Computer models and analysis tools for neural microcircuits  

Microsoft Academic Search

This chapter surveys web resources regarding computer models and analysis tools for neural microcircuits. In particular it describes the features of a new website (www.lsm.tugraz.at) that facilitates the creation of computer models for cortical neural microcircuits of various sizes and levels of detail, as well as tools for evaluating the computational power of these models in a Matlab- environment.

Thomas Natschlager; Henry Markram

2002-01-01

126

Tools and techniques for failure analysis and qualification of MEMS.  

SciTech Connect

Many of the tools and techniques used to evaluate and characterize ICs can be applied to MEMS technology. In this paper we discuss various tools and techniques used to provide structural, chemical, and electrical analysis and how these data aid in qualifying MEMS technologies.

Walraven, Jeremy Allen

2003-07-01

127

Ris-R-1359(EN) Fractography analysis of tool samples  

E-print Network

Risø-R-1359(EN) Fractography analysis of tool samples used for cold forging Kristian Vinter Dahl for industrial cold forging have been investigated using light optical microscopy and scanning electron of material occurs as a crack formation at a notch inside of the tool. Generally the cold forging dies

128

SIMPLE: a universal tool box for event trace analysis  

Microsoft Academic Search

The event trace analysis system SIMPLE allows the evaluation of arbitrarily formatted event traces. SIMPLE is designed as a software package which comprises independent tools that are all based on a new kind of event trace access: the trace format is described in a trace description language (TDL) and evaluation tools access the event trace through a standardized problem-oriented event

P. Dauphin; R. Hofmann; F. Lemmen; B. Mohr

1996-01-01

129

Generalized Aliasing as a Basis for Program Analysis Tools  

E-print Network

analyses. This dissertation describes the design of a system, Ajax, that addresses this problem by using. To enable the construction of many tools, Ajax imposes a clean separation between analysis engines, search for accesses to objects, and build object models. To support these tools, Ajax includes a novel

130

Quasar: A New Tool for Concurrent Ada Programs Analysis  

E-print Network

Quasar: A New Tool for Concurrent Ada Programs Analysis Sami Evangelista, Claude Kaiser, Jean. We present a new tool, Quasar, which is based on ASIS and which uses fully the concept of patterns the usefulness of Quasar by analyzing several variations of a non trivial concurrent program. 1 Introduction

Evangelista, Sami

131

HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL  

EPA Science Inventory

An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

132

BRFSS: Prevalence Data and Data Analysis Tools  

NSDL National Science Digital Library

RFSS is the nation's premier system of health-related telephone surveys that collect state data about U.S. residents regarding their health-related risk behaviors, chronic health conditions, and use of preventive services. BRFSS collects data in all 50 states as well as the District of Columbia and three U.S. territories. BRFSS completes more than 400,000 adult interviews each year, making it the largest continuously conducted health survey system in the world. These tools allow the user to perform various analyses and display the data in different means. 

Center for Disease Control

133

DERMAL ABSORPTION OF PESTICIDES CALCULATED BY DECONVOLUTION  

EPA Science Inventory

Using published human data on skin-to-urine and blood-to-urine transfer of 12 pesticides and herbicides, the skin-to-blood transfer rates for each compound were estimated by two numerical deconvolution techniques. Regular constrained deconvolution produced an estimated upper limi...

134

Fully Parallel MHD Stability Analysis Tool  

NASA Astrophysics Data System (ADS)

Feasibility study of fully parallelizing plasma stability code MARS is made. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Two approaches of parallelization of the solution of the eigenvalue problem are evaluated: 1) repeat steps of the present MARS algorithm using parallel libraries and procedures; 2) solve linear block-diagonal sets of equations, formulated in the inverse iteration algorithm in MARS, by parallel libraries and procedures. The results of these studies will be reported.

Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

2012-10-01

135

TERPRED: A Dynamic Structural Data Analysis Tool  

PubMed Central

Computational protein structure prediction mainly involves the main-chain prediction and the side-chain confirmation determination. In this research, we developed a new structural bioinformatics tool, TERPRED for generating dynamic protein side-chain rotamer libraries. Compared with current various rotamer sampling methods, our work is unique in that it provides a method to generate a rotamer library dynamically based on small sequence fragments of a target protein. The Rotamer Generator provides a means for existing side-chain sampling methods using static pre-existing rotamer libraries, to sample from dynamic target-dependent libraries. Also, existing side-chain packing algorithms that require large rotamer libraries for optimal performance, could possibly utilize smaller, target-relevant libraries for improved speed. PMID:25302339

Walker, Karl; Cramer, Carole L.; Jennings, Steven F.; Huang, Xiuzhen

2012-01-01

136

JAVA based LCD Reconstruction and Analysis Tools  

SciTech Connect

We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

Bower, G.

2004-10-11

137

Pervaporation: a useful tool for speciation analysis  

NASA Astrophysics Data System (ADS)

The application of pervaporation as both an auxiliary and a fundamental device for speciation analysis in liquid and solid samples is discussed. Examples of various determinations, including the coupling of the technique to both a gas chromatograph and flow-injection configurations, applied mostly to environmental and biological samples, are presented, giving clear evidence of the double role of the pervaporation process.

Luque de Castro, M. D.; Papaefstathiou, I.

1998-02-01

138

A new tool for contamination analysis  

SciTech Connect

The Contamination Analysis Unit (CAU) is a sensing system that facilitates a new approach to industrial cleaning. Through use of portable mass spectrometry and various desorption techniques, the CAU provides in-process, near-real-time measurement of surface cleanliness levels. It can be of help in significantly reducing hazardous waste generation and toxic air emissions from manufacturing operations.

Meltzer, M.; Gregg, H.

1996-06-01

139

Nonlinear Robustness Analysis Tools for Flight Control Law Validation & Verification  

NASA Astrophysics Data System (ADS)

Loss of control in flight is among the highest aviation accident categories for both the number of accidents and the number of fatalities. The flight controls community is seeking an improved validation tools for safety critical flight control systems. Current validation tools rely heavily on linear analysis, which ignore the inherent nonlinear nature of the aircraft dynamics and flight control system. Specifically, current practices in validating the flight control system involve gridding the flight envelope and checking various criteria based on linear analysis to ensure safety of the flight control system. The analysis and certification methods currently applied assume the aircrafts' dynamics is linear. In reality, the behavior of the aircraft is always nonlinear due to its aerodynamic characteristics and physical limitations imposed by the actuators. This thesis develops nonlinear analysis tools capable of certifying flight control laws for nonlinear aircraft dynamics. The proposed analysis tools can handle both the aerodynamic nonlinearities and the physical limitations imposed by the actuators in the aircrafts' dynamics. This proposed validation technique will extend and enrich the predictive capability of existing flight control law validation methods to analyze nonlinearities. The objective of this thesis is to provide the flight control community with an advanced set of analysis tools to reduce aviation fatalities and accidents rate.

Chakraborty, Abhijit

140

General Purpose Textual Sentiment Analysis and Emotion Detection Tools  

E-print Network

General Purpose Textual Sentiment Analysis and Emotion Detection Tools Alexandre Denis, Samuel Cruz.denis, samuel.cruz-lara, nadia.bellalem}@loria.fr Abstract. Textual sentiment analysis and emotion detection consists in retrieving the sentiment or emotion carried by a text or document. This task can be useful

Paris-Sud XI, Université de

141

Pin: building customized program analysis tools with dynamic instrumentation  

Microsoft Academic Search

Robust and powerful software instrumentation tools are essential for program analysis tasks such as profiling, performance evaluation, and bug detection. To meet this need, we have developed a new instrumentation system called to instrument executables while they are running. For efficiency, Pin uses several techniques, including inlining, register re-allocation, liveness analysis, and instruction scheduling to optimize instrumentation. This fully automated

Chi-Keung Luk; Robert S. Cohn; Robert Muth; Harish Patil; Artur Klauser; P. Geoffrey Lowney; Steven Wallace; Vijay Janapa Reddi; Kim M. Hazelwood

2005-01-01

142

GEOGRAPHIC ANALYSIS TOOL FOR HEALTH AND ENVIRONMENTAL RESEARCH (GATHER)  

EPA Science Inventory

GATHER, Geographic Analysis Tool for Health and Environmental Research, is an online spatial data access system that provides members of the public health community and general public access to spatial data that is pertinent to the analysis and exploration of public health issues...

143

Recurrence time statistics: Versatile tools for genomic DNA sequence analysis  

E-print Network

enables us to carry out sequence analysis on the whole genomic scale by a PC. Keywords Genomic DNARecurrence time statistics: Versatile tools for genomic DNA sequence analysis Yinhe Cao1, Wen from DNA sequences. One of the more important structures in a DNA se- quence is repeat-related. Often

Gao, Jianbo

144

Pin: building customized program analysis tools with dynamic instrumentation  

Microsoft Academic Search

Robust and powerful software instrumentation tools are essential for program analysis tasks such as profiling, performance evaluation, and bug detection. To meet this need, we have developed a new instrumentation system called Pin. Our goals are to provide easy-to-use, portable, transparent, and efficient instrumentation. Instrumentation tools (called Pintools) are written in C\\/C++ using Pin's rich API. Pin follows the model

Chi-Keung Luk; Robert Cohn; Robert Muth; Harish Patil; Artur Klauser; Geoff Lowney; Steven Wallace; Vijay Janapa Reddi; Kim Hazelwood

2005-01-01

145

Aristotle: a system for development of program analysis based tools  

Microsoft Academic Search

Aristotle provides program analysis information,and supports the development of softwareengineering tools. Aristotle's front end consists ofparsers that gather control flow, local dataflow andsymbol table information for procedural languageprograms. We implemented a parser for C by incorporatinganalysis routines into the GNU C parser; aC++parser is being implemented using similar techniques. Aristotle tools use the data provided bythe parsers to perform a

Mary Jean Harrold; Loren Larsen; John Lloyd; David Nedved; Melanie Page; Gregg Rothermel; Manvinder Singh; Michael Smith

1995-01-01

146

Radar Interferometry Time Series Analysis and Tools  

NASA Astrophysics Data System (ADS)

We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.

Buckley, S. M.

2006-12-01

147

RCytoscape: tools for exploratory network analysis  

PubMed Central

Background Biomolecular pathways and networks are dynamic and complex, and the perturbations to them which cause disease are often multiple, heterogeneous and contingent. Pathway and network visualizations, rendered on a computer or published on paper, however, tend to be static, lacking in detail, and ill-equipped to explore the variety and quantities of data available today, and the complex causes we seek to understand. Results RCytoscape integrates R (an open-ended programming environment rich in statistical power and data-handling facilities) and Cytoscape (powerful network visualization and analysis software). RCytoscape extends Cytoscape's functionality beyond what is possible with the Cytoscape graphical user interface. To illustrate the power of RCytoscape, a portion of the Glioblastoma multiforme (GBM) data set from the Cancer Genome Atlas (TCGA) is examined. Network visualization reveals previously unreported patterns in the data suggesting heterogeneous signaling mechanisms active in GBM Proneural tumors, with possible clinical relevance. Conclusions Progress in bioinformatics and computational biology depends upon exploratory and confirmatory data analysis, upon inference, and upon modeling. These activities will eventually permit the prediction and control of complex biological systems. Network visualizations -- molecular maps -- created from an open-ended programming environment rich in statistical power and data-handling facilities, such as RCytoscape, will play an essential role in this progression. PMID:23837656

2013-01-01

148

A Semi-Automated Functional Test Data Analysis Tool  

SciTech Connect

The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

Xu, Peng; Haves, Philip; Kim, Moosung

2005-05-01

149

GATB: Genome Assembly & Analysis Tool Box  

PubMed Central

Motivation: Efficient and fast next-generation sequencing (NGS) algorithms are essential to analyze the terabytes of data generated by the NGS machines. A serious bottleneck can be the design of such algorithms, as they require sophisticated data structures and advanced hardware implementation. Results: We propose an open-source library dedicated to genome assembly and analysis to fasten the process of developing efficient software. The library is based on a recent optimized de-Bruijn graph implementation allowing complex genomes to be processed on desktop computers using fast algorithms with low memory footprints. Availability and implementation: The GATB library is written in C++ and is available at the following Web site http://gatb.inria.fr under the A-GPL license. Contact: lavenier@irisa.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990603

Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

2014-01-01

150

The role and selection of the filter function in Fourier self-deconvolution revisited.  

PubMed

Overlapped bands often appear in applications of infrared spectroscopy, for instance in the analysis of the amide I band of proteins. Fourier self-deconvolution (FSD) is a popular band-narrowing mathematical method, allowing for the resolution of overlapped bands. The filter function used in FSD plays a significant role in the factor by which the deconvolved bands are actually narrowed (the effective narrowing), as well as in the final signal-to-noise degradation induced by FSD. Moreover, the filter function determines, to a good extent, the band-shape of the deconvolved bands. For instance, the intensity of the harmful side-lobule oscillations that appear in over-deconvolution depends importantly on the filter function used. In the present paper we characterized the resulting band shape, effective narrowing, and signal-to-noise degradation in infra-, self-, and over-deconvolution conditions for several filter functions: Triangle, Bessel, Hanning, Gaussian, Sinc2, and Triangle2. We also introduced and characterized new filters based on the modification of the Blackmann filter. Our conclusion is that the Bessel filter (in infra-, self-, and mild over-deconvolution), the newly introduced BL3 filter (in self- and mild/moderate over-deconvolution), and the Gaussian filter (in moderate/strong over-deconvolution) are the most suitable filter functions to be used in FSD. PMID:19589217

Lórenz-Fonfría, Víctor A; Padrós, Esteve

2009-07-01

151

Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions  

NASA Technical Reports Server (NTRS)

Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

2002-01-01

152

Deconvolution of dynamic mechanical networks  

PubMed Central

Time-resolved single-molecule biophysical experiments yield data that contain a wealth of dynamic information, in addition to the equilibrium distributions derived from histograms of the time series. In typical force spectroscopic setups the molecule is connected via linkers to a readout device, forming a mechanically coupled dynamic network. Deconvolution of equilibrium distributions, filtering out the influence of the linkers, is a straightforward and common practice. We have developed an analogous dynamic deconvolution theory for the more challenging task of extracting kinetic properties of individual components in networks of arbitrary complexity and topology. Our method determines the intrinsic linear response functions of a given object in the network, describing the power spectrum of conformational fluctuations. The practicality of our approach is demonstrated for the particular case of a protein linked via DNA handles to two optically trapped beads at constant stretching force, which we mimic through Brownian dynamics simulations. Each well in the protein free energy landscape (corresponding to folded, unfolded, or possibly intermediate states) will have its own characteristic equilibrium fluctuations. The associated linear response function is rich in physical content, because it depends both on the shape of the well and its diffusivity—a measure of the internal friction arising from such processes as the transient breaking and reformation of bonds in the protein structure. Starting from the autocorrelation functions of the equilibrium bead fluctuations measured in this force clamp setup, we show how an experimentalist can accurately extract the state-dependent protein diffusivity using a straightforward two-step procedure. PMID:21118989

Hinczewski, Michael; von Hansen, Yann; Netz, Roland R.

2010-01-01

153

A computational tool for quantitative analysis of vascular networks.  

PubMed

Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area), providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge. PMID:22110636

Zudaire, Enrique; Gambardella, Laure; Kurcz, Christopher; Vermeren, Sonja

2011-01-01

154

A Computational Tool for Quantitative Analysis of Vascular Networks  

PubMed Central

Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time - and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called “branching index” (branch points / unit area), providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge. PMID:22110636

Zudaire, Enrique; Gambardella, Laure; Kurcz, Christopher; Vermeren, Sonja

2011-01-01

155

Computational Tools for the Secondary Analysis of Metabolomics Experiments  

PubMed Central

Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

2013-01-01

156

Efficient Bayesian-based multiview deconvolution  

PubMed Central

Light-sheet fluorescence microscopy is able to image large specimens with high resolution by capturing the samples from multiple angles. Multiview deconvolution can substantially improve the resolution and contrast of the images, but its application has been limited owing to the large size of the data sets. Here we present a Bayesian-based derivation of multiview deconvolution that drastically improves the convergence time, and we provide a fast implementation using graphics hardware. PMID:24747812

Preibisch, Stephan; Amat, Fernando; Stamataki, Evangelia; Sarov, Mihail; Singer, Robert H; Myers, Eugene; Tomancak, Pavel

2014-01-01

157

Mass++: A Visualization and Analysis Tool for Mass Spectrometry.  

PubMed

We have developed Mass++, a plug-in style visualization and analysis tool for mass spectrometry. Its plug-in style enables users to customize it and to develop original functions. Mass++ has several kinds of plug-ins, including rich viewers and analysis methods for proteomics and metabolomics. Plug-ins for supporting vendors' raw data are currently available; hence, Mass++ can read several data formats. Mass++ is both a desktop tool and a software development platform. Original functions can be developed without editing the Mass++ source code. Here, we present this tool's capability to rapidly analyze MS data and develop functions by providing examples of label-free quantitation and implementing plug-ins or scripts. Mass++ is freely available at http://www.first-ms3d.jp/english/ . PMID:24965016

Tanaka, Satoshi; Fujita, Yuichiro; Parry, Howell E; Yoshizawa, Akiyasu C; Morimoto, Kentaro; Murase, Masaki; Yamada, Yoshihiro; Yao, Jingwen; Utsunomiya, Shin-Ichi; Kajihara, Shigeki; Fukuda, Mitsuru; Ikawa, Masayuki; Tabata, Tsuyoshi; Takahashi, Kentaro; Aoshima, Ken; Nihei, Yoshito; Nishioka, Takaaki; Oda, Yoshiya; Tanaka, Koichi

2014-07-01

158

Deconvolution of immittance data: some old and new methods  

SciTech Connect

The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of Kronig-Kramers transforms, a powerful nonlinear-least-squares inversion method, and a new Monte-Carlo inversion method. The important distinction, usually ignored, between discrete-point distributions and continuous ones is emphasized, and both recent parametric and non-parametric deconvolution/inversion procedures for frequency-response data are discussed and compared. Information missing in a recent parametric measurement-model deconvolution approach is pointed out and remedied, and its priority evaluated. Comparisons are presented between the standard parametric least squares inversion method and a new non-parametric Monte Carlo one that allows complicated composite distributions of relaxation times (DRT) to be accurately estimated without the uncertainty present with regularization methods. Also, detailed Monte-Carlo DRT estimates for the supercooled liquid 0.4Ca(NO) 0.6KNO3(CKN) at 350 K are compared with appropriate frequency-response-model fit results. These composite models were derived from stretched-exponential Kohlrausch temporal response with the inclusion of either of two different series electrode-polarization functions.

Tuncer, Enis [ORNL; Macdonald, Ross J. [University of North Carolina

2007-01-01

159

Separation analysis, a tool for analyzing multigrid algorithms  

NASA Technical Reports Server (NTRS)

The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

Costiner, Sorin; Taasan, Shlomo

1995-01-01

160

Aqueous humor analysis as a diagnostic tool in toxoplasma uveitis  

Microsoft Academic Search

Analysis of local toxoplasma antibody production to confirm a suspected clinical diagnosis of toxoplasma chorioretinitis is a valuable diagnostic tool. Determination of toxoplasma antibodies in the blood of the patient is of limited use. When blood toxoplasma tests are negative this indicates that toxoplasma as a causative organism in the pathogenesis of uveitis is unlikely.A positive blood test is a

A. Kijlstra; L. Luyendijk; G. S. Baarsma; A. Rothova; C. M. C. Schweitzer; Z. Timmerman; J. de Vries; A. C. Breebaart

1989-01-01

161

Football analysis using spatio-temporal tools Joachim Gudmundsson  

E-print Network

Football analysis using spatio-temporal tools Joachim Gudmundsson University of Sydney and NICTA, Australia thomas.wolle@gmail.com ABSTRACT Analysing a football match is without doubt an important task specifically for analysing the performance of football players and teams. The aim, functionality

Wolle, Thomas

162

An Online Image Analysis Tool for Science Education  

ERIC Educational Resources Information Center

This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

2008-01-01

163

Sparse Component Analysis: a New Tool for Data Mining  

E-print Network

Sparse Component Analysis: a New Tool for Data Mining Pando Georgiev1 , Fabian Theis2 , Andrzej,hova}@bsp.brain.riken.go.jp Summary. In many practical problems for data mining the data X under consid- eration (given as (m Ã? N Separation, cluster- ing. 1 Introduction Data mining techniques can be divided into the following classes [3

Cichocki, Andrzej

164

FUTURE POWER GRID INITIATIVE Market Design Analysis Tool  

E-print Network

AND SIMULATION Market Design Instrument Schema #12;FOCUS AREA Focus Area Two targets research in the areasFUTURE POWER GRID INITIATIVE Market Design Analysis Tool OBJECTIVE Power market design plays a critical role in the outcomes related to power system reliability and market efficiency. However

165

A Performance Analysis Tool for Nokia Mobile Phone Software  

Microsoft Academic Search

Performance problems are often observed in embedded software systems. The reasons for poor performance are frequently not obvious. Bottlenecks can occur in any of the software components along the execution path. Therefore it is important to instrument and monitor the different components contributing to the runtime behavior of an embedded software system. Performance analysis tools can help locate performance bottlenecks

Edu Metz; Raimondas Lencevicius

2003-01-01

166

Analysis Tools for Next Generation Hadron Spectroscopy Experiments  

E-print Network

Analysis Tools for Next Generation Hadron Spectroscopy Experiments February 13, 2014 #12;Contents 1 Spectroscopy Experiments was initiated with the ATHOS12 meeting, which took place in Camogli, Italy, June 20 for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized

167

Selected Tools for Risk Analysis in Logistics Processes  

NASA Astrophysics Data System (ADS)

As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

Kuli?ska, Ewa

2012-03-01

168

A Tool for Efficient Fault Tree Analysis (extended version)  

E-print Network

like system reliability and availability. This paper presents DFTCalc, a powerful tool for FTA the underlying state space small. 1 Introduction Risk analysis is a key feature in reliability engineering: in order to design and build medical devices, smart grids, and internet shops that meet the required

Vellekoop, Michel

169

An Automated Data Analysis Tool for Livestock Market Data  

ERIC Educational Resources Information Center

This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

Williams, Galen S.; Raper, Kellie Curry

2011-01-01

170

Discovery and New Frontiers Project Budget Analysis Tool  

NASA Technical Reports Server (NTRS)

The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

Newhouse, Marilyn E.

2011-01-01

171

Tools  

NSDL National Science Digital Library

The goal of the lesson is not for students to learn what the simple machines are, even though this is an underlying theme. Students will approach the lesson in a much more open-minded fashion. They will discuss tools and how they function. This will naturally lead to acknowledgment of how tools make our lives easier. By categorizing everyday items, students will come to understand the natural functions of tools. This base of knowledge will lead into exercises and discussions about how complex machines are a conglomerate of simpler tools and motions, as well as how tools have changed and become more sophisticated throughout history. At the end of the lesson to reemphasize the importance of tools in human society, students will write a paper in which they imagine a world without a particular tool.

Science Netlinks;

2005-06-13

172

Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study  

NASA Technical Reports Server (NTRS)

An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

Flores, Melissa; Malin, Jane T.

2013-01-01

173

A dataflow analysis tool for parallel processing of algorithms  

NASA Technical Reports Server (NTRS)

A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

Jones, Robert L., III

1993-01-01

174

Tool Support for Parametric Analysis of Large Software Simulation Systems  

NASA Technical Reports Server (NTRS)

The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

2008-01-01

175

fMRI analysis software tools: an evaluation framework  

NASA Astrophysics Data System (ADS)

Performance comparison of functional Magnetic Resonance Imaging (fMRI) software tools is a very difficult task. In this paper, a framework for comparison of fMRI analysis results obtained with different software packages is proposed. An objective evaluation is possible only after pre-processing steps that normalize input data in a standard domain. Segmentation and registration algorithms are implemented in order to classify voxels belonging to brain or not, and to find the non rigid transformation that best aligns the volume under inspection with a standard one. Through the definitions of intersection and union of fuzzy logic an index was defined which quantify information overlap between Statistical Parametrical Maps (SPMs). Direct comparison between fMRI results can only highlight differences. In order to assess the best result, an index that represents the goodness of the activation detection is required. The transformation of the activation map in a standard domain allows the use of a functional Atlas for labeling the active voxels. For each functional area the Activation Weighted Index (AWI) that identifies the mean activation level of whole area was defined. By means of this brief, but comprehensive description, it is easy to find a metric for the objective evaluation of a fMRI analysis tools. Trough the first evaluation method the situations where the SPMs are inconsistent were identified. The result of AWI analysis suggest which tool has higher sensitivity and specificity. The proposed method seems a valid evaluation tool when applied to an adequate number of patients.

Pedoia, Valentina; Colli, Vittoria; Strocchi, Sabina; Vite, Cristina; Binaghi, Elisabetta; Conte, Leopoldo

2011-03-01

176

Rosetta CONSERT operations and data analysis preparation: simulation software tools.  

NASA Astrophysics Data System (ADS)

The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

2014-05-01

177

Analysis tools for discovering strong parity violation at hadron colliders  

NASA Astrophysics Data System (ADS)

Several arguments suggest parity violation may be observable in high energy strong interactions. We introduce new analysis tools to describe the azimuthal dependence of multiparticle distributions, or “azimuthal flow.” Analysis uses the representations of the orthogonal group O(2) and dihedral groups DN necessary to define parity completely in two dimensions. Classification finds that collective angles used in event-by-event statistics represent inequivalent tensor observables that cannot generally be represented by a single “reaction plane.” Many new parity-violating observables exist that have never been measured, while many parity-conserving observables formerly lumped together are now distinguished. We use the concept of “event-shape sorting” to suggest separating right- and left-handed events, and we discuss the effects of transverse and longitudinal spin. The analysis tools are statistically robust, and can be applied equally to low or high multiplicity events at the Tevatron, RHIC or RHIC Spin, and the LHC.

Backovi?, Mihailo; Ralston, John P.

2011-07-01

178

Galaxy, a web-based genome analysis tool for experimentalists  

PubMed Central

High-throughput data production has revolutionized molecular biology. However, massive increases in data generation capacity require analysis approaches that are more sophisticated, and often very computationally intensive. Thus making sense of high-throughput data requires informatics support. Galaxy (http://galaxyproject.org) is a software system that provides this support through a framework that gives experimentalists simple interfaces to powerful tools, while automatically managing the computational details. Galaxy is available both as a publicly available web service, which provides tools for the analysis of genomic, comparative genomic, and functional genomic data, or a downloadable package that can be deployed in individual labs. Either way, it allows experimentalists without informatics or programming expertise to perform complex large-scale analysis with just a web browser. PMID:20069535

Blankenberg, Daniel; Von Kuster, Gregory; Coraor, Nathaniel; Ananda, Guruprasad; Lazarus, Ross; Mangan, Mary; Nekrutenko, Anton; Taylor, James

2014-01-01

179

Virtual tool mark generation for efficient striation analysis.  

PubMed

This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

2014-07-01

180

Microscopy image segmentation tool: robust image data analysis.  

PubMed

We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy. PMID:24689586

Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K

2014-03-01

181

Microscopy image segmentation tool: Robust image data analysis  

SciTech Connect

We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)] [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

2014-03-15

182

Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment  

PubMed Central

Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.

2014-01-01

183

Genomics Assisted Ancestry Deconvolution in Grape  

PubMed Central

The genus Vitis (the grapevine) is a group of highly diverse, diploid woody perennial vines consisting of approximately 60 species from across the northern hemisphere. It is the world’s most valuable horticultural crop with ~8 million hectares planted, most of which is processed into wine. To gain insights into the use of wild Vitis species during the past century of interspecific grape breeding and to provide a foundation for marker-assisted breeding programmes, we present a principal components analysis (PCA) based ancestry estimation method to calculate admixture proportions of hybrid grapes in the United States Department of Agriculture grape germplasm collection using genome-wide polymorphism data. We find that grape breeders have backcrossed to both the domesticated V. vinifera and wild Vitis species and that reasonably accurate genome-wide ancestry estimation can be performed on interspecific Vitis hybrids using a panel of fewer than 50 ancestry informative markers (AIMs). We compare measures of ancestry informativeness used in selecting SNP panels for two-way admixture estimation, and verify the accuracy of our method on simulated populations of admixed offspring. Our method of ancestry deconvolution provides a first step towards selection at the seed or seedling stage for desirable admixture profiles, which will facilitate marker-assisted breeding that aims to introgress traits from wild Vitis species while retaining the desirable characteristics of elite V. vinifera cultivars. PMID:24244717

Sawler, Jason; Reisch, Bruce; Aradhya, Mallikarjuna K.; Prins, Bernard; Zhong, Gan-Yuan; Schwaninger, Heidi; Simon, Charles; Buckler, Edward; Myles, Sean

2013-01-01

184

Genomics assisted ancestry deconvolution in grape.  

PubMed

The genus Vitis (the grapevine) is a group of highly diverse, diploid woody perennial vines consisting of approximately 60 species from across the northern hemisphere. It is the world's most valuable horticultural crop with ~8 million hectares planted, most of which is processed into wine. To gain insights into the use of wild Vitis species during the past century of interspecific grape breeding and to provide a foundation for marker-assisted breeding programmes, we present a principal components analysis (PCA) based ancestry estimation method to calculate admixture proportions of hybrid grapes in the United States Department of Agriculture grape germplasm collection using genome-wide polymorphism data. We find that grape breeders have backcrossed to both the domesticated V. vinifera and wild Vitis species and that reasonably accurate genome-wide ancestry estimation can be performed on interspecific Vitis hybrids using a panel of fewer than 50 ancestry informative markers (AIMs). We compare measures of ancestry informativeness used in selecting SNP panels for two-way admixture estimation, and verify the accuracy of our method on simulated populations of admixed offspring. Our method of ancestry deconvolution provides a first step towards selection at the seed or seedling stage for desirable admixture profiles, which will facilitate marker-assisted breeding that aims to introgress traits from wild Vitis species while retaining the desirable characteristics of elite V. vinifera cultivars. PMID:24244717

Sawler, Jason; Reisch, Bruce; Aradhya, Mallikarjuna K; Prins, Bernard; Zhong, Gan-Yuan; Schwaninger, Heidi; Simon, Charles; Buckler, Edward; Myles, Sean

2013-01-01

185

SATRAT: Staphylococcus aureus transcript regulatory network analysis tool  

PubMed Central

Staphylococcus aureus is a commensal organism that primarily colonizes the nose of healthy individuals. S. aureus causes a spectrum of infections that range from skin and soft-tissue infections to fatal invasive diseases. S. aureus uses a large number of virulence factors that are regulated in a coordinated fashion. The complex regulatory mechanisms have been investigated in numerous high-throughput experiments. Access to this data is critical to studying this pathogen. Previously, we developed a compilation of microarray experimental data to enable researchers to search, browse, compare, and contrast transcript profiles. We have substantially updated this database and have built a novel exploratory tool—SATRAT—the S. aureus transcript regulatory network analysis tool, based on the updated database. This tool is capable of performing deep searches using a query and generating an interactive regulatory network based on associations among the regulators of any query gene. We believe this integrated regulatory network analysis tool would help researchers explore the missing links and identify novel pathways that regulate virulence in S. aureus. Also, the data model and the network generation code used to build this resource is open sourced, enabling researchers to build similar resources for other bacterial systems. PMID:25653902

Nagarajan, Vijayaraj; Elasri, Mohamed O.

2015-01-01

186

Applying AI tools to operational space environmental analysis  

NASA Technical Reports Server (NTRS)

The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

Krajnak, Mike; Jesse, Lisa; Mucks, John

1995-01-01

187

A blind deconvolution approach to ultrasound imaging.  

PubMed

In this paper, a single-input multiple-output (SIMO) channel model is introduced for the deconvolution process of ultrasound imaging; the ultrasound pulse is the single system input and tissue reflectivity functions are the channel impulse responses. A sparse regularized blind deconvolution model is developed by projecting the tissue reflectivity functions onto the null space of a cross-relation matrix and projecting the ultrasound pulse onto a low-resolution space. In this way, the computational load is greatly reduced and the estimation accuracy can be improved because the proposed deconvolution model contains fewer variables. Subsequently, an alternating direction method of multipliers (ADMM) algorithm is introduced to efficiently solve the proposed blind deconvolution problem. Finally, the performance of the proposed blind deconvolution method is examined using both computer-simulated data and practical in vitro and in vivo data. The results show a great improvement in the quality of ultrasound images in terms of signal-to-noise ratio and spatial resolution gain. PMID:24626035

Yu, Chengpu; Zhang, Cishen; Xie, Lihua

2012-02-01

188

Validating and Verifying a New Thermal-Hydraulic Analysis Tool  

SciTech Connect

The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3D{sup C}/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created using RELAP5-3D{sup C}/ATHENA. Both steady-state and transient calculations can be performed, using many working fluids and point to three-dimensional neutronics. A general description of the techniques used to couple the codes is given. The validation and verification (V and V) matrix is outlined. V and V is presently ongoing. (authors)

Schultz, Richard R.; Weaver, Walter L.; Ougouag, Abderrafi M. [INEEL - Idaho National Engineering and Environmental Laboratory, Idaho Falls, ID 83415 (United States); Wieselquist, William A. [North Carolina State University, 700 Hillsborough St, Raleigh, NC 27606 (United States)

2002-07-01

189

Space mission scenario development and performance analysis tool  

NASA Technical Reports Server (NTRS)

This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

2004-01-01

190

Analysis Tools for Next-Generation Hadron Spectroscopy Experiments  

E-print Network

The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

M. Battaglieri; B. J. Briscoe; A. Celentano; S. -U. Chung; A. D'Angelo; R. De Vita; M. Döring; J. Dudek; S. Eidelman; S. Fegan; J. Ferretti; G. Fox; G. Galata; H. Garcia-Tecocoatzi; D. I. Glazier; B. Grube; C. Hanhart; M. Hoferichter; S. M. Hughes; D. G. Ireland; B. Ketzer; F. J. Klein; B. Kubis; B. Liu; P. Masjuan; V. Mathieu; B. McKinnon; R. Mitchell; F. Nerling; S. Paul; J. R. Pelaez; J. Rademacker; A. Rizzo; C. Salgado; E. Santopinto; A. V. Sarantsev; T. Sato; T. Schlüter; M. L. L. da Silva; I. Stankovic; I. Strakovsky; A. Szczepaniak; A. Vassallo; N. K. Walford; D. P. Watts; L. Zana

2014-12-19

191

SMART (Shop floor Modeling, Analysis and Reporting Tool Project  

NASA Technical Reports Server (NTRS)

This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

1999-01-01

192

Stranger: An Automata-Based String Analysis Tool for PHP  

Microsoft Academic Search

\\u000a \\u000a Stranger is an automata-based string analysis tool for finding and eliminating string-related security vulnerabilities in PHP applications.\\u000a Stranger uses symbolic forward and backward reachability analyses to compute the possible values that the string expressions can take\\u000a during program execution. Stranger can automatically (1) prove that an application is free from specified attacks or (2) generate vulnerability signatures that\\u000a characterize all

Fang Yu; Muath Alkhalaf; Tevfik Bultan

2010-01-01

193

An integrated thermal management analysis tool [for aircraft  

Microsoft Academic Search

A computational tool, developed to perform subsystem and system level integrated thermal management assessment and design calculations, is described in this paper. The Vehicle Integrated Thermal Management Analysis Code (VITMAC) simulates the coupled thermal-fluid response of airframe\\/engine active cooling circuits, airframe\\/engine structural components, and fuel tanks subject to aeroheating and internal\\/engine heat loads. VITMAC simulates both the steady-state and transient

F. Issacci; A. Telal Wassel; V. Van Griethuysen

1996-01-01

194

Millennial scale system impulse response of polar climates - deconvolution results between ? 18O records from Greenland and Antarctica  

NASA Astrophysics Data System (ADS)

Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, ? 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the periodicity of the thermo-haline circulation (THC). Simplified models of the polar climate fluctuations are shown to support these findings.

Reischmann, E.; Yang, X.; Rial, J. A.

2013-12-01

195

Blind-deconvolution optical-resolution photoacoustic microscopy in vivo.  

PubMed

Optical-resolution photoacoustic microscopy (OR-PAM) is becoming a vital tool for studying the microcirculation system in vivo. By increasing the numerical aperture of optical focusing, the lateral resolution of OR-PAM can be improved; however, the depth of focus and thus the imaging range will be sacrificed correspondingly. In this work, we report our development of blind-deconvolution optical-resolution photoacoustic microscopy (BD-PAM) that can provide a lateral resolution ~2-fold finer than that of conventional OR-PAM (3.04 vs. 5.78?m), without physically increasing the system's numerical aperture. The improvement achieved with BD-PAM is demonstrated by imaging graphene nanoparticles and the microvasculature of mice ears in vivo. Our results suggest that BD-PAM may become a valuable tool for many biomedical applications that require both fine spatial resolution and extended depth of focus. PMID:23546115

Chen, Jianhua; Lin, Riqiang; Wang, Huina; Meng, Jing; Zheng, Hairong; Song, Liang

2013-03-25

196

On the next generation of reliability analysis tools  

NASA Technical Reports Server (NTRS)

The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

Babcock, Philip S., IV; Leong, Frank; Gai, Eli

1987-01-01

197

Aerospace Power Systems Design and Analysis (APSDA) Tool  

NASA Technical Reports Server (NTRS)

The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

Truong, Long V.

1998-01-01

198

ISAC: A tool for aeroservoelastic modeling and analysis  

NASA Technical Reports Server (NTRS)

The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

Adams, William M., Jr.; Hoadley, Sherwood Tiffany

1993-01-01

199

Analysis Tools for Next-Generation Hadron Spectroscopy Experiments  

E-print Network

The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near...

Battaglieri, M; Celentano, A; Chung, S -U; D'Angelo, A; De Vita, R; Döring, M; Dudek, J; Eidelman, S; Fegan, S; Ferretti, J; Fox, G; Galata, G; Garcia-Tecocoatzi, H; Glazier, D I; Grube, B; Hanhart, C; Hoferichter, M; Hughes, S M; Ireland, D G; Ketzer, B; Klein, F J; Kubis, B; Liu, B; Masjuan, P; Mathieu, V; McKinnon, B; Mitchell, R; Nerling, F; Paul, S; Pelaez, J R; Rademacker, J; Rizzo, A; Salgado, C; Santopinto, E; Sarantsev, A V; Sato, T; Schlüter, T; da Silva, M L L; Stankovic, I; Strakovsky, I; Szczepaniak, A; Vassallo, A; Walford, N K; Watts, D P; Zana, L

2014-01-01

200

Multiscale CLEAN Deconvolution of Radio Synthesis Images  

NASA Astrophysics Data System (ADS)

Radio synthesis imaging is dependent upon deconvolution algorithms to counteract the sparse sampling of the Fourier plane. These deconvolution algorithms find an estimate of the true sky brightness from the necessarily incomplete sampled visibility data. The most widely used radio synthesis deconvolution method is the CLEAN algorithm of Hogbom. This algorithm works extremely well for collections of point sources and surprisingly well for extended objects. However, the performance for extended objects can be improved by adopting a multiscale approach. We describe and demonstrate a conceptually simple and algorithmically straightforward extension to CLEAN that models the sky brightness by the summation of components of emission having different size scales. While previous multiscale algorithms work sequentially on decreasing scale sizes, our algorithm works simultaneously on a range of specified scales. Applications to both real and simulated data sets are given.

Cornwell, T. J.

2008-11-01

201

Ultrasonic range resolution enhancement using L1 norm deconvolution  

Microsoft Academic Search

Deconvolution techniques have been widely used to improve resolution and quality of ultrasonic images One of the optimality criteria is minimizing the Lp norm of the estimation error. It has been shown that the L1 norm deconvolution is more robust than the L2 norm. Linear programming and iterative reweighted least squares (IRLS) algorithms have been implemented in L1 deconvolution. Because

Jianqiang Xin; Nihat M. Bilgutay

1993-01-01

202

Judo match analysis,a powerful coaching tool, basic and advanced tools  

E-print Network

In this second paper on match analysis, we analyze in deep the competition steps showing the evolution of this tool at National Federation level.On the basis of our,first classification. Furthermore, it is the most important source of technical assessment. Studying competition with this tool is essential for the coaches because they can obtain useful information for their coaching. Match Analysis is today the master key in situation sports like Judo, to help in useful way the difficult task of coach or best for National or Olympic coaching equips. In this paper it is presented a deeper study of the judo competitions at high level both from the male and female point of view, explaining at light of biomechanics, not only the throws evolution in time, introduction of Innovative and Chaotic techniques, but also the evolution of fighting style in these high level competitions, both connected with the grow of this Olympic Sport in the Word Arena. It is shown how new interesting ways are opened by this powerful coac...

Sacripanti, A

2013-01-01

203

Evaluating control displays with the Engineering Control Analysis Tool (ECAT)  

SciTech Connect

In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

Plott, B. [Alion Science and Technology, MA and D Operation, 4949 Pearl E. Circle, 300, Boulder, CO 80301 (United States)

2006-07-01

204

Colossal Tooling Design: 3D Simulation for Ergonomic Analysis  

NASA Technical Reports Server (NTRS)

The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

2003-01-01

205

POPBAM: Tools for Evolutionary Analysis of Short Read Sequence Alignments  

PubMed Central

Background While many bioinformatics tools currently exist for assembling and discovering variants from next-generation sequence data, there are very few tools available for performing evolutionary analyses from these data. Evolutionary and population genomics studies hold great promise for providing valuable insights into natural selection, the effect of mutations on phenotypes, and the origin of species. Thus, there is a need for an extensible and flexible computational tool that can function into a growing number of evolutionary bioinformatics pipelines. Results This paper describes the POPBAM software, which is a comprehensive set of computational tools for evolutionary analysis of whole-genome alignments consisting of multiple individuals, from multiple populations or species. POPBAM works directly from BAM-formatted assembly files, calls variant sites, and calculates a variety of commonly used evolutionary sequence statistics. POPBAM is designed primarily to perform analyses in sliding windows across chromosomes or scaffolds. POPBAM accurately measures nucleotide diversity, population divergence, linkage disequilibrium, and the frequency spectrum of mutations from two or more populations. POPBAM can also produce phylogenetic trees of all samples in a BAM file. Finally, I demonstrate that the implementation of POPBAM is both fast and memory-efficient, and also can feasibly scale to the analysis of large BAM files with many individuals and populations. Software: The POPBAM program is written in C/C++ and is available from http://dgarriga.github.io/POPBAM. The program has few dependencies and can be built on a variety of Linux platforms. The program is open-source and users are encouraged to participate in the development of this resource. PMID:24027417

Garrigan, Daniel

2013-01-01

206

Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.  

ERIC Educational Resources Information Center

This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

Carlson, David H.

1986-01-01

207

TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS  

SciTech Connect

We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

Da Rio, Nicola [European Space Agency, Keplerlaan 1, 2200-AG Noordwijk (Netherlands); Robberto, Massimo, E-mail: ndario@rssd.esa.int [Space Telescope Science Institute, 3700 San Martin Dr., Baltimore, MD 21218 (United States)

2012-12-01

208

A Simulation Tool for Analysis of Alternative Paradigms for the New Electricity Business  

E-print Network

A Simulation Tool for Analysis of Alternative Paradigms for the New Electricity Business Thomas J preliminary results on the development of a simulation tool to perform this analysis. Since power systems

209

Protocol analysis as a tool for behavior analysis  

PubMed Central

The study of thinking is made difficult by the fact that many of the relevant stimuli and responses are not apparent. Although the use of verbal reports has a long history in psychology, it is only recently that Ericsson and Simon's (1993) book on verbal reports explicated the conditions under which such reports may be reliable and valid. We review some studies in behavior analysis and cognitive psychology that have used talk-aloud reporting. We review particular methods for collecting reliable and valid verbal reports using the “talk-aloud” method as well as discuss alternatives to the talk-aloud procedure that are effective under different task conditions, such as the use of reports after completion of very rapid task performances. We specifically caution against the practice of asking subjects to reflect on the causes of their own behavior and the less frequently discussed problems associated with providing inappropriate social stimulation to participants during experimental sessions. PMID:22477126

Austin, John; Delaney, Peter F.

1998-01-01

210

Operations other than war: Requirements for analysis tools research report  

SciTech Connect

This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

Hartley, D.S. III

1996-12-01

211

The Precision Formation Flying Integrated Analysis Tool (PFFIAT)  

NASA Technical Reports Server (NTRS)

Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

2004-01-01

212

Target deconvolution strategies in drug discovery  

Microsoft Academic Search

Recognition of some of the limitations of target-based drug discovery has recently led to the renaissance of a more holistic approach in which complex biological systems are investigated for phenotypic changes upon exposure to small molecules. The subsequent identification of the molecular targets that underlie an observed phenotypic response — termed target deconvolution — is an important aspect of current

Christina Schlüpen; Roberto Raggiaschi; Giovanni Gaviraghi; Georg C. Terstappen

2007-01-01

213

Blind seismic deconvolution using variational Bayesian method  

NASA Astrophysics Data System (ADS)

Blind seismic deconvolution, which comprises seismic wavelet and reflectivity sequence, is a strongly ill-posed problem. The reflectivity sequence is modeled as a Bernoulli-Gaussian (BG) process, depending on four parameters (noise variance, high and low reflector variances, and reflector density). These parameters need to be estimated from the seismic record, which is the convolution of the reflectivity sequence and the seismic wavelet. In this paper, we propose a variational Bayesian method for blind seismic deconvolution which can determine the reflectivity sequence and the seismic wavelet. The connection between variational Bayesian blind deconvolution and the minimization of the Kullback-Leibler divergence of two probability distributions is also established. The gamma, beta distributions are used for the unknown parameters (hyperparameters) as prior distribution and also we give how these distributions can be inferred in actual situations. The proposed algorithms are tested by simulation and compared to existing blind deconvolution methods. The results show that variational Bayesian method has better agreement with the actual value.

Yanqin, Li; Guoshan, Zhang

2014-11-01

214

Blind deconvolution through digital signal processing  

Microsoft Academic Search

This paper addresses the problem of deconvolving two signals when both are unknown. The authors call this problem blind deconvolution. The discussion develops two related solutions which can be applied through digital signal processing in certain practical cases. The case of reverberated and resonated sound forms the center of the development. The specific problem of restoring old acoustic recordings provides

T. M. Cannon; R. B. Ingebretsen

1975-01-01

215

Fast Wavelet-Regularized Image Deconvolution  

Microsoft Academic Search

We present a modified version of the deconvolution algorithm introduced by Figueiredo and Nowak, which leads to a sub- stantial acceleration. The algorithm essentially consists in al- ternating between a Landweber-type iteration and a wavelet- domain denoising step. Our key innovations are 1) the use of a Shannon wavelet basis, which decouples the problem accross subbands, and 2) the use

Cédric Vonesch; Michael Unser

2007-01-01

216

Nonstationary sparsity-constrained seismic deconvolution  

NASA Astrophysics Data System (ADS)

The Robinson convolution model is mainly restricted by three inappropriate assumptions, i.e., statistically white reflectivity, minimum-phase wavelet, and stationarity. Modern reflectivity inversion methods (e.g., sparsity-constrained deconvolution) generally attempt to suppress the problems associated with the first two assumptions but often ignore that seismic traces are nonstationary signals, which undermines the basic assumption of unchanging wavelet in reflectivity inversion. Through tests on reflectivity series, we confirm the effects of nonstationarity on reflectivity estimation and the loss of significant information, especially in deep layers. To overcome the problems caused by nonstationarity, we propose a nonstationary convolutional model, and then use the attenuation curve in log spectra to detect and correct the influences of nonstationarity. We use Gabor deconvolution to handle nonstationarity and sparsity-constrained deconvolution to separating reflectivity and wavelet. The combination of the two deconvolution methods effectively handles nonstationarity and greatly reduces the problems associated with the unreasonable assumptions regarding reflectivity and wavelet. Using marine seismic data, we show that correcting nonstationarity helps recover subtle reflectivity information and enhances the characterization of details with respect to the geological record.

Sun, Xue-Kai; Sam, Zandong Sun; Xie, Hui-Wen

2014-12-01

217

Signal restoration through deconvolution applied to deep  

E-print Network

Signal restoration through deconvolution applied to deep mantle seismic probes Rosemary Renaut restoration to improve the signal to noise ratio, sharpen seismic arrival onset, and act as an empirical in the wave train. The resulting restored time series facilitates more accurate and objective rel- ative

Renaut, Rosemary

218

Signal restoration through deconvolution applied to deep  

E-print Network

Signal restoration through deconvolution applied to deep mantle seismic probes Rosemary Renaut@asu.edu stefan@mathpost.la.asu.edu garnero@asu.edu Abstract In [1] we present a method of signal restoration. The resulting restored time series facilitates more accurate and objective rel- ative travel time estimation

Renaut, Rosemary

219

Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation  

SciTech Connect

The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

Neubauer, J.

2014-12-01

220

VisIt: Interactive Parallel Visualization and Graphical Analysis Tool  

NASA Astrophysics Data System (ADS)

VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)

2011-03-01

221

Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool  

SciTech Connect

Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

Unwin, Stephen D.; Seiple, Timothy E.

2009-05-28

222

Mechanical System Analysis/Design Tool (MSAT) Quick Guide  

NASA Technical Reports Server (NTRS)

MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

Lee, HauHua; Kolb, Mark; Madelone, Jack

1998-01-01

223

Stacks: an analysis tool set for population genomics  

PubMed Central

Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics. PMID:23701397

CATCHEN, JULIAN; HOHENLOHE, PAUL A.; BASSHAM, SUSAN; AMORES, ANGEL; CRESKO, WILLIAM A.

2014-01-01

224

Orbit Analysis Tools Software user's manual, version 1  

NASA Astrophysics Data System (ADS)

In the course of our work in mission planning and analysis we have developed a set of computer programs that address many of the questions commonly asked by designers when planning a new satellite system and by managers wishing to assess the performance of an existing system. The Orbit Analysis Tools Software (OATS) is an organization of this collection of computer programs unified by a single graphical user interface. The graphical and tabular output from OATS may be printed directly from the program or cut and pasted via the clipboard into any other Macintosh application program. The FaceIt1 utility is used to establish the interface between the FORTRAN code and the Macintosh Toolbox.

Hope, Alan S.; Middour, Jay

1993-04-01

225

PyRAT (python radiography analysis tool): overview  

SciTech Connect

PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

2011-01-14

226

Application of regularized Richardson–Lucy algorithm for deconvolution of confocal microscopy images  

PubMed Central

Although confocal microscopes have considerably smaller contribution of out-of-focus light than widefield microscopes, the confocal images can still be enhanced mathematically if the optical and data acquisition effects are accounted for. For that, several deconvolution algorithms have been proposed. As a practical solution, maximum-likelihood algorithms with regularization have been used. However, the choice of regularization parameters is often unknown although it has considerable effect on the result of deconvolution process. The aims of this work were: to find good estimates of deconvolution parameters; and to develop an open source software package that would allow testing different deconvolution algorithms and that would be easy to use in practice. Here, Richardson–Lucy algorithm has been implemented together with the total variation regularization in an open source software package IOCBio Microscope. The influence of total variation regularization on deconvolution process is determined by one parameter. We derived a formula to estimate this regularization parameter automatically from the images as the algorithm progresses. To assess the effectiveness of this algorithm, synthetic images were composed on the basis of confocal images of rat cardiomyocytes. From the analysis of deconvolved results, we have determined under which conditions our estimation of total variation regularization parameter gives good results. The estimated total variation regularization parameter can be monitored during deconvolution process and used as a stopping criterion. An inverse relation between the optimal regularization parameter and the peak signal-to-noise ratio of an image is shown. Finally, we demonstrate the use of the developed software by deconvolving images of rat cardiomyocytes with stained mitochondria and sarcolemma obtained by confocal and widefield microscopes. PMID:21323670

Laasmaa, M; Vendelin, M; Peterson, P

2011-01-01

227

CRITICA: coding region identification tool invoking comparative analysis  

NASA Technical Reports Server (NTRS)

Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

1999-01-01

228

ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution  

E-print Network

ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution Jaroslav Pazdera pazdera the atomic deconvolution problem and we propose the estimator for an atom location and give its asymptotic in the ordinary deconvolution problem. ATOMIC DECONVOLUTION In the ordinary deconvolution problem one wants

Jureckova, Jana

229

Design and Analysis Tool for External-Compression Supersonic Inlets  

NASA Technical Reports Server (NTRS)

A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

Slater, John W.

2012-01-01

230

IPMP 2013--a comprehensive data analysis tool for predictive microbiology.  

PubMed

Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology. PMID:24334095

Huang, Lihan

2014-02-01

231

Assessing Extremes Climatology Using NWS Local Climate Analysis Tool  

NASA Astrophysics Data System (ADS)

The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of tornadoes, flash floods, storminess, extreme weather events, etc. LCAT will expand the suite of NWS climate products. The LCAT development utilizes NWS Operations and Services Improvement Process (OSIP) to document the field and user requirements, develop solutions, and prioritize resources. OSIP is a five work-stage process separated by four gate reviews. LCAT is currently at work-stage three: Research Demonstration and Solution Analysis. Gate 1 and 2 reviews identified LCAT as a high strategic priority project with a very high operational need. The Integrated Working Team, consisting of NWS field representatives, assists in tool function design and identification of LCAT operational deployment support.

Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

2010-12-01

232

Analysis and specification tools in relation to the APSE  

NASA Technical Reports Server (NTRS)

Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

Hendricks, John W.

1986-01-01

233

CHRONOS's Paleontological-Stratigraphic Interval Construction and Analysis Tool (PSICAT)  

NASA Astrophysics Data System (ADS)

The Paleontological-Stratigraphic Interval Construction and Analysis Tool (PSICAT) is a Java-based graphical editing tool for creating and viewing stratigraphic column diagrams from drill cores and outcrops. It is customized to the task of working with stratigraphic columns and captures data digitally as you draw and edit the diagram. The data and diagrams are captured in open formats, and integration with the CHRONOS system (www.chronos.org) will allow the user to easily upload their data and diagrams into CHRONOS. Because the data and diagrams are stored in CHRONOS, they will be accessible to anyone, anywhere, at any time. PSICAT is designed with a modular, plug-in-based architecture that will allow it to support a wide variety of functionality, tasks, and geoscientific communities. PSICAT is currently being developed for use by the ANDRILL project (http://www.andrill.org) on their upcoming drilling expeditions in Antarctica, but a general community version will be also available. PSICAT will allow unprecedented communication between Antarctica-based scientists and shore-based scientists, potentially allowing shore-based scientists to interact in almost real time with on-ice operations and data collection.

Reed, J. A.; Cervato, C.; Fielding, C. R.; Fils, D.

2005-12-01

234

Multi-Mission Power Analysis Tool (MMPAT) Version 3  

NASA Technical Reports Server (NTRS)

The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

Wood, Eric G.; Chang, George W.; Chen, Fannie C.

2012-01-01

235

A Freeware Java Tool for Spatial Point Analysis of Neuronal Barry G. Condron  

E-print Network

NEWS ITEM A Freeware Java Tool for Spatial Point Analysis of Neuronal Structures Barry G. Condron, a freeware tool, called PAJ, has been developed. This Java-based tool takes 3D Cartesian coordinates as input in Java that is based on previously described statistical analysis (Diggle 2003). In PAJ, data is copied

Condron, Barry

236

Combinatorial tools for the analysis of transcriptional regulation  

SciTech Connect

In this paper, we discuss virtual experiments for the study of major regulatory processes such as translation, signalization or transcription pathways. An essential part of these processes is the formation of protein clusters held together by a small number of binding domains that can be shared by many different proteins. Analysis of these clusters is complicated by the vast number of different arrangements of proteins that can trigger a specific reaction. We propose combinatorial tools that can help predict the effects on the rate of transcription of either changes in transcriptional factors concentration, or due to the introduction of chimeras combining domains not usually present on a protein. 15 refs., 5 figs., 3 tabs.

Bergeron, A.; Gaul, E.; Bergeron, D. [Universite du Quebec a Montreal (Canada)

1996-12-31

237

Decision Analysis Tool to Compare Energy Pathways for Transportation  

SciTech Connect

With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). Biomass is seen as an important domestic energy feedstock, and there are multiple pathways in which it can be linked to the transport sector. Contenders include the use of cellulosic ethanol from biomass to replace gasoline or the use of a biomass-fueled combined cycle electrical power generation facility in conjunction plug-in hybrid electric vehicles (PHEVs). This paper reviews a project that is developing a scenario decision analysis tool to assist policy makers, program managers, and others to obtain a better understanding of these uncertain possibilities and how they may interact over time.

Bloyd, Cary N.

2010-06-30

238

SINEBase: a database and tool for SINE analysis.  

PubMed

SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis. PMID:23203982

Vassetzky, Nikita S; Kramerov, Dmitri A

2013-01-01

239

Sensitivity analysis of an information fusion tool: OWA operator  

NASA Astrophysics Data System (ADS)

The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

2007-04-01

240

System-of-Systems Technology-Portfolio-Analysis Tool  

NASA Technical Reports Server (NTRS)

Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

2012-01-01

241

A Web-based Tool For The Analysis Of Concept Inventory Data  

Microsoft Academic Search

“FOCIA” stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of

Joseph P. Beuckman; Scott V. Franklin; Rebecca S. Lindell

2005-01-01

242

A Web-based Tool For The Analysis Of Concept Inventory Data  

Microsoft Academic Search

``FOCIA'' stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of

Joseph P. Beuckman; Scott V. Franklin; Rebecca S. Lindell

2005-01-01

243

Immunoglobulin Analysis Tool: A Novel Tool for the Analysis of Human and Mouse Heavy and Light Chain Transcripts  

PubMed Central

Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft® Excel® based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts – such as the usage of germline gene segments, or the length and composition of the CDR-3 region – IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai’s H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel® 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set to its fullest. PMID:22754554

Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F.; Ippolito, Gregory C.; Zemlin, Michael

2012-01-01

244

Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools  

SciTech Connect

The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

Barrand, Guy

2002-08-20

245

TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0  

NASA Technical Reports Server (NTRS)

The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.

Ortiz, C. J.

1994-01-01

246

Blind Poissonian images deconvolution with framelet regularization.  

PubMed

We propose a maximum a posteriori blind Poissonian images deconvolution approach with framelet regularization for the image and total variation (TV) regularization for the point spread function. Compared with the TV based methods, our algorithm not only suppresses noise effectively but also recovers edges and detailed information. Moreover, the split Bregman method is exploited to solve the resulting minimization problem. Comparative results on both simulated and real images are reported. PMID:23455078

Fang, Houzhang; Yan, Luxin; Liu, Hai; Chang, Yi

2013-02-15

247

Regularized Blind Deconvolution with Poisson Data  

NASA Astrophysics Data System (ADS)

We propose easy-to-implement algorithms to perform blind deconvolution of nonnegative images in the presence of noise of Poisson type. Alternate minimization of a regularized Kullback-Leibler cost function is achieved via multiplicative update rules. The scheme allows to prove convergence of the iterates to a stationary point of the cost function. Numerical examples are reported to demonstrate the feasibility of the proposed method.

Lecharlier, Loïc; De Mol, Christine

2013-10-01

248

Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools  

PubMed Central

Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

2014-01-01

249

A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra  

NASA Astrophysics Data System (ADS)

A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

2013-06-01

250

Genomic Resources and Tools for Gene Function Analysis in Potato  

PubMed Central

Potato, a highly heterozygous tetraploid, is undergoing an exciting phase of genomics resource development. The potato research community has established extensive genomic resources, such as large expressed sequence tag (EST) data collections, microarrays and other expression profiling platforms, and large-insert genomic libraries. Moreover, potato will now benefit from a global potato physical mapping effort, which is serving as the underlying resource for a full potato genome sequencing project, now well underway. These tools and resources are having a major impact on potato breeding and genetics. The genome sequence will provide an invaluable comparative genomics resource for cross-referencing to the other Solanaceae, notably tomato, whose sequence is also being determined. Most importantly perhaps, a potato genome sequence will pave the way for the functional analysis of the large numbers of potato genes that await discovery. Potato, being easily transformable, is highly amenable to the investigation of gene function by biotechnological approaches. Recent advances in the development of Virus Induced Gene Silencing (VIGS) and related methods will facilitate rapid progress in the analysis of gene function in this important crop. PMID:19107214

Bryan, Glenn J.; Hein, Ingo

2008-01-01

251

Advanced tools for astronomical time series and image analysis  

NASA Astrophysics Data System (ADS)

The algorithms described here, which I have developed for applications in X-ray and ?-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

Scargle, Jeffrey D.

252

jSIPRO - analysis tool for magnetic resonance spectroscopic imaging.  

PubMed

Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan

2013-10-01

253

Spectral Analysis Tool 6.2 for Windows  

NASA Technical Reports Server (NTRS)

Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

2006-01-01

254

Fishing the target of antitubercular compounds: in silico target deconvolution model development and validation.  

PubMed

An in silico target prediction protocol for antitubercular (antiTB) compounds has been proposed in this work. This protocol is the extension of a recently published 'domain fishing model' (DFM), validating its predicted targets on a set of 42 common antitubercular drugs. For the 23 antiTB compounds of the set which are directly linked to targets (see text for definition), the DFM exhibited a very good target prediction accuracy of 95%. For 19 compounds indirectly linked to targets also, a reasonable pathway/embedded pathway prediction accuracy of 84% was achieved. Since mostly eukaryotic ligand binding data was used for the DFM generation, the high target prediction accuracy for prokaryotes (which is an extrapolation from the training data) was unexpected and provides an additional proof of concept of the DFM. To estimate the general applicability of the model, ligand-target coverage analysis was performed. Here, it was found that, although the DFM only modestly covers the entire TB proteome (32% of all proteins), it captures 70% of the proteome subset targeted by 42 common antiTB compounds, which is in agreement with the good predictive ability of the DFM for the targets of the compounds chosen here. In a prospective validation, the model successfully predicted the targets of new antiTB compounds, CBR-2092 and Amiclenomycin. Together, these findings suggest that in silico target prediction tools may be a useful supplement to existing, experimental target deconvolution strategies. PMID:19301903

Prathipati, Philip; Ma, Ngai Ling; Manjunatha, Ujjini H; Bender, Andreas

2009-06-01

255

Study of academic achievements using spatial analysis tools  

NASA Astrophysics Data System (ADS)

In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

González, C.; Velilla, C.; Sánchez-Girón, V.

2012-04-01

256

Mission operations data analysis tools for Mars Observer guidance and control  

NASA Technical Reports Server (NTRS)

Mission operations for the Mars Observer (MO) Project at the Jet Propulsion Laboratory were supported by a variety of ground data processing software and analysis tools. Some of these tools were generic to multimission spacecraft mission operations, some were specific to the MO spacecraft, and others were custom tailored to the operation and control of the Attitude and Articulation Control Subsystem (AACS). The focus of this paper is on the data analysis tools for the AACS. Four different categories of analysis tools are presented; with details offered for specific tools. Valuable experience was gained from the use of these tools and through their development. These tools formed the backbone and enhanced the efficiency of the AACS Unit in the Mission Operations Spacecraft Team. These same tools, and extensions thereof, have been adopted by the Galileo mission operations, and are being designed into Cassini and other future spacecraft mission operations.

Kan, Edwin P.

1994-01-01

257

Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution  

NASA Astrophysics Data System (ADS)

The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution in UDECON is achieved by searching for the minimum ABIC while shifting the sensor response (to account for possible mispositioning of the sample on the tray) and a smoothness parameter in ranges defined by user. Comparison of deconvolution results using sensor response estimated from integrated point source measurements and other methods suggest that the integrated point source estimate yields better results (smaller ABIC). The noise characteristics of magnetometer measurements and the reliability of the UDECON algorithm were tested using repeated (a total of 400 times) natural remanence measurement of a u-channel sample before and after stepwise alternating field demagnetizations. Using a series of synthetic data constructed based on real paleomagnetic record, we demonstrate that optimized deconvolution using UDECON can greatly help revealing detailed paleomagnetic information such as excursions that may be smoothed out during pass-through measurement. Application of UDECON to the vast amount of existing and future pass-through paleomagnetic and rock magnetic measurements on sediments recovered especially through ocean drilling programs will contribute to our understanding of the geodynamo and paleo-environment by providing more detailed records of geomagnetic and environmental changes.

Xuan, C.; Oda, H.

2013-12-01

258

Genetics and Population Analysis LOT: a Tool for Linkage Analysis of Ordinal Traits for Pedigree Data  

E-print Network

Summary: Existing linkage-analysis methods address binary or quantitative traits. However, many complex diseases and human conditions, particularly behavioral disorders, are rated on ordinal scales. Herein, we introduce, LOT, a tool that performs linkage analysis of ordinal traits for pedigree data. It implements a latentvariable proportional-odds logistic model that relates inheritance patterns to the distribution of the ordinal trait. The likelihood-ratio test is used for testing evidence of linkage. Availability: The LOT program is available for download at

Meizhuo Zhang; Rui Feng; Xiang Chen; Buqu Hu; Heping Zhang

2008-01-01

259

Kriging the Fields: a New Statistical Tool for Wave Propagation Analysis  

E-print Network

Kriging the Fields: a New Statistical Tool for Wave Propagation Analysis Ph. De Doncker , J analysis is presented based on the spa- tial statistics tools known as kriging and vario- graphic analysis 91192, Gif- sur-Yvette, France, e-mail: Marc.Helier@supelec.fr, tabbara@lss.supelec.fr Kriging has

Libre de Bruxelles, Université

260

Analysis of the influence of tool dynamics in diamond turning  

SciTech Connect

This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

1988-12-01

261

Thermal Management Tools for Propulsion System Trade Studies and Analysis  

NASA Technical Reports Server (NTRS)

Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

McCarthy, Kevin; Hodge, Ernie

2011-01-01

262

SEM analysis as a diagnostic tool for photovoltaic cell degradation  

NASA Astrophysics Data System (ADS)

The importance of scanning electron microscopy (SEM) analysis as a diagnostic tool for analyzing the degradation of a polycrystalline Photovoltaic cell has been studied. The main aim of this study is to characterize the surface morphology of hot spot regions (degraded) cells in photovoltaic solar cells. In recent years, production of hetero and multi-junction solar cells has experience tremendous growth as compared to conventional silicon (Si) solar cells. Thin film photovoltaic solar cells generally are more prone to exhibiting defects and associated degradation modes. To improve the lifetime of these cells and modules, it is imperative to fully understand the cause and effect of defects and degradation modes. The objective of this paper is to diagnose the observed degradation in polycrystalline silicon cells, using scanning electron microscopy (SEM). In this study poly-Si cells were characterize before and after reverse biasing, the reverse biasing was done to evaluate the cells' susceptibility to leakage currents and hotspots formation. After reverse biasing, some cells were found to exhibit hotspots as confirmed by infrared thermography. The surface morphology of these hotspots re

Osayemwenre, Gilbert; Meyer, E. L.

2013-04-01

263

Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions  

E-print Network

Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions Dung V hack tools. However, beside U3 technology, attackers also have another more flexible alternative, portable application or application virtualization, which allows a wide range of hack tools to be compiled

Halgamuge, Malka N.

264

Vesta and HED Meteorites: Determining Minerals and their Abundances with Mid-IR Spectral Deconvolution I  

NASA Astrophysics Data System (ADS)

We identify the known mineral compositions and abundances of laboratory samples of Howardite, Eucrite and Diogenite (HED) meteorites (Salisbury et al. 1991, Icarus 9, 280-297) using an established spectral deconvolution algorithm (Ramsey, 1996 Ph.D. Dissertation, ASU; Ramsey and Christiansen 1998, JGR 103, 577-596) for mid-infrared spectral libraries of mineral separates of varying grain sizes. Most notably, the spectral deconvolution algorithm fit the known plagioclase and pyroxene compositions for all of the HED meteorite spectra determined by laboratory analysis. Our results for the HED samples, give us a high degree of confidence that our results are valid and that the spectral deconvolution algorithm is viable. Mineral compositions and abundances are also determined using the same technique for one possible HED parent body, Vesta, using mid-infrared spectra that were obtained from ground-based telescopes (Sprague et al. 1993, A.S.P. 41 Lim et al. 2005, Icarus 173, 385-408) and the Infrared Space Observatory (ISO) (Dotto et al. 2000, A&A 358, 1133-1141). Mid-infrared spectra of Vesta come from different areas on its surface. The ISO Vesta spectral deconvolution is suggestive of triolite, olivine, augite, chromite, wollastonite, and sodalite at one location. Modeling of other locations is underway. We also were successful in modeling spectra from locations on the Moon where no Apollo samples are available and for several locations on Mercury's surface using the same techniques (see lunar and mercurian abstracts this meeting). These results demonstrate promise for the spectral deconvolution method to correctly make mineral identifications on remotely observed objects, in particular main-belt asteroids, the Moon, and Mercury. This work was funded by NSF AST0406796.

Hanna, Kerri D.; Sprague, A. L.

2007-10-01

265

Experimental analysis of change detection algorithms for multitooth machine tool fault detection  

NASA Astrophysics Data System (ADS)

This paper describes an industrial application of fault diagnosis method for a multitooth machine tool. Different statistical approaches have been used to detect and diagnose insert breakage in multitooth tools based on the analysis of electrical power consumption of the tool drives. Great effort has been made to obtain a robust method, able to avoid any needed re-calibration process, after, for example, a maintenance operation. From the point of view of maintenance costs, these multitooth tools are the most critical part of the machine tools used for mass production in the car industry. These tools integrate different kinds of machining operations and cutting conditions.

Reñones, Aníbal; de Miguel, Luis J.; Perán, José R.

2009-10-01

266

Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools  

NASA Astrophysics Data System (ADS)

Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency management, to know in advance the different levels of risk perception and preparedness existing among several sectors of the population. Knowing where the most vulnerable population is located may optimize the use of resources, better direct the initial efforts and organize the evacuation and attention procedures. As part of the CBEWS, a comprehensive survey was applied in the study area to measure, among others features, the levels of risk perception, preparation and information received about natural hazards. After a statistical and direct analysis on a complete social dataset recorded, a spatial information distribution is actually in progress. Based on boundaries features (municipalities and sub-districts) of Italian Institute of Statistics (ISTAT), a local scale background has been granted (a private address level is not accessible for privacy rules so the local districts-ID inside municipality has been the detail level performed) and a spatial location of the surveyed population has been completed. The geometric component has been defined and actually it is possible to create a local distribution of social parameters derived from perception questionnaries results. A lot of raw information and social-statistical analysis offer different mirror and "visual concept" of risk perception. For this reason a concrete complete GeoDB is under working for the complete organization of the dataset. By a technical point of view the environment for data sharing is based on a complete open source web-service environment, to offer manually-made and user-friendly interface to this kind of information. Final aim is to offer different switches of dataset, using the same scale prototype and data hierarchical structure, to provide and compare social location of risk perception in the most detailed level.

Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

2010-05-01

267

Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis  

USGS Publications Warehouse

Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

2009-01-01

268

Ultrasound medical image deconvolution using CLEAN L.-T. Chiraa  

E-print Network

Ultrasound medical image deconvolution using CLEAN algorithm L.-T. Chiraa , J.-M. Giraultb , T reconstruction problem of ultrasound medical images using blind deconvolution algorithm has been recognized tissues scatters number. 1 Introduction The medical diagnostic using ultrasounds has intensively used

Boyer, Edmond

269

Robust wavelet estimation and blind deconvolution of noisy surface seismics  

E-print Network

- tained an optimal deconvolution output using Wiener filter- ing. The new procedure performs well, even , the idea is to construct the deconvolution filter so that the output is maximally non-Gaussian Donoho, 1981 of bandwidth , the Wiggins algorithm breaks down Longbottom et al., 1988; White, 1988 . Now industry practice

van der Baan, Mirko

270

Mineral abundance determination: Quantitative deconvolution of thermal emission spectra  

Microsoft Academic Search

A linear retrieval (spectral deconvolution) algorithm is developed and applied to high-resolution laboratory infrared spectra of particulate mixtures and their end- members. The purpose is to place constraints on, and test the viability of, linear spectral deconvolution of high-resolution emission spectra. The effects of addition of noise, data reproducibility, particle size variation, an increasing number of minerals in the mixtures,

Michael S. Ramsey; Philip R. Christensen

1998-01-01

271

The Mission Planning Lab: A Visualization and Analysis Tool  

Microsoft Academic Search

Simulation and visualization are powerful decision making tools that are time saving and cost effective. Space missions pose testing and evaluation challenges that can be overcome through modeling, simulation, and visualization of mission parameters. The National Aeronautics and Space Administrationpsilas (NASA) Wallops Flight Facility (WFF) capitalizes on the benefits of modeling, simulation, and visualization tools through a project initiative called

S. C. Daugherty; B. W. Cervantes

2009-01-01

272

Analysis and simulation tools for solar array power systems  

Microsoft Academic Search

This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar

Nattorn Pongratananukul

2005-01-01

273

3D finite element analysis of tool wear in machining  

Microsoft Academic Search

The paper is focused on the 3D numerical prediction of tool wear in metal cutting operations. In particular, an analytical model, able to take into account the diffusive wear mechanism, was implemented through a specific subroutine. Furthermore, an advanced approach to model heat transfer phenomena at the tool–chip interface was included in the numerical simulation. The adopted simulation strategy gave

A. Attanasio; E. Ceretti; S. Rizzuti; D. Umbrello; F. Micari

2008-01-01

274

Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis  

NASA Astrophysics Data System (ADS)

The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with a two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. This low-viscosity, transparent glue allowed for an easy application on the target area by means of a syringe and permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

Klapper, D.; Kueppers, U.; Castro, J. M.

2009-12-01

275

Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis  

NASA Astrophysics Data System (ADS)

The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. Using a syringe, this low-viscosity, transparent glue could be easily applied on the target area. We found that the glue permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

2010-05-01

276

Constrained variable projection method for blind deconvolution  

NASA Astrophysics Data System (ADS)

This paper is focused on the solution of the blind deconvolution problem, here modeled as a separable nonlinear least squares problem. The well known ill-posedness, both on recovering the blurring operator and the true image, makes the problem really difficult to handle. We show that, by imposing appropriate constraints on the variables and with well chosen regularization parameters, it is possible to obtain an objective function that is fairly well behaved. Hence, the resulting nonlinear minimization problem can be effectively solved by classical methods, such as the Gauss-Newton algorithm.

Cornelio, A.; Loli Piccolomini, E.; Nagy, J. G.

2012-09-01

277

Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics  

SciTech Connect

The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

Jodoin, Vincent J [ORNL] [ORNL; Lee, Ronald W [ORNL] [ORNL; Peplow, Douglas E. [ORNL] [ORNL; Lefebvre, Jordan P [ORNL] [ORNL

2011-01-01

278

AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool  

SciTech Connect

Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

Keith J. Halford

2009-10-01

279

General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft  

NASA Technical Reports Server (NTRS)

The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

Dove, Edwin; Hughes, Steve

2007-01-01

280

New Geant4 based simulation tools for space radiation shielding and effects analysis  

Microsoft Academic Search

We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the

G. Santina; P. Nieminen; H. Evansa; E. Daly; F. Lei; P. R. Truscott; C. S. Dyer; B. Quaghebeur; D. Heynderickx

2003-01-01

281

SEE: a tool for the visualization and analysis of rodent exploratory behavior  

E-print Network

for the exploration of exploratory behavior. The raw data for SEE are a time series of the animal `s coordinatesSEE: a tool for the visualization and analysis of rodent exploratory behavior Dan Drai, Ilan Golani behavior creates a need for a visualization and analysis tool that will highlight regularities and help

Golani, Ilan

282

MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies  

E-print Network

MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies Joaquin.surname@telecom-bretagne.eu Abstract. We present the core functionality of MIRAGE, a management tool for the analysis and deployment policies into network security component con- figurations. In both cases, MIRAGE provides intra

Garcia-Alfaro, Joaquin

283

TOOL-ASSISTED MULTI-FACET ANALYSIS OF FORMAL SPECIFICATIONS (USING ATELIER-B AND PROB)  

E-print Network

TOOL-ASSISTED MULTI-FACET ANALYSIS OF FORMAL SPECIFICATIONS (USING ATELIER-B AND PROB) Christian specification. The B method and the Atelier-B tool are used for formal specifications, for safety property to discover errors and therefore to improve the former specifications. KEY WORDS Formal Analysis, B Method

Paris-Sud XI, Université de

284

Tools for Scalable Parallel Program Analysis - Vampir VNG and DeWiz  

Microsoft Academic Search

Large scale high-performance computing systems pose a tough obstacle for todays program analysis tools. Their demands in computational performance and memory capacity for processing program analysis data exceed the capabilities of standard workstations and traditional analysis tools. The sophisticated approaches of Vampir NG (VNG) and the Debugging Wizard DeWiz\\u000a intend to provide novel ideas for scalable parallel program analysis. While

Holger Brunst; Dieter Kranzlmüller; Wolfgang E. Nagel

2004-01-01

285

FAILURE PREDICTION AND STRESS ANALYSIS OF MICROCUTTING TOOLS  

E-print Network

-based products are limited because silicon is brittle. Products can be made from other engineering materials and need to be machined in microscale. This research deals with predicting microtool failure by studying spindle runout and tool deflection effects...

Chittipolu, Sujeev

2010-07-14

286

Evaluation of a Surface Exploration Traverse Analysis and Navigation Tool  

E-print Network

SEXTANT is an extravehicular activity (EVA) mission planner tool developed in MATLAB, which computes the most efficient path between waypoints across a planetary surface. The traverse efficiency can be optimized around ...

Gilkey, Andrea L.

287

Novel tools for sequence and epitope analysis of glycosaminoglycans  

E-print Network

Our understanding of glycosaminoglycan (GAG) biology has been limited by a lack of sensitive and efficient analytical tools designed to deal with these complex molecules. GAGs are heterogeneous and often sulfated linear ...

Behr, Jonathan Robert

2007-01-01

288

Tools for scalable performance analysis on Petascale systems  

Microsoft Academic Search

Tools are becoming increasingly important to efficiently utilize the computing power available in contemporary large scale systems. The drastic increase in the size and the complexity of systems require tools to be scalable while producing meaning full and easily digestible information that may help the user pin-point problems at scale. The goal of this tutorial is to introduce some state-of-the-art

I-hsin Chung; Seetharami R. Seelam; Bernd Mohr; Jesús Labarta

2009-01-01

289

ExperiScope: an analysis tool for interaction data  

Microsoft Academic Search

ABSTRACT Wepresent ExperiScope, an analytical tool to help designers,and ,experimenters ,explore ,the ,results of quantitative,evaluations ,of interaction ,techniques. ExperiScope combines ,a new ,visualization incorporating aspects of the ,KLM and the three-state model ,with an interface helping users to rapidly cluster similar patterns of interactions. The tool makes it easy to identify and compare key patterns of use encountered during data

François Guimbretière; Morgan Dixon; Ken Hinckley

2007-01-01

290

pathFinder: A Static Network Analysis Tool for Pharmacological Analysis of Signal Transduction Pathways  

NSDL National Science Digital Library

The study of signal transduction is becoming a de facto part of the analysis of gene expression and protein profiling techniques. Many online tools are used to cluster genes in various ways or to assign gene products to signal transduction pathways. Among these, pathFinder is a unique tool that can find signal transduction pathways between first, second, or nth messengers and their targets within the cell. pathFinder can identify qualitatively all possible signal transduction pathways connecting any starting component and target within a database of two-component pathways (directional dyads). One or more intermediate pathway components can be excluded to simulate the use of pharmacological inhibitors or genetic deletion (knockout). Missing elements in a pathway connecting the activator or initiator and target can also be inferred from a null pathway result. The value of this static network analysis tool is illustrated by the predication from pathFinder analysis of a novel cyclic AMP–dependent, protein kinase A–independent signaling pathway in neuroendocrine cells, which has been experimentally confirmed.

Babru B. Samal (NIH; National Institute of Mental Health--Intramural Research Programs (NIMH-IRP) Bioinformatics Core REV)

2008-08-05

291

Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches  

NASA Astrophysics Data System (ADS)

We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

2014-12-01

292

Teaching Advanced Data Analysis Tools to High School Astronomy Students  

NASA Astrophysics Data System (ADS)

A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed instruction tailored to their experience level along with proper support and mentoring.This project was funded by a grant from the National Science Foundation, Grant # PHY1157078.

Black, David V.; Herring, Julie; Hintz, Eric G.

2015-01-01

293

Pressure transient testing and productivity analysis for horizontal wells  

E-print Network

, a "blind" deconvolution method was developed to restore the pressure response free of wellbore storage distortion, and to detect the afterflow/unloading rate function using Fourier analysis of the observed pressure data. This new deconvolution method...

Cheng, Yueming

2004-11-15

294

A computational tool for ionosonde CADI's ionogram analysis  

NASA Astrophysics Data System (ADS)

The purpose of this work is to present a new computational tool for ionogram generated with a Canadian Advanced Digital Ionosonde (CADI). This new tool uses the fuzzy relation paradigm to identify the F trace and from this form extract the parameters foF2, h'F, and hpF2. The tool was very extensively tested with ionosondes that operate at low latitudes and near the equatorial region. The ionograms used in this work were recorded at São José dos Campos (23.2° S, 45.9° W; dip latitude 17.6° S) and Palmas (10.2° S, 48.2° W; dip latitude 5.5° S). These automatically extracted ionospheric parameters were compared with those obtained manually and a good agreement was found. The developed tool will greatly expedite and standardize ionogram processing. Therefore, this new tool will facilitate exchange of information among many groups that operate ionosondes of the CADI type, and will be very helpful for space weather purposes.

Pillat, Valdir Gil; Guimarães, Lamartine Nogueira Frutuoso; Fagundes, Paulo Roberto; da Silva, José Demísio Simões

2013-03-01

295

Heuristic charge assignment for deconvolution of electrospray ionization mass spectra.  

PubMed

We propose a new algorithm for deconvolution of electrospray ionization mass spectra based on direct assignment of charge to the measured signal at each mass-to-charge ratio (m/z). We investigate two heuristics for charge assignment: the entropy-based heuristic is adapted from a deconvolution algorithm by Reinhold and Reinhold;10 the multiplicative-correlation heuristic is adapted from the multiplicative-correlation deconvolution algorithm of Hagen and Monnig.6 The entropy-based heuristic is insensitive to overestimates of z(max), the maximum ion charge. We test the deconvolution algorithm on two single-component samples: the measured spectrum of human beta-endorphin has two prominent and one very weak line whereas myoglobin has a well-developed quasi-gaussian envelope of 17 peaks. In both cases, the deconvolution algorithm gives a clean deconvoluted spectrum with one dominant peak and very few artefacts. The relative heights of the peaks due to the parent molecules in the deconvoluted spectrum of a mixture of two peptides, which are expected to ionize with equal efficiency, give an accurate measure of their relative concentration in the sample. PMID:12590391

Zheng, Huiru; Ojha, Piyush C; McClean, Stephen; Black, Norman D; Hughes, John G; Shaw, Chris

2003-01-01

296

Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).  

PubMed

This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings. PMID:18406925

Rath, Frank

2008-01-01

297

An Evaluation of Visual and Textual Network Analysis Tools  

SciTech Connect

User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This article presents the results of a comparative evaluation between a visualization-based application and a more traditional, table-based application for analyzing computer network packet captures. We conducted this evaluation as part of the user-centered design process. Participants performed both structured, well-defined tasks and exploratory, open-ended tasks with both tools. We measured accuracy and efficiency for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study design presented may be useful for future researchers performing user testing on visualization for cyber security applications.

Goodall, John R [ORNL

2011-01-01

298

Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique  

ERIC Educational Resources Information Center

A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

2005-01-01

299

Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.  

PubMed

Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C

2013-05-01

300

Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs  

SciTech Connect

In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

Howard, M. [NSTec; Luttman, A. [NSTec; Fowler, M. [NSTec

2014-11-01

301

A requirements analysis for videogame design support tools  

Microsoft Academic Search

Designing videogames involves weaving together systems of rules, called game mechanics, which support and str ucture com- pelling player experiences. Thus a significant port ion of game design involves reasoning about the effects of diff erent potential game mechanics on player experience. Unlike some design fields, such as architecture and mechanical design, that ha ve CAD tools to support designers

Mark J. Nelson; Michael Mateas

2009-01-01

302

Usability tool for analysis of web designs using mouse tracks  

Microsoft Academic Search

This paper presents MouseTrack as a web logging system that tracks mouse movements on websites. The system includes a visualization tool that displays the mouse cursor path followed by website visitors. It helps web site administrators run usability tests and analyze the collected data. Practitioners can track any existing webpage by simply entering its URL. This paper includes a design

Ernesto Arroyo; Ted Selker; Willy Wei

2006-01-01

303

Residual Stress Field Analysis and Prediction in Nitrided Tool Steel  

Microsoft Academic Search

Residual stresses are present in engineering components as an unintended consequence of manufacturing processes, but they are also deliberately introduced to beneficial effect during surface engineering procedures. Plasma nitriding is a process of particular importance for forming tools and dies, giving significant advantages in wear and fatigue resistance through the generation of near-surface compressive residual stresses. A precise knowledge of

B. Podgornik; V. Leskovšek; M. Kova?i?; J. Vižintin

2011-01-01

304

PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis  

ERIC Educational Resources Information Center

This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

Waycott, Jenny; Jones, Ann; Scanlon, Eileen

2005-01-01

305

Economic analysis and optimization of tool portfolio in semiconductor manufacturing  

Microsoft Academic Search

The tool portfolio of a plant refers to the makeup, in quantity and type, of processing machines in the plant. It is determined by taking into consideration the future trends of process and machine technologies and the forecasts of product evolution and product demands. Portfolio planning is also a multicriteria decision-making task involving tradeoffs among, investment cost, throughput, cycle time,

Yon-Chun Chou; Chuan-Shun Wu

2002-01-01

306

INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION  

EPA Science Inventory

Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

307

TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data  

PubMed Central

High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/. PMID:23408855

Fimereli, Danai; Detours, Vincent; Konopka, Tomasz

2013-01-01

308

Multiscale Analysis of Surface Topography from Single Point Incremental Forming using an Acetal Tool  

NASA Astrophysics Data System (ADS)

Single point incremental forming (SPIF) is a sheet metal manufacturing process that forms a part by incrementally applying point loads to the material to achieve the desired deformations and final part geometry. This paper investigates the differences in surface topography between a carbide tool and an acetal-tipped tool. Area-scale analysis is performed on the confocal areal surface measurements per ASME B46. The objective of this paper is to determine at which scales surfaces formed by two different tool materials can be differentiated. It is found that the surfaces in contact with the acetal forming tool have greater relative areas at all scales greater than 5 × 104 ?m2 than the surfaces in contact with the carbide tools. The surfaces not in contact with the tools during forming, also referred to as the free surface, are unaffected by the tool material.

Ham, M.; Powers, B. M.; Loiselle, J.

2014-03-01

309

Deconvolution of mixed magnetism in multilayer graphene  

SciTech Connect

Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

Swain, Akshaya Kumar [IITB-Monash Research Academy, Department of Metallurgical Engineering and Materials Science, IIT Bombay, Mumbai 400076 (India); Bahadur, Dhirendra, E-mail: dhirenb@iitb.ac.in [Department of Metallurgical Engineering and Materials Science, IIT Bombay, Mumbai 400076 (India)

2014-06-16

310

Multichannel blind deconvolution using low rank recovery  

NASA Astrophysics Data System (ADS)

We introduce a new algorithm for multichannel blind deconvolution. Given the outputs of K linear time- invariant channels driven by a common source, we wish to recover their impulse responses without knowledge of the source signal. Abstractly, this problem amounts to finding a solution to an overdetermined system of quadratic equations. We show how we can recast the problem as solving a system of underdetermined linear equations with a rank constraint. Recent results in the area of low rank recovery have shown that there are effective convex relaxations to problems of this type that are also scalable computationally, allowing us to recover 100s of channel responses after a moderate observation time. We illustrate the effectiveness of our methodology with a numerical simulation of a passive "noise imaging" experiment.

Romberg, Justin; Tian, Ning; Sabra, Karim

2013-05-01

311

Quantitative deconvolution of human thermal infrared emittance.  

PubMed

The bioheat transfer models conventionally employed in etiology of human thermal infrared (TIR) emittance rely upon two assumptions; universal graybody emissivity and significant transmission of heat from subsurface tissue layers. In this work, a series of clinical and laboratory experiments were designed and carried out to conclusively evaluate the validity of the two assumptions. Results obtained from the objective analyses of TIR images of human facial and tibial regions demonstrated significant variations in spectral thermophysical properties at different anatomic locations on human body. The limited validity of the two assumptions signifies need for quantitative deconvolution of human TIR emittance in clinical, psychophysiological and critical applications. A novel approach to joint inversion of the bioheat transfer model is also introduced, levering the deterministic temperature-dependency of proton resonance frequency in low-lipid human soft tissue for characterizing the relationship between subsurface 3D tissue temperature profiles and corresponding TIR emittance. PMID:23086533

Arthur, D T J; Khan, M M

2013-01-01

312

An advanced image analysis tool for the quantification and characterization of breast cancer in microscopy images.  

PubMed

The paper presents an advanced image analysis tool for the accurate and fast characterization and quantification of cancer and apoptotic cells in microscopy images. The proposed tool utilizes adaptive thresholding and a Support Vector Machines classifier. The segmentation results are enhanced through a Majority Voting and a Watershed technique, while an object labeling algorithm has been developed for the fast and accurate validation of the recognized cells. Expert pathologists evaluated the tool and the reported results are satisfying and reproducible. PMID:25681102

Goudas, Theodosios; Maglogiannis, Ilias

2015-03-01

313

Dynamics analysis of a 5-UPS\\/PRPU parallel machine tool  

Microsoft Academic Search

Abstract—This paper, presents the dynamics analysis of a novel 5-UPS\\/PRPU 5-degree-of-freedom parallel machine tool. The characteristic of the parallel machine tool is described. The mathematical model of the dynamics of the parallel machine tool isset up by using Lagrangian formulations, and the formulation isimplemented with MATLAB software for studying the inverse dynamics through some typical motions of the parallel machine

Yongsheng Zhao; Yulei Hou; Yi Shi; Ling Lu

2007-01-01

314

CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS  

EPA Science Inventory

Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

315

Forensic Analysis of Windows Hosts Using UNIX-based Tools  

SciTech Connect

Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

Cory Altheide

2004-07-19

316

Computational mechanics analysis tools for parallel-vector supercomputers  

NASA Technical Reports Server (NTRS)

Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

1993-01-01

317

A tool for analysis and classification of sleep stages  

Microsoft Academic Search

Scoring sleep stages is a critical process in assessing several sleep studies and slumber disorders. Sleep is classified in two major states: non-rapid-eye-mo vement (non-REM) sleep and REM sleep. Non-REM sleep comprises stages Nt, N2 and N3. We develop a tool for automatic scoring the stages of sleep following the rules of 2007 AASM (American Academy of Sleep Medicine). The

Quoc Khai Le; Quang Dang Khoa Truong; Van Toi Vo

2011-01-01

318

The Mission Planning Lab: A Visualization and Analysis Tool  

NASA Technical Reports Server (NTRS)

Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

Daugherty, Sarah C.; Cervantes, Benjamin W.

2009-01-01

319

Extended duration human-robot interaction: Tools and analysis  

Microsoft Academic Search

Extended human-robot interactions possess unique aspects which are not exhibited in short-term interactions spanning a few minutes or extremely long-term spanning days. In order to comprehensively monitor such interactions, we need special recording mechanisms which ensure the interaction is captured at multiple spatio-temporal scales, viewpoints and modalities(audio, video, physio). To minimize cognitive burden, we need tools which can automate the

Ravi Kiran Sarvadevabhatla; Victor Ng-Thow-Hing; Sandra Okita

2010-01-01

320

Analysis tools for the calibration and commissioning of the AOF  

NASA Astrophysics Data System (ADS)

The Adaptive Optics Facility (AOF) is an AO-oriented upgrade envisaged to be implemented at the UT4 in Paranal in 2013-2014, and which could serve as a test case for the E-ELT. Counting on the largest Deformable Secondary Mirror ever built (1170 actuators) and on four off-axes Na laser launch telescopes, the AOF will operate in distinct modes (GLAO, LTAO, SCAO), in accordance to the instruments attached to the 2 telescope Nasmyth ports (GALACSI+MUSE, GRAAL+HAWK-I) and to the Cassegrain port (ERIS). Tools are under development to allow a fast testing of important parameters for these systems when at commissioning and for posterior assessment of telemetry data. These concern the determination of turbulence parameters and Cn2 profiling, measurement of Strehl and ensquared energies, misregistration calculation, bandwidth & overall performance, etc. Our tools are presented as Graphical User Interfaces developed in the Matlab environment, and will be able to grab through a dedicated server data saved in SPARTA standards. We present here the tools developed up to present date and discuss details of what can be obtained from the AOF, based on simulations.

Garcia-Rissmann, Aurea; Kolb, Johann; Le Louarn, Miska; Madec, Pierre-Yves; Muller, Nicolas

2013-12-01

321

LC-TOOL-2003-015 1 Java Physics Generator and Analysis Modules  

E-print Network

LC-TOOL-2003-015 1 Java Physics Generator and Analysis Modules Michael T. Ronan LBNL, Berkeley, CA) physics event generators are used in de#12;ning a common generator interface package. Portable libraries provide high-level OO study tools. Complete physics generation, parallel detector simulations

322

Building an Exploratory Visual Analysis Tool for Qualitative Researchers Tanuka Bhowmick  

E-print Network

Building an Exploratory Visual Analysis Tool for Qualitative Researchers Tanuka Bhowmick Geo, Vancouver, WA, June 26-28, 2006 #12;Abstract Qualitative research is an integral part of both academic research in various disciplines, newer and more sophisticated exploratory tools that assist qualitative

Klippel, Alexander

323

Striation Patterns Classification of Tool Marks Based on Extended Fractal Analysis  

Microsoft Academic Search

Currently, optical device, such as microscopes and CCD cameras, are utilized for identification of tool marks in the field of forensic science which mainly depend on the experience of forensic scientists. A new approach using extended fractal analysis technology to classify tool marks such as striation patterns is presented. it computes four directional multi-scale extended fractal parameters and the maximum

Min Yang; Dong-Yun Li; Li Mou; Wei-Dong Wang

2008-01-01

324

A User-Friendly Self-Similarity Analysis Tool Thomas Karagiannis, Michalis Faloutsos, Mart Molle  

E-print Network

and the absence of publicly avail- able software. In this paper, we introduce SELFIS, a comprehensive tool for an in-depth LRD analysis, including several LRD estimators. In addition, SELFIS includes a powerful ap acquired SELFIS within a month of its release, which clearly demonstrates the need for such a tool

Molle, Mart

325

Analysis of Mutation Testing Tools Johnathan Snyder, Department of Computer Science, University of Alabama  

E-print Network

Analysis of Mutation Testing Tools Johnathan Snyder, Department of Computer Science, University@crimson.ua.edu Conclusion In my research, I studied three mutation testing tools for Java: MuJava, Jumble, and PIT. All of them use byte level mutation which speeds up the time it takes to generate the mutants and run

Gray, Jeffrey G.

326

BLAST: at the core of a powerful and diverse set of sequence analysis tools  

E-print Network

BLAST: at the core of a powerful and diverse set of sequence analysis tools Scott Mc Institutes of Health, Building 38A, 8600 Rockville Pike, Bethesda, MD 20894, USA Received February 20, 2004; Revised April 2, 2004; Accepted April 14, 2004 ABSTRACT Basic Local Alignment Search Tool (BLAST) is one

Narasimhan, Giri

327

Machine Tool Accuracy Analysis M.A.Sc. Candidate: Ricky Chan  

E-print Network

Machine Tool Accuracy Analysis M.A.Sc. Candidate: Ricky Chan Supervisor: Dr. Stephen Veldhuis coordination between machine tool axes. This research studies and characterizes the positional error in a 3 Abstract: CNC machining is an essential part of almost all manufacturing industries. Machine accuracy

Bone, Gary

328

BUILDING AN URBAN ENERGY PERFORMANCE FRAMEWORK: INTEGRATING SPATIAL ANALYSIS AND BUILDING SIMULATION TOOLS FOR CAMPUS PLANNING  

Microsoft Academic Search

The tools that currently benchmark energy consumption beyond the building level are limited. This paper describes a framework utilizing simulation and spatial analysis tools to identify a credible set of campus energy performance indicators integrating both the building and site levels and taking into account the spatial arrangement surrounding each building. The research method propose a series of simulation experiments

Khaled A. Tarabieh; Ali M. Malkawi

329

Comparative study of some methods in blind deconvolution  

E-print Network

This study presents some techniques used in Blind Deconvolution with emphasis on applications to digital communications. The literature contains many algorithms developed and tested in different situations, but very limited research was conducted...

Mbarek, Kais

1995-01-01

330

Blind Deconvolution for Ultrasound Sequences Using a Noninverse Greedy Algorithm  

PubMed Central

The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

2013-01-01

331

Blind deconvolution for ultrasound sequences using a noninverse greedy algorithm.  

PubMed

The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

2013-01-01

332

The discrete Kalman filtering approach for seismic signals deconvolution  

SciTech Connect

Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

Kurniadi, Rizal; Nurhandoko, Bagus Endar B. [Departement of Physics Intitut Teknologi Bandung, Jl. Ganesha 10 Bandung (Indonesia)

2012-06-20

333

The discrete Kalman filtering approach for seismic signals deconvolution  

NASA Astrophysics Data System (ADS)

Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

Kurniadi, Rizal; Nurhandoko, Bagus Endar B.

2012-06-01

334

Information-Theoretic Deconvolution Approximation of Treatment Effect Distribution  

E-print Network

of the proposed deconvolution estimator. This method is applied to data from the U.S. Job Training Partnership ACT of the treatment effect distribution. However, the average treatment effect may mask important distributional

Perloff, Jeffrey M.

335

Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution  

NASA Technical Reports Server (NTRS)

A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

1999-01-01

336

Two-dimensional blind Bayesian deconvolution of medical ultrasound images  

Microsoft Academic Search

A new approach to 2-D blind deconvolution of ultrasonic images in a Bayesian framework is presented. The radio-frequency image data are modeled as a convolution of the point-spread function and the tissue function, with additive white noise. The deconvolution algorithm is derived from statistical assumptions about the tissue function, the point-spread function, and the noise. It is solved as an

R. Jirik; T. Taxt

2008-01-01

337

Deconvolution of long-pulse lidar signals with matrix formulation.  

PubMed

A deconvolution technique for deriving more resolved signals from lidar signals with typical CO(2) laser pulses is proposed, utilizing special matrices constructed from the temporal profile of laser pulses. It is shown that near-range signals can be corrected and small-scale variations of backscattered signals can be retrieved with this technique. Deconvolution errors as a result of noise in lidar data and in the laser pulse profile are also investigated numerically by computer simulation. PMID:18259329

Park, Y J; Dho, S W; Kong, H J

1997-07-20

338

Mineral abundance determination: Quantitative deconvolution of thermal emission spectra  

Microsoft Academic Search

A linear retrieval (spectral deconvolution) algorithm is developed and applied to high-resolution laboratory infrared spectra of particulate mixtures and their end-members. The purpose is to place constraints on, and test the viability of, linear spectral deconvolution of high-resolution emission spectra. The effects of addition of noise, data reproducibility, particle size variation, an increasing number of minerals in the mixtures, and

Michael S. Ramsey; Philip R. Christensen

1998-01-01

339

Process-oriented evaluation of user interactions in integrated system analysis tools  

E-print Network

When computer-based tools are used for analysis of complex systems, the design of user interactions and interfaces becomes an essential part of development that determines the overall quality. The objective of this study ...

Lee, Chaiwoo

340

Development of Statistical Energy Analysis Tools for Toyota Motor Engineering & Manufacturing  

E-print Network

Development of Statistical Energy Analysis Tools for Toyota Motor Engineering & Manufacturing Duke University | Bass Connections in Energy IETC | May 21, 2014 Jason Chen, Robert Collins, Gary Gao, Daniel Schaffer, Jill Wu ESL-IE-14...

Chen, J; Collins, Ro.; Gao, G.; Schaffer, D.; Wu, J.

2014-01-01

341

The Ribosomal Database Project: improved alignments and new tools for rRNA analysis.  

PubMed

The Ribosomal Database Project (RDP) provides researchers with quality-controlled bacterial and archaeal small subunit rRNA alignments and analysis tools. An improved alignment strategy uses the Infernal secondary structure aware aligner to provide a more consistent higher quality alignment and faster processing of user sequences. Substantial new analysis features include a new Pyrosequencing Pipeline that provides tools to support analysis of ultra high-throughput rRNA sequencing data. This pipeline offers a collection of tools that automate the data processing and simplify the computationally intensive analysis of large sequencing libraries. In addition, a new Taxomatic visualization tool allows rapid visualization of taxonomic inconsistencies and suggests corrections, and a new class Assignment Generator provides instructors with a lesson plan and individualized teaching materials. Details about RDP data and analytical functions can be found at http://rdp.cme.msu.edu/. PMID:19004872

Cole, J R; Wang, Q; Cardenas, E; Fish, J; Chai, B; Farris, R J; Kulam-Syed-Mohideen, A S; McGarrell, D M; Marsh, T; Garrity, G M; Tiedje, J M

2009-01-01

342

Tool supported reliability analysis of finite-source retrial queues Janos Sztrik  

E-print Network

state. Some numerical examples are given demonstrating the effect of failure and repair rates of server to the original system specification and hiding details of the analysis techniques. Such tools map the model

Sztrik, János

343

Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool  

SciTech Connect

The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model.

Johnson, P.E. [Oak Ridge National Lab., TN (United States); Lester, P.B. [Dept. of Energy Oak Ridge Operations, TN (United States)

1998-05-01

344

The Ribosomal Database Project: improved alignments and new tools for rRNA analysis  

PubMed Central

The Ribosomal Database Project (RDP) provides researchers with quality-controlled bacterial and archaeal small subunit rRNA alignments and analysis tools. An improved alignment strategy uses the Infernal secondary structure aware aligner to provide a more consistent higher quality alignment and faster processing of user sequences. Substantial new analysis features include a new Pyrosequencing Pipeline that provides tools to support analysis of ultra high-throughput rRNA sequencing data. This pipeline offers a collection of tools that automate the data processing and simplify the computationally intensive analysis of large sequencing libraries. In addition, a new Taxomatic visualization tool allows rapid visualization of taxonomic inconsistencies and suggests corrections, and a new class Assignment Generator provides instructors with a lesson plan and individualized teaching materials. Details about RDP data and analytical functions can be found at http://rdp.cme.msu.edu/. PMID:19004872

Cole, J. R.; Wang, Q.; Cardenas, E.; Fish, J.; Chai, B.; Farris, R. J.; Kulam-Syed-Mohideen, A. S.; McGarrell, D. M.; Marsh, T.; Garrity, G. M.; Tiedje, J. M.

2009-01-01

345

EZ and GOSSIP, two new VO compliant tools for spectral analysis  

NASA Astrophysics Data System (ADS)

We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.

Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.

2008-10-01

346

Support Vector Regression for Censored Data (SVRc): A Novel Tool for Survival Analysis  

Microsoft Academic Search

A crucial challenge in predictive modeling for survival analysis is managing censored observations in the data. The Cox proportional hazards model is the standard tool for the analysis of continuous censored survival data. We propose a novel machine learning algorithm, support vector regression for censored data (SVRc) for improved analysis of medical survival data. SVRc leverages the high-dimensional capabilities of

Faisal M. Khan; Valentina Bayer Zubek

2008-01-01

347

Tools for multivariable spectral and coherence analysis Author: Tryphon Georgiou report: February 2007  

E-print Network

resolution tools for use in sensor networks and arrays, data mining, and spectral analysis. Signal analysis recently developed have have been applied to non- invasive temperature sensing via ultrasound (intended. A fundamental issue discussed in this report is the question of how to quantify uncertainty in spectral analysis

Georgiou, Tryphon T.

348

METABONOMICS AS A CLINICAL TOOL OF ANALYSIS, LCMS APPROACH  

Microsoft Academic Search

Metabolic differences between test and control groups (i.e., metabonomics) are routinely accomplished by using multivariate analysis for data obtained commonly from NMR, GC-MS and LC-MS. Multivariate analysis (e.g., principal component analysis PCA) is commonly used to extract potential metabolites responsible for clinical observations. Metabonomics applied to the clinical field is challenging because the physiological variabilities like gender, age, race … etc might

Muhammed Alzweiri; David Watson; John Parkinson

2012-01-01

349

Computational mechanics analysis tools for parallel-vector supercomputers  

NASA Technical Reports Server (NTRS)

Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

1993-01-01

350

Chemometric deconvolution of gas chromatographic unresolved conjugated linoleic acid isomers triplet in milk samples.  

PubMed

A generally known problem of GC separation of trans-7;cis-9; cis-9,trans-11; and trans-8,cis-10 CLA (conjugated linoleic acid) isomers was studied by GC-MS on 100m capillary column coated with cyanopropyl silicone phase at isothermal column temperatures in a range of 140-170 degrees C. The resolution of these CLA isomers obtained at given conditions was not high enough for direct quantitative analysis, but it was, however, sufficient for the determination of their peak areas by commercial deconvolution software. Resolution factors of overlapped CLA isomers determined by the separation of a model CLA mixture prepared by mixing of a commercial CLA mixture and CLA isomer fraction obtained by the HPLC semi-preparative separation of milk fatty acids methyl esters were used to validate the deconvolution procedure. Developed deconvolution procedure allowed the determination of the content of studied CLA isomers in ewes' and cows' milk samples, where dominant isomer cis-9,trans-11 is eluted between two small isomers trans-7,cis-9 and trans-8,cis-10 (in the ratio up to 1:100). PMID:19056089

Blasko, Jaroslav; Kubinec, Róbert; Ostrovský, Ivan; Pavlíková, Eva; Krupcík, Ján; Soják, Ladislav

2009-04-01

351

Fast, Automated Implementation of Temporally Precise Blind Deconvolution of Multiphasic Excitatory Postsynaptic Currents  

PubMed Central

Records of excitatory postsynaptic currents (EPSCs) are often complex, with overlapping signals that display a large range of amplitudes. Statistical analysis of the kinetics and amplitudes of such complex EPSCs is nonetheless essential to the understanding of transmitter release. We therefore developed a maximum-likelihood blind deconvolution algorithm to detect exocytotic events in complex EPSC records. The algorithm is capable of characterizing the kinetics of the prototypical EPSC as well as delineating individual release events at higher temporal resolution than other extant methods. The approach also accommodates data with low signal-to-noise ratios and those with substantial overlaps between events. We demonstrated the algorithm’s efficacy on paired whole-cell electrode recordings and synthetic data of high complexity. Using the algorithm to align EPSCs, we characterized their kinetics in a parameter-free way. Combining this approach with maximum-entropy deconvolution, we were able to identify independent release events in complex records at a temporal resolution of less than 250 µs. We determined that the increase in total postsynaptic current associated with depolarization of the presynaptic cell stems primarily from an increase in the rate of EPSCs rather than an increase in their amplitude. Finally, we found that fluctuations owing to postsynaptic receptor kinetics and experimental noise, as well as the model dependence of the deconvolution process, explain our inability to observe quantized peaks in histograms of EPSC amplitudes from physiological recordings. PMID:22761670

Andor-Ardó, Daniel; Keen, Erica C.; Hudspeth, A. J.; Magnasco, Marcelo O.

2012-01-01

352

Fast, automated implementation of temporally precise blind deconvolution of multiphasic excitatory postsynaptic currents.  

PubMed

Records of excitatory postsynaptic currents (EPSCs) are often complex, with overlapping signals that display a large range of amplitudes. Statistical analysis of the kinetics and amplitudes of such complex EPSCs is nonetheless essential to the understanding of transmitter release. We therefore developed a maximum-likelihood blind deconvolution algorithm to detect exocytotic events in complex EPSC records. The algorithm is capable of characterizing the kinetics of the prototypical EPSC as well as delineating individual release events at higher temporal resolution than other extant methods. The approach also accommodates data with low signal-to-noise ratios and those with substantial overlaps between events. We demonstrated the algorithm's efficacy on paired whole-cell electrode recordings and synthetic data of high complexity. Using the algorithm to align EPSCs, we characterized their kinetics in a parameter-free way. Combining this approach with maximum-entropy deconvolution, we were able to identify independent release events in complex records at a temporal resolution of less than 250 µs. We determined that the increase in total postsynaptic current associated with depolarization of the presynaptic cell stems primarily from an increase in the rate of EPSCs rather than an increase in their amplitude. Finally, we found that fluctuations owing to postsynaptic receptor kinetics and experimental noise, as well as the model dependence of the deconvolution process, explain our inability to observe quantized peaks in histograms of EPSC amplitudes from physiological recordings. PMID:22761670

Andor-Ardó, Daniel; Keen, Erica C; Hudspeth, A J; Magnasco, Marcelo O

2012-01-01

353

Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.  

PubMed

Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

Villeneuve, Emma; Carfantan, Hervé

2014-10-01

354

Semi-blind spectral deconvolution with adaptive Tikhonov regularization.  

PubMed

Deconvolution has become one of the most used methods for improving spectral resolution. Deconvolution is an ill-posed problem, especially when the point spread function (PSF) is unknown. Non-blind deconvolution methods use a predefined PSF, but in practice the PSF is not known exactly. Blind deconvolution methods estimate the PSF and spectrum simultaneously from the observed spectra, which become even more difficult in the presence of strong noise. In this paper, we present a semi-blind deconvolution method to improve the spectral resolution that does not assume a known PSF but models it as a parametric function in combination with the a priori knowledge about the characteristics of the instrumental response. First, we construct the energy functional, including Tikhonov regularization terms for both the spectrum and the parametric PSF. Moreover, an adaptive weighting term is devised in terms of the magnitude of the first derivative of spectral data to adjust the Tikhonov regularization for the spectrum. Then we minimize the energy functional to obtain the spectrum and the parameters of the PSF. We also discuss how to select the regularization parameters. Comparative results with other deconvolution methods on simulated degraded spectra, as well as on experimental infrared spectra, are presented. PMID:23146190

Yan, Luxin; Liu, Hai; Zhong, Sheng; Fang, Houzhang

2012-11-01

355

Spectral probability density as a tool for ambient noise analysis.  

PubMed

This paper presents the empirical probability density of the power spectral density as a tool to assess the field performance of passive acoustic monitoring systems and the statistical distribution of underwater noise levels across the frequency spectrum. Using example datasets, it is shown that this method can reveal limitations such as persistent tonal components and insufficient dynamic range, which may be undetected by conventional techniques. The method is then combined with spectral averages and percentiles, which illustrates how the underlying noise level distributions influence these metrics. This combined approach is proposed as a standard, integrative presentation of ambient noise spectra. PMID:23556689

Merchant, Nathan D; Barton, Tim R; Thompson, Paul M; Pirotta, Enrico; Dakin, D Tom; Dorocicz, John

2013-04-01

356

A Tool for Long-Range Dependent Analysis via the R\\/S Statistic  

Microsoft Academic Search

Self-similarity and long-range dependence have been found to apply as models of traffic in modern computer networks. This behavior has important implications for the design, analysis, control and performance of such networks, thus an accurate identification and quantification of such behavior is important. A novel tool for the analysis of long-range dependence via the R\\/S statistic is presented. The tool,

Julio C. Ramirez Pacheco; Deni Torres Roman

2006-01-01

357

Development of a task analysis tool to facilitate user interface design  

NASA Technical Reports Server (NTRS)

A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

Scholtz, Jean C.

1992-01-01

358

Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools  

NASA Technical Reports Server (NTRS)

A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

Orr, Stanley A.; Narducci, Robert P.

2009-01-01

359

High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis  

PubMed Central

The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

Simonyan, Vahan; Mazumder, Raja

2014-01-01

360

A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.  

PubMed

Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. PMID:24522836

Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

2014-07-01

361

Analysis, Diagnosis, and Short-Range Forecast Tools  

NSDL National Science Digital Library

This lesson is divided into three sections. The first section discusses the importance of analysis and diagnosis in evaluating NWP in the forecast process. In section two, we discuss a methodology for dealing with discrepancies between both the official forecast and NWP compared to analysis and diagnosis. The third section shows a representative example of the methodology.

2014-09-14

362

Configural Frequency Analysis as a Statistical Tool for Developmental Research.  

ERIC Educational Resources Information Center

Configural frequency analysis (CFA) is suggested as a technique for longitudinal research in developmental psychology. Stability and change in answers to multiple choice and yes-no item patterns obtained with repeated measurements are identified by CFA and illustrated by developmental analysis of an item from Gorham's Proverb Test. (Author/DWH)

Lienert, Gustav A.; Oeveste, Hans Zur

1985-01-01

363

Pathway-based Analysis Tools for Complex Diseases: A Review.  

PubMed

Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases are provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases. PMID:25462153

Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

2014-10-28

364

Negotiation Process Analysis: A Research and Training Tool.  

ERIC Educational Resources Information Center

This paper proposes the use of interaction process analysis to study negotiation behaviors. Following a review of current literature in the field, the paper presents a theoretical framework for the analysis of both labor/management and social negotiation processes. Central to the framework described are two systems of activities that together…

Williams, Timothy

365

A Web-based Tool For The Analysis Of Concept Inventory Data  

NASA Astrophysics Data System (ADS)

``FOCIA'' stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of gain, normalized change and effect size. The tool currently analyzes data from the Lunar Phases Concept Inventory (LPCI), the Force Concept Inventory (FCI), the Astronomy Diagnostic Test (ADT), the Force and Motion Concept Inventory (FMCE) and generically, any multiple choice test. It will be expanded to analyze data from other commonly utilized concept inventories in the PER community and, from user-designed and uploaded tools. In this paper, we will discuss the development of this analysis tool including some technical details of implementation and a description of what is available for use. Instructors and researchers are encouraged to use the latest version of the analysis tool via our website, http://www.sciedures.org.

Beuckman, Joseph P.; Franklin, Scott V.; Lindell, Rebecca S.

2005-09-01

366

A Web-based Tool For The Analysis Of Concept Inventory Data  

NSDL National Science Digital Library

"FOCIA" stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of gain, normalized change and effect size. The tool currently analyzes data from the Lunar Phases Concept Inventory (LPCI), the Force Concept Inventory (FCI), the Astronomy Diagnostic Test (ADT), the Force and Motion Concept Inventory (FMCE) and generically, any multiple choice test. It will be expanded to analyze data from other commonly utilized concept inventories in the PER community and, from user-designed and uploaded tools. In this paper, we will discuss the development of this analysis tool including some technical details of implementation and a description of what is available for use. Instructors and researchers are encouraged to use the latest version of the analysis tool via our website, http://www.sciedures.org.

Beuckman, Joseph; Franklin, Scott V.; Lindell, Rebecca S.

2009-11-30

367

Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop  

SciTech Connect

Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.

Miller, Barton

2014-06-30

368

Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan  

SciTech Connect

Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This paper describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.

Janjusic, Tommy [ORNL; Kartsaklis, Christos [ORNL; Wang, Dali [ORNL

2013-01-01

369

Cash flow analysis: a tool for the dairy farmer  

E-print Network

rose 3. 8$, while the price of milk dropped 1. 5$. This has put a huge strain on profit margins. Another drop of 4g in the price of milk Is pro/ected for 1985. Also, as was mentioned above. dairy farming is quite capital intensive. The capital assets..., and supporting cash flow estimates for a projected analysis. The latter is where the real benefit of cash flow analysis is derived. ProJected cash fiow analysis, with fts projected cash flow statement, provides dairymen with many varied uses. One use...

McMorrough, Mark D

1985-01-01

370

An Analysis Tool for Flight Dynamics Monte Carlo Simulations  

E-print Network

and analysis work to understand vehicle operating limits and identify circumstances that lead to mission failure. A Monte Carlo simulation approach that varies a wide range of physical parameters is typically used to generate thousands of test cases...

Restrepo, Carolina 1982-

2011-05-20

371

Introducing an Online Cooling Tower Performance Analysis Tool  

E-print Network

in detail to highlight important design considerations and issues. This will include how the Merkel Theory, psychometric properties, tower types, and historical weather data are incorporated into the analysis....

Muller, M.R.; Muller, M.B.; Rao, P.

2012-01-01

372

Application of surface chemical analysis tools for characterization of nanoparticles.  

PubMed

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES), X-ray photoelectron spectroscopy (XPS), time-of-flight secondary-ion mass spectrometry (TOF-SIMS), low-energy ion scattering (LEIS), and scanning-probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface-analysis methods for the characterization of nanoparticles are discussed and summarized, with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578

Baer, D R; Gaspar, D J; Nachimuthu, P; Techane, S D; Castner, D G

2010-02-01

373

Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles  

SciTech Connect

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties.

Baer, Donald R.; Gaspar, Daniel J.; Nachimuthu, Ponnusamy; Techane, Sirnegeda D.; Castner, David G.

2010-02-01

374

Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles  

PubMed Central

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578

Baer, DR; Gaspar, DJ; Nachimuthu, P; Techane, SD; Castner, DG

2010-01-01

375

Database Management and Analysis Tools of Machine Induction  

Microsoft Academic Search

This paper surveys machine induction techniques for database management and analysis. Our premise is that machine induction facilitates an evolution from relatively unstructured data stores to efficient and correct database implementations.

Doug Fischer; Gilford Hapanyengwi

1993-01-01

376

National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion  

NASA Technical Reports Server (NTRS)

Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

Follen, G.; Naiman, C.; Evans, A.

1999-01-01

377

OPE The Campus Safety and Security Data Analysis Cutting Tool  

NSDL National Science Digital Library

Provided by the Office of Postsecondary Education (OPE) of the US Department of Education, this searchable database allows users to browse records of reported criminal offenses at over 6000 colleges and universities. The database contains records for 1997-99 and may be browsed by region, state, city, type of institution, instructional program, and number of students. Users can also simply type in the name of a specific institution. Initial entries include basic contact information and links to statistics for criminal offenses, hate offenses, and arrests. Each entry page also links to the relevant page at the National Center for Education Statistics IPEDS COOL (College Opportunities On-Line) website (reviewed in the March 31, 2000 Scout Report), a tool for comparison shopping between different collges and universities.

378

PRAAD: Preprocessing and Analysis Tool for Arabic Ancient Documents  

Microsoft Academic Search

This paper presents the new system PRAAD for preprocessing and analysis of Arabic historical documents. It is composed of two important parts: pre-processing and analysis of ancient documents. After digitization, the color or greyscale ancient documents images are distorted by the presence of strong background artefacts such as scan optical blur and noise, show-through and bleed-through effects and spots. In

Wafa Boussellaa; Abderrazak Zahour; Bruno Taconet; Adel Alimi; Abdellatif Benabdelhafid

2007-01-01

379

MR PRISM: a spectral analysis tool for the PRISM  

NASA Astrophysics Data System (ADS)

We describe a computer application designed to analyze hyperspectral data collected by the Compact Infrared Spectrometer for Mars (CRISM). The application links the spectral, imaging and mapping perspectives on the eventual CRISM dataset by presenting the user with three different ways to analyze the data. One of the goals when developing this instrument is to build in the latest algorithms for detection of spectrally compelling targets on the surface of the Red Planet, so they may be available to the Planetary Science community without cost and with a minimal learning barrier to cross. This will allow the Astrobiology community to look for targets of interest such as hydrothermal minerals, sulfate minerals and hydrous minerals and be able to map the extent of these minerals using the most up-to-date and effective algorithms. The application is programmed in Java and will be made available for Windows, Mac and Linux platforms. Users will be able to embed Groovy scripts into the program in order to extend its functionality. The first collection of CRISM data will occur in September of 2006 and this data will be made publicly available six months later via the Planetary Datasystem (PDS). Potential users in the community should therefore look forward to a release date mid-2007. Although exploration of the CRISM data set is the motivating force for developing these software tools, the ease of writing additional Groovy scripts to access other data sets makes the tools useful for mineral exploration, crop management, and characterization of extreme environments here on Earth or other terrestrial planets. The system can be easily implemented for use by high school, college, and graduate level students.

Brown, Adrian J.; Storrie-Lombardi, Michael

2006-08-01

380

Deconvolution of dynamic dual photon microscopy images of cerebral microvasculature to assess the hemodynamic status of the brain  

NASA Astrophysics Data System (ADS)

Assessing the hemodynamic status of the brain and its variations in response to stimulations is required to understand the local cerebral circulatory mechanisms. Dynamic contrast enhanced imaging of cerebral microvasculature provides information that can be used in understanding physiology of cerebral diseases. Bolus tracking is used to extract characteristic parameters that quantify local cerebral blood flow. However, post-processing of the data is needed to segment the field of view (FOV) and to perform deconvolution to remove the effects of input bolus profile and the path it travels to reach the imaging window. Finding the arterial input function (AIF) and dealing with the ill-posedness of deconvolution system make this process are the main challenges. We propose using ICA to segment the FOV and to extract a local AIF as well as the venous output function that is required for deconvolution. This also helps to stabilize the system as ICA suppresses noise efficiently. Tikhoniv regularization (with L-curve analysis to find the best regularization parameter) is used to make the system stable. In-vivo dynamic 2PLSM images of a rat brain in two conditions (when the animal is at rest and when it is stimulated) are used in this study. The experimental along with the simulation studies provided promising results that demonstrate the feasibility and importance of performing deconvolution.

Mehrabian, Hatef; Lindvere, Liis; Stefanovic, Bojana; Martel, Anne L.

2011-03-01

381

Ionic liquid thermal stabilities: decomposition mechanisms and analysis tools.  

PubMed

The increasing amount of papers published on ionic liquids generates an extensive quantity of data. The thermal stability data of divergent ionic liquids are collected in this paper with attention to the experimental set-up. The influence and importance of the latter parameters are broadly addressed. Both ramped temperature and isothermal thermogravimetric analysis are discussed, along with state-of-the-art methods, such as TGA-MS and pyrolysis-GC. The strengths and weaknesses of the different methodologies known to date demonstrate that analysis methods should be in line with the application. The combination of data from advanced analysis methods allows us to obtain in-depth information on the degradation processes. Aided with computational methods, the kinetics and thermodynamics of thermal degradation are revealed piece by piece. The better understanding of the behaviour of ionic liquids at high temperature allows selective and application driven design, as well as mathematical prediction for engineering purposes. PMID:23598738

Maton, Cedric; De Vos, Nils; Stevens, Christian V

2013-07-01

382

A Search and Analysis Tool for RHESSI Complementary Observations  

NASA Astrophysics Data System (ADS)

To maximize its scientific return, the RHESSI mission relies on coordinated analysis of context (or complementary) observations from ground- and space-based observatories. To facilitate this analysis, we have developed a database and software system that allows users to search, browse, and retrieve complementary data for RHESSI-observed flares. The system includes images, spectra, and lightcurves that span a wide range of wavelengths (radio, optical, EUV, soft X-rays, and hard X-rays), as well as magnetic field observations. These observations are mirrored with 24 hours from remote sites around the world to a central archive at the NASA/GSFC Solar Data Analysis Center. The archive currently includes SOHO (EIT, MDI), TRACE, H-alpha (Meudon, BBSO), radio (OVSA, RSTN, Phoenix, Nobeyama, Nancay), and the Czech Hard X-ray Spectrometer observations. This poster will demonstrate an IDL interface to the RHESSI complementary data archive.

Zarro, D. M.; Tolbert, K.; Dennis, B. R.

2002-05-01

383

A Search and Analysis Tool for HESSI Ancillary Observations  

NASA Astrophysics Data System (ADS)

To fully maximize its scientific return, the HESSI mission will rely heavily on coordinated analysis of context (or ancillary) observations from ground- and space-based observatories. To facilitate this analysis, we have developed an archive and software system that allows users to search, browse, and retrieve ancillary data for HESSI-observed flares. The system includes images, spectra, and lightcurves that span a wide range of wavelengths (radio, optical, UV/EUV, soft X-rays, and hard X-rays) as well as magnetic field observations. These observations are mirrored within 24 hours from remote sites around the world to a central server at the NASA/GSFC Solar Data Analysis Center. The central archive currently includes Yohkoh (SXT), SOHO (EIT, MDI), TRACE, H-alpha (Meudon), radio (RSTN, OVSA, Phoenix, Nancay, Nobeyama) and the Czech Hard X-ray Spectrometer observations. This poster will demonstrate an IDL interface that we have developed for accessing the HESSI ancillary data archive. [EOB

Zarro, D. M.; Tolbert, K.; Dennis, B.

2001-12-01

384

DEBRISK, a Tool for Re-Entry Risk Analysis  

NASA Astrophysics Data System (ADS)

An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the ablated object. The mass in the lumped mass equation is termed 'thermal mass' and consists of the part of the object that is exposed to the flow (so excluding the mass of the contained children). A fair amount of predefined materials is implemented, along with their thermal properties. In order to allow the users to modify the properties or to add new materials, user defined materials can be used. In that case the properties such as specific heat, emissivity and conductivity can either be entered as a constant or as being temperature dependent by entering a table. Materials can be derived from existing objects, which is useful in case only one or few of the material properties change. The code has been developed in the Java language, benefitting from the object oriented approach. Most methods that are used in DEBRISK to compute drag coefficients and heating rates are based on engineering methods developed in the 1950 to 1960's, which are used as well in similar tools (ORSAT, SESAME, ORSAT-J, ...). The paper presents a set of comparisons with literature cases of similar tools in order to verify the implementation of those methods in the developed software.

Omaly, P.; Spel, M.

2012-01-01

385

GIPSY 3D: Analysis, visualization and VO-Tools  

NASA Astrophysics Data System (ADS)

The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing System (GIPSY), and new ones based on use cases elaborated in collaboration with advanced users. One of the main goals is to provide local interoperability between GIPSY (visualization and data analysis) and other VO software. The connectivity with the Virtual Observatory environment will provide general access to 3D data VO archives and services, maximizing the potential for scientific discovery.

Ruíz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; van der Hulst, J. M.

2009-07-01

386

Application of surface chemical analysis tools for characterization of nanoparticles  

Microsoft Academic Search

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is\\u000a described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy\\u000a (AES), X-ray photoelectron spectroscopy (XPS), time-of-flight secondary-ion mass spectrometry (TOF-SIMS), low-energy ion scattering\\u000a (LEIS), and scanning-probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic

Donald R. Baer; Daniel J. Gaspar; Ponnusamy Nachimuthu; Sirnegeda D. Techane; David G. Castner

2010-01-01

387

Development of a User Interface for a Regression Analysis Software Tool  

NASA Technical Reports Server (NTRS)

An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

Ulbrich, Norbert Manfred; Volden, Thomas R.

2010-01-01

388

Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology  

PubMed Central

The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu). PMID:24109552

Grüning, Björn A.; Paszkiewicz, Konrad; Pritchard, Leighton

2013-01-01

389

Primary Trait Analysis: A Tool for Classroom-Based Assessment.  

ERIC Educational Resources Information Center

Describes Primary Trait Analysis (PTA) as a technique for instructors interested in classroom-based assessment. Provides a history of PTA at Raymond Walters College (RWC) and explains how RWC faculty developed PTA scales to communicate learning objectives and evaluative criteria to students. Provides several specific scales as examples. (EV)

Baughin, Judith A.; Brod, Evelyn F.; Page, Deborah L.

2002-01-01

390

Market research for requirements analysis using linguistic tools  

Microsoft Academic Search

Numerous studies in recent months have proposed the use of linguistic instruments to support requirements analysis. There are two main reasons for this: (i) the progress made in natural language processing, (ii) the need to provide the developers of software systems with support in the early phases of requirements definition and conceptual modelling. This paper presents the results of an

Luisa Mich; Mariangela Franch; Pierluigi Novi Inverardi

2004-01-01

391

A software tool for the analysis of neuronal morphology data  

PubMed Central

Anatomy plays a fundamental role in supporting and shaping nervous system activity. The remarkable progress of computer processing power within the last two decades has enabled the generation of electronic databases of complete three-dimensional (3D) dendritic and axonal morphology for neuroanatomical studies. Several laboratories are freely posting their reconstructions online after result publication v.gr. NeuroMorpho.Org (Nat Rev Neurosci7:318–324, 2006). These neuroanatomical archives represent a crucial resource to explore the relationship between structure and function in the brain (Front Neurosci6:49, 2012). However, such 'Cartesian’ descriptions bear little intuitive information for neuroscientists. Here, we developed a simple prototype of a MATLAB-based software tool to quantitatively describe the 3D neuronal structures from public repositories. The program imports neuronal reconstructions and quantifies statistical distributions of basic morphological parameters such as branch length, tortuosity, branch's genealogy and bifurcation angles. Using these morphological distributions, our algorithm can generate a set of virtual neurons readily usable for network simulations. PMID:24529393

2014-01-01

392

A geospatial tool for wildfire threat analysis in central Texas  

NASA Astrophysics Data System (ADS)

Wildland fires in the United States are not always confined to wilderness areas. The growth of population centers and housing developments in wilderness areas has blurred the boundaries between rural and urban. This merger of human development and natural landscape is known in the wildland fire community as the wildland urban interface or WUI, and it is within this interface that many wildland fires increasingly occur. As wildland fire intrusions in the WUI increase so too does the need for tools to assess potential impact to valuable assets contained within the interface. This study presents a methodology that combines real-time weather data, a wildland fire behavior model, satellite remote sensing and geospatial data in a geographic information system to assess potential risk to human developments and natural resources within the Austin metropolitan area and surrounding ten counties of central Texas. The methodology uses readily available digital databases and satellite images within Texas, in combination with an industry standard fire behavior model to assist emergency and natural resource managers assess potential impacts from wildland fire. Results of the study will promote prevention of WUI fire disasters, facilitate watershed and habitat protection, and help direct efforts in post wildland fire mitigation and restoration.

Hunter, Bruce Allan

393

Bayesian blind deconvolution from differently exposed image pairs.  

PubMed

Photographs acquired under low-lighting conditions require long exposure times and therefore exhibit significant blurring due to the shaking of the camera. Using shorter exposure times results in sharper images but with a very high level of noise. In this paper we address the problem of utilizing two such images in order to obtain an estimate of the original scene and present a novel blind deconvolution algorithm for solving it. We formulate the problem in a hierarchical Bayesian framework by utilizing prior knowledge on the unknown image and blur, and also on the dependency between the two observed images. By incorporating a fully Bayesian analysis, the developed algorithm estimates all necessary model parameters along with the unknown image and blur, such that no user-intervention is needed. Moreover, we employ a variational Bayesian inference procedure, which allows for the statistical compensation of errors occurring at different stages of the restoration, and also provides uncertainties of the estimates. Experimental results with synthetic and real images demonstrate that the proposed method provides very high quality restoration results and compares favorably to existing methods even though no user supervision is needed. PMID:20529746

Babacan, Sevket Derin; Wang, Jingnan; Molina, Rafael; Katsaggelos, Aggelos K

2010-11-01

394

An integrated data analysis tool for improving measurements on the MST RFPa)  

NASA Astrophysics Data System (ADS)

Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

Reusch, L. M.; Galante, M. E.; Franz, P.; Johnson, J. R.; McGarry, M. B.; Stephens, H. D.; Den Hartog, D. J.

2014-11-01

395

Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT): Semi-Annual Progress Report  

SciTech Connect

This report summarizes work carried out by the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of July 1, 2011 through December 31, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. The UV-CDAT team is positioned to address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, Visualization interfaces.

Williams, D N

2012-02-29

396

Leaf analysis as an exploratory tool in mineralogy  

NASA Astrophysics Data System (ADS)

PIXE analysis has been used for more than a decade at the University of Manitoba to determine trace-element concentrations in a wide variety of materials including minerals. Detailed analysis of the elemental composition of leaves is of interest because the macronutrient and micronutrient elements present in plant tissues are already well known from chemical studies. In the present work samples from species Betula populifolia and Picea glauca were irradiated at an incident proton energy of 40 MeV to determine possible additional trace-element concentrations due to migration from mineral deposits present underground. In addition to known nutrient elements, other elements such as Rb, Sr, Cd and Ba were readily detected. In some samples the presence of Pt was also identified.

Mirzai, A. A.; McKee, J. S. C.; Yeo, Y. H.; Gallop, D.; Medved, J.

1990-04-01

397

Regression analyses and motile sperm subpopulation structure study as improving tools in boar semen quality analysis  

Microsoft Academic Search

A precise estimation of the fertilizing ability of a boar ejaculate would be very useful to improve pig assisted reproduction results. For this purpose, we tested the mathematical combination of several parameters of the boar semen quality analysis, including the computer-assisted semen motility analysis (CASA), as a predictive fertility tool. The utilized mathematical relations among parameters were logistic and linear

Armando Quintero-Moreno; Teresa Rigau; Joan E Rodr??guez-Gil

2004-01-01

398

CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips  

PubMed Central

Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

2014-01-01

399

A Static Analysis Tool for Detecting Web Application Injection Vulnerabilities for ASP Program  

Microsoft Academic Search

Publicly reported vulnerability in recent years strong growth of the Web Application , Cross-site scripting (XSS) and SQL injection have been the most dominant class of web vulnerabilities, Web application security has been a great challenge. For the case, the static analysis tools ASPWC presented in this paper to detect XSS attacks and SQL injection vulnerabilities based on taint analysis,

Xin-hua Zhang; Zhi-jian Wang

2010-01-01

400

BULLWHIP EFFECT AND SUPPLY CHAIN MODELLING AND ANALYSIS USING CPN TOOLS  

E-print Network

Chain analysis and demonstrated their model using a case study from the food industry. Their model uses). Supply Chains in food industry are also modeled in [1]. The authors propose a supply chain managementBULLWHIP EFFECT AND SUPPLY CHAIN MODELLING AND ANALYSIS USING CPN TOOLS Dragana Makaji

van der Aalst, Wil

401

Exploring NASA and ESA Atmospheric Data Using GIOVANNI, the Online Visualization and Analysis Tool  

NASA Technical Reports Server (NTRS)

Giovanni, the NASA Goddard online visualization and analysis tool (http://giovanni.gsfc.nasa.gov) allows users explore various atmospheric phenomena without learning remote sensing data formats and downloading voluminous data. Using NASA MODIS (Terra and Aqua) and ESA MERIS (ENVISAT) aerosol data as an example, we demonstrate Giovanni usage for online multi-sensor remote sensing data comparison and analysis.

Leptoukh, Gregory

2007-01-01

402

HiTRACE-Web: an online tool for robust analysis of high-throughput capillary electrophoresis  

E-print Network

HiTRACE-Web: an online tool for robust analysis of high-throughput capillary electrophoresis Hanjoo software named HiTRACE (High Throughput Robust Analysis of Capillary Electrophoresis). HiTRACE has been). Its conventional application areas include genomic mapping, forensic identification and genome

Das, Rhiju

403

Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis  

ERIC Educational Resources Information Center

The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

Moffitt, Kevin Christopher

2011-01-01

404

MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies  

E-print Network

MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies J. Garcia-Alfaro, F. Cuppens, N. Cuppens-Boulahia, and S. Preda Abstract We present the core functionality of MIRAGE cases, MIRAGE provides intra-component analysis to detect inconsistencies in single component

Boyer, Edmond

405

CoryneBase: Corynebacterium genomic resources and analysis tools at your fingertips.  

PubMed

Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

Heydari, Hamed; Siow, Cheuk Chuen; Tan, Mui Fern; Jakubovics, Nick S; Wee, Wei Yee; Mutha, Naresh V R; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

2014-01-01

406

Transana Video Analysis Software as a Tool for Consultation: Applications to Improving PTA Meeting Leadership  

ERIC Educational Resources Information Center

The chief aim of this article is to illustrate the potential of using Transana, a qualitative video analysis tool, for effective and efficient school-based consultation. In this illustrative study, the Transana program facilitated analysis of excerpts of video from a representative sample of Parent Teacher Association (PTA) meetings over the…

Rush, Craig

2012-01-01

407

Laboratory Experiments for Code Validation of Multiutility Spacecraft Charging Analysis Tool (MUSCAT)  

Microsoft Academic Search

The multiutility spacecraft charging analysis tool (MUSCAT), a spacecraft charging analysis software, has been developed as a collaboration work between Japan Aerospace Exploration Agency and Kyushu Institute of Technology. Laboratory experiments for fundamental code validation were carried out in both facilities' plasma chambers. MUSCAT is a particle simulation code based on particle-in-cell (PIC) and particle tracking (PT) algorithms capable of

Satoshi Hosoda; Takanobu Muranaka; Hitoshi Kuninaka; Jeongho Kim; Shinji Hatta; Naomi Kurahara; Mengu Cho; Hiroko O. Ueda; Kiyokazu Koga; Tateo Goka

2008-01-01

408

Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet  

NASA Technical Reports Server (NTRS)

The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

2000-01-01

409

ADVISOR: a systems analysis tool for advanced vehicle modeling  

Microsoft Academic Search

This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy’s (DOE’s) ADVISOR written in the MATLAB\\/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance,

T. Markel; A. Brooker; T. Hendricks; V. Johnson; K. Kelly; B. Kramer; M O’Keefe; S. Sprik; K. Wipke

2002-01-01

410

Introduction to Social Network Analysis (SNA) as an investigative tool  

Microsoft Academic Search

Social behavior is brought about mainly through social ties and connections. Our contacts with other people shape our view\\u000a of the world, reinforce our identity, and the interactions provide us with all kinds of opportunities and resources to get\\u000a things done. The social capital associated with networks is also one of the primary ways facilitating crime. Therefore, the\\u000a systematic analysis

Renée C. van der Hulst

2009-01-01

411

Decision Analysis Tool to Compare Energy Pathways for Transportation  

SciTech Connect

With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). A prototype model, Analytica Transportation Energy Analysis Model (ATEAM), has been developed using the Analytica decision modeling environment, visualizing the structure as a hierarchy of influence diagrams. The report summarized the FY2010 ATEAM accomplishments.

Bloyd, Cary N.; Stork, Kevin

2011-02-01

412

Tools for Functional Postgenomic Analysis of Listeria monocytogenes?  

PubMed Central

We describe the development of genetic tools for regulated gene expression, the introduction of chromosomal mutations, and improved plasmid transfer by electroporation in the food-borne pathogen Listeria monocytogenes. pIMK, a kanamycin-resistant, site-specific, integrative listeriophage vector was constructed and then modified for overexpression (pIMK2) or for isopropyl-?-d-thiogalactopyranoside (IPTG)-regulated expression (pIMK3 and pIMK4). The dynamic range of promoters was assessed by determining luciferase activity, P60 secretion, and internalin A-mediated invasion. These analyses demonstrated that pIMK4 and pIMK3 have a stringently controlled dynamic range of 540-fold. Stable gene overexpression was achieved with pIMK2, giving a range of expression for the three vectors of 1,350-fold. The lactococcal pORI280 system was optimized for the generation of chromosomal mutations and used to create five new prfA star mutants. The combination of pIMK4 and pORI280 allowed streamlined creation of “IPTG-dependent” mutants. This was exemplified by creation of a clean deletion mutant with deletion of the universally essential secA gene, and this mutant exhibited a rapid loss of viability upon withdrawal of IPTG. We also improved plasmid transfer by electroporation into three commonly used laboratory strains of L. monocytogenes. A 125-fold increase in transformation efficiency for EGDe compared with the widely used protocol of Park and Stewart (S. F. Park and G. S. Stewart, Gene 94:129-132, 1990) was observed. Maximal transformation efficiencies of 5.7 × 106 and 6.7 × 106 CFU per ?g were achieved for EGDe and 10403S, respectively, with a replicating plasmid. An efficiency of 2 × 107 CFU per ?g is the highest efficiency reported thus far for L. monocytogenes F2365. PMID:18441118

Monk, Ian R.; Gahan, Cormac G. M.; Hill, Colin

2008-01-01

413

Power Systems Life Cycle Analysis Tool (Power L-CAT).  

SciTech Connect

The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

Andruski, Joel; Drennen, Thomas E.

2011-01-01

414

Computational Tools and Facilities for the Next-Generation Analysis and Design Environment  

NASA Technical Reports Server (NTRS)

This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

1997-01-01

415

MESH-BASED SPHERICAL DECONVOLUTION FOR PHYSICALLY VALID FIBER ORIENTATION RECONSTRUCTION VIA DIFFUSION-WEIGHTED MRI  

E-print Network

University of California, Los Angeles ABSTRACT High angular resolution diffusion imaging (HARDI) meth- ods fiber geometry. Index Terms-- Deconvolution, magnetic resonance imaging, inverse problems, optimizationMESH-BASED SPHERICAL DECONVOLUTION FOR PHYSICALLY VALID FIBER ORIENTATION RECONSTRUCTION VIA

Thompson, Paul

416

Deconvolution of variable rate reservoir performance data using B-splines  

E-print Network

This work presents the development, validation and application of a novel deconvolution method based on B-splines for analyzing variable-rate reservoir performance data. Variable-rate deconvolution is a mathematically unstable problem which has been...

Ilk, Dilhan

2007-04-25

417

Second generation sequencing allows for mtDNA mixture deconvolution and high resolution detection of heteroplasmy  

PubMed Central

Aim To use parallel array pyrosequencing to deconvolute mixtures of mitochondrial DNA (mtDNA) sequence and provide high resolution analysis of mtDNA heteroplasmy. Methods The hypervariable segment 1 (HV1) of the mtDNA control region was analyzed from 30 individuals using the 454 GS Junior instrument. Mock mixtures were used to evaluate the system’s ability to deconvolute mixtures and to reliably detect heteroplasmy, including heteroplasmic differences between 5 family members of the same maternal lineage. Amplicon sequencing was performed on polymerase chain reaction (PCR) products generated with primers that included multiplex identifiers (MID) and adaptors for pyrosequencing. Data analysis was performed using NextGENe® software. The analysis of an autosomal short tandem repeat (STR) locus (D18S51) and a Y-STR locus (DYS389 I/II) was performed simultaneously with a portion of HV1 to illustrate that multiplexing can encompass different markers of forensic interest. Results Mixtures, including heteroplasmic variants, can be detected routinely down to a component ratio of 1:250 (20 minor variant copies with a coverage rate of 5000 sequences) and can be readily detected down to 1:1000 (0.1%) with expanded coverage. Amplicon sequences from D18S51, DYS389 I/II, and the second half of HV1 were successfully partitioned and analyzed. Conclusions The ability to routinely deconvolute mtDNA mixtures down to a level of 1:250 allows for high resolution analysis of mtDNA heteroplasmy, and for differentiation of individuals from the same maternal lineage. The pyrosequencing approach results in poor resolution of homopolymeric sequences, and PCR/sequencing artifacts require a filtering mechanism similar to that for STR stutter and spectral bleed through. In addition, chimeric sequences from jumping PCR must be addressed to make the method operational. PMID:21674826

Holland, Mitchell M.; McQuillan, Megan R.; O’Hanlon, Katherine A.

2011-01-01

418

GIPSY 3D: Analysis, Visualization and VO Tools for Datacubes  

NASA Astrophysics Data System (ADS)

The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing System (GIPSY), and new ones based on use cases elaborated in collaboration with advanced users. One of the main goals is to provide local interoperability between GIPSY and other VO software. The connectivity with the Virtual Observatory environment will provide general access to 3D data VO archives and services, maximizing the potential for scientific discovery.

Ruíz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; van der Hulst, J. M.

2009-09-01

419

Stakeholder analysis: a useful tool for biobank planning.  

PubMed

Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway. PMID:24835062

Bjugn, Roger; Casati, Bettina

2012-06-01

420

Liquid chromatography: a tool for the analysis of metal species.  

PubMed

An overview is presented of classic and more recent applications of liquid chromatography for the analysis of metal species. The different approaches involving ion-exchange, ion-pair, and chelation separation mechanisms are discussed as well as the new philosophy of simply removing interferents before specific detections of metal ions (alkali and alakaline earths, rare earths, heavy and transition metals). New more selective materials enabling difficult separations and studies on multimodal or hyphenated techniques for metal speciation (e.g. arsenic and chromium) are considered. PMID:10457482

Sarzanini, C

1999-07-30

421

Configuration Analysis Tool (CAT). System Description and users guide (revision 1)  

NASA Technical Reports Server (NTRS)

A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

1982-01-01

422

Preparing NASA Atmospheric Data Exploration Tools for NPOESS Data Visualization and Analysis  

Microsoft Academic Search

The NASA Goddard Space Flight Center (GSFC) Earth Sciences (GES) Atmospheric Composition Data and Information Services Center (ACDISC) (http:\\/\\/acdisc.gsfc.nasa.gov\\/ or google on 'acdisc'), known for its development and deployment of responsive, user oriented, data management and value added data tools, has implemented a visualization and analysis tool that is rapidly receiving much use by the Atmospheric Composition (AC) community. Giovanni,

G. G. Leptoukh; S. J. Kempler; I. Gerasimov; S. W. Berrick; J. Johnson; S. Ahmad

2005-01-01

423

Surgem: Next Generation CAD Tools for Interactive Patient Specific Surgical Planning and Hemodynamic Analysis  

Microsoft Academic Search

The first version of an anatomy editing\\/surgical planning tool targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel shape editing concepts and human-shape interaction (HSI) technologies have been combined to facilitate interactive shape alterations and grid generation. At a surgery planning phase, these tools are applied to design and evaluate possible modifications of patient-specific anatomies of

Jarek Rossignac; Kerem Pekkan; Brian Whited; Kirk Kanter; Ajit Yoganathan; Wallace H. Coulter

424

SizeUp: A Tool for Interactive Comparative Collection Analysis for Very Large Species Collections  

E-print Network

SizeUp: A Tool for Interactive Comparative Collection Analysis for Very Large Species Collections Andrew Ozor Generated by Foxit PDF Creator © Foxit Software http://www.foxitsoftware.com For evaluation only. Wide Ranging Biological Data l Global... How do we compare and analyze large data sets, and visualize the result in a user friendly tool? Generated by Foxit PDF Creator © Foxit Software http://www.foxitsoftware.com For evaluation only. Multiple Problems l No formal definition for 'quality...

Ozor, Andrew

2009-11-18

425

Tool and Task Analysis Guide for Vocational Welding (150 Tasks). Performance Based Vocational Education.  

ERIC Educational Resources Information Center

This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…

John H. Hinds Area Vocational School, Elwood, IN.

426

AVISPA: a web tool for the prediction and analysis of alternative splicing  

PubMed Central

Transcriptome complexity and its relation to numerous diseases underpins the need to predict in silico splice variants and the regulatory elements that affect them. Building upon our recently described splicing code, we developed AVISPA, a Galaxy-based web tool for splicing prediction and analysis. Given an exon and its proximal sequence, the tool predicts whether the exon is alternatively spliced, displays tissue-dependent splicing patterns, and whether it has associated regulatory elements. We assess AVISPA's accuracy on an independent dataset of tissue-dependent exons, and illustrate how the tool can be applied to analyze a gene of interest. AVISPA is available at http://avispa.biociphers.org. PMID:24156756

2013-01-01

427

Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide  

NASA Technical Reports Server (NTRS)

The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

Csank, Jeffrey T.; Zinnecker, Alicia M.

2014-01-01

428

IBIXFIT: A Tool For The Analysis Of Microcalorimeter PIXE Spectra  

SciTech Connect

PIXE analysis software has been for long mainly tuned to the needs of Si(Li) detector based spectra analysis and quantification methods based on K{sub {alpha}} or L{sub {alpha}} X-ray lines. Still, recent evidences related to the study of relative line intensities and new developments in detection equipment, namely the emergence of commercial microcalorimeter based X-ray detectors, have brought up the possibility that in the near future PIXE will become more than just major lines quantification. A main issue that became evident as a consequence of this was the need to be able to fit PIXE spectra without prior knowledge of relative line intensities. Considering new developments it may be necessary to generalize PIXE to a wider notion of ion beam induced X-ray (IBIX) emission, to include the quantification of processes such as Radiative Auger Emission. In order to answer to this need, the IBIXFIT code was created based much on the Bayesian Inference and Simulated Annealing routines implemented in the Datafurnace code [1]. In this presentation, the IBIXFIT is used to fit a microcalorimeter spectrum of a Ba{sub x}Sr{sub (1-x)}TiO{sub 3} thin film sample and the specific possibility of selecting between fixed and free line ratios combined with other specificities of the IBIXFIT algorithm are shown to be essential to overcome the problems faced.

Taborda, A.; Reis, M. A. [Instituto Tecnologico e Nuclear, Estrada Nacional 10, Sacavem, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Atomica da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, 1649-003 Lisboa (Portugal); Alves, L. C.; Barradas, N. P. [Instituto Tecnologico e Nuclear, Estrada Nacional 10, Sacavem, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, 1649-003 Lisboa (Portugal); Chaves, P. C. [Instituto Tecnologico e Nuclear, Estrada Nacional 10, Sacavem, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Atomica da Universidade de Lisboa, Av. Prof. Gama Pinto, 2, 1649-003 Lisboa (Portugal); Instituto Superior Tecnico da Universidade Tecnica de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal)

2011-06-01

429

Irena : tool suite for modeling and analysis of small-angle scattering.  

SciTech Connect

Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron) using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.

Ilavsky, J.; Jemian, P.

2009-04-01

430

Multilayered Analysis of Teacher-Student Interactions: Concepts and Perspectives Guiding Video Analysis with Tattoo, the Analytic Transcription Tool  

Microsoft Academic Search

This article describes the development of a video analysis software tool designed to make explicit and open the process of systematic analysis of video material on teaching–learning interactions. The need of an efficient and transparent way of transcribing and analysing video materials was brought forth in a sequence of studies of interaction in music education in Sweden, where spoken language,

Tore West

2007-01-01

431

The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT): Data Analysis and Visualization for  

E-print Network

and Visualization for Geoscience Data Dean Williams, Charles Doutriaux, John Patchett, Sean Williams, Galen Shipman Poco, Tommy Ellqvist, Emanuele Santos, Gerald Potter, Brian Smith, Thomas Maxwell, David Kindig Data Analysis Tools (UV-CDAT): Data Analysis and Visualization for Geoscience Data Dean N. Williams

432

Mathematical tools for the analysis and exploitation of polarimetric measurements  

NASA Astrophysics Data System (ADS)

The measured Mueller matrices contain until sixteen independent parameters for each measurement configuration (spectral profile of the wave probe of the polarimeter, angle of incidence, observation direction...) and for each spatially resolved element of the sample (imaging polarimetry). Thus, the polarimetric techniques are widely used for the study of a great variety of material samples in optics and remote sensing. Nevertheless, the relevant physical information does not appear explicitly in the measured parameters and thus the best knowledge of the structure of the physical information contained in a Mueller matrix is required in order to develop appropriate procedures for the polarimetric analysis. In this paper, the physically invariant polarimetric quantities are identified and decoupled, and the main approaches for serial and parallel decompositions of measured Mueller matrices into simple components are reviewed.

Gil, José J.

2013-09-01

433

Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis  

SciTech Connect

This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).

Pabian, Frank V [Los Alamos National Laboratory] [Los Alamos National Laboratory

2012-08-14

434

The application of compressive sampling to radio astronomy. I. Deconvolution  

NASA Astrophysics Data System (ADS)

Compressive sampling is a new paradigm for sampling, based on sparseness of signals or signal representations. It is much less restrictive than Nyquist-Shannon sampling theory and thus explains and systematises the widespread experience that methods such as the Högbom CLEAN can violate the Nyquist-Shannon sampling requirements. In this paper, a CS-based deconvolution method for extended sources is introduced. This method can reconstruct both point sources and extended sources (using the isotropic undecimated wavelet transform as a basis function for the reconstruction step). We compare this CS-based deconvolution method with two CLEAN-based deconvolution methods: the Högbom CLEAN and the multiscale CLEAN. This new method shows the best performance in deconvolving extended sources for both uniform and natural weighting of the sampled visibilities. Both visual and numerical results of the comparison are provided.

Li, F.; Cornwell, T. J.; de Hoog, F.

2011-04-01

435

Application of Compressive Sampling to Radio Astronomy I: Deconvolution  

NASA Astrophysics Data System (ADS)

Compressive sampling is a new paradigm for sampling, based on sparseness of signals or signal representations. It is much less restrictive than Nyquist-Shannon sampling theory and thus explains and systematises the widespread experience that methods such as the Högbom CLEAN can violate the Nyquist-Shannon sampling requirements. In this paper, a CS-based deconvolution method for extended sources is introduced. This method can reconstruct both point sources and extended sources (using the isotropic undecimated wavelet transform as a basis function for the reconstruction step). We compare this CS-based deconvolution method with two CLEAN-based deconvolution methods: the Högbom CLEAN and the multiscale CLEAN. This new method shows the best performance in deconvolving extended sources for both uniform and natural weighting of the sampled visibilities. Both visual and numerical results of the comparison are provided.

Li, Feng; Brown, Shea; Cornwell, Tim J.; de Hoog, Frank

2011-06-01

436

Wavelet-based deconvolution of ultrasonic signals in nondestructive evaluation  

E-print Network

In this paper, the inverse problem of reconstructing reflectivity function of a medium is examined within a blind deconvolution framework. The ultrasound pulse is estimated using higher-order statistics, and Wiener filter is used to obtain the ultrasonic reflectivity function through wavelet-based models. A new approach to the parameter estimation of the inverse filtering step is proposed in the nondestructive evaluation field, which is based on the theory of Fourier-Wavelet regularized deconvolution (ForWaRD). This new approach can be viewed as a solution to the open problem of adaptation of the ForWaRD framework to perform the convolution kernel estimation and deconvolution interdependently. The results indicate stable solutions of the estimated pulse and an improvement in the radio-frequency (RF) signal taking into account its signal-to-noise ratio (SNR) and axial resolution. Simulations and experiments showed that the proposed approach can provide robust and optimal estimates of the reflectivity function.

Herrera, Roberto Henry; Rodríguez, Manuel

2012-01-01

437

Funtools: Fits Users Need Tools for Quick, Quantitative Analysis  

NASA Technical Reports Server (NTRS)

The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

Mandel, Eric; Brederkamp, Joe (Technical Monitor)

2001-01-01

438

A software tool for 3D dose verification and analysis  

NASA Astrophysics Data System (ADS)

The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

Sa'd, M. Al; Graham, J.; Liney, G. P.

2013-06-01

439

Joint decorrelation, a versatile tool for multichannel data analysis.  

PubMed

We review a simple yet versatile approach for the analysis of multichannel data, focusing in particular on brain signals measured with EEG, MEG, ECoG, LFP or optical imaging. Sensors are combined linearly with weights that are chosen to provide optimal signal-to-noise ratio. Signal and noise can be variably defined to match the specific need, e.g. reproducibility over trials, frequency content, or differences between stimulus conditions. We demonstrate how the method can be used to remove power line or cardiac interference, enhance stimulus-evoked or stimulus-induced activity, isolate narrow-band cortical activity, and so on. The approach involves decorrelating both the original and filtered data by joint diagonalization of their covariance matrices. We trace its origins; offer an easy-to-understand explanation; review a range of applications; and chart failure scenarios that might lead to misleading results, in particular due to overfitting. In addition to its flexibility and effectiveness, a major appeal of the method is that it is easy to understand. PMID:24990357

de Cheveigné, Alain; Parra, Lucas C

2014-09-01

440

Pathomx: an interactive workflow-based tool for the analysis of metabolomic data.  

PubMed

BackgroundMetabolomics is a systems approach to the analysis of cellular processes through small-molecule metabolite profiling. Standardisation of sample handling and acquisition approaches has contributed to reproducibility. However, the development of robust methods for the analysis of metabolomic data is a work-in-progress. The tools that do exist are often not well integrated, requiring manual data handling and custom scripting on a case-by-case basis. Furthermore, existing tools often require experience with programming environments such as MATLAB or R to use, limiting accessibility. Here we present Pathomx, a workflow-based tool for the processing, analysis and visualisation of metabolomic and associated data in an intuitive and extensible environment.ResultsThe core application provides a workflow editor, IPython kernel and a HumanCycTM-derived database of metabolites, proteins and genes. Toolkits provide reusable tools that may be linked together to create complex workflows. Pathomx is released with a base set of plugins for the import, processing and visualisation of data. The IPython backend provides integration with existing platforms including MATLAB and R, allowing data to be seamlessly transferred. Pathomx is supplied with a series of demonstration workflows and datasets. To demonstrate the use of the software we here present an analysis of 1D and 2D 1H NMR metabolomic data from a model system of mammalian cell growth under hypoxic conditions.ConclusionsPathomx is a useful addition to the analysis toolbox. The intuitive interface lowers the barrier to entry for non-experts, while scriptable tools and integration with existing tools supports complex analysis. We welcome contributions from the community. PMID:25490956

Fitzpatrick, Martin A; McGrath, Catherine M; Young, Stephen P

2014-12-10

441

Semantic integration of gene expression analysis tools and data sources using software connectors  

PubMed Central

Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

2013-01-01

442

Principal component analysis: A tool for processing hyperspectral infrared data  

NASA Astrophysics Data System (ADS)

During the last decades, new instruments have been designed and built to improve observations of atmospheric temperature, water vapor, and winds. In the area of infrared remote sensing, new technologies will enable the next generation of instruments, like the Geostationary Imaging Fourier Transform Spectrometer (GIFTS), to collect high spectral and spatial resolution data with very high data rates. If not properly compressed those data rates will exceed the capacity of the current operational downlink technology and will require expensive data systems to process the data on the ground. This dissertation focuses o