Note: This page contains sample records for the topic deconvolution analysis tool from
While these samples are representative of the content of,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of
to obtain the most current and comprehensive results.
Last update: November 12, 2013.


Microsoft Academic Search

The Spectral Deconvolution Analysis Tool (SDAT) is being written to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT will utilize spectral deconvolution spectroscopy techniques to analyze both ??? coincidence spectra for radioxenon isotopes and high-resolution High Purity Germanium (HPGe) spectra that are utilized for aerosol monitoring. Spectral deconvolution spectroscopy is an analysis method that utilizes the

S. R. Biegalski; K. M. Foltz Biegalski; D. A. Haas


Testing the Spectral Deconvolution Algorithm Tool (SDAT) with Xe Spectra.  

National Technical Information Service (NTIS)

The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both B...

D. A. Haas K. M. Biegalski S. R. Biegalski



Fourier self-deconvolution in the analysis of near infrared spectra of chemically complex samples  

Microsoft Academic Search

A demonstrated analytical tool for band enhancements in the mid-IR, Fourier self-deconvolution can serve similarly in the analysis of NIR spectra even when the spectra, obtained with dispersion instruments in the diffuse reflectance mode, have broad, overlapping bands not easily resolved. Unlike derivative enhancement methods, Fourier self-deconvolution preserves peak areas so that the deconvolted results may prove advantageous in using

William F. McClure; Anthony M. C. Davies



Chemometric data analysis for deconvolution of overlapped ion mobility profiles.  


We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution. PMID:22948903

Zekavat, Behrooz; Solouki, Touradj



Deconvolution Analysis of Cardiac Natriuretic Peptides During Acute Volume Overload  

Microsoft Academic Search

Cardiac natriuretic peptides, especially amino terminal pro-Brain Natriuretic Peptide (NT-proBNP), are emerging as powerful circulating markers of cardiac function. However, the in vivo secretion and elimination (t 1? 2 )o f these peptides during acute volume overload have not been studied. We present the first report of the secretion and elimination of cardiac natriuretic peptides, based on deconvolution analysis of

Chris J. Pemberton; Michael L. Johnson; Tim G. Yandle; Eric A. Espiner



Direct chiral liquid chromatography determination of aryloxyphenoxypropionic herbicides in soil: deconvolution tools for peak processing.  


In this paper, the enantiomeric separation of two aryloxyphenoxypropionic esters (fluazifop-butyl and quizalofop-ethyl) and a safener herbicide (mefenpyr-diethyl), which is widely used for protecting crop plants, has been studied by direct liquid chromatography (LC) with UV detection on an ?(1)-acid glycoprotein as chiral stationary phase. Optimization of separation conditions was done by factorial experimental design. Experimental factors and ranges selected were propanol (5-10%), phosphate buffer pH (6.5-7.0), and column temperature (15-25 °C). Responses were expressed in terms of enantioresolution (R(s)) and adjusted retention time of the second eluted enantiomer (t(r2)'). The chemometric method used to explore data was response surface analysis. Multiple response analyses were carried out to determine the combination of experimental factors which simultaneously optimize experimental responses. Under optimum conditions for enantioseparation of each herbicide, partially overlapped or fully resolved enantiomers were obtained. Deconvolution tools were employed as an integration method to fit chromatographic data and to achieve a more precise enantiomeric ratio (ER) and enantiomeric fraction (EF) values. Applicability of both direct chiral LC and peak deconvolution methods was evaluated in spiked soil samples at different R/S enantiomeric ratios. Acceptable and reproducible recoveries between 71% and 96% with precision in the range 1-6% were achieved for herbicide-spiked levels from 0.50 to 9.0 ?g g(-1). In addition, parameters such as R(s), ER, and EF were calculated and compared with values obtained using the common valley drop integration method. PMID:21487707

Guillén-Casla, V; Magro-Moral, J; Rosales-Conrado, N; Pérez-Arribas, L V; León-González, M E; Polo-Díez, L M



The deconvolution operation in convex analysis: An introduction  

SciTech Connect

Performing the infimal convolution of two functions is a frequent and useful operation in Convex Analysis: it is, to great extent, the dual operation of the addition; it serves (like other {open_quotes}convolutions{close_quotes} in Analysis) to regularize functions; it has nice geometrical and economic interpretations. The deconvolution of a (convex) function by another one is a new operation, firstly defined in clear-cut manner, which is to the infimal convolution what the subtraction is to the addition for real numbers; it appears in conjugating the difference of convex functions; it serves in solving explicitly convolution equations; it has an interpretation in terms of subtraction of epigraphs. Since its introduction, the deconvolution operation has been studied more thoroughly by the author and his former students or associates. What we intend to present here is a short (and, hopefully, pedagogical) introduction to the deconvolution operation, in a simplified setting. This can be viewed as a complement to chapter IV and X in the book.

Hiriart-Urruty, J.B.



PVT Analysis With A Deconvolution Algorithm  

Microsoft Academic Search

Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis

Kouzes; Richard T



CAM-CM: a signal deconvolution tool for in vivo dynamic contrast-enhanced imaging of complex tissues  

PubMed Central

Summary:In vivo dynamic contrast-enhanced imaging tools provide non-invasive methods for analyzing various functional changes associated with disease initiation, progression and responses to therapy. The quantitative application of these tools has been hindered by its inability to accurately resolve and characterize targeted tissues due to spatially mixed tissue heterogeneity. Convex Analysis of Mixtures – Compartment Modeling (CAM-CM) signal deconvolution tool has been developed to automatically identify pure-volume pixels located at the corners of the clustered pixel time series scatter simplex and subsequently estimate tissue-specific pharmacokinetic parameters. CAM-CM can dissect complex tissues into regions with differential tracer kinetics at pixel-wise resolution and provide a systems biology tool for defining imaging signatures predictive of phenotypes. Availability: The MATLAB source code can be downloaded at the authors? website Contact: Supplementary information: Supplementary data are available at Bioinformatics online.

Chen, Li; Chan, Tsung-Han; Choyke, Peter L.; Hillman, Elizabeth M. C.; Chi, Chong-Yung; Bhujwalla, Zaver M.; Wang, Ge; Wang, Sean S.; Szabo, Zsolt; Wang, Yue



Multispectral imaging analysis: spectral deconvolution and applications in biology  

NASA Astrophysics Data System (ADS)

Multispectral imaging has been in use for over half a century. Owing to advances in digital photographic technology, multispectral imaging is now used in settings ranging from clinical medicine to industrial quality control. Our efforts focus on the use of multispectral imaging coupled with spectral deconvolution for measurement of endogenous tissue fluorophores and for animal tissue analysis by multispectral fluorescence, absorbance, and reflectance data. Multispectral reflectance and fluorescence images may be useful in evaluation of pathology in histological samples. For example, current hematoxylin/eosin diagnosis limits spectral analysis to shades of red and blue/grey. It is possible to extract much more information using multispectral techniques. To collect this information, a series of filters or a device such as an acousto-optical tunable filter (AOTF) or liquid-crystal filter (LCF) can be used with a CCD camera, enabling collection of images at many more wavelengths than is possible with a simple filter wheel. In multispectral data processing the "unmixing" of reflectance or fluorescence data and analysis and the classification based upon these spectra is required for any classification. In addition to multispectral techniques, extraction of topological information may be possible by reflectance deconvolution or multiple-angle imaging, which could aid in accurate diagnosis of skin lesions or isolation of specific biological components in tissue. The goal of these studies is to develop spectral signatures that will provide us with specific and verifiable tissue structure/function information. In addition, relatively complex classification techniques must be developed so that the data are of use to the end user.

Leavesley, Silas; Ahmed, Wamiq; Bayraktar, Bulent; Rajwa, Bartek; Sturgis, Jennifer; Robinson, J. P.



Deconvolution of single photon counting data with a reference method and global analysis  

Microsoft Academic Search

A method based on quenched references and global analysis was used to deconvolute timeresolved single photon counting data. The results from both computer simulated data and real experiments showed that highly accurate and reliable deconvolutions were possible. Fluorescence lifetimes and Stern-Volmer quenching constants for quenching with NaI were determined for the reference substances para-terphenyl, PPO (2,5-diphenyloxazol), POPOP (1,4-bis-(5-phenyl-2-oxazolyl)-benzene), and dimethyl-POPOP,

J.-E. Löfroth



Fully automated deconvolution method for on-line analysis of time-resolved fluorescence spectroscopy data based on an iterative Laguerre expansion technique  

Microsoft Academic Search

Time-resolved fluorescence spectroscopy (TRFS) is a powerful analytical tool for quantifying the biochemical composition of organic and inorganic materials. The potential of TRFS for tissue diagnosis has been recently demonstrated. To facilitate the translation of TRFS to the clinical arena, algorithms for online TRFS data analysis are essential. A fast model-free TRFS deconvolution algorithm based on the Laguerre expansion method

Aditi S. Dabir; Chintan A. Trivedi; Yeontack Ryu; Paritosh Pande; Javier A. Jo



Intrinsic fluorescence spectroscopy of glutamate dehydrogenase: Integrated behavior and deconvolution analysis  

NASA Astrophysics Data System (ADS)

In this paper, we present a deconvolution method aimed at spectrally resolving the broad fluorescence spectra of proteins, namely, of the enzyme bovine liver glutamate dehydrogenase (GDH). The analytical procedure is based on the deconvolution of the emission spectra into three distinct Gaussian fluorescing bands Gj. The relative changes of the Gj parameters are directly related to the conformational changes of the enzyme, and provide interesting information about the fluorescence dynamics of the individual emitting contributions. Our deconvolution method results in an excellent fitting of all the spectra obtained with GDH in a number of experimental conditions (various conformational states of the protein) and describes very well the dynamics of a variety of phenomena, such as the dependence of hexamers association on protein concentration, the dynamics of thermal denaturation, and the interaction process between the enzyme and external quenchers. The investigation was carried out by means of different optical experiments, i.e., native enzyme fluorescence, thermal-induced unfolding, and fluorescence quenching studies, utilizing both the analysis of the “average” behavior of the enzyme and the proposed deconvolution approach.

Pompa, P. P.; Cingolani, R.; Rinaldi, R.



Multichannel Blind Separation and Deconvolution of Images for Document Analysis  

Microsoft Academic Search

In this paper, we apply Bayesian blind source separation (BSS) from noisy convolutive mixtures to jointly separate and restore source images degraded through unknown blur operators, and then linearly mixed. We found that this problem arises in several image processing applications, among which there are some interesting instances of degraded document analysis. In particular, the convolutive mixture model is proposed

Anna Tonazzini; Ivan Gerace; Francesca Martinelli



Homomorphic Deconvolution.  

National Technical Information Service (NTIS)

The technique of homomorphic deconvolution is described briefly and applied to a class of problems in which one of the components can be represented by a sum of complex exponentials and the other by a pulse train.

A. V. Oppenheim



FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses  

PubMed Central

The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NF?B knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

Shokhirev, Maxim Nikolaievich; Hoffmann, Alexander



Application of automated mass spectrometry deconvolution and identification software for pesticide analysis in surface waters.  


A new approach to surface water analysis has been investigated in order to enhance the detection of different organic contaminants in Nathan Creek, British Columbia. Water samples from Nathan Creek were prepared by liquid/liquid extraction using dichloromethane (DCM) as an extraction solvent and analyzed by gas chromatography mass spectrometry method in scan mode (GC-MS scan). To increase sensitivity for pesticides detection, acquired scan data were further analyzed by Automated Mass Spectrometry Deconvolution and Identification Software (AMDIS) incorporated into the Agilent Deconvolution Reporting Software (DRS), which also includes mass spectral libraries for 567 pesticides. Extracts were reanalyzed by gas chromatography mass spectrometry single ion monitoring (GC-MS-SIM) to confirm and quantitate detected pesticides. Pesticides: atrazine, dimethoate, diazinone, metalaxyl, myclobutanil, napropamide, oxadiazon, propazine and simazine were detected at three sampling sites on the mainstream of the Nathan Creek. Results of the study are further discussed in terms of detectivity and identification level for each pesticide found. The proposed approach of monitoring pesticides in surface waters enables their detection and identification at trace levels. PMID:17090491

Furtula, Vesna; Derksen, George; Colodey, Alan



A novel fluorescence imaging technique combining deconvolution microscopy and spectral analysis for quantitative detection of opportunistic pathogens  

SciTech Connect

A novel fluorescence imaging technique based on deconvolution microscopy and spectral analysis is presented here as an alternative to confocal laser scanning microscopy. It allowed rapid, specific and simultaneous identification of five major opportunistic pathogens, relevant for public health, in suspension and provided quantitative results.

Le Puil, Michael [Florida Gulf Coast University; Biggerstaff, John P. [University of Tennessee, Knoxville (UTK); Weidow, B. [University of Tennessee, Knoxville (UTK); Price, Jeffery R [ORNL; Naser, S. [University of Central Florida; White, D.C. [University of Tennessee, Knoxville (UTK); Alberte, R. [Florida Gulf Coast University



Analysis of thermoluminescence kinetics of CaF2(Tm) peaks with glow curve deconvolution  

Microsoft Academic Search

The purpose of this study is to achieve detailed information on kinetic parameters of the peaks of a dosimetric material. First and general order kinetic formulas are applied with a computerized deconvolution technique to CaF2(Tm) thermoluminescence glow curves. A comparison is presented between deconvolution kinetic parameters and parameters obtained applying Chen's method.

C. Bacci; P. Bernardini; A. di Domenico; C. Furetta; B. Rispoli



Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis  

NASA Astrophysics Data System (ADS)

Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas, there is still need for further improvement of the resolution. Here a method for resolution enhancement of CFA data is presented. It is demonstrated that it is possible to improve the resolution of CFA data by restoring some of the detail that was lost in the measuring process, thus improving the usefulness of the data for high-resolution studies such as annual layer counting. The presented method uses deconvolution techniques and is robust to the presence of noise in the measurements. If integrated into the data processing, it requires no additional data collection. The method is applied to selected ice core data sequences from Greenland and Antarctica, and the results demonstrate that the data quality can be significantly improved.

Rasmussen, S. O.; Andersen, K. K.; Johnsen, S. J.; Bigler, M.; McCormack, T.



Contrasts for multichannel blind deconvolution  

Microsoft Academic Search

A class of optimization criteria is proposed whose maximization allows us to carry out blind multichannel deconvolution in the presence of additive noise. Contrasts presented in the paper encompass those related to source separation and independent component analysis problems

Pierre Comon



Fried deconvolution  

NASA Astrophysics Data System (ADS)

In this paper we present a new approach to deblur the effect of atmospheric turbulence in the case of long range imaging. Our method is based on an analytical formulation, the Fried kernel, of the atmosphere modulation transfer function (MTF) and a framelet based deconvolution algorithm. An important parameter is the refractive index structure which requires specific measurements to be known. Then we propose a method which provides a good estimation of this parameter from the input blurred image. The final algorithms are very easy to implement and show very good results on both simulated blur and real images.

Gilles, Jérôme; Osher, Stanley



AIRY-LN: an ad-hoc numerical tool for deconvolution of images from the LBT instrument LINC-NIRVANA  

NASA Astrophysics Data System (ADS)

LINC-NIRVANA (LN) is the German-Italian Fizeau beam combiner for the Large Binocular Telescope (LBT), composed of two 8.4-m apertures on a unique mount. It will provide multiple images of the same astrophysical target corresponding to different orientations of the 22.8-m maximum baseline. Starting from the already existing Sofware Package AIRY (a set of IDL-based modules developed within the CAOS "system" and dedicated to simulation and/or deconvolution of single or multiple images), an ad-hoc version has been especially designed for the data that will be obtained with LN. In this paper, we present the resulting Software Package AIRY-LN. Its capabilities, including quick-look methods, methods for specific classes of astronomical objects, PSF extraction, and a blind deconvolution algorithm are detailed. An IDL-licence-free (by means of the IDL Virtual Machine) and observer-oriented version of the whole package (with pre-setted LN image processing parameters) is also presented.

Desiderà, Gabriele; La Camera, Andrea; Boccacci, Patrizia; Bertero, Mario; Carbillet, Marcel



Physics analysis tools  

SciTech Connect

There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

Kunz, P.F.



Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis  

SciTech Connect

We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

Greenberg, M.; Ebel, D.S. (AMNH)



Three-dimensional analysis tool for segmenting and measuring the structure of telomeres in mammalian nuclei  

Microsoft Academic Search

Quantitative analysis in combination with fluorescence microscopy calls for innovative digital image measurement tools. We have developed a three-dimensional tool for segmenting and analyzing FISH stained telomeres in interphase nuclei. After deconvolution of the images, we segment the individual telomeres and measure a distribution parameter we call rhoT. This parameter describes if the telomeres are distributed in a sphere-like volume

Bart J. Vermolen; Ian T. Young; Alice Chuang; Landon Wark; Tony Chuang; Sabine Mai; Yuval Garini



Swift Science Analysis Tools  

NASA Astrophysics Data System (ADS)

Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (, and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

Marshall, F. E.; Swift Team Team



Inverse filtering and deconvolution  

Microsoft Academic Search

This paper studies the so-called inverse filtering and deconvolution problem from different angles. To start with, both exact and almost deconvolution problems are formulated, and the necessary and sufficient conditions for their solvability are investigated. Exact and almost deconvolution problems seek filters that can estimate the unknown inputs of the given plant or system either exactly or almostly whatever may

Ali Saberi; Anton A. Stoorvogel; Peddapullaiah Sannuti



A deconvolution method to improve automated 3D-analysis of dendritic spines: application to a mouse model of Huntington's disease.  


Dendritic spines are postsynaptic structures the morphology of which correlates with the strength of synaptic efficacy. Measurements of spine density and spine morphology are achievable using recent imaging and bioinformatics tools. The three-dimensional automated analysis requires optimization of image acquisition and treatment. Here, we studied the critical steps for optimal confocal microscopy imaging of dendritic spines. We characterize the deconvolution process and show that it improves spine morphology analysis. With this method, images of dendritic spines from medium spiny neurons are automatically detected by the software Neuronstudio, which retrieves spine density as well as spine diameter and volume. This approach is illustrated with three-dimensional analysis of dendritic spines in a mouse model of Huntington's disease: the transgenic R6/2 mice. In symptomatic mutant mice, we confirm the decrease in spine density, and the method brings further information and show a decrease in spine volume and dendrite diameter. Moreover, we show a significant decrease in spine density at presymptomatic age which so far has gone unnoticed. PMID:21822732

Heck, Nicolas; Betuing, Sandrine; Vanhoutte, Peter; Caboche, Jocelyne



Convex constraint analysis: a natural deconvolution of circular dichroism curves of proteins.  


A new algorithm, called convex constraint analysis, has been developed to deduce the chiral contribution of the common secondary structures directly from experimental CD curves of a large number of proteins. The analysis is based on CD data reported by Yang, J.T., Wu, C.-S.C. and Martinez, H.M. [Methods Enzymol., 130, 208-269 (1986)]. Application of the decomposition algorithm for simulated protein data sets resulted in component spectra [B (lambda, i)] identical to the originals and weights [C (i, k)] with excellent Pearson correlation coefficients (R) [Chang, C.T., Wu, C.-S.C. and Yang, J.T. (1978) Anal. Biochem., 91, 12-31]. Test runs were performed on sets of simulated protein spectra created by the Monte Carlo technique using poly-L-lysine-based pure component spectra. The significant correlational coefficients (R greater than 0.9) demonstrated the high power of the algorithm. The algorithm, applied to globular protein data, independent of X-ray data, revealed that the CD spectrum of a given protein is composed of at least four independent sources of chirality. Three of the computed component curves show remarkable resemblance to the CD spectra of known protein secondary structures. This approach yields a significant improvement in secondary structural evaluations when compared with previous methods, as compared with X-ray data, and yields a realistic set of pure component spectra. The new method is a useful tool not only in analyzing CD spectra of globular proteins but also has the potential for the analysis of integral membrane proteins. PMID:1946324

Perczel, A; Hollósi, M; Tusnády, G; Fasman, G D



Inter-pixel capacitance: prospects for deconvolution  

NASA Astrophysics Data System (ADS)

We provide an IDL implementation of Fourier deconvolution and apply it to data from a WFC3 H1RG near-infrared array detector that exhibits inter-pixel capacitance (IPC). The deconvolution removes the most obvious deleterious effect of IPC: the cross-like pattern (i.e. the arithematic symbol for addition, "+") of charge around pixels with large dark current, a.k.a. "hot" pixels. We also exhibit a case in which our physical interpretation of a detector defect changed qualitatively before and after the deconvolution: in a region of multiple "scratches" on the detector, before deconvolution we thought each "scratch" was spatially resolved by the 18-micron pixels, but after deconvolution, it appears that the width of each "scratch" is much smaller than a pixel. We briefly discuss a number of potential benefits of deconvolution of data from arrays with IPC: the preliminary analysis in this report may motivate and expedite additional simulations and observations upon which a more definitive and quantitative analysis will be based. In practice, deconvolution of IPC can be performed after images have been calibrated with the nominal (i.e. existing) pipeline process, although we discuss how in principle and potentially for optimum effect, the deconvolution of IPC should be performed on all data (science images, flat-fields, etc) at the outset, after reading out the arrays and prior to calibrating the science images with darks, flat-fields, cosmic-ray rejection, etc. A subsequent report will quantify the benefits of IPC-deconvolution prior to, instead of after, pixel-by-pixel calibration; this report only outlines the potential benefits qualitatively.

McCullough, P.



eCRAM computer algorithm for implementation of the charge ratio analysis method to deconvolute electrospray ionization mass spectra  

NASA Astrophysics Data System (ADS)

A computer program (eCRAM) has been developed for automated processing of electrospray mass spectra based on the charge ratio analysis method. The eCRAM algorithm deconvolutes electrospray mass spectra solely from the ratio of mass-to-charge (m/z) values of multiply charged ions. The program first determines the ion charge by correlating the ratio of m/z values for any two (i.e., consecutive or non-consecutive) multiply charged ions to the unique ratios of two integers. The mass, and subsequently the identity of the charge carrying species, is further determined from m/z values and charge states of any two ions. For the interpretation of high-resolution electrospray mass spectra, eCRAM correlates isotopic peaks that share the same isotopic compositions. This process is also performed through charge ratio analysis after correcting the multiply charged ions to their lowest common ion charge. The application of eCRAM algorithm has been demonstrated with theoretical mass-to-charge ratios for proteins lysozyme and carbonic anhydrase, as well as experimental data for both low and high-resolution FT-ICR electrospray mass spectra of a range of proteins (ubiquitin, cytochrome c, transthyretin, lysozyme and calmodulin). This also included the simulated data for mixtures by combining experimental data for ubiquitin, cytochrome c and transthyretin.

Maleknia, Simin D.; Green, David C.



Boronic acid-protected gold clusters capable of asymmetric induction: spectral deconvolution analysis of their electronic absorption and magnetic circular dichroism.  


Gold clusters protected by 3-mercaptophenylboronic acid (3-MPB) with a mean core diameter of 1.1 nm are successfully isolated, and their absorption, magnetic circular dichroism (MCD), and chiroptical responses in metal-based electronic transition regions, which can be induced by surface D-/L-fructose complexation, are examined. It is well-known that MCD basically corresponds to electronic transitions in the absorption spectrum, so simultaneous deconvolution analysis of electronic absorption and MCD spectra of the gold cluster compound is conducted under the constrained requirement that a single set of Gaussian components be used for their fitting. We then find that fructose-induced chiroptical response is explained in terms of the deconvoluted spectra experimentally obtained. We believe this spectral analysis is expected to benefit better understanding of the electronic states and the origin of the optical activity in chiral metal clusters. PMID:22303900

Yao, Hiroshi; Saeki, Masanori; Sasaki, Akito



Blind deconvolution via cumulant extrema  

Microsoft Academic Search

Classical deconvolution is concerned with the task of recovering an excitation signal, given the response of a known time-invariant linear operator to that excitation. Deconvolution is discussed along with its more challenging counterpart, blind deconvolution, where no knowledge of the linear operator is assumed. This discussion focuses on a class of deconvolution algorithms based on higher-order statistics, and more particularly,

J. A. Cadzow



Heliostat cost-analysis tool  

Microsoft Academic Search

A heliostat cost analysis tool (HELCAT) that processes manufacturing transportation, and installation cost data was developed which provides a consistent structure for cost analyses. The HELCAT calculates a representation product price based on direct input data and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost

L. D. Brandt; R. E. Chang



Shot Planning and Analysis Tools  

Microsoft Academic Search

Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand.

A Casey; R Beeler; A Conder; R Fallejo; M Flegel; M Hutton; K Jancaitis; V Lakamsani; D Potter; S Reisdorf; J Tappero; P Whitman; W Carr; Z Liao



Information Gathering and Analysis Tools  

NSDL National Science Digital Library

The National Center for Environmental Decision-making Research aims "to improv[e] environmental decision making" at regional, state, and local levels. Administered by the Joint Institute for Energy and Environment in Knoxville, Tennessee, NCEDR offers many decision-making resources, most prominently, tools for information gathering and analysis. Users may select from eight categories of tool use, from Identifying Values to Post-Decision Assessment. Within each category, subcategories offer information and tools on economic market assessment, ecological relationships, and other topics. Additional Links and commentary on Strengths & Weaknesses (of tools), Communicating the Results, Looking Ahead, and Key Sources round out the site.

Research., National C.


Medical Ultrasound Image Deconvolution  

NASA Astrophysics Data System (ADS)

In medical pulse-echo ultrasound imaging, a constant sound speed of 1,540 m/s in soft tissues is assumed. When the actual speed is different the mismatch can lead to image distortions. Even if the assumed speed is correct, ultrasound images can be difficult to interpret due to image blurring and the presence of speckle. However, this can be improved by non-blind deconvolution if the point-spread function (PSF) is known. In clinical applications a sufficiently accurate estimate of the PSF is difficult to obtain because of the unknown properties (including speed of sound) of soft tissues. In this paper, we address two topics: first, we explore the sensitivity of our deconvolution algorithm to variations in the speed of sound in the tissue; second, we extend our deconvolution algorithm to enable it to adapt to (and estimate) an unknown sound speed. In the first topic, the results reveal that the deconvolution output is sufficiently sensitive to the accuracy of the sound speed that the speed itself can be estimated using deconvolution. However, qualitative assessment suggests that we may not need the exact speed of sound for successful deconvolution so long as the assumed speed does not deviate significantly from the true value. In the second topic, the goal is gradually to adapt the assumed sound speed to improve the deconvolution and eventually estimate the true sound speed. We tested our algorithm with in vitro phantoms where the estimation error was found to be +0.01 ± 0.60% (mean ± standard deviation). In addition to the speed estimation itself, our method has also proved capable of producing better restoration of the ultrasound images than deconvolution by an assumed speed of 1,540 m/s when this assumption is significantly in error.

Shin, H.-C.; Prager, R. W.; Gomersall, H.; Kingsbury, N.; Treece, G. M.; Gee, A. H.


Analysis of an iterative deconvolution approach for estimating the source wavelet during waveform inversion of crosshole georadar data  

Microsoft Academic Search

A major issue in the application of waveform inversion methods to crosshole georadar data is the accurate estimation of the source wavelet. Here, we explore the viability and robustness of incorporating this step into a time-domain waveform inversion procedure through an iterative deconvolution approach. Our results indicate that, at least in non-dispersive electrical environments, such an approach provides remarkably accurate

Florian Belina; James Irving; Jacques Ernst; Klaus Holliger


Bayesian deconvolution and analysis of photoelectron or any other spectra: Fermi-liquid versus marginal Fermi-liquid behavior of the 3d electrons in Ni  

NASA Astrophysics Data System (ADS)

We present a simple and effective iterative deconvolution of noisy experimental spectra D broadened by the spectrometer function. We show that this ``iterative Bayesian deconvolution'' is closely related to the more complex ``Bayesian analysis,'' also known as the quantified maximum-entropy method. A model m of the true spectral function is needed in both cases. The Bayesian analysis is the most powerful and precise method to relate measured spectra D to the corresponding theoretical models m via the respective probabilities, but two grave conceptual problems together with two severe technical difficulties prevented widespread application. We remove these four obstacles by (i) demonstrating analytically and also by computer simulations that the most probable deconvolution a⁁ obtained as a by-product from the Bayesian analysis gets closer to the true spectral function as the quality of m increases, (ii) finding it equivalent but vastly more efficient to optimize the parameters contained in a given model m by the usual least-squares fit between D and the convolution of m prior to the Bayesian analysis instead of using the Bayesian analysis itself for that purpose, (iii) approximating the convolution by a summation over the energies of the n data points only, with the normalization of the spectrometer function chosen to minimize the errors at both edges of the spectrum, and (iv) avoiding the severe convergence problems frequently encountered in the Bayesian analysis by a simple reformulation of the corresponding system of n nonlinear equations. We also apply our version of the Bayesian analysis to angle-resolved photoelectron spectra taken at normal emission from Ni(111) close to the Fermi energy at about 12 K, using two different physical models: Compared with the marginal Fermi liquid, the Fermi-liquid line shape turns out to be about 104 times more probable to conform with the observed structure of the majority and minority spin peaks in the low-photon and small-binding-energy region.

Gerhardt, U.; Marquardt, S.; Schroeder, N.; Weiss, S.



Atlas Distributed Analysis Tools  

NASA Astrophysics Data System (ADS)

The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich



Tc-99m DTPA renal scintigraphy using deconvolution analysis with six functional images of the mean time to evaluate acute pyelonephritis.  


In 38 children with proved P-fimbriated Escherichia coli acute pyelonephritis, Tc-99m DTPA dynamic renal scintigraphy in the zoom mode using deconvolution analysis was performed, and the results were compared with those of Tc-99m DMSA scans. From the dynamic study, six functional images of the mean time were generated. Each functional image was analyzed separately to search for focal areas of increased mean time within the kidney contour, especially over the kidney parenchyma. Time-activity curves from these areas were generated and analyzed. Tc-99m DMSA scintigraphy showed generalized or focal decreased uptake in 32 (41.8%) kidneys, and deconvolution analysis of Tc-99m DTPA scintigraphy revealed pathologic renographic curves in 58 (77.6%) kidneys. Prolonged whole-kidney and normal renal parenchymal transit times (dilatation without obstruction) were found in 38 (50%) kidneys, whereas prolonged whole-kidney and renal parenchymal transit times (dilatation with obstruction) were observed in 20 (27.6%) kidneys. Separate analysis of each of the six functional images of the mean time showed focal areas of increased mean time in the kidney parenchyma of 11 kidneys. In five cases, time-activity curves from these areas showed a sharp increase of activity on the descending part of the curve, which might reflect the return of urine from the collecting system into kidney cortex (i.e., intrarenal reflux). These results showed that in a urinary tract with acute pyelonephritis, urodynamic changes may lead to obstructive nephropathy and intrarenal reflux. Tc-99m DTPA renal scintigraphy in the zoom mode using deconvolution analysis with six functional images of the mean time has proved to be a valuable method to evaluate acute pyelonephritis, thus allowing dynamic and morphologic analysis of the urinary tract at the same time. PMID:9988072

Poropat, M; Batini?, D; Basi?, M; Nizi?, L J; Dodig, D; Milosevi?, D; Votava-Rai?, A; Tezak, S; Vrljicak, K; Hui?, D; Medvedec, M



Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols  

EPA Science Inventory

In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...


EPR spectrum deconvolution and dose assessment of fossil tooth enamel using maximum likelihood common factor analysis  

Microsoft Academic Search

In order to determine the components which give rise to the EPR spectrum around g = 2 we have applied Maximum Likelihood Common Factor Analysis (MLCFA) on the EPR spectra of enamel sample 1126 which has previously been analysed by continuous wave and pulsed EPR as well as EPR microscopy. MLCFA yielded agreeing results on three sets of X-band spectra

G. Vanhaelewyn; F. Callens; R. Grün



Application of Automated Mass Spectrometry Deconvolution and Identification Software for Pesticide Analysis in Surface Waters  

Microsoft Academic Search

A new approach to surface water analysis has been investigated in order to enhance the detection of different organic contaminants in Nathan Creek, British Columbia. Water samples from Nathan Creek were prepared by liquid\\/liquid extraction using dichloromethane (DCM) as an extraction solvent and analyzed by gas chromatography mass spectrometry method in scan mode (GC-MS scan). To increase sensitivity for pesticides




VCAT: Visual Crosswalk Analysis Tool  

SciTech Connect

VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory



Analysis Tools for Fusion Simulations  

NASA Astrophysics Data System (ADS)

In this talk, we highlight two analysis tools for evaluating fusion simulations. The first tool is for interactively exploring the topology of the magnetic field using a Poincaré map. Unlike traditional Poincaré maps that rely on a dense set of puncture points to form a contiguous representation of the magnetic surface we use a sparse set of connected puncture points. The puncture points are connected based on a rational approximation of the safety factor. The resulting analysis not only allows for the visualization of magnetic surfaces using a minimal number of puncture points but also identifies features such as magnetic islands. The second tool is for performing query based analysis on simulations utilizing particles. To assist in the analysis of simulation codes that utilize millions to billions of particles we have developed analysis tools that combine parallel coordinates plots combine with accelerated index searches. Parallel coordinate plots allow one to identify trends within multivariate data while accelerated index searches allows one to quickly perform range based queries on a large number of multivariate entries.

Sanderson, Allen; Kruger, Scott; Breslau, Joshua; Ethier, Stephane



WEAT: Web Enabled Analysis Tool  

NSDL National Science Digital Library

Behavioral Risk Factor Surveillance System The BRFSS, the world’s largest telephone survey, tracks health risks in the United States. Information from the survey is used to improve the health of the American people. This tool allows users to create crosstablulations and perform logistic analysis on these data.  

Control, Center F.


Graphical Multiprocessing Analysis Tool (GMAT)  

Microsoft Academic Search

The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states,

M. K. Seager; S. Campbell; S. Sikora; R. Strout; M. Zosel



Scenario Analysis Tool Suite: A User's Guide.  

National Technical Information Service (NTIS)

This document is a user guide for the software product, the Scenario Analysis Tool Suite (version 1.5). The tool suite implements several scenario analysis techniques, Morphological analysis, Field Anomaly Relaxation analysis, Battelle approach, Bayesian ...

C. Dilek



VOSpec: VO Spectral Analysis Tool  

NASA Astrophysics Data System (ADS)

VOSpec is a multi-wavelength spectral analysis tool with access to spectra, theoretical models and atomic and molecular line databases registered in the VO. The standard tools of VOSpec include line and continuum fitting, redshift and reddening correction, spectral arithmetic and convolution between spectra, equivalent width and flux calculations, and a best fitting algorithm for fitting selected SEDs to a TSAP service. VOSpec offers several display modes (tree vs table) and organising functionalities according to the available metadata for each service, including distance from the observation position.

Esavo Team



Blind image deconvolution  

Microsoft Academic Search

The goal of image restoration is to reconstruct the original scene from a degraded observation. This recovery process is critical to many image processing applications. Although classical linear image restoration has been thoroughly studied, the more difficult problem of blind image restoration has numerous research possibilities. We introduce the problem of blind deconvolution for images, provide an overview of the

DEEPA KUNDUR; D. Hatzinakos



Blind image deconvolution revisited  

Microsoft Academic Search

The article discusses the major approaches, such as projection based blind deconvolution and maximum likelihood restoration, we overlooked previously (see ibid., no.5, 1996). We discuss them for completeness along with some other works found in the literature. As the area of blind image restoration is a rapidly growing field of research, new methods are constantly being developed

D. Kundur; D. Hatzinakos



Heliostat cost-analysis tool  

SciTech Connect

Estimated production costs of solar energy systems serve as guides for future component development and as measures of the potential economic viability of the technologies. The analysis of heliostat costs is particularly important since the heliostat field is the largest cost component of a solar central receiver plant. A heliostat cost analysis tool (HELCAT) that processes manufacturing, transportation, and installation cost data has been developed to provide a consistent structure for cost analyses. HELCAT calculates a representative product price based on direct input data (e.g. direct materials, direct labor, capital requirements) and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost estimates are discussed. A set of nominal economic and financial parameters is also suggested.

Brandt, L.D.; Chang, R.E.



Methods for deconvoluting and interpreting complex gamma- and x-ray spectral regions  

SciTech Connect

Germanium and silicon detectors are now widely used for the detection and measurement of x and gamma radiation. However, some analysis situations and spectral regions have heretofore been too complex to deconvolute and interpret by techniques in general use. One example is the L x-ray spectrum of an element taken with a Ge or Si detector. This paper describes some new tools and methods that were developed to analyze complex spectral regions; they are illustrated with examples.

Gunnink, R.



Failure environment analysis tool applications  

NASA Astrophysics Data System (ADS)

Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

Pack, Ginger L.; Wadsworth, David B.



Total variation blind deconvolution  

Microsoft Academic Search

We present a blind deconvolution algorithm based on the total variational (TV) minimization method proposed by Acar and Vogel (1994). The motivation for regularizing with the TV norm is that it is extremely effective for recovering edges of images as well as some blurring functions, e.g., motion blur and out-of-focus blur. An alternating minimization (AM) implicit iterative scheme is devised

Tony F. Chan; Chiu-Kwong Wong



Joint deconvolution and imaging  

NASA Astrophysics Data System (ADS)

We investigate a Wiener fusion method to optimally combine multiple estimates for the problem of image deblurring given a known blur and a corpus of sharper training images. Nearest-neighbor estimation of high frequency information from training images is fused with a standard Wiener deconvolution estimate. Results show an improvement in sharpness and decreased artifacts compared to either the standard Wiener filter or the nearest-neighbor reconstruction.

Anderson, Hyrum S.; Gupta, Maya R.



Graphical Multiprocessing Analysis Tool (GMAT)  

SciTech Connect

The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.



EASY-GOING deconvolution: Automated MQMAS NMR spectrum analysis based on a model with analytical crystallite excitation efficiencies.  


The EASY-GOING deconvolution (EGdeconv) program is extended to enable fast and automated fitting of multiple quantum magic angle spinning (MQMAS) spectra guided by evolutionary algorithms. We implemented an analytical crystallite excitation model for spectrum simulation. Currently these efficiencies are limited to two-pulse and z-filtered 3QMAS spectra of spin 3/2 and 5/2 nuclei, whereas for higher spin-quantum numbers ideal excitation is assumed. The analytical expressions are explained in full to avoid ambiguity and facilitate others to use them. The EGdeconv program can fit interaction parameter distributions. It currently includes a Gaussian distribution for the chemical shift and an (extended) Czjzek distribution for the quadrupolar interaction. We provide three case studies to illustrate EGdeconv's capabilities for fitting MQMAS spectra. The EGdeconv program is available as is on our website for 64-bit Linux operating systems. PMID:23376481

Grimminck, Dennis L A G; van Meerten, Bas; Verkuijlen, Margriet H W; van Eck, Ernst R H; Meerts, W Leo; Kentgens, Arno P M



EASY-GOING deconvolution: Automated MQMAS NMR spectrum analysis based on a model with analytical crystallite excitation efficiencies  

NASA Astrophysics Data System (ADS)

The EASY-GOING deconvolution (EGdeconv) program is extended to enable fast and automated fitting of multiple quantum magic angle spinning (MQMAS) spectra guided by evolutionary algorithms. We implemented an analytical crystallite excitation model for spectrum simulation. Currently these efficiencies are limited to two-pulse and z-filtered 3QMAS spectra of spin 3/2 and 5/2 nuclei, whereas for higher spin-quantum numbers ideal excitation is assumed. The analytical expressions are explained in full to avoid ambiguity and facilitate others to use them. The EGdeconv program can fit interaction parameter distributions. It currently includes a Gaussian distribution for the chemical shift and an (extended) Czjzek distribution for the quadrupolar interaction. We provide three case studies to illustrate EGdeconv's capabilities for fitting MQMAS spectra. The EGdeconv program is available as is on our website for 64-bit Linux operating systems.

Grimminck, Dennis L. A. G.; van Meerten, Bas; Verkuijlen, Margriet H. W.; van Eck, Ernst R. H.; Leo Meerts, W.; Kentgens, Arno P. M.



Digital Deconvolution: Image Sampling and Restoration Techniques.  

National Technical Information Service (NTIS)

The finite length digital deconvolution problem is formulated and discussed in terms of modern optimization theory. The ill-conditioned nature of deconvolution is identified and classic deconvolution operators are examined in terms of their respective eff...

D. J. Udovic R. Mittra



Comparative Analysis of Active Bandwidth Estimation Tools  

Microsoft Academic Search

A comparative analysis of state-of-the-art active probing tools for bandwidth estimation is outlined. Techniques and tools for capacity, available bandwidth and bulk transfer capacity estimation are simul- taneously assessed. First, a generic framework for the design of active bandwidth estimation tools is proposed as a result of our analysis of the implementation and performance of a number of available techniques

Federico Montesino-pouzols



RADC SCAT automated sneak circuit analysis tool  

Microsoft Academic Search

The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital

E. L. DePalma



Analysis of immunohistochemical expression of k19 in oral epithelial dysplasia and oral squamous cell carcinoma using color deconvolution-image analysis method.  


K19 is an intermediate filament protein that has been investigated in oral squamous cell carcinoma (OSCC), but that has not been correlated with the amount of keratin produced within well-differentiated OSCC grade. The aim of the present study was to objectively analyze K19 immunoexpression in OSCC and to validate the utility of K19 in differentiation among grades of oral epithelial dysplasia (OED). Formalin-fixed tissues of 36 primary OSCC (22 well, 10 moderately, 4 poorly differentiated), 43 OED (23 mild, 8 moderate, 12 severe), and 11 normal oral epithelium (NOE) were included. K19 was immunostained using HRP-DAB method. The percentage of K19-positive area was found using color deconvolution program in ImageJ(®) image analysis system (public domain software, National Institutes of Health, Bethesda, MD, USA) and analyzed using independent samples t tests and ANOVA test. K19 scores in NOE, mild, moderate and severe OED were: 1.8, 3.4, 21, and 50.3%, respectively, with significant association with the grade (t test P < 0.05). Well-differentiated OSCC with <30% keratin pearl formation expressed significantly higher K19 scores compared to well-differentiated OSCC with >30% keratin pearls (28.6 and 1.2%, respectively, P < 0.05). K19 scores in moderately and poorly differentiated OSCC were 60.8 and 61.3%, respectively. K19 scores significantly differentiated between two subgroups of tumors within well-differentiated OSCC grade and reflected histologic differentiation as well as probably predicting the clinical outcome. Combining K19 immunostain with the regular H&E stain may be helpful to facilitate and assure assigning a more accurate grade for OED. PMID:20882374

Safadi, Rima A; Musleh, Atika S; Al-Khateeb, Taiseer H; Hamasha, Abed Al-Hadi



Analysis of Immunohistochemical Expression of K19 in Oral Epithelial Dysplasia and Oral Squamous Cell Carcinoma Using Color Deconvolution-Image Analysis Method  

PubMed Central

K19 is an intermediate filament protein that has been investigated in oral squamous cell carcinoma (OSCC), but that has not been correlated with the amount of keratin produced within well-differentiated OSCC grade. The aim of the present study was to objectively analyze K19 immunoexpression in OSCC and to validate the utility of K19 in differentiation among grades of oral epithelial dysplasia (OED). Formalin-fixed tissues of 36 primary OSCC (22 well, 10 moderately, 4 poorly differentiated), 43 OED (23 mild, 8 moderate, 12 severe), and 11 normal oral epithelium (NOE) were included. K19 was immunostained using HRP-DAB method. The percentage of K19-positive area was found using color deconvolution program in ImageJ® image analysis system (public domain software, National Institutes of Health, Bethesda, MD, USA) and analyzed using independent samples t tests and ANOVA test. K19 scores in NOE, mild, moderate and severe OED were: 1.8, 3.4, 21, and 50.3%, respectively, with significant association with the grade (t test P < 0.05). Well-differentiated OSCC with <30% keratin pearl formation expressed significantly higher K19 scores compared to well-differentiated OSCC with >30% keratin pearls (28.6 and 1.2%, respectively, P < 0.05). K19 scores in moderately and poorly differentiated OSCC were 60.8 and 61.3%, respectively. K19 scores significantly differentiated between two subgroups of tumors within well-differentiated OSCC grade and reflected histologic differentiation as well as probably predicting the clinical outcome. Combining K19 immunostain with the regular H&E stain may be helpful to facilitate and assure assigning a more accurate grade for OED.

Musleh, Atika S.; Al-Khateeb, Taiseer H.; Hamasha, Abed Al-Hadi



Climate Data Analysis Tools - (CDAT)  

NASA Astrophysics Data System (ADS)

Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware software, CDAT's focus is to allow climate researchers the ability to access and analyze multi-dimensional distributed climate datasets.

Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.



System analysis: Developing tools for the future  

SciTech Connect

This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.



Bioinformatics tools for secretome analysis.  


Over recent years, analyses of secretomes (complete sets of secreted proteins) have been reported in various organisms, cell types, and pathologies and such studies are quickly gaining popularity. Fungi secrete enzymes can break down potential food sources; plant secreted proteins are primarily parts of the cell wall proteome; and human secreted proteins are involved in cellular immunity and communication, and provide useful information for the discovery of novel biomarkers, such as for cancer diagnosis. Continuous development of methodologies supports the wide identification and quantification of secreted proteins in a given cellular state. The role of secreted factors is also investigated in the context of the regulation of major signaling events, and connectivity maps are built to describe the differential expression and dynamic changes of secretomes. Bioinformatics has become the bridge between secretome data and computational tasks for managing, mining, and retrieving information. Predictions can be made based on this information, contributing to the elucidation of a given organism's physiological state and the determination of the specific malfunction in disease states. Here we provide an overview of the available bioinformatics databases and software that are used to analyze the biological meaning of secretome data, including descriptions of the main functions and limitations of these tools. The important challenges of data analysis are mainly related to the integration of biological information from dissimilar sources. Improvements in databases and developments in software will likely substantially contribute to the usefulness and reliability of secretome studies. This article is part of a Special Issue entitled: An Updated Secretome. PMID:23395702

Caccia, Dario; Dugo, Matteo; Callari, Maurizio; Bongarzone, Italia



Tool for Magnetic Analysis Package  

NSDL National Science Digital Library

The FIT-MART Launcher package is a self-contained file for simulating systems of interacting quantum magnetic moments (âspinsâ). The interactions are modeled using the Heisenberg model, calculations are carried out by numerically diagonalizing the matrix representation of the Heisenberg Hamiltonian, and several types of plots are generated to describe various aspects of the model. The FIT-MART package is a âFully Integrated Tool for Magnetic Analysis in Research & Teachingâ (hence the acronym) which provides a very simple interface for defining complex quantum spin models, carrying out complex calculations, and visualizing the results using several graphical representations. These representations include plots of the energy spectrum as well as plots of the magnetization and magnetic susceptibility as a function of temperature and magnetic field. The FIT-MART package is an Open Source Physics package written to help students as well as researchers who are studying magnetism. It is distributed as a ready-to-run (compiled) Java archive. Double clicking osp_fit_mart.jar file will run the package if Java is installed. In future versions of this package, curricular materials will be included to help students to learn about magnetism, and automated fitting routines will be included to help researchers quickly and easily model experimental data.

Engelhardt, Larry; Rainey, Cameron




SciTech Connect

The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

Robert R. Jensen; Steven A. Benson; Jason D. Laumb



Tool support for architecture analysis and design  

Microsoft Academic Search

The needs of software architectural design and analysis have led to a desire to create CASE tools to support the processes. Such a tool should help: to document an architecture; to reuse architectural artifacts; to aid in exploring architectural alternatives; and to support architectural metrics. This position paper first presents a set of requirements that an ideal tool for architectural

Rick Kazman



Iterative Deconvolution and Receiver-Function Estimation  

Microsoft Academic Search

We describe and apply an iterative, time-domain deconvolution ap- proach to receiver-function estimation and illustrate the reliability and advantages of the technique using synthetic- and observation-based examples. The iterative tech- nique is commonly used in earthquake time-function studies and offers several ad- vantages in receiver-function analysis such as intuitively stripping the largest re- ceiver-function arrivals from the observed seismograms first

Juan Pablo Ligorrfa; Charles J. Ammon



NCI Interactive Budget Analysis Tool

This tool provides users an interactive overview of the National Cancer Institute (NCI) budget and Fact Book data since Fiscal Year 1999. Additional historical NCI budget information can be obtained through the NCI Fact Book Collection.


Gradient-based image deconvolution  

NASA Astrophysics Data System (ADS)

Image restoration and deconvolution from blurry and noisy observation is known to be ill-posed. To stabilize the recovery, total variation (TV) regularization is often utilized for its beneficial edge in preserving the image's property. We take a different approach of TV regularization for image restoration. We first recover horizontal and vertical differences of images individually through some successful deconvolution algorithms. We restore horizontal and vertical difference images separately so that each is more sparse or compressible than the corresponding original image with a TV measure. Then we develop a novel deconvolution method that recovers the horizontal and vertical gradients, respectively, and then estimate the original image from these gradients. Various experiments that compare the effectiveness of the proposed method against the traditional TV methods are presented. Experimental results are provided to show the improved performance of our method for deconvolution problems.

Huang, Heyan; Yang, Hang; Ma, Siliang



Industry Sector Analysis Mexico: Machine Tools, Lathes.  

National Technical Information Service (NTIS)

The market survey covers the machine tool lathes market in Mexico. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Mexican consumers to U.S. products; the competitive situation, and market ...

L. P. Sanroman M. Gerard



Tools for Time-Frequency Analysis.  

National Technical Information Service (NTIS)

Together with students and postdocs, the PI has worked on the mathematical aspects and applications of various tools in time frequency or time scale analysis. they have brought a deeper understanding to the geometry of redundant representations (frames), ...

I. Daubechies



Visualization Tool for Engineering Vector Analysis.  

National Technical Information Service (NTIS)

A prototype computer tool was designed to support an interactive, visual approach to the learning of vector analysis. The objective was to construct a prototype environment in which vectors, scalars and points could be entered, manipulated and observed in...

B. L. Miranda



Statistical Tools for Forensic Analysis of Toolmarks  

SciTech Connect

Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser



Quantitative fluorescence microscopy and image deconvolution.  


Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used to remove blurred signal from an image. There are two major types of deconvolution approaches, deblurring and restoration algorithms. Deblurring algorithms remove blur, but treat a series of optical sections as individual two-dimensional entities, and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed. PMID:23931516

Swedlow, Jason R



Image deconvolution in digital autoradiography  

PubMed Central

Digital autoradiography (DAR) is a powerful method to determine quantitatively the “small-scale” (i.e., submillimeter) distribution of a radiotracer within a tissue section. However, the limited spatial resolution of the DAR image, due to blurring by the point spread function (PSF), can result in a poor correlation with tissue histology and immunohistochemistry. The authors attempt to overcome this limitation by recovering the radiotracer distribution by image deconvolution using the Richardson-Lucy algorithm and a measured PSF obtained from a small radioactive source on hydrophobic microscope slide. Simulation studies have shown that the deconvolution algorithm reliably recovers the pixel values corresponding to the radioactivity distributions. As an example, the proposed image restoration approach has been tested with DAR images of different radiolabeled markers on tumor sections obtained from clinical and preclinical animal model studies. Digital autoradiograms following deconvolution show improved sharpness and contrast relative to the unprocessed autoradiograms.

Zhang, Mutian; Chen, Qing; Li, Xiao-Feng; O'Donoghue, Joseph; Ruan, Shutian; Zanzonico, Pat; Ling, C. Clifton; Humm, John L.



Shock and Discontinuities Analysis Tool (SDAT)  

NASA Astrophysics Data System (ADS)

We have developed an analysis/visualization tool to study shocks and other discontinuities from satellite observations of plasma and magnetic field data. The tool uses an extension of the Viñas-Scudder analysis method based on the Rankine-Hugoniot conservation equations and published in JGR (1985). SDAT provides shock parameters such as the normal components n, shock speed Us, angle between the normal and the upstream magnetic field ?Bn, Alfvén and magnetosonic Mach numbers, deHoffman-Teller velocity and many other important shock parameters to describe the shock . SDAT was developed fully in IDL. As currently configured, SDAT reads ASCII data from any space mission for the analysis. All data displays and graphics generated by SDAT are written in Postcript and a summary of the analysis is generated as an ASCII file. The input plasma (?, V, Tp and Te) and magnetic field (B) data is read from different files at their own native resolution. The tool allows for data zooming in time and the input data can be in any coordinate system. Examples of real satellite data are included to facilitate the learning process in the usage of the tool. This tool is available to the space physics community. As configured, the tool is basically complete for shock analysis; however, work continues to generalize SDAT for the analysis of other types of discontinuities.

Viñas, A. F.; Holland, M. P.



Effects of truncation on deconvolution  

NASA Astrophysics Data System (ADS)

Many methods for deconvolving images assume either that the entire convolution is available, or that the convolution is adequately modelled as a circular convolution. In reality, neither is usually the case, and only a section of a much larger blurred (and contaminated) image is observed. The truncation gives rise to null objects in reconstructions obtained by deconvolution methods. It is possible to formulate the problem as exact, though underdetermined, and to apply singular value decomposition to deriving an inverse operator. We compare different practical methods for performing deconvolution with a scanning finite impulse response filter derived in this manner.

Lane, Richard G.; Irwan, Roy; Bones, Philip J.



Fast Deconvolution Using Frequency-Dependent Regularization  

Microsoft Academic Search

A very fast deconvolution method, which is based on the Fast Fourier Transform, can be used to design a matrix of causal finite impulse response filters whose performance is optimized at a large number of discrete frequencies. The method is very efficient for both single-channel deconvolution, which can be used for loudspeaker equalisation, and multi-channel deconvolution, which can be used

Ole Kirkeby; Per Rubak; Angelo Farina


Automated Steel Cleanliness Analysis Tool (ASCAT)  

Microsoft Academic Search

The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting

Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Scott Story



The Galileo Fault Tree Analysis Tool  

Microsoft Academic Search

We present Galileo, a dynamic fault tree modeling and analysis tool that combines the innovative DIF- Tree analysis methodology with a rich user interface built using package-oriented programming. DIFTree integrates binary decision diagram and Markov meth- ods under the common notation of dynamic fault trees, allowing the user to exploit the benefits of both tech- niques while avoiding the need

Kevin J. Sullivan; Joanne Bechta Dugan; David Coppit



Blind Deconvolution via Sequential Imputations  

Microsoft Academic Search

The sequential imputation procedure is applied to adaptively and sequentially reconstruct discrete input signals that are blurred by an unknown linear moving average channel and contaminated by additive Gaussian noises, a problem known as blind deconvolution in digital communication. A rejuvenation procedure for improving the efficiency of sequential imputation is introduced and theoretically justified. The proposed method does not require

Jun S. Liu; Rong Chen



Deconvolution using the complex cepstrum  

SciTech Connect

The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

Riley, H.B.



The Vampir Performance Analysis Tool-Set  

Microsoft Academic Search

\\u000a This paper presents the Vampir tool-set for performance analysis of parallel applications. It consists of the run-time measurement\\u000a system VampirTrace and the visualization tools Vampir and VampirServer. It describes the major features and outlines the underlying\\u000a implementation that is necessary to provide low overhead and good scalability. Furthermore, it gives a short overview about\\u000a the development history and future work

Andreas Knüpfer; Holger Brunst; Jens Doleschal; Matthias Jurenz; Matthias Lieber; Holger Mickler; Matthias S. Muller; Wolfgang E. Nagel



RADC SCAT automated sneak circuit analysis tool  

NASA Astrophysics Data System (ADS)

The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.

Depalma, Edward L.


Graphical Tools for Comparative Genome Analysis  

PubMed Central

Visualization of data is important for many data-rich disciplines. In biology, where data sets are becoming larger and more complex, grephical analysis is felt to be ever more pertinent. Although some patterns and trends in data sets may only be determined by sophisticated computional analysis, viewing data by eye can provide us with an extraordinary amount of information in an instant. Recent advances in bioinformatic technologies allow us to link graphical tools to data sources with ease, so we can visualize our data sets dynamically. Here, an overview of graghical software tools for comparative genome analysis is given, showing that a range of simple tools can provide us with powerful view of the differences and similarities between genome.



UMMPerfusion: an open source software tool towards quantitative MRI perfusion analysis in clinical routine.  


To develop a generic Open Source MRI perfusion analysis tool for quantitative parameter mapping to be used in a clinical workflow and methods for quality management of perfusion data. We implemented a classic, pixel-by-pixel deconvolution approach to quantify T1-weighted contrast-enhanced dynamic MR imaging (DCE-MRI) perfusion data as an OsiriX plug-in. It features parallel computing capabilities and an automated reporting scheme for quality management. Furthermore, by our implementation design, it could be easily extendable to other perfusion algorithms. Obtained results are saved as DICOM objects and directly added to the patient study. The plug-in was evaluated on ten MR perfusion data sets of the prostate and a calibration data set by comparing obtained parametric maps (plasma flow, volume of distribution, and mean transit time) to a widely used reference implementation in IDL. For all data, parametric maps could be calculated and the plug-in worked correctly and stable. On average, a deviation of 0.032?±?0.02 ml/100 ml/min for the plasma flow, 0.004?±?0.0007 ml/100 ml for the volume of distribution, and 0.037?±?0.03 s for the mean transit time between our implementation and a reference implementation was observed. By using computer hardware with eight CPU cores, calculation time could be reduced by a factor of 2.5. We developed successfully an Open Source OsiriX plug-in for T1-DCE-MRI perfusion analysis in a routine quality managed clinical environment. Using model-free deconvolution, it allows for perfusion analysis in various clinical applications. By our plug-in, information about measured physiological processes can be obtained and transferred into clinical practice. PMID:22832894

Zöllner, Frank G; Weisser, Gerald; Reich, Marcel; Kaiser, Sven; Schoenberg, Stefan O; Sourbron, Steven P; Schad, Lothar R



Application of parallel computing to the Monte Carlo simulation of electron scattering in solids: A rapid method for profile deconvolution  

Microsoft Academic Search

X-ray microanalysis by analytical electron microscopy (AEM) has proven to be a powerful tool for characterizing the spatial distribution of solute elements in materials. True compositional variations over spatial scales smaller than the actual resolution for microanalysis can be determined if the measured composition profile is deconvoluted. Explicit deconvolutions of such data, via conventional techniques such as Fourier transforms, are

A. D. Romig Jr.; S. J. Plimpton; J. R. Michael; R. L. Myklebust; D. E. Newbury



Statistical Deconvolution for Superresolution Fluorescence Microscopy  

PubMed Central

Superresolution microscopy techniques based on the sequential activation of fluorophores can achieve image resolution of ?10 nm but require a sparse distribution of simultaneously activated fluorophores in the field of view. Image analysis procedures for this approach typically discard data from crowded molecules with overlapping images, wasting valuable image information that is only partly degraded by overlap. A data analysis method that exploits all available fluorescence data, regardless of overlap, could increase the number of molecules processed per frame and thereby accelerate superresolution imaging speed, enabling the study of fast, dynamic biological processes. Here, we present a computational method, referred to as deconvolution-STORM (deconSTORM), which uses iterative image deconvolution in place of single- or multiemitter localization to estimate the sample. DeconSTORM approximates the maximum likelihood sample estimate under a realistic statistical model of fluorescence microscopy movies comprising numerous frames. The model incorporates Poisson-distributed photon-detection noise, the sparse spatial distribution of activated fluorophores, and temporal correlations between consecutive movie frames arising from intermittent fluorophore activation. We first quantitatively validated this approach with simulated fluorescence data and showed that deconSTORM accurately estimates superresolution images even at high densities of activated fluorophores where analysis by single- or multiemitter localization methods fails. We then applied the method to experimental data of cellular structures and demonstrated that deconSTORM enables an approximately fivefold or greater increase in imaging speed by allowing a higher density of activated fluorophores/frame.

Mukamel, Eran A.; Babcock, Hazen; Zhuang, Xiaowei



Comparison of gas chromatography-pulsed flame photometric detection-mass spectrometry, automated mass spectral deconvolution and identification system and gas chromatography-tandem mass spectrometry as tools for trace level detection and identification.  


The complexity of a matrix is in many cases the major limiting factor in the detection and identification of trace level analytes. In this work, the ability to detect and identify trace level of pesticides in complex matrices was studied and compared in three, relatively new methods: (a) GC-PFPD-MS where simultaneous PFPD (pulsed flame photometric detection) and MS analysis is performed. The PFPD indicates the exact chromatographic time of suspected peaks for their MS identification and provides elemental information; (b) automatic GC-MS data analysis using the AMDIS ("Automated Mass Spectral Deconvolution and Identification System") software by the National Institute of Standards and Technology; (c) GC-MS-MS analysis. A pesticide mixture (MX-5), containing diazinon, methyl parathion, ethyl parathion, methyl trithion and ethion was spiked, in descending levels from 1 ppm to 10 ppb, into soil and sage (spice) extracts and the detection level and identification quality were evaluated in each experiment. PFPD-MS and AMDIS exhibited similar performance, both superior to standard GC-MS, revealing and identifying compounds that did not exhibit an observable GC peak (either buried under the chromatographic background baseline or co-eluting with other interfering GC peaks). GC-MS-MS featured improved detection limits (lower by a factor of 6-8) compared to AMDIS and PFPD-MS. The GC-PFPD-MS-MS combination was found useful in several cases, where no reconstructed ion chromatogram MS-MS peaks existed, but an MS-MS spectrum could still be extracted at the elution time indicated by PFPD. The level of identification and confirmation with MS-MS was inferior to that of the other two techniques. In comparison with the soil matrix, detection limits obtained with the loaded sage matrix were poorer by similar factors for all the techniques studied (factors of 5.8, >6.5 and 4.0 for AMDIS, PFPD-MS and MS-MS, respectively). Based on the above results, the paper discusses the trade-offs between detectivity and identification level with the compared three techniques as well as other more traditional techniques and approaches. PMID:10701673

Dagan, S



Global spatial deconvolution of Lunar Prospector Th abundances  

NASA Astrophysics Data System (ADS)

We have completed the first global spatial deconvolution analysis of planetary gamma-ray data for lunar Th abundances as measured by the Lunar Prospector Gamma-ray Spectrometer. We tested two different spatial deconvolution techniques - Jansson's method and the Pixon method - and determined that the Pixon method provides superior performance. The final deconvolved map results in a spatial resolution improvement of a factor of 1.5-2. The newly deconvolved data allow us to clearly delineate nearside Th enhancements and depressions, validate enhanced Th abundances associated with specific lunar red spots, and reveal new details of the Th distribution at the Aristarchus plateau.

Lawrence, D. J.; Puetter, R. C.; Elphic, R. C.; Feldman, W. C.; Hagerty, J. J.; Prettyman, T. H.; Spudis, P. D.



Link Analysis Tools for Intelligence and Counterterrorism  

Microsoft Academic Search

Association rule mining is an important data analysis tool that can be applied with success to a variety of domains. However, most association rule mining algorithms seek to discover statistically signifl- cant patterns (i.e. those with considerable support). We argue that, in law-enforcement, intelligence and counterterrorism work, sometimes it is necessary to look for patterns which do not have large

Antonio Badia; Mehmed M. Kantardzic




Microsoft Academic Search

The Advanced Composites Repair Analysis Tool (ACRAT) has been under development for the USAF Advanced Composites Program Office under an Ogden ALC Design Engineering Program (DEP) Contractual Engineering Task (CET) Order. ACRAT is an integrated prototype software system consisting of commercial-off-the-shelf (COTS) and public domain CAE simulation codes and customized databases. The objective has been to develop Beta versions of

Thomas E. Mack; James Y. Song


lmbench: Portable Tools for Performance Analysis  

Microsoft Academic Search

lmbench is a micro-benchmark suite designed to focus attention on the basic building blocks of man y common system applications, such as databases, simu- lations, software development, and networking. In almost all cases, the indi vidual tests are the result of analysis and isolation of a customer' sa ctual perfor- mance problem. These tools can be, and currently are, used

Larry W. Mcvoy; Carl Staelin



Timeline analysis tools for law enforcement  

Microsoft Academic Search

The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was

John Mucks



Hypnos graphic interface: a sleep analysis tool  

Microsoft Academic Search

This paper describes the features of hypnos graphic interface, a sleep analysis tool developed to work in a multi-platform environment. It comprises a set of components that can be executed on different machines and constitutes the three main elements of the application: an interface for visualising, editing and manipulating biomedical signals; a distributed file system; and a scripting language for

S. Oliveira; P. Goncalves; A. C. Rosa



Multidimensional multichannel FIR deconvolution using Gröbner bases.  


We present a new method for general multidimensional multichannel deconvolution with finite impulse response (FIR) convolution and deconvolution filters using Gröbner bases. Previous work formulates the problem of multichannel FIR deconvolution as the construction of a left inverse of the convolution matrix, which is solved by numerical linear algebra. However, this approach requires the prior information of the support of deconvolution filters. Using algebraic geometry and Gröbner bases, we find necessary and sufficient conditions for the existence of exact deconvolution FIR filters and propose simple algorithms to find these deconvolution filters. The main contribution of our work is to extend the previous Gröbner basis results on multidimensional multichannel deconvolution for polynomial or causal filters to general FIR filters. The proposed algorithms obtain a set of FIR deconvolution filters with a small number of nonzero coefficients (a desirable feature in the impulsive noise environment) and do not require the prior information of the support. Moreover, we provide a complete characterization of all exact deconvolution FIR filters, from which good FIR deconvolution filters under the additive white noise environment are found. Simulation results show that our approaches achieve good results under different noise settings. PMID:17022265

Zhou, Jianping; Do, Minh N



Decision Analysis Tools for Volcano Observatories  

NASA Astrophysics Data System (ADS)

Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

Hincks, T. H.; Aspinall, W.; Woo, G.



Patent citation analysis: A policy analysis tool  

Microsoft Academic Search

Patent citation analysis is a recent development which uses bibliometric techniques to analyse the wealth of patent citation information. This paper describes the various facets of patent citations and patent citation studies, and their important applications. Construction of technology indicators being an important use of patent citations, various patent citation based technological indicators and their applications are also described.

M. M. S. Karki



Interpretation of magnetic data based on Euler deconvolution and crustal magnetization in Taiwan  

NASA Astrophysics Data System (ADS)

Taiwan is located between the Ryukyu subduction zone in the northeast and the Luzon Arc subduction zone in the south. For understanding the complex plate collision progresses, seismology is the one of primary tools used to describe tectonic structure in Taiwan. It's still essential and useful to use other geophysical data to support or verify the results from seismology studies. In this study, two kinds of analysis methodologies are applied to the new magnetic data in Taiwan. The whole new field magnetic dataset in Taiwan were surveyed in 2004 (Yen et al., 2009). A detailed magnetic survey has been carried out in whole Taiwan area in order to provide the completely magnetic anomaly map (including Central Range). This survey covers an area with 6000 observation points of about 30000 square km. After the corrections of the magnetic observed data, the magnetic anomaly map was obtained. Several fundamental informations, such as the anomalies after reduction to pole correction and Curie point depth, are archived to figure out the tectonic significances. For further understanding advanced meanings in magnetic anomalies in Taiwan, the Euler deconvolution method with structure index and the crustal magnetization calculation are used to explore the geological and tectonic deformation. Theoretically, Euler deconvolution method has the advantage in locating spatial corresponding with depths of magnetic sources. And the crustal magnetization calculation is functional in determining the magnet susceptibility distribution from different rock properties. In this study, the new magnetic anomalies in Taiwan have been inverted with Euler deconvolution and crustal magnetization. The results show even spatial complicated arrangement for the magnetic sources, the analysis methods can provide better resolutions. Although it is difficult to show exactly the detail 3D spatial range, it can obtain the horizontal locations and depth of the magnetic bodies. The interpretation methods used here are more useful in complicated magnetic anomaly areas.

Hsieh, H.; Yen, H.; Chao, B. F.



Paleocurrent analysis: important tool in basin analysis  

SciTech Connect

A paleocurrent study of the Chiwaukum graben, a major structural depression in central Washington, illustrates the usefulness of paleocurrent indicators in determining a more complete picture of a basin's geological history - beyond the more traditional uses regarding depositional environment and provenance. For the Chiwaukum graben, paleocurrent mapping has given considerable insight into the structural development of the area. The Chiwaukum graben is bounded on the east and west by the Entiat and Leavenworth faults, respectively, and preserves a lower and a middle Tertiary sequence of fluviatile sandstones, shales, and conglomerates. Paleocurrent studies of the lower Tertiary sandstones demonstrate that the dominant paleotransport direction in the central portion of the graben was southwesterly (233/sup 0/). An integrated interpretation of paleocurrent and other data suggests that the area of the Entiat Mountains was the primary source area and a topographic high during deposition. Paleocurrent data also support the existence of exposed bedrock hills within the graben during sedimentation. Due to a lack of northeast-oriented paleocurrent vectors near the Leavenworth fault, the area southwest of the fault probably had little topographic expression and possibly was a site of deposition during at least part of lower Tertiary time. Paleocurrent analysis suggests that major relief along central and northern portions of the Leavenworth fault may be postdepositional.

Buza, J.W.



3D image restoration for confocal microscopy: toward a wavelet deconvolution for the study of complex biological structures  

NASA Astrophysics Data System (ADS)

Image restoration algorithms provide efficient tools for recovering part of the information lost in the imaging process of a microscope. We describe recent progress in the application of deconvolution to confocal microscopy. The point spread function of a Biorad-MRC1024 confocal microscope was measured under various imaging conditions, and used to process 3D-confocal images acquired in an intact preparation of the inner ear developed at Karolinska Institutet. Using these experiments we investigate the application of denoising methods based on wavelet analysis as a natural regularization of the deconvolution process. Within the Bayesian approach to image restoration, we compare wavelet denoising with the use of a maximum entropy constraint as another natural regularization method. Numerical experiments performed with test images show a clear advantage of the wavelet denoising approach, allowing to `cool down' the image with respect to the signal, while suppressing much of the fine-scale artifacts appearing during deconvolution due to the presence of noise, incomplete knowledge of the point spread function, or undersampling problems. We further describe a natural development of this approach, which consists of performing the Bayesian inference directly in the wavelet domain.

Boutet de Monvel, Jacques; Le Calvez, Sophie; Ulfendahl, Mats



Blind deconvolution of speckle images.  

NASA Astrophysics Data System (ADS)

A technique for deconvolving an image from both a single convolution and an ensemble of differently blurred images is presented. The method is more robust than the earlier blind deconvolution algorithms proposed by Ayers and Dainty. The performance of the algorithm in the presence of noise is evaluated. It is also demonstrated how the algorithm can be modified to utilize the much greater amount of information contained in an ensemble of differently blurred pictures of an image. Reconstructions using both computer simulations and infrared astronomical speckle data are presented. The speckle reconstructions are compared with those obtained by both Fourier phase retrieval and bispectral estimation.

Lane, R. G.



Tracker Video Analysis and Modeling Tool  

NSDL National Science Digital Library

The Tracker Video Analysis and Modeling Tool allows students to model and analyze the motion of objects in videos. By overlaying simple dynamical models directly onto videos, students may see how well a model matches the real world. Interference patterns and spectra can also be analyzed with Tracker. Tracker 4.81 installers are available on Linux, Mac OS X, and Windows and include the Xuggle open source video engine. Tracker 4.81 Windows Installer Tracker 4.81 Mac OS X Installer Tracker 4.81 Linux 32-bit Installer - Instructions Tracker 4.81 Linux 64-bit Installer - Instructions Tracker is an Open Source Physics tool built on the OSP code library. Additional Tracker resources, demonstration experiments, and videos, can be found by searching ComPADRE for "Tracker." Additional Tracker resources including Tracker help and sample videos are available from the Tracker home page at Cabrillo College below.

Brown, Douglas




EPA Science Inventory

Using published human data on skin-to-urine and blood-to-urine transfer of 12 pesticides and herbicides, the skin-to-blood transfer rates for each compound were estimated by two numerical deconvolution techniques. Regular constrained deconvolution produced an estimated upper limi...



Microsoft Academic Search

The recovery of geological reection coecien ts from seismic data includes a deconvolution operation. The sparse spike deconvolution algorithm used in seismic inversion is computed with an l1 minimization. Although this procedure was developed in 1973, there is no mathematical model that explains the eciency of this approach for seismic data. Using recent results on sparse signal representations in re-



A method for wavelet estimation and deconvolution  

SciTech Connect

This paper reports on the procedure of usual deconvolution which is that after the definition of expectation output, the deconvolution operator is derived from input seismic data, then convolved with seismic data to yield output data. The author's new method includes the steps: defining an area in a seismic section, calculating criterion with the use of predefined objective function, estimating wavelet by iterative method, deriving inverse wavelet from the known wavelet, and finally obtaining deconvolution output. Wavelet estimation is the key to the success of this method. Each iteration outputs deconvolution data. Then criterion is calculated, and next iterative parameter is determined and used to calculate correction value for improving wavelet. good deconvolution section can be obtained finally.

Zhu, X. (Geophysical Research Inst., Bureau of Oil Geophysical Prospecting, Zhuozhou City, Hebei Province (CN))



IRTool: an IRST X Windows analysis tool  

NASA Astrophysics Data System (ADS)

IRTool is an IRST X Windows analysis tool, which is being developed by Arete Associates and NSWC/WO under the sponsorship of the Office of Naval Research in support of the Infrared Analysis Modeling and Measurements Program (IRAMMP). The tool consists of an integrated set of physics based modules to support IRST multispectral and space-time analyses. The primary modules are for (1) modeling atmospheric effects, (2) simulating ocean and cloud scenes without and with sensor effects, (3) modeling and injecting target signatures into real and simulated data, and (4) analytic calculation of the expected signal-to-noise ratio (ESNR) for an airborne target on a specified trajectory. Additional modules support data processing and analysis for clutter characterization and model validation. These modules have undergone extensive verification and comparison with data. IRTool has an interactive X Windows driver, which launches stand alone modules to run in the UNIX background. The user can interactively display and plot module outputs using IDL programs written for IRTool. IRTool is available from the IRAMMP program manger (Douglas Crowder).

Davis, Philip J.; Branlund, Eric; Church, Steven R.; Chmielewski, Don; Klesch, David; Krumrey, Erik P.; Crowder, Douglas



Direct deconvolution of radio synthesis images using L1 minimisation  

NASA Astrophysics Data System (ADS)

Aims: We introduce an algorithm for the deconvolution of radio synthesis images that accounts for the non-coplanar-baseline effect, allows multiscale reconstruction onto arbitrarily positioned pixel grids, and allows the antenna elements to have direcitonal dependent gains. Methods: Using numerical L1-minimisation techniques established in the application of compressive sensing to radio astronomy, we directly solve the deconvolution equation using graphics processing unit (GPU) hardware. This approach relies on an analytic expression for the contribution of a pixel in the image to the observed visibilities, and the well-known expression for Dirac delta function pixels is used along with two new approximations for Gaussian pixels, which allow for multi-scale deconvolution. The algorithm is similar to the CLEAN algorithm in that it fits the reconstructed pixels in the image to the observed visibilities while minimising the total flux; however, unlike CLEAN, it operates on the ungridded visibilities, enforces positivity, and has guaranteed global convergence. The pixels in the image can be arbitrarily distributed and arbitrary gains between each pixel and each antenna element can also be specified. Results: Direct deconvolution of the observed visibilities is shown to be feasible for several deconvolution problems, including a 1 megapixel wide-field image with over 400 000 visibilities. Correctness of the algorithm is shown using synthetic data, and the algorithm shows good image reconstruction performance for wide field images and requires no regridding of visibilities. Though this algorithm requires significantly more computation than methods based on the CLEAN algorithm, we demonstrate that it is trivially parallelisable across multiple GPUs and potentially can be scaled to GPU clusters. We also demonstrate that a significant speed up is possible through the use of multi-scale analysis using Gaussian pixels.

Hardy, Stephen J.



Assessment of perfusion by dynamic contrast-enhanced imaging using a deconvolution approach based on regression and singular value decomposition  

Microsoft Academic Search

The assessment of tissue perfusion by dynamic contrast-enhanced (DCE) imaging involves a deconvolution process. For analysis of DCE imaging data, we implemented a regression approach to select appropriate regularization parameters for deconvolution using the standard and generalized singular value decomposition methods. Monte Carlo simulation experiments were carried out to study the performance and to compare with other existing methods used

Tong San Koh; X. Y. Wu; L. H. Cheong; C. C. T. Lim



SEAT: A strategic engagement analysis tool  

SciTech Connect

The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

Dreicer, J.; Michelsen, C.; Morgeson, D.



Blind deconvolution by means of the Richardson–Lucy algorithm  

Microsoft Academic Search

A blind deconvolution algorithm based on the Richardson - Lucy deconvolution algorithm is presented. Its performance in the presence of noise is found to be superior to that of other blind deconvolution algorithms. Results are presented and compared with results obtained from implementation of a Weiner filter blind deconvolution algorithm. The algorithm is developed further to incorporate functional forms of

D. A. Fish; A. M. Brinicombe; E. R. Pike; J. G. Walker



Deconvolution of wellbore pressure and flow rate  

SciTech Connect

Determination of the influence function of a well/reservoir system from the deconvolution of wellbore flow rate and pressure is presented. Deconvolution is fundamental and is particularly applicable to system identification. A variety of different deconvolution algorithms are presented. The simplest algorithm is a direct method that works well for data without measurement noise but that fails in the presence of even small amounts of noise. The authors show, however, that a modified algorithm that imposes constraints on the solution set works well, even with significant measurement errors.

Kuchuk, F.J. (Schlumberger-Doll Research Center, Ridgefield, CT (USA)); Carter, R.G. (National Aeronautics and Space Administration, Hampton, VA (USA). Langley Research Center); Ayestaran, L. (Schlumberger Technical Services, Dubai (AE))



Deconvolution and Optimal Filtering in Seismology  

NASA Astrophysics Data System (ADS)

Deconvolution is an important, well-studied problem that is commonly encountered in seismology [1-4]. During hydrocarbon exploration, seismic receivers measure a noisy version of the earth’s response that is blurred by a source wavelet (for example, from marine air guns, or land dynamite charges). Deconvolution becomes necessary to deduce the earth’s response fromthe blurred and noisy receiver measurements. In earthquake seismology, the receivers measure seismic waves generated by earthquakes. In this case, deconvolution is employed to isolate the earthquake source time function, which characterizes the underground faulting process, from propagation effects [5-7].

Neelamani, Ramesh


Method and tool for network vulnerability analysis  


A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)



Timeline analysis tools for law enforcement  

NASA Astrophysics Data System (ADS)

The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

Mucks, John



PyRAT - python radiography analysis tool (u)  

SciTech Connect

PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory; Armstrong, Jerawan C [Los Alamos National Laboratory



Parallelization of a Blind Deconvolution Algorithm (Postprint).  

National Technical Information Service (NTIS)

Often it is of interest to deblur imagery in order to obtain higher- resolution images. Deblurring requires knowledge of the blurring function - information that is often not available separately from the blurred imagery. Blind deconvolution algorithms ov...

C. L. Matson K. J. Borelli



Monte Carlo Approach to Numerical Deconvolution.  

National Technical Information Service (NTIS)

A numerical procedure for solving deconvolution problems is presented. The procedure is based on the Monte Carlo method, which statistically estimates each element in the deconvolved excitation. A discrete Fourier transform technique is used to improve th...

M. P. Ekstrom



Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)  

SciTech Connect

NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

Melaina, M.; Penev, M.



Use risk analysis as a safety tool  

SciTech Connect

Safety managers of hydrocarbon processing industry (HPI) facilities must use a yardstick to estimate potential losses from accidents and convert accident risk into a quantified value. Quantitative risk assessment (QRA) is a yardstick that places numeric values on losses such as fatalities to employees and the public, interruption of business during reconstruction, damage to buildings, equipment, etc. and departure from business areas. No realistic method can remove all risk from an operating facility. However, stricter environmental and worker protection regulations require HPI managers to evaluate their operations, find vulnerable or hot-spot areas and take action. Logical analysis methods such as fault tree, event tree or cause-consequence diagrams can use safety audit information to identify and rank high-risk areas. After finding hazardous situations, manpower and capital can be allocated to eliminate potential accidents and reduce risk. The paper discusses QRA as a management tool, underlying principles, yardsticks to quantify risk, event analysis, limitations of QRA, and setting-up of a QRA project.

Mani, G. (Kuwait National Petroleum Co. (Kuwait). Shuaiba Refinery)



Three-Dimensional Imaging by Deconvolution Microscopy  

Microsoft Academic Search

Deconvolution is a computational method used to reduce out-of-focus fluorescence in three-dimensional (3D) microscope images. It can be applied in principle to any type of microscope image but has most often been used to improve images from conventional fluorescence microscopes. Compared to other forms of 3D light microscopy, like confocal microscopy, the advantage of deconvolution microscopy is that it can

James G. McNally; Tatiana Karpova; John Cooper; José Angel Conchello



Receiver function estimated by maximum entropy deconvolution  

Microsoft Academic Search

Maximum entropy deconvolution is presented to estimate receiver function, with the maximum entropy as the rule to determine\\u000a auto-correlation and cross-correlation functions. The Toeplitz equation and Levinson algorithm are used to calculate the iterative\\u000a formula of error-predicting filter, and receiver function is then estimated. During extrapolation, reflective coefficient\\u000a is always less than 1, which keeps maximum entropy deconvolution stable. The

Qing-Ju Wu; Xiao-Bo Tian; Nai-Ling Zhang; Wei-Ping Li; Rong-Sheng Zeng



New Analysis Techniques in the CEPBA-Tools Environment  

NASA Astrophysics Data System (ADS)

The CEPBA tools environment is a performance analysis environment that initially focused on trace visualization and analysis. Current development efforts try to go beyond the presentation of simple statistics by introducing more intelligence in the analysis of the raw data.

Labarta, Jesus


Principal components analysis as a tool for Quaternary paleoclimatic research  

SciTech Connect

Nine small lakes on southeast Baffin Island, NWT, Canada, were cored and the sediments retrieved were analyzed for sediment size and composition, magnetic susceptibility, sediment geochemistry, organic matter content, and carbon isotopic composition. Age control was obtained from 85 AMS radiocarbon dates. in total, 1,847 measurements were made on twelve cores. The size of the data set precluded the use of visual analysis of the trends within each of the variable data sets. The method used to deconvolute the paleoenvironmental signal was one of principal components analysis and regression. Principal components analysis was carried out on the entire data set to determine which variables caused most of the variance within the overall signal. This showed that three principal components axes (PCAs) could account for 79% of the total variance within the data set. For each PCA, the closest correlated variable was chosen (sand content, total organic matter content, and sedimentation rate) and for each lake core, this variable was regressed against time. Residuals from the regression trend were then derived and normalized to a Z score. Z scores for each variable were plotted against age. Then, within 500 year timeslots, the median residual Z score was determined. This gave a stepped record of residuals throughout the Holocene and indicated periods of significant environmental change within the lakes' watersheds. Comparing this to previously obtained pollen and diatom records from the same area showed similarity and also illustrated important local differences.

Miller, R.J.O. (Univ. of Colorado, Boulder, CO (United States). Dept. of Geological Sciences)



Compensated deconvolution from wavefront sensing  

NASA Astrophysics Data System (ADS)

The U.S. Air Force has a continuing mission to obtain imagery of earth-orbiting objects. One of the means for obtaining this imagery is through the use of ground-based observatories. A fundamental problem associated with imaging objects through the atmosphere is that atmospheric turbulence inflicts a large, random aberration on the telescope which effectively limits the realizable resolution to that of a much smaller telescope. Several approaches have been taken to overcome these effects including pure post processing, pure adaptive optics, and hybrid techniques involving both adaptive optics and image post processing. One key result from past approaches is that partially compensated systems can be used in conjunction with image processing to overcome most of the optical effects of atmospheric turbulence while retaining nearly the performance of a fully compensated system. One hybrid approach is compensated deconvolution from wavefront sensing (CDWFS). This method uses wavefront sensor measurements in conjunction with short exposure images to improve the effective optical performance. This thesis formulates and executes a plan which allows fundamental questions regarding partially compensated adaptive optics performance to be answered. Specifically, imaging of extended objects using the CDWFS technique is investigated, through simulation. The simulation results demonstrate that the CDWFS technique can be used to reduce the required closed-loop bandwidth of an imaging system, permitting longer integration times in the wavefront sensor, and thus allowing dimmer objects to be imaged without the use of an artificial guidestar.

Thorson, Lori A.



Forensic physical memory analysis: an overview of tools and techniques  

Microsoft Academic Search

Forensic physical memory analysis has gradually evolved from basic techniques such as string searching to more com- plex methods. As computer malware becomes more sophis- ticated, tools and techniques for memory analysis suffer in- adequacies. Given this, it is essential to examine tools and techniques for physical memory analysis. By understanding their behaviour, limitations and advantages it is possible then

Gabriela Limon Garcia


Constrained iterations for blind deconvolution and convexity issues  

NASA Astrophysics Data System (ADS)

The need for image restoration arises in many applications of various scientific disciplines, such as medicine and astronomy and, in general, whenever an unknown image must be recovered from blurred and noisy data [M. Bertero, P. Boccacci, Introduction to Inverse Problems in Imaging, Institute of Physics Publishing, Philadelphia, PA, USA, 1998]. The algorithm studied in this work restores the image without the knowledge of the blur, using little a priori information and a blind inverse filter iteration. It represents a variation of the methods proposed in Kundur and Hatzinakos [A novel blind deconvolution scheme for image restoration using recursive filtering, IEEE Trans. Signal Process. 46(2) (1998) 375-390] and Ng et al. [Regularization of RIF blind image deconvolution, IEEE Trans. Image Process. 9(6) (2000) 1130-1134]. The problem of interest here is an inverse one, that cannot be solved by simple filtering since it is ill-posed. The imaging system is assumed to be linear and space-invariant: this allows a simplified relationship between unknown and observed images, described by a point spread function modeling the distortion. The blurring, though, makes the restoration ill-conditioned: regularization is therefore also needed, obtained by adding constraints to the formulation. The restoration is modeled as a constrained minimization: particular attention is given here to the analysis of the objective function and on establishing whether or not it is a convex function, whose minima can be located by classic optimization techniques and descent methods. Numerical examples are applied to simulated data and to real data derived from various applications. Comparison with the behavior of methods [D. Kundur, D. Hatzinakos, A novel blind deconvolution scheme for image restoration using recursive filtering, IEEE Trans. Signal Process. 46(2) (1998) 375-390] and [M. Ng, R.J. Plemmons, S. Qiao, Regularization of RIF Blind Image Deconvolution, IEEE Trans. Image Process. 9(6) (2000) 1130-1134] show the effectiveness of our variant.

Spaletta, Giulia; Caucci, Luca



Stereo Wavesurfer: A Tool for Dialog Analysis  

Microsoft Academic Search

Researchers in the Interactive Systems Group at UTEP have been using a research tool called Didi for some time now. It was originally designed to be easily adaptable. This tool has proven to be adaptable as it has been changed by different researchers to suit particular needs. As a result, multiple versions of the program exist. In addition to this,

Ernesto Medina; Thamar Solorio


Computational tools for poverty measurement and analysis  

Microsoft Academic Search

This paper introduces some relatively straightforward computational tools for estimating poverty measures from the sort of data that are typically available from published sources. All that is required for using these tools is an elementary regression package. The methodology also easily lends itself to a number of poverty simulations, some of which are discussed. The paper addresses the central question:

Gaurav Datt



Deconvolution of immittance data: some old and new methods  

SciTech Connect

The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of Kronig-Kramers transforms, a powerful nonlinear-least-squares inversion method, and a new Monte-Carlo inversion method. The important distinction, usually ignored, between discrete-point distributions and continuous ones is emphasized, and both recent parametric and non-parametric deconvolution/inversion procedures for frequency-response data are discussed and compared. Information missing in a recent parametric measurement-model deconvolution approach is pointed out and remedied, and its priority evaluated. Comparisons are presented between the standard parametric least squares inversion method and a new non-parametric Monte Carlo one that allows complicated composite distributions of relaxation times (DRT) to be accurately estimated without the uncertainty present with regularization methods. Also, detailed Monte-Carlo DRT estimates for the supercooled liquid 0.4Ca(NO) 0.6KNO3(CKN) at 350 K are compared with appropriate frequency-response-model fit results. These composite models were derived from stretched-exponential Kohlrausch temporal response with the inclusion of either of two different series electrode-polarization functions.

Tuncer, Enis [ORNL; Macdonald, Ross J. [University of North Carolina



Scalable analysis tools for sensitivity analysis and UQ (3160) results.  

SciTech Connect

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.



Statistical expression deconvolution from mixed tissue samples  

PubMed Central

Motivation: Global expression patterns within cells are used for purposes ranging from the identification of disease biomarkers to basic understanding of cellular processes. Unfortunately, tissue samples used in cancer studies are usually composed of multiple cell types and the non-cancerous portions can significantly affect expression profiles. This severely limits the conclusions that can be made about the specificity of gene expression in the cell-type of interest. However, statistical analysis can be used to identify differentially expressed genes that are related to the biological question being studied. Results: We propose a statistical approach to expression deconvolution from mixed tissue samples in which the proportion of each component cell type is unknown. Our method estimates the proportion of each component in a mixed tissue sample; this estimate can be used to provide estimates of gene expression from each component. We demonstrate our technique on xenograft samples from breast cancer research and publicly available experimental datasets found in the National Center for Biotechnology Information Gene Expression Omnibus repository. Availability: R code ( for estimating sample proportions is freely available to non-commercial users and available at Contact:

Clarke, Jennifer; Seo, Pearl; Clarke, Bertrand



Statistical Tools for Forensic Analysis of Toolmarks.  

National Technical Information Service (NTIS)

Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition t...

D. Baldwin M. Morris S. Bajic Z. Zhou M. J. Kreiser



Tools for Knowledge Analysis, Synthesis, and Sharing  

NASA Astrophysics Data System (ADS)

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

Medland, Michael B.



Software Tools for High-Throughput Analysis and Archiving of Immunohistochemistry Staining Data Obtained with Tissue Microarrays  

PubMed Central

The creation of tissue microarrays (TMAs) allows for the rapid immunohistochemical analysis of thousands of tissue samples, with numerous different antibodies per sample. This technical development has created a need for tools to aid in the analysis and archival storage of the large amounts of data generated. We have developed a comprehensive system for high-throughput analysis and storage of TMA immunostaining data, using a combination of commercially available systems and novel software applications developed in our laboratory specifically for this purpose. Staining results are recorded directly into an Excel worksheet and are reformatted by a novel program (TMA-Deconvoluter) into a format suitable for hierarchical clustering analysis or other statistical analysis. Hierarchical clustering analysis is a powerful means of assessing relatedness within groups of tumors, based on their immunostaining with a panel of antibodies. Other analyses, such as generation of survival curves, construction of Cox regression models, or assessment of intra- or interobserver variation, can also be done readily on the reformatted data. Finally, the immunoprofile of a specific case can be rapidly retrieved from the archives and reviewed through the use of Stainfinder, a novel web-based program that creates a direct link between the clustered data and a digital image database. An on-line demonstration of this system is available at

Liu, Chih Long; Prapong, Wijan; Natkunam, Yasodha; Alizadeh, Ash; Montgomery, Kelli; Gilks, C. Blake; van de Rijn, Matt



Quantitative SWOT analysis on global competitiveness of machine tool industry  

Microsoft Academic Search

The strategic importance of global competitiveness of the machine tool industry in Japan is steadily increasing, and therefore machine tool manufacturers require effective corporate strategy to achieve sustainable competitive advantages. Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis is one of the most effective approaches used for analysing strategic management policy of an organization. However, the use of conventional SWOT analysis

H. Shinno; H. Yoshioka; S. Marpaung; S. Hachiga



SWOT Analysis Support Tool for Verification of Business Strategy  

Microsoft Academic Search

To verify business strategies, it is essential to extract and analyze business information. The amount of business information is so enormous that it is time-consuming for an analyst to extract and analyze. Thus, we propose a support tool for verifying business strategies using SWOT analysis. This tool supports the extraction of factors for analysis. However, the information includes unnecessary factors,

M. Samejima; Y. Shimizu; M. Akiyoshi; N. Komoda



Effect of static analysis tools on software security: preliminary investigation  

Microsoft Academic Search

Static analysis tools can handle large-scale software and find thousands of defects. But do they improve software security? We evaluate the effect of static analysis tool use on software security in open source projects. We measure security by vulnerability reports in the National Vulnerability Database.

Vadim Okun; William F. Guthrie; Romain Gaucher; Paul E. Black



Gas chromatography coupled to mass spectrometry analysis of volatiles, sugars, organic acids and aminoacids in Valencia Late orange juice and reliability of the Automated Mass Spectral Deconvolution and Identification System for their automatic identification and quantification.  


Neutral volatiles and non-volatile polar compounds (sugars, organics acids and aminoacids) present in Valencia Late orange juice have been analysed by Gas Chromatography coupled to Mass Spectrometry (GC-MS). Before analysis, the neutral volatiles have been extracted by Headspace-Solid Phase Microextraction (HS-SPME), and the non-volatile polar compounds have been transformed to their corresponding volatile trimethylsilyl (TMS) derivatives. From the resulting raw GC-MS data files, the reliability of the Automated Mass Spectral Deconvolution and Identification System (AMDIS) to perform accurate identification and quantification of the compounds present in the sample has been tested. Hence, both raw GC-MS data files have been processed automatically by using AMDIS and manually by using Xcalibur™, the manufacturer's data processing software for the GC-MS platform used. Results indicate that the reliability of AMDIS for accurate identification and quantification of the compounds present in the sample strongly depends on a number of operational settings, for both the MS and AMDIS, which must be optimized for the particular type of assayed sample. After optimization of these settings, AMDIS and Xcalibur™ yield practically the same results. A total of 85 volatiles and 22 polar compounds have been identified and quantified in Valencia Late orange juice. PMID:22533907

Cerdán-Calero, Manuela; Sendra, José María; Sentandreu, Enrique



Maximum likelihood deconvolution: a new perspective  

SciTech Connect

Maximum-likelihood deconvolution can be presented from at least two very different points of view. Unfortunately, in most journal articles, it is couched in the mystique of state-variable models and estimation theory, both of which, are generally quite foreign to geophysical signal processors. This paper explains maximum-likelihood deconvolution using the well-known convolutional model and some relatively simple ideas from optimization theory. Both of these areas should be well known to geophysical signal processors. Although it is straightforward to develop the theory of maximum-likelihood deconvolution using the convolutional model and optimization theory, this approach does not lead to practical computational algorithms. Recursive algorithms must be used; they are orders of magnitude faster than the batch algorithms that are associated with the convolutional model.

Mendel, J.M.



Multichannel Wiener deconvolution of vertical seismic profiles  

SciTech Connect

The authors describe a technique for performing optimal, least-squares deconvolution of vertical seismic profile (VSP) data. The method is a two-step process that involves (1) estimating the source signature and (2) applying a least-squares optimum deconvolution operator that minimizes the noise not coherent with the source signature estimate. The optimum inverse problem, formulated in the frequency domain, gives as a solution an operator that can be interpreted as a simple inverse to the estimated aligned signature multiplied by semblance across the array. An application to a zero-offset VSP acquired with a dynamite source shows the effectiveness of the operator in attaining the two conflicting goals of adaptively spiking the effective source signature and minimizing the noise. Signature design for seismic surveys could benefit from observing that the optimum deconvolution operator gives a flat signal spectrum if and only if the seismic source has the same amplitude spectrum as the noise.

Haldorsen, J.B.U. (Geco Prakla, Hannover (Germany)); Miller, D.E. (Schlumberger-Doll Research, Ridgefield, CT (United States)); Walsh, J.J. (Schlumberger Cambridge Research (United Kingdom))



Deconvolution of Thomson scattering temperature profiles  

SciTech Connect

Deconvolution of Thomson scattering (TS) profiles is required when the gradient length of the electron temperature (T{sub e}) or density (n{sub e}) are comparable to the instrument function length ({Delta}{sub R}). The most correct method for deconvolution to obtain underlying T{sub e} and n{sub e} profiles is by consideration of scattered signals. However, deconvolution at the scattered signal level is complex since it requires knowledge of all spectral and absolute calibration data. In this paper a simple technique is presented where only knowledge of the instrument function I(r) and the measured profiles, T{sub e,observed}(r) and n{sub e,observed}(r), are required to obtain underlying T{sub e}(r) and n{sub e}(r). This method is appropriate for most TS systems and is particularly important where high spatial sampling is obtained relative to {Delta}{sub R}.

Scannell, R.; Beurskens, M.; Carolan, P. G.; Kirk, A.; Walsh, M. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX14 3DB (United Kingdom); O'Gorman, T. [Department of Physics, University College Cork, Cork (Ireland); Osborne, T. H. [General Atomics, P.O. Box, San Diego, California 92186-5608 (United States)



Deconvolution of images with periodic striping noise  

NASA Astrophysics Data System (ADS)

In this paper a new deconvolution algorithm is presented concerning images contaminated by periodic stripes. Inspired by the 2-D power spectrum distribution property of periodic stripes in the frequency domain, we construct a novel regularized inverse filter which allows the algorithm to suppress the amplification of striping noise in the Fourier inverse step and further get rid of most of them, and mirror-wavelet denoising is followed to remove the left colored noise. In simulations with striped images, this algorithm outperforms the traditional mirror-wavelet based deconvolution in terms of both visual effect and SNR comparison, only at the expense of slightly heavier computation load. The same idea about regularized inverse filter can also be used to improve other deconvolution algorithms, such as wavelet packets and wiener filters, when they are employed to images stained by periodic stripes.

Wang, Zuoguan; Xu, Wujun; Fu, Yutian



Wavelet-Based Deconvolution for Ill-Conditioned Systems.  

National Technical Information Service (NTIS)

This thesis proposes a new approach to wavelet-based image deconvolution that comprises Fourier-domain system inversion followed by wavelet-domain noise suppression. In contrast to other wavelet-based deconvolution approaches, the algorithm employs a regu...

R. Neelamani



Super-exponential algorithms for multichannel blind deconvolution  

Microsoft Academic Search

Multichannel blind deconvolution has been receiving increasing attention. Shalvi and Weinstein proposed an attractive approach to single-channel blind deconvolution called the super-exponential methods. The objective of this correspondence is to extend the Shalvi and Weinstein (1993, 1994) approach to the multichannel case and present super-exponential algorithms for multichannel blind deconvolution. We propose three approaches to multichannel blind deconvolution. In the

Yujiro Inouye; Kazuaki Tanebe



Werner deconvolution for variable altitude aeromagnetic data  

SciTech Connect

The standard Werner deconvolution method is extended to include the effects of variable sensor altitude but this leads to a deconvolution algorithm that is unstable for slowly changing flight height. By expressing the sensor altitude as a linear function of horizontal position (within a specified window), the authors show that the numerical instability can be avoided. The subsequent selection and averaging of the raw solutions is controlled by three parameters that can be adjusted to specific survey data characteristics. Results for an aeromagnetic survey over Vancouver Island, British Columbia show that, in comparison with the variable altitude approach, the standard Werner method produces unacceptable errors when applied to variable altitude data.

Ostrowski, J.S. (Horler Information Inc., Ottawa, Ontario (Canada)); Pilkington, M.; Teskey, D.J. (Geological Survey of Canada, Ottawa, Ontario (Canada))



Tools for Knowledge Analysis, Synthesis, and Sharing  

ERIC Educational Resources Information Center

|Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

Medland, Michael B.



SIMMER as a safety analysis tool  

SciTech Connect

SIMMER has been used for numerous applications in fast reactor safety, encompassing both accident and experiment analysis. Recent analyses of transition-phase behavior in potential core disruptive accidents have integrated SIMMER testing with the accident analysis. Results of both the accident analysis and the verification effort are presented as a comprehensive safety analysis program.

Smith, L.L.; Bell, C.R.; Bohl, W.R.; Bott, T.F.; Dearing, J.F.; Luck, L.B.



Comment on "Atomic spectral line-free parameter deconvolution procedure".  


Recently Milosavljevi? and Popari? [Phys. Rev. E 63, 036404 (2001)] proposed a method for the deconvolution of isolated asymmetric plasma broadened atomic (neutral) spectral lines. The authors claim that their method enables a complete plasma diagnostics by applying this deconvolution on a single experimental line profile. In the present Comment the proposed deconvolution procedure and its application are reexamined. PMID:12786332

Nikoli?, D; Djurovi?, S; Mijatovi?, Z; Kobilarov, R



IC failure analysis: techniques and tools for quality reliability improvement  

Microsoft Academic Search

The role of failure analysis is discussed. Failure analysis techniques and tools, including electrical measurements, optical microscopy, thermal imaging analysis, electron beam techniques, light emission microscopy, ion beam techniques, and scanning probe microscopy, are reviewed. Opportunities for advances in the field of IC failure analysis are considered




Tools for Knowledge Analysis, Synthesis, and Sharing  

Microsoft Academic Search

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning\\u000a to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more\\u000a effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact

Michael B. Medland



Dynamics Diagnostics: Methods, Equipment and Analysis Tools  

Microsoft Academic Search

There are two reasons why it is important to diagnose, identify, or analyse the dynamic behaviour of machine tools. Firstly,\\u000a the many techniques that can be used to optimise the machining process invariably require some model of the structural dynamics.\\u000a This is especially true for the problem of avoiding unstable chatter vibrations, and for predicting the surface finish. Knowledge\\u000a of

Neil D. Sims


Global Spatial Deconvolution of Lunar Prospector Th Abundances Using the Pixon and Jansson Deconvolution Methods  

NASA Astrophysics Data System (ADS)

We have carried out a spatial deconvolution of global lunar thorium abundances using two methods: Pixon and Jansson. We conclude that the Pixon method produces significantly improved deconvolved maps, which we use to revisit geologically complex regions,

Lawrence, D. J.; Puetter, R. C.; Elphic, R. C.; Feldman, W. C.; Hagerty, J. J.; Prettyman, T. H.; Spudis, P. D.



Tools for Physics Analysis in CMS  

NASA Astrophysics Data System (ADS)

The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling the development of common analysis efforts across and within physics analysis groups. It aims at fulfilling the needs of most CMS analyses, providing both ease-of-use for the beginner and flexibility for the advanced user. The main PAT concepts are described in detail and some examples from realistic physics analyses are given.

Hinzmann, Andreas



GRPANL: a program for deconvoluting and interpreting complex peak clusters  

SciTech Connect

GRPANL (GRouP ANaLysis) is a general-purpose peak fitting program that first determines gamma-ray and x-ray energies and intensities for specified peaks or clusters of peaks in a spectrum and then proceeds to interpret these results, determining both the radioisotopes detected and the amounts of each in the sample. Versions of the program are now running on the Digital Equipment Corporation (DEC) VAX and PDP-11 computers. The code has several unique capabilities for deconvoluting and interpreting difficult analytical situations that other codes usually cannot handle.

Gunnink, R.; Ruhter, W.D.; Niday, J.B.



A simple maximum entropy deconvolution algorithm  

NASA Astrophysics Data System (ADS)

A simple maximum entropy image deconvolution algorithm, now implemented in the Astronomical Image Processing System AIPS as task VM, is described. VM uses a simple Newton-Raphson approach to optimise the relative entropy of the image subject to constraints upon the rms error and total power enforced by Lagrange multipliers. Some examples of the application of VM to VLA data are given.

Cornwell, T. J.; Evans, K. F.



Blind deconvolution through digital signal processing  

Microsoft Academic Search

This paper addresses the problem of deconvolving two signals when both are unknown. The authors call this problem blind deconvolution. The discussion develops two related solutions which can be applied through digital signal processing in certain practical cases. The case of reverberated and resonated sound forms the center of the development. The specific problem of restoring old acoustic recordings provides

T. M. Cannon; R. B. Ingebretsen



Iterative blind deconvolution method and its applications  

Microsoft Academic Search

A simple iterative technique has been developed for blind deconvolution of two convolved functions. The method is described, and a number of results obtained from a computational implementation are presented. Some further possible applications are indicated. The convolution c(x) of two functions, f(x) and g(x), can be expressed mathematically by the integral equa- tion

G. R. Ayers; J. C. Dainty



Super-exponential methods for blind deconvolution  

Microsoft Academic Search

A class of iterative methods for solving the blind deconvolution problem, i.e. for recovering the input of an unknown possibly nonminimum-phase linear system by observation of its output, is presented. These methods are universal do not require prior knowledge of the input distribution, are computationally efficient and statistically stable, and converge to the desired solution regardless of initialization at a

Ofir Shalvi; Ehud Weinstein



Fast deconvolution of multichannel systems using regularization  

Microsoft Academic Search

A very fast deconvolution method, which is based on the fast Fourier transform (FFT), can be used to control the outputs from a multichannel plant comprising any number of control sources and error sensors. The result is a matrix of causal finite impulse response filters whose performance is optimized at a large number of discrete frequencies. The paper is particularly

Ole Kirkeby; Philip A. Nelson; Hareo Hamada; Felipe Orduna-Bustamante



Multichannel blind seismic deconvolution using dynamic programming  

Microsoft Academic Search

In this paper, we present an algorithm for multichannel blind deconvolution of seismic signals, which exploits lateral continuity of earth layers by dynamic programming approach. We assume that reflectors in consecutive channels, related to distinct layers, form continuous paths across channels. We introduce a quality measure for evaluating the quality of a continuous path, and iteratively apply dynamic programming to

Alon Heimer; Israel Cohen



Physically constrained Fourier transform deconvolution method.  


An iterative Fourier-transform-based deconvolution method for resolution enhancement is presented. This method makes use of the a priori information that the data are real and positive. The method is robust in the presence of noise and is efficient especially for large data sets, since the fast Fourier transform can be employed. PMID:19412237

Flaherty, Francis A



ATACOBOL: A COBOL Test Coverage Analysis Tool and Its Applications  

Microsoft Academic Search

A coverage testing tool ATACOBOL (Automatic Test Analysis for COBOL) that applies data flow coverage technique is developed for software development on IBM System\\/390 mainframe. We show that the data flow coverage criteria can identify possible problematic paths that maps to the actual testing semantic required by Y2K compliance software testing. However, the mainframe environment lacks testing tools that equip

Sam K. S. Sze; Michael R. Lyu



SIMPLE: a universal tool box for event trace analysis  

Microsoft Academic Search

The event trace analysis system SIMPLE allows the evaluation of arbitrarily formatted event traces. SIMPLE is designed as a software package which comprises independent tools that are all based on a new kind of event trace access: the trace format is described in a trace description language (TDL) and evaluation tools access the event trace through a standardized problem-oriented event

P. Dauphin; R. Hofmann; F. Lemmen; B. Mohr



Principles and Tools for Collaborative Entity-Based Intelligence Analysis  

Microsoft Academic Search

Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have

Eric A. Bier; Stuart K. Card; John W. Bodnar



Entity-based collaboration tools for intelligence analysis  

Microsoft Academic Search

Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have

Eric A. Bier; Stuart K. Card; John W. Bodnar



BRFSS: Prevalence Data and Data Analysis Tools  

NSDL National Science Digital Library

RFSS is the nation's premier system of health-related telephone surveys that collect state data about U.S. residents regarding their health-related risk behaviors, chronic health conditions, and use of preventive services. BRFSS collects data in all 50 states as well as the District of Columbia and three U.S. territories. BRFSS completes more than 400,000 adult interviews each year, making it the largest continuously conducted health survey system in the world. These tools allow the user to perform various analyses and display the data in different means. 

Control, Center F.


Extending Iris: The VAO SED Analysis Tool  

NASA Astrophysics Data System (ADS)

Iris is a tool developed by the Virtual Astronomical Observatory (VAO) for building and analyzing Spectral Energy Distributions (SEDs). Iris was designed to be extensible, so that new components and models can be developed by third parties and then included at runtime. Iris can be extended in different ways: new file readers allow users to integrate data in custom formats into Iris SEDs; new models can be fitted to the data, in the form of template libraries for template fitting, data tables, and arbitrary Python functions. The interoperability-centered design of Iris and the Virtual Observatory standards and protocols can enable new science functionalities involving SED data.

Laurino, O.; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.



A wavelet time-scale deconvolution filter design for nonstationary signal transmission systems through a multipath fading channel  

Microsoft Academic Search

This study attempts to develop a time-scale deconvolution filter for optimal signal reconstruction of nonstationary processes with a stationary increment transmitted through a multipath fading and colored noisy channel with stochastic tap coefficients. A deconvolution filter based on wavelet analysis\\/synthesis filter bank is proposed to solve this problem via a three-stage filter bank. A fractal signal transmitted through a multipath

Bor-Sen Chen; Yue-Chiech Chung; Der-Feng Huang



The environment power system analysis tool development program  

NASA Astrophysics Data System (ADS)

The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.



Statistical tools for clinical gait analysis  

Microsoft Academic Search

Gait analysis studies involve continuous curves of data measured over a gait cycle. Curve analysis and interpretation require adequate statistical methods. Three principal problems may be encountered in clinical practice: (i) the reliability of gait curves for a given patient, (ii) classifying a new subject as belonging to a given population or not and (iii) comparison of two populations (independent

A. Duhamel; J. L. Bourriez; P. Devos; P. Krystkowiak; A. Destée; P. Derambure; L. Defebvre



START: System Testability Analysis and Research Tool  

Microsoft Academic Search

START, a software package for automatic test sequencing and testability analysis of complex, hierarchically described modular systems, is described, and its use in modeling systems is examined. START uses algorithms based on information theory, heuristic search, and graph theory to solve various faces of the test sequencing and testability analysis problems. A system is modeled in the failure space as

Kriihna R. Pattipati; Somnath Deb; Mahesh Dontamsetty; Amit Maitra



Visualizing patent statistics by means of social network analysis tools  

Microsoft Academic Search

The present paper reviews the literature on social network analysis with applications to bibliometric data, and in particular, patent information. Several approaches of network analysis are conducted in the field of optoelectronics to exemplify the power of network analysis tools. Cooperation networks between inventors and applicants are illustrated, emphasizing bibliometric measures such as activity, citation frequency, etc. as well as

Christian Sternitzke; Adam Bartkowski; Reinhard Schramm



Dynamic characteristic analysis on machine tool by Ansys  

Microsoft Academic Search

The paper uses the finite element method and experiments in machine tool to carry out the analysis of dynamic characteristics. This paper uses ANSYS to do the modal analysis which establishes a finite element model of the spring damper connected on machine surface based on the three-dimensional solid model. Comparing test data and the results of finite element analysis, the

Hao Yang; Wu Yang; Daming Wang



Healthcare BI: a tool for meaningful analysis.  


Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting. PMID:21634274

Rohloff, Rose



The effects of error magnitude and bandwidth selection for deconvolution with unknown error distribution  

PubMed Central

The error distribution is generally unknown in deconvolution problems with real applications. A separate independent experiment is thus often conducted to collect the additional noise data in those studies. In this paper, we study the nonparametric deconvolution estimation from a contaminated sample coupled with an additional noise sample. A ridge-based kernel deconvolution estimator is proposed and its asymptotic properties are investigated depending on the error magnitude. We then present a data-driven bandwidth selection algorithm with combining the bootstrap method and the idea of simulation extrapolation. The finite sample performance of the proposed methods and the effects of error magnitude are evaluated through simulation studies. A real data analysis for a gene Illumina BeadArray study is performed to illustrate the use of the proposed methods.

Wang, Xiao-Feng; Ye, Deping



Capabilities of the analysis tools of the IMPEx infrastructure  

NASA Astrophysics Data System (ADS)

The EU-FP7 Project "Integrated Medium for Planetary Exploration" was established as a result of scientific collaboration between institutions across Europe and is working on the integration of a set of interactive data analysis and modeling tools in the field of space plasma and planetary physics. According to [1] these tools are comprised of AMDA, Clweb and 3DView from the data analysis and visualisation sector as well as Hybrid/MHD and Paraboloid magnetospheric models from the simulation sector. This presentation focuses on how these various tools will access observational and modeled data and display them in innovative and interactive ways.

Génot, V.; Khodachenko, M. L.; Kallio, E. J.; Topf, F.; Al-Ubaidi, T.; Gangloff, M.; Budnik, E.; Bouchemit, M.; Renard, B.; Bourel, N.; Penou, E.; André, N.; Modolo, R.; Hess, S.; Schmidt, W.; Alexeev, I. I.; Belenkaya, E. S.



On the Application of Euler Deconvolution to the Analytic Signal  

NASA Astrophysics Data System (ADS)

In the last years papers on Euler deconvolution (ED) used formulations that accounted for the unknown background field, allowing to consider the structural index (N) an unknown to be solved for, together with the source coordinates. Among them, Hsu (2002) and Fedi and Florio (2002) independently pointed out that the use of an adequate m-order derivative of the field, instead than the field itself, allowed solving for both N and source position. For the same reason, Keating and Pilkington (2004) proposed the ED of the analytic signal. A function being analyzed by ED must be homogeneous but also harmonic, because it must be possible to compute its vertical derivative, as well known from potential field theory. Huang et al. (1995), demonstrated that analytic signal is a homogeneous function, but, for instance, it is rather obvious that the magnetic field modulus (corresponding to the analytic signal of a gravity field) is not a harmonic function (e.g.: Grant & West, 1965). Thus, it appears that a straightforward application of ED to the analytic signal is not possible because a vertical derivation of this function is not correct by using standard potential fields analysis tools. In this note we want to theoretically and empirically check what kind of error are caused in the ED by such wrong assumption about analytic signal harmonicity. We will discuss results on profile and map synthetic data, and use a simple method to compute the vertical derivative of non-harmonic functions measured on a horizontal plane. Our main conclusions are: 1. To approximate a correct evaluation of the vertical derivative of a non-harmonic function it is useful to compute it with finite-difference, by using upward continuation. 2. We found that the errors on the vertical derivative computed as if the analytic signal was harmonic reflects mainly on the structural index estimate; these errors can mislead an interpretation even though the depth estimates are almost correct. 3. Consistent estimates of depth and S.I. are instead obtained by using a finite-difference vertical derivative of the analytic signal. 4. Analysis of a case history confirms the strong error in the estimation of structural index if the analytic signal is treated as an harmonic function.

Fedi, M.; Florio, G.; Pasteka, R.



RADC SCAT: Automated Sneak Circuit Analysis Tool.  

National Technical Information Service (NTIS)

Standard Sneak Analysis procedures are costly from a time, money and personnel perspective. The processing of design data available only during the latter portions of the development cycle are highly labor intensive and difficult to institute a design cha...

E. L. DePalma



LCD Root Simulation and Analysis Tools.  

National Technical Information Service (NTIS)

The North American Linear Collider Detector group has developed a simulation program package based on the ROOT system. The package consists of Fast simulation, the reconstruction of the Full simulated data, and physics analysis utilities.

M. Iwasaki



LCD ROOT Simulation and Analysis Tools  

SciTech Connect

The North American Linear Collider Detector group has developed a simulation program package based on the ROOT system. The package consists of Fast simulation, the reconstruction of the Full simulated data, and physics analysis utilities.

Iwasaki, Masako



New Tools in Nonlinear System Analysis.  

National Technical Information Service (NTIS)

This project was aimed at developing novel theories for the analysis and design of systems exhibiting essentially nonlinear behavior, such as systems utilizing quantized decision making, periodic orbits, switching, etc. The original primary directions of ...

A. Megretski



JAVA based LCD Reconstruction and Analysis Tools  

SciTech Connect

We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

Bower, G.



Collaborative Tools for CounterTerrorism Analysis  

Microsoft Academic Search

One of the major challenges in counter-terrorism analysis involves connecting the relatively few and sparse terrorism-related dots embedded within massive amounts of data flowing into the government's intelligence and counter-terrorism agencies. Information technologies have the potential to empower intelligence agencies or analysts with the ability to find pertinent data faster, conduct more efficient and effective analysis, share information with others

R. Popp; Krishna Pattipati; P. Willett; D. Serfaty; W. Stacy; K. Carley; J. Allanach; Haiying Tu; Satnam Singh



Deconvolution of electron diffraction patterns of amorphous materials formed with convergent beam.  


To perform reduced density function (G(r)) analysis on electron diffraction patterns of amorphous materials formed with convergent beams, the effects of convergence must be removed from the diffraction data. Assuming electrons incident upon the sample in different directions are incoherent, this can be done using deconvolution (Ultramicroscopy 76 (1999) 115). In this letter we show that the combination of an energy filtering transmission electron microscope with an image plate, increases the accuracy with which diffraction data can be measured and, subsequently, the accuracy of the deconvolution. PMID:12524200

McBride, W; Cockayne, D J H; Tsuda, K



A 3D image analysis tool for SPECT imaging  

NASA Astrophysics Data System (ADS)

We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.



A new tool for contamination analysis  

SciTech Connect

The Contamination Analysis Unit (CAU) is a sensing system that facilitates a new approach to industrial cleaning. Through use of portable mass spectrometry and various desorption techniques, the CAU provides in-process, near-real-time measurement of surface cleanliness levels. It can be of help in significantly reducing hazardous waste generation and toxic air emissions from manufacturing operations.

Meltzer, M.; Gregg, H.



Pervaporation: a useful tool for speciation analysis  

NASA Astrophysics Data System (ADS)

The application of pervaporation as both an auxiliary and a fundamental device for speciation analysis in liquid and solid samples is discussed. Examples of various determinations, including the coupling of the technique to both a gas chromatograph and flow-injection configurations, applied mostly to environmental and biological samples, are presented, giving clear evidence of the double role of the pervaporation process.

Luque de Castro, M. D.; Papaefstathiou, I.



Is citation analysis a legitimate evaluation tool?  

Microsoft Academic Search

A comprehensive discussion on the use of citation analysis to rate scientific performance and the controversy surrounding it. The general adverse criticism that citation counts include an excessive number of negative citations (citations to incorrect results worthy of attack), self-citations (citations to the works of the citing authors), and citations to methodological papers is analyzed. Included are a discussion of

E. Garfield



Radar Interferometry Time Series Analysis and Tools  

Microsoft Academic Search

We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including

S. M. Buckley



Efficient deconvolution of noisy periodic interference signals.  


The interference signal formed by combining two coherent light beams carries information on the path difference between the beams. When the path difference is a periodic function of time, as, for example, when one beam is reflected from a vibrating surface and the other from a fixed surface, the interference signal is periodic with the same period as the vibrating surface. Bessel functions provide an elegant and efficient means for deconvoluting such periodic interference signals, thus making it possible to obtain the displacement of the moving surface with nanometer resolution. Here we describe the mathematical basis for the signal deconvolution and employ this technique to obtain the amplitude of miniature capillary waves on water as a test case. PMID:16604773

Behroozi, Feredoon; Behroozi, Peter S



Drug target deconvolution by chemical proteomics.  


Drug target deconvolution is a process where the action of a drug, a small molecule, is characterised by identifying the proteins binding the drug and initiating the biological effect. The biological relevant target has to be extracted, or deconvoluted, from a list of proteins identified in such an approach. Beside the medically desired action of the drug, the identification of other proteins binding the drug can help to identify side effects and toxicity at a very early stage of drug development. The current approach to identify the proteins binding to the drug is an affinity-enrichment based approach, where the drug molecule is immobilised to a matrix through a linker and the proteins binding to the drug are identified by proteomics. PMID:21763176

Raida, Manfred



Fourier deconvolution of photoacoustic FTIR spectra  

NASA Astrophysics Data System (ADS)

Fourier self-deconvolution is a fairly routine numerical method for increasing the apparent resolution of spectra in which the intrinsic bandwidths are much greater than the instrumental resolution. The present work demonstrates that a photoacoustic (PA) interferogram obtained with a Fourier transform spectrometer can be used directly in this calculation, without the usual intermediate computation of a spectrum. Phase errors in the interferogram must be eliminated as a first step in this procedure. The technique has been applied to PA IR interferograms acquired for an Alberta coal and for kaolinite, a common layer silicate. Several new bands were identified in the coal spectrum and assigned utilizing previously published results for coal. Results for kaolinite illustrate a behaviour characteristic of deconvolution of single bands; in addition, an OH-stretching band usually not detectable in IR spectra of kaolinite was observed and verified by comparison with Raman data.

Friesen, W. I.; Michaelian, K. H.



Application of regularized Richardson-Lucy algorithm for deconvolution of confocal microscopy images  

PubMed Central

Although confocal microscopes have considerably smaller contribution of out-of-focus light than widefield microscopes, the confocal images can still be enhanced mathematically if the optical and data acquisition effects are accounted for. For that, several deconvolution algorithms have been proposed. As a practical solution, maximum-likelihood algorithms with regularization have been used. However, the choice of regularization parameters is often unknown although it has considerable effect on the result of deconvolution process. The aims of this work were: to find good estimates of deconvolution parameters; and to develop an open source software package that would allow testing different deconvolution algorithms and that would be easy to use in practice. Here, Richardson–Lucy algorithm has been implemented together with the total variation regularization in an open source software package IOCBio Microscope. The influence of total variation regularization on deconvolution process is determined by one parameter. We derived a formula to estimate this regularization parameter automatically from the images as the algorithm progresses. To assess the effectiveness of this algorithm, synthetic images were composed on the basis of confocal images of rat cardiomyocytes. From the analysis of deconvolved results, we have determined under which conditions our estimation of total variation regularization parameter gives good results. The estimated total variation regularization parameter can be monitored during deconvolution process and used as a stopping criterion. An inverse relation between the optimal regularization parameter and the peak signal-to-noise ratio of an image is shown. Finally, we demonstrate the use of the developed software by deconvolving images of rat cardiomyocytes with stained mitochondria and sarcolemma obtained by confocal and widefield microscopes.

Laasmaa, M; Vendelin, M; Peterson, P



Orbit Analysis Tools Software user's manual, version 1  

Microsoft Academic Search

In the course of our work in mission planning and analysis we have developed a set of computer programs that address many of the questions commonly asked by designers when planning a new satellite system and by managers wishing to assess the performance of an existing system. The Orbit Analysis Tools Software (OATS) is an organization of this collection of

Alan S. Hope; Jay Middour



Competitive intelligence process and tools for intelligence analysis  

Microsoft Academic Search

Purpose – The purpose of this survey research is twofold. First, to study and report the process that is commonly used to create and maintain a competitive intelligence (CI) program in organizations. And second, to provide an analysis of several emergent text mining, web mining and visualization-based CI tools, which are specific to collection and analysis of intelligence. Design\\/methodology\\/approach –

Ranjit Bose



Implementation of inventory analysis tool for optimization and policy selection  

Microsoft Academic Search

This paper serves to describe the development and application of a web based, low cost, user friendly Inventory Analysis Tool for stock availability optimization and enhanced delivery performance. The inventory optimization attempts to find dynamically the best inventory policy and safety stock for Stock Keeping Units with independent demands. The analysis is based on supply and demand data, which includes

Siong Sheng Chin; Edmund Chan; Terence Yeo



Soaplab - a unified Sesame door to analysis tools  

Microsoft Academic Search

Soaplab is a set of Web Services providing programmatic access to many applications on remote computers. Because such applications in the scientific environment usually analyze data, Soaplab is often referred to as an Analysis Web Service. It uses a unified (and partly standardized) API to find an analysis tool, discover what data it requires and what data it produces, to

Martin Senger; Peter Rice; Tom Oinn




EPA Science Inventory

GATHER, Geographic Analysis Tool for Health and Environmental Research, is an online spatial data access system that provides members of the public health community and general public access to spatial data that is pertinent to the analysis and exploration of public health issues...


Regularized Blind Deconvolution with Poisson Data  

NASA Astrophysics Data System (ADS)

We propose easy-to-implement algorithms to perform blind deconvolution of nonnegative images in the presence of noise of Poisson type. Alternate minimization of a regularized Kullback-Leibler cost function is achieved via multiplicative update rules. The scheme allows to prove convergence of the iterates to a stationary point of the cost function. Numerical examples are reported to demonstrate the feasibility of the proposed method.

Lecharlier, Loïc; De Mol, Christine



Deconvolution of appearance potential spectra II  

Microsoft Academic Search

We have recently introduced a new method for the inversion of autoconvolution integrals [1]. In this paper, we describe an\\u000a improvement of the previous algorithm which results from the consequent use of cubical spline functions. The treatment given\\u000a here is mathematically more consistent and offers better numerical stability than that given earlier. The method is applied\\u000a to the deconvolution of

V. Dose; Th. Fauster



Introducing Tool Support for Retrospective Analysis of Release Planning Decisions  

Microsoft Academic Search

The release planning activity in market-driven requirements engineering is crucial but difficult. The quality of the decisions\\u000a on product content and release timing determines the market success, but as predictions of market value and development cost\\u000a are uncertain, the decisions are not always optimal. This paper presents a prototype tool for retrospective analysis of release\\u000a planning decisions based on tool

Lena Karlsson; Björn Regnell



Development of a climate data analysis tool (CDAT)  

SciTech Connect

The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

Marlais, S.M.



Diamond-turning tool setting by interferogram analysis  

SciTech Connect

A method was developed to establish a numerically controlled tool path with respect to the work spindle centerline. Particularly adapted to the diamond turning of optics, this method is based upon interferogram analysis and is applicable to the establishment of the work spindle centerline relative to the tool path for any center-turned optic having a well-defined vertex radius of curvature. The application reported is for an f/2 concave spherical mirror.

Rasnick, W.H.; Yoder, R.C.



Serial concept maps: tools for concept analysis.  


Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments. PMID:17547345

All, Anita C; Huycke, LaRae I



Database tools for enhanced analysis of TMX-U data  

SciTech Connect

A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers.

Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.



Analysis Tool Web Services from the EMBL-EBI.  


Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services ( interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo



A Semi-Automated Functional Test Data Analysis Tool  

SciTech Connect

The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

Xu, Peng; Haves, Philip; Kim, Moosung



Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 ?m) and regression tree analysis  

USGS Publications Warehouse

This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 ?m wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 ?m region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus



Vulnerability assessment using two complementary analysis tools  

SciTech Connect

To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

Paulus, W.K.



A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra  

NASA Astrophysics Data System (ADS)

A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai



Comparison of gas chromatography–pulsed flame photometric detection–mass spectrometry, automated mass spectral deconvolution and identification system and gas chromatography–tandem mass spectrometry as tools for trace level detection and identification  

Microsoft Academic Search

The complexity of a matrix is in many cases the major limiting factor in the detection and identification of trace level analytes. In this work, the ability to detect and identify trace level of pesticides in complex matrices was studied and compared in three, relatively new methods: (a) GC–PFPD–MS where simultaneous PFPD (pulsed flame photometric detection) and MS analysis is

Shai Dagan



Application of the Origen Fallout Analysis Tool and the Delfic Falout Planning Tool to National Technical Nuclear Forensics.  

National Technical Information Service (NTIS)

The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool fo...

D. E. Peplow J. P. Lefebvre R. W. Lee V. J. Jodoin



Physics analysis tools for beauty physics in ATLAS  

NASA Astrophysics Data System (ADS)

The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

Anastopoulos, C.; B-Thacker, E.; Catmore, J.; Dallison, S.; Derue, F.; Epp, B.; Jussel, P.; Kaczmarska, A.; Mora, L. d.; Radziewski, H. v.; ?ezní?ek, P.; Stahl, T.



Interoperability of the analysis tools within the IMPEx project  

NASA Astrophysics Data System (ADS)

The growing amount of data in planetary sciences requires adequate tools for visualisation enabling in depth analysis. Within the FP7 IMPEx infrastructure data will originate from heterogeneous sources : large observational databases (CDAWeb, AMDA at CDPP, ...), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Together with the common "time series" visualisation functionality for both in-situ and modeled data (provided by AMDA and CLWeb tools), IMPEx will also provide immersion capabilities into the complex 3D data originating from models (provided by 3DView). The functionalities of these tools will be described. The emphasis will be put on how these tools 1/ can share information (for instance Time Tables or user composed parameters) and 2/ be operated synchronously via dynamic connections based on Virtual Observatory standards.

Génot, Vincent; Khodachenko, Maxim; Kallio, Esa; Al-Ubaidi, Tarek; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourel, Natacha; Modolo, Ronan; Hess, Sébastien; André, Nicolas; Penou, Emmanuel; Topf, Florian; Alexeev, Igor; Belenkaya, Elena; Kalegaev, Vladimir; Schmidt, Walter



Parallel Analysis Tools for Ultra-Large Climate Data Sets  

NASA Astrophysics Data System (ADS)

While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian



Diffusion of latent semantic analysis as a research tool: A social network analysis approach  

Microsoft Academic Search

Latent Semantic Analysis (LSA) is a relatively new research tool with a wide range of applications in different fields ranging from discourse analysis to cognitive science, from information retrieval to machine learning and so on. In this paper, we chart the development and diffusion of LSA as a research tool using Social Network Analysis (SNA) approach that reveals the social

Yasar Tonta; Hamid R. Darvish



Basis function multifield bispectral deconvolution analysis  

SciTech Connect

A different procedure for calculating linear and nonlinear coefficients of model systems for fully developed turbulence is derived. This procedure can be applied to systems with multiple interacting fields; in the single-field case the linear coefficients consist of mode frequencies and growth rates. This method differs from previous methods in the use of a limited set of functions or basis set from which the nonlinear terms in the turbulence equation are approximated in a series expansion. The algorithm is derived from this assumption using a least squares approach. This approach has been tested on simulations of fully developed two-dimensional turbulence and compared to previous methods. It is able to reconstruct coefficients with several significant figures precision and offers excellent noise rejection capabilities, and is moreover able to operate using tiny data sets compared to those required by previous methods.

Baver, D.A.; Terry, P.W. [Department of Physics, University of Wisconsin, Madison, Wisconsin 53706 (United States)



PVT Analysis with a Deconvolution Algorithm.  

National Technical Information Service (NTIS)

Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information a...

R. T. Kouzes



International comparative analysis of building regulations: an analytical tool  

Microsoft Academic Search

Purpose – The purpose of this paper is to introduce a tool for the international comparative analysis of regulatory regimes in the field of building regulation. Design\\/methodology\\/approach – On the basis of a heuristic model drawn from regulatory literature, a typology of building regulatory regimes is introduced. Each type is illustrated with a number of real-life examples from North America,

Jeroen van der Heijden



Assessing Extremes Climatology Using NWS Local Climate Analysis Tool  

Microsoft Academic Search

The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast

M. M. Timofeyeva; A. Hollingshead; D. Hilderbrand; B. Mayes; T. Hartley; N. M. Kempf McGavock; E. Lau; E. A. Olenic; B. Motta; R. Bunge; L. E. Brown; F. Fritsch



An Online Image Analysis Tool for Science Education  

ERIC Educational Resources Information Center

|This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.



An Automated Data Analysis Tool for Livestock Market Data  

ERIC Educational Resources Information Center

This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

Williams, Galen S.; Raper, Kellie Curry



Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint  

SciTech Connect

This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

Christensen, C.; Horowitz, S.



TRIAC: A code for track measurements using image analysis tools  

Microsoft Academic Search

A computer program named TRIAC written in MATLAB has been developed for track recognition and track parameters measurements from images of the Solid State Nuclear Track Detectors CR39. The program using image analysis tools counts the number of tracks for dosimetry proposes and classifies the tracks according to their radii for the spectrometry of alpha-particles. Comparison of manual scanning counts

D. L. Patiris; K. Blekas; K. G. Ioannides



DSD-Crasher: a hybrid analysis tool for bug finding  

Microsoft Academic Search

DSD-Crasher is a bug finding tool that follows a three-step approach to program analysis: D. Capture the program's intended execution behavior with dynamic invariant detection. The derived invariants exclude many unwanted values from the program's input domain. S. Statically analyze the program within the restricted input domain to explore many paths. D. Automatically generate test cases that focus on veri-

Christoph Csallner; Yannis Smaragdakis



An Automated Data Analysis Tool for Livestock Market Data  

ERIC Educational Resources Information Center

|This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual…

Williams, Galen S.; Raper, Kellie Curry



Lagrangian analysis. Modern tool of the dynamics of solids  

Microsoft Academic Search

Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for

J. Cagnoux; P. Chartagnac; P. Hereil; M. Perez; L. Seaman



CEOP centralized data system and integrated analysis tools  

Microsoft Academic Search

The amount of earth environmental data has increased explosively because of recent advances in observational techniques. On the CEOP (Coordinated Enhanced Observing Period) project, in order to improve our understanding of water and energy cycle, large amount of data is collected and archived. In this paper, we introduce the CEOP centralized data system and integrated analysis tool. The centralized system

Kenji Taniguchi; Toshihiro Nemoto; Eiji Ikoma; Masaki Yasukawa; Toshio Koike; Masaru Kitsuregawa


Kinetic Visualizations: A New Class of Tools for Intelligence Analysis  

Microsoft Academic Search

Intelligence analysis requires detecting and ex- ploiting patterns hidden in complex data. When the critical aspects of a data set can be effec- tively visually presented, displays become pow- erful tools by harnessing the pattern-recognition capabilities of human vision. To this end, shape, color, and interactive techniques are widely util- ized in intelligence displays. Unfortunately, the volume and complexity of

Robert J. Bobrow; Aaron Helsinger


GenMiner: a Data Mining Tool for Protein Analysis  

Microsoft Academic Search

We present an integrated tool for preprocessing and analysis of genetic data through data mining. Our goal is the prediction of the functional behavior of proteins, a critical problem in functional genomics. During the last years, many programming approaches have been developed for the identification of short amino-acid chains, which are included in families of related proteins. These chains are

Gerasimos Hatzidamianos; Sotiris Diplaris; Ioannis Athanasiadis; Pericles A. Mitkas



A suite of community tools for spectro-polarimetric analysis  

Microsoft Academic Search

The National Center for Atmospheric Research (NCAR) has undertaken a 3-year initiative to develop the Community Spectro-polarimetric Analysis Center (CSAC). The goal of this effort is to provide the community with standardized tools for extracting the solar magnetic field vector and related atmospheric parameters from spectro-polarimetric observations. The emphasis will be to develop portable, efficient, and well-documented procedures for analysis

B. Lites; R. Casini; J. Garcia; H. Socas-Navarro



DARE-COTS. A domain analysis support tool  

Microsoft Academic Search

DARE-COTS (Domain Analysis Research Environment for Commercial Off-The-Shelf software) is a CASE tool that supports domain analysis-the activity of identifying and documenting the commonalities and variabilities in related software systems. DARE-COTS supports the capture of domain information from experts, documents and code in a domain. Captured domain information is stored in a domain book that typically contains a generic architecture

William Frakes; Ruben Prieto-Diaz; Christopher Fox



Recursive deconvolution of combinatorial chemical libraries.  


A recursive strategy that solves for the active members of a chemical library is presented. A pentapeptide library with an alphabet of Gly, Leu, Phe, and Tyr (1024 members) was constructed on a solid support by the method of split synthesis. One member of this library (NH2-Tyr-Gly-Gly-Phe-Leu) is a native binder to a beta-endorphin antibody. A variation of the split synthesis approach is used to build the combinatorial library. In four vials, a member of the library's alphabet is coupled to a solid support. After each coupling, a portion of the resin from each of the four reaction vials was set aside and catalogued. The solid support from each vial is then combined, mixed, and redivided. The steps of (i) coupling, (ii) saving and cataloging, and (iii) randomizing were repeated until a pentapeptide library was obtained. The four pentapeptide libraries where the N-terminal amino acid is defined were screened against the beta-endorphin antibody and quantitated via an ELISA. The amino acid of the four pools that demonstrated the most binding was then coupled to the four tetrapeptide partial libraries that had been set aside and catalogued during the split synthesis. This recursive deconvolution was repeated until the best binders were deduced. Besides the anticipated native binder, two other members of the library displayed significant binding. This recursive method of deconvolution does not use a molecular tag, requires only one split synthesis, and can be applied to the deconvolution of nonlinear small-molecule combinatorial libraries and linear oligomeric combinatorial libraries, since it is based only on the procedure of the synthesis. PMID:7972077

Erb, E; Janda, K D; Brenner, S



Atomic spectral line free parameter deconvolution procedure.  


We report an advanced numerical procedure for deconvolution of theoretical asymmetric convolution integral of a Gaussian and a plasma broadened spectral line profile j(A,R)(lambda) for spectal lines. Our method determines all broadening parameters, self-consistently and directly from the line profile with minimal assumptions or prior knowledge. This method is useful for obtaining complete information on all plasma parameters directly from the recorded shape of a single line, which is very important in case no other diagnostic methods are available. The method is also convenient for determination of plasma parameters in the case of a symmetrical profile such as Voigt one. PMID:11308772

Milosavljevi?, V; Popari?, G



Analysis tools for non-radially pulsating objects  

NASA Astrophysics Data System (ADS)

At the University of Canterbury we have been developing a set of tools for the analysis of spectra of varying types of non-radially pulsating objects. This set currently includes: calculation of the moments, calculations of the phase across the profile as well as basic binary profile fitting for determination of orbital characteristics and projected rotational velocity (v sin i) measurement. Recently the ability to calculate cross-correlation profiles using either specified or synthesized line lists has been added, all implemented in MATLAB. A number of observations of ? Doradus candidates is currently being used to test these tools. For information on our observing facilities see Pollard et al. (2007).

Wright, D. J.; Pollard, K. R.; Cottrell, P. L.



Deconvolution of MODIS imagery using multiscale maximum entropy  

Microsoft Academic Search

A multiscale maximum entropy method (MEM) for image deconvolution is implemented and applied to MODIS (moderate resolution imaging spectroradiometer) data to remove instrument point-spread function (PSF) effects. The implementation utilizes three efficient computational methods: a fast Fourier transform convolution, a wavelet image decomposition and an algorithm for gradient method step-size estimation that together enable rapid image deconvolution. Multiscale entropy uses

C. J. Jackett; P. J. Turner; J. L. Lovell; R. N. Williams



Generalized contrasts for multichannel blind deconvolution of linear systems  

Microsoft Academic Search

Two contrasts for the problem of multichannel blind deconvolution have been given and theoretically studied by Comon [1996]. The maximization of these criteria allows us to solve the problem of multi-input\\/multi-output (MIMO) blind deconvolution. In this paper, we show that many other contrast functions may be considered. The two aforementioned criteria are proved to be included in the wide class

Eric Moreau; Jean-Christophe Pesquet



Blind deconvolution of multivariate signals: A deflation approach  

Microsoft Academic Search

It is well established that the blind deconvolution problem makes sense as soon as y (or, equivalently, w) is non Gaussian. A previously adaptive blind deconvolution problem is discussed in the case where the observation y is a p-variate signal given by a particular function where the impulse responses (hk)k = 1,p are unknown p × 1 transfer functions, and

Ph. Loubaton; P. Regalia



On the Optimal Rates of Convergence for Nonparametric Deconvolution Problems  

Microsoft Academic Search

Deconvolution problems arise in a variety of situations in statistics. An interesting problem is to estimate the density $f$ of a random variable $X$ based on $n$ i.i.d. observations from $Y = X + \\\\varepsilon$, where $\\\\varepsilon$ is a measurement error with a known distribution. In this paper, the effect of errors in variables of nonparametric deconvolution is examined. Insights

Jianqing Fan



Field Quality Analysis as a Tool to Monitor Magnet Production  

SciTech Connect

Field harmonics offer a powerful tool to examine the mechanical structure of accelerator magnets. A large deviation from the nominal values suggests a mechanical defect. Magnets with such defects are likely to have a poor quench performance. Similarly, a trend suggests a wear in tooling or a gradual change in the magnet assem-bly or in the size of a component. This paper presents the use of the field quality as a tool to monitor the magnet production of the Relativistic Heavy Ion Collider (RHIC). Several examples are briefly described. Field quality analysis can also rule out a suspected geometric error if it can not be supported by the symmetry and the magnitude of the measured harmonics.

Gupta, R.; Anerella, M.; Cozzolino, J.; Fisher, D.; Ghosh, A.; Jain, A.; Sampson, W.; Schmalzle, J.; Thompson, P.; Wanderer, P.; Willen, E.



Power tools for gene expression and clonal analysis in Drosophila  

PubMed Central

The development of two-component expression systems in Drosophila melanogaster, one of the most powerful genetic models, has allowed the precise manipulation of gene function in specific cell populations. These expression systems, in combination with site-specific recombination approaches, have also led to the development of new methods for clonal lineage analysis. We present a hands-on user guide to the techniques and approaches that have greatly increased resolution of genetic analysis in the fly, with a special focus on their application for lineage analysis. Our intention is to provide guidance and suggestions regarding which genetic tools are most suitable for addressing different developmental questions.

Rodriguez, Alberto del Valle; Didiano, Dominic; Desplan, Claude



Analysis of variance–principal component analysis: A soft tool for proteomic discovery  

Microsoft Academic Search

A soft tool for detection of biomarkers in high dimensional data sets has been developed. The tool combines analysis of variance (ANOVA) and principal component analysis (PCA). Covariations are separated using ANOVA into main effects and interaction. The covariances for each effect are combined with the pure error and subjected to PCA. If the main effect is significant compared to

Peter de B. Harrington; Nancy E. Vieira; Jimmy Espinoza; Jyh Kae Nien; Roberto Romero; Alfred L. Yergey



Noise-aware image deconvolution with multidirectional filters.  


In this paper we propose an approach for handling noise in deconvolution algorithm based on multidirectional filters. Most image deconvolution techniques are sensitive to the noise. Even a small amount of noise will degrade the quality of image estimation dramatically. We found that by applying a directional low-pass filter to the blurred image, we can reduce the noise level while preserving the blur information in the orthogonal direction to the filter. So we apply a series of directional filters at different orientations to the blurred image, and a guided filter based edge-preserving image deconvolution is used to estimate an accurate Radon transform of the clear image from each filtered image. Finally, we reconstruct the original image using the inverse Radon transform. We compare our deconvolution algorithm with many competitive deconvolution techniques in terms of the improvement in signal-to-noise ratio and visual quality. PMID:24085180

Yang, Hang; Zhu, Ming; Huang, Heyan; Zhang, Zhongbo



Perceived Image Quality Improvements from the Application of Image Deconvolution to Retinal Images from an Adaptive Optics Fundus Imager  

NASA Astrophysics Data System (ADS)

Aim: The objective of this project was to apply an image restoration methodology based on wavefront measurements obtained with a Shack-Hartmann sensor and evaluating the restored image quality based on medical criteria.Methods: Implementing an adaptive optics (AO) technique, a fundus imager was used to achieve low-order correction to images of the retina. The high-order correction was provided by deconvolution. A Shack-Hartmann wavefront sensor measures aberrations. The wavefront measurement is the basis for activating a deformable mirror. Image restoration to remove remaining aberrations is achieved by direct deconvolution using the point spread function (PSF) or a blind deconvolution. The PSF is estimated using measured wavefront aberrations. Direct application of classical deconvolution methods such as inverse filtering, Wiener filtering or iterative blind deconvolution (IBD) to the AO retinal images obtained from the adaptive optical imaging system is not satisfactory because of the very large image size, dificulty in modeling the system noise, and inaccuracy in PSF estimation. Our approach combines direct and blind deconvolution to exploit available system information, avoid non-convergence, and time-consuming iterative processes. Results: The deconvolution was applied to human subject data and resulting restored images compared by a trained ophthalmic researcher. Qualitative analysis showed significant improvements. Neovascularization can be visualized with the adaptive optics device that cannot be resolved with the standard fundus camera. The individual nerve fiber bundles are easily resolved as are melanin structures in the choroid. Conclusion: This project demonstrated that computer-enhanced, adaptive optic images have greater detail of anatomical and pathological structures.

Soliz, P.; Nemeth, S. C.; Erry, G. R. G.; Otten, L. J.; Yang, S. Y.


Deconvolution techniques for passive radar imaging  

NASA Astrophysics Data System (ADS)

Forming images of aircraft using passive radar systems that exploit illuminators of opportunity, such as commercial television and FM radio systems, involves reconstructing an image from sparse samples of its Fourier transform. For a given flight path, a single receiver-transmitter pair produces one arc of data in Fourier space. Since the resulting Fourier sampling patterns bear a superficial resemblance to those found in radio astronomy, we consider using deconvolution techniques borrowed from radio astronomy, namely the CLEAN algorithm, to form images from passive radar data. Some deconvolution techniques, such as the CLEAN algorithm, work best on images which are well-modeled as a set of distinct point scatterers. Hence, such algorithms are well-suited to high-frequency imaging of man-made targets, as the current on the scatterer surface tends to collect at particular points. When using low frequencies of interest in passive radar, the images are more distributed. In addition, the complex-valued nature of radar imaging presents a complication not present in radio astronomy, where the underlying images are real valued. These effects conspire to present a great challenge to the CLEAN algorithm, indicating the need to explore more sophisticated techniques.

Lanterman, Aaron D.; Munson, David C.



Asymmetric iterative blind deconvolution of multiframe images  

NASA Astrophysics Data System (ADS)

Imaging through a stochastically varying distorting medium, such as a turbulent atmosphere, requires multiple short-exposure frames to ensure maximum resolution of object features. Restoration methods are used to extract the common underlying object from the speckle images, and blind deconvolution techniques are required as typically there is little prior information available about either the image or individual PSFs. A method is presented for multiframe restoration based on iterative blind deconvolution, which alternates between restoring the image and PSF estimates. A maximum-likelihood approach is employed via the Richardson-Lucy (RL) method which automatically ensures positively and conservation of the total number of photons. The restoration is accelerated by applying a vector sequence is treated as a 3D volume of data and processed to produce a 3D stack of PSFs and a single 2D image of the object. The problem of convergence to an undesirable solution, such as a delta function, is addressed by weighting the number of image or PSF iterations according to how quickly each is converging, this leads to the asymmetrical nature of the algorithm. Noise artifacts are suppressed by using a dampened RL algorithm to prevent over fitting of the corrupted data. Results are presented for real single frame and simulated multiframe speckle imaging.

Biggs, David S.; Andrews, Mark



Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)  

NASA Astrophysics Data System (ADS)

Forecasting technology capabilities requires a tool and a process for capturing state-of-the-art technology metrics and estimates for future metrics. A decision support tool, known as the Advanced Technology Lifecycle Analysis System (ATLAS), contains a Technology Tool Box (TTB) database designed to accomplish this goal. Sections of this database correspond to a Work Breakdown Structure (WBS) developed by NASA's Exploration Systems Research and Technology (ESRT) Program. These sections cover the waterfront of technologies required for human and robotic space exploration. Records in each section include technology performance, operations, and programmatic metrics. Timeframes in the database provide metric values for the state of the art (Timeframe 0) and forecasts for timeframes that correspond to spiral development milestones in NASA's Exploration Systems Mission Directorate (ESMD) development strategy. Collecting and vetting data for the TTB will involve technologists from across the agency, the aerospace industry and academia. Technologists will have opportunities to submit technology metrics and forecasts to the TTB development team. Semi-annual forums will facilitate discussions about the basis of forecast estimates. As the tool and process mature, the TTB will serve as a powerful communication and decision support tool for the ESRT program.

Doyle, Monica M.; O'Neil, Daniel A.; Christensen, Carissa B.



Multi-Spacecraft Analysis with Generic Visualization Tools  

NASA Astrophysics Data System (ADS)

To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.



Vesta and HED Meteorites: Determining Minerals and their Abundances with Mid-IR Spectral Deconvolution I  

NASA Astrophysics Data System (ADS)

We identify the known mineral compositions and abundances of laboratory samples of Howardite, Eucrite and Diogenite (HED) meteorites (Salisbury et al. 1991, Icarus 9, 280-297) using an established spectral deconvolution algorithm (Ramsey, 1996 Ph.D. Dissertation, ASU; Ramsey and Christiansen 1998, JGR 103, 577-596) for mid-infrared spectral libraries of mineral separates of varying grain sizes. Most notably, the spectral deconvolution algorithm fit the known plagioclase and pyroxene compositions for all of the HED meteorite spectra determined by laboratory analysis. Our results for the HED samples, give us a high degree of confidence that our results are valid and that the spectral deconvolution algorithm is viable. Mineral compositions and abundances are also determined using the same technique for one possible HED parent body, Vesta, using mid-infrared spectra that were obtained from ground-based telescopes (Sprague et al. 1993, A.S.P. 41 Lim et al. 2005, Icarus 173, 385-408) and the Infrared Space Observatory (ISO) (Dotto et al. 2000, A&A 358, 1133-1141). Mid-infrared spectra of Vesta come from different areas on its surface. The ISO Vesta spectral deconvolution is suggestive of triolite, olivine, augite, chromite, wollastonite, and sodalite at one location. Modeling of other locations is underway. We also were successful in modeling spectra from locations on the Moon where no Apollo samples are available and for several locations on Mercury's surface using the same techniques (see lunar and mercurian abstracts this meeting). These results demonstrate promise for the spectral deconvolution method to correctly make mineral identifications on remotely observed objects, in particular main-belt asteroids, the Moon, and Mercury. This work was funded by NSF AST0406796.

Hanna, Kerri D.; Sprague, A. L.



Development of Multi-Utility Spacecraft Charging Analysis Tool (MUSCAT)  

Microsoft Academic Search

A new numerical software package to analyze spacecraft charging, named ldquomulti-utility spacecraft charging analysis toolrdquo (MUSCAT), has been developed. MUSCAT consists of an integrated graphical user interface tool called ldquoVineyardrdquo and the solver. Vineyard enables satellite engineers to compute spacecraft charging with little knowledge of the numerical calculations. Functions include 3-D satellite modeling, parameter input such as material and orbit

Takanobu Muranaka; Satoshi Hosoda; Jeong-Ho Kim; Shinji Hatta; Koichiro Ikeda; Takamitsu Hamanaga; Mengu Cho; Hideyuki Usui; Hiroko O. Ueda; Kiyokazu Koga; Tateo Goka



Experimental monitoring and data analysis tools for protein folding  

Microsoft Academic Search

Protein folding is a complex process that can take place through different pathways depending on the inducing agent and on the monitored time scale. This diversity of possibilities requires a good design of experiments and powerful data analysis tools that allow operating with multitechnique measurements and\\/or with diverse experiments related to different aspects of the process of interest.Multivariate curve resolution–alternating

Patrick Cutler; Paul J. Gemperline; Anna de Juan



Galileo: A Tool for Dynamic Fault Tree Analysis  

Microsoft Academic Search

Galileo is a prototype software tool for dependability analysis of fault tolerant computer-based systems. Reliability models\\u000a are specified using dynamic fault trees, which provide special constructs for modeling sequential failure modes in addition\\u000a to standard combinatorial fault tree gates. Independent modules are determined automatically, and separate modules are solved\\u000a combinatorially (using Binary Decision Diagrams) or using Markov Methods.

Joanne Bechta Dugan



Scenario Analysis in an Automated Tool for Requirements Engineering  

Microsoft Academic Search

  \\u000a ?This paper presents an automated tool for scenario-driven requirements engineering where scenario analysis plays the central\\u000a role. It is shown that a scenario can be described by three views of data flow, entity relationship and state transition models\\u000a by slight extensions of classic data flow, entity relationship and state transition diagrams. The notions of consistency and\\u000a completeness of a

Hong Zhu; Lingzi Jin



DSD-Crasher: A hybrid analysis tool for bug finding  

Microsoft Academic Search

ABSTRACT DSD-Crasher is a bug,finding tool that follows a three-step approach,to program,analysis: D. Capture,the program’s,intended,execution,behavior with,dynamic,invariant,detection. The derived,invariants exclude,many,unwanted,values,from,the program’s,input domain. S. Statically analyze,the program,within,the restricted input domain,to explore many,paths. D. Automatically,generate,test cases that focus on veri- fying the results of the static analysis. Thereby,confirmed results are never false positives, as opposed to the high false positive rate inherent,in conservative,static

Christoph Csallner; Yannis Smaragdakis; Tao Xie



Tool for bonded optical element thermal stability analysis  

NASA Astrophysics Data System (ADS)

An analytical tool is presented which supports the opto-mechanical design of bonded optical elements. Given the mounting requirements from the optical engineer, the alignment stability and optical stresses in bonded optics can be optimized for the adhesive and housing material properties. While a perfectly athermalized mount is desirable, it is not realistic. The tool permits evaluation of element stability and stress over the expected thermal range at nominal, or worst case, achievable assembly and manufacturing tolerances. Selection of the most appropriate mount configuration and materials, which maintain the optical engineer's design, is then possible. The tool is based on a stress-strain analysis using Hooke's Law in the worst case plane through the optic centerline. The optimal bond line is determined for the selected adhesive, housing and given optic materials using the basic athermalization equation. Since a mounting solution is expected to be driven close to an athermalized design, the stress variations are considered linearly related to strain. A review of the equation set, the tool input and output capabilities and formats and an example will be discussed.

Klotz, Gregory L.



Graphical Tools for Network Meta-Analysis in STATA  

PubMed Central

Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia



Graphical Tools for Network Meta-Analysis in STATA.  


Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia



Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.  

EPA Science Inventory

The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...


Networking Sensor Observations, Forecast Models & Data Analysis Tools  

NASA Astrophysics Data System (ADS)

This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and population statistics services to be registered with the GEOSS registry and made findable through the GEOSS Clearinghouse. The fusion of data sources and different web service interfaces illustrate the agility in using standard interfaces and help define the type of input and output interfaces needed to connect models and analysis tools within sensor webs.

Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.



Transient multiple impulsive signal analysis for pneumatic percussive rivet tools  

NASA Astrophysics Data System (ADS)

This paper presents a systematic study on the analysis of the multiple impulsive transient signals generated by the pneumatic percussive rivet tools, i.e. rivet hammers and bucking bars. Detailed discussions are provided on how to conduct vibration measurements of the tools and the methods of analysing the multiple impulsive signals. Important issues such as triggering method, averaging, windowing, recording, and conversion from fast Fourier transform narrow band spectrum to a one-third octave-band spectrum are included in the paper. In addition, the implementation of ISO 5349 standard on this type of measurement is addressed. It is believed that the methods presented in this paper can be applied for similar transient signals generated by other types of mechanical systems.

Cherng, John G.; Peng, Sheng-Lih



SAGE: A tool for time-series analysis of Greenland  

NASA Astrophysics Data System (ADS)

The National Snow and Ice Data Center (NSIDC) has developed an operational tool for analysis. This production tool is known as "Services for the Analysis of the Greenland Environment" (SAGE). Using an integrated workspace approach, a researcher has the ability to find relevant data and perform various analysis functions on the data, as well as retrieve the data and analysis results. While there continues to be compelling observational evidence for increased surface melting and rapid thinning along the margins of the Greenland ice sheet, there are still uncertainties with respect to estimates of mass balance of Greenland's ice sheet as a whole. To better understand the dynamics of these issues, it is important for scientists to have access to a variety of datasets from multiple sources, and to be able to integrate and analyze the data. SAGE provides data from various sources, such as AMSR-E and AVHRR datasets, which can be analyzed individually through various time-series plots and aggregation functions; or they can be analyzed together with scatterplots or overlaid time-series plots to provide quick and useful results to support various research products. The application is available at SAGE was built on top of NSIDC's existing Searchlight engine. The SAGE interface gives users access to much of NSIDC's relevant Greenland raster data holdings, as well as data from outside sources. Additionally, various web services provide access for other clients to utilize the functionality that the SAGE interface provides. Combined, these methods of accessing the tool allow scientists the ability to devote more of their time to their research, and less on trying to find and retrieve the data they need.

Duerr, R. E.; Gallaher, D. W.; Khalsa, S. S.; Lewis, S.



POPBAM: Tools for Evolutionary Analysis of Short Read Sequence Alignments  

PubMed Central

Background While many bioinformatics tools currently exist for assembling and discovering variants from next-generation sequence data, there are very few tools available for performing evolutionary analyses from these data. Evolutionary and population genomics studies hold great promise for providing valuable insights into natural selection, the effect of mutations on phenotypes, and the origin of species. Thus, there is a need for an extensible and flexible computational tool that can function into a growing number of evolutionary bioinformatics pipelines. Results This paper describes the POPBAM software, which is a comprehensive set of computational tools for evolutionary analysis of whole-genome alignments consisting of multiple individuals, from multiple populations or species. POPBAM works directly from BAM-formatted assembly files, calls variant sites, and calculates a variety of commonly used evolutionary sequence statistics. POPBAM is designed primarily to perform analyses in sliding windows across chromosomes or scaffolds. POPBAM accurately measures nucleotide diversity, population divergence, linkage disequilibrium, and the frequency spectrum of mutations from two or more populations. POPBAM can also produce phylogenetic trees of all samples in a BAM file. Finally, I demonstrate that the implementation of POPBAM is both fast and memory-efficient, and also can feasibly scale to the analysis of large BAM files with many individuals and populations. Software: The POPBAM program is written in C/C++ and is available from The program has few dependencies and can be built on a variety of Linux platforms. The program is open-source and users are encouraged to participate in the development of this resource.

Garrigan, Daniel



A comparison of deconvolution and the Rutland-Patlak plot in parenchymal renal uptake rate  

PubMed Central

Introduction: Deconvolution and the Rutland-Patlak (R-P) plot are two of the most commonly used methods for analyzing dynamic radionuclide renography. Both methods allow estimation of absolute and relative renal uptake of radiopharmaceutical and of its rate of transit through the kidney. Materials and Methods: Seventeen patients (32 kidneys) were referred for further evaluation by renal scanning. All patients were positioned supine with their backs to the scintillation gamma camera, so that the kidneys and the heart are both in the field of view. Approximately 5-7 mCi of 99mTc-DTPA (diethylinetriamine penta-acetic acid) in about 0.5 ml of saline is injected intravenously and sequential 20 s frames were acquired, the study on each patient lasts for approximately 20 min. The time-activity curves of the parenchymal region of interest of each kidney, as well as the heart were obtained for analysis. The data were then analyzed with deconvolution and the R-P plot. Results: A strong positive association (n = 32; r = 0.83; R2 = 0.68) was found between the values that obtained by applying the two methods. Bland-Altman statistical analysis demonstrated that ninety seven percent of the values in the study (31 cases from 32 cases, 97% of the cases) were within limits of agreement (mean ± 1.96 standard deviation). Conclusion: We believe that R-P analysis method is expected to be more reproducible than iterative deconvolution method, because the deconvolution technique (the iterative method) relies heavily on the accuracy of the first point analyzed, as any errors are carried forward into the calculations of all the subsequent points, whereas R-P technique is based on an initial analysis of the data by means of the R-P plot, and it can be considered as an alternative technique to find and calculate the renal uptake rate.

Al-Shakhrah, Issa A



Spatial-Resolution Enhancement of SMOS Data: A Deconvolution-Based Approach  

Microsoft Academic Search

A deconvolution-based model has been developed in an attempt to improve the spatial resolution of future soil moisture and ocean salinity (SMOS) data. This paper is devoted to the analysis and evaluation of different algorithms using brightness temperature images obtained from an upgraded version of the SMOS end-to-end performance simulator. Particular emphasis is made on the use of least-square-derived Lagrangian

María Piles; Adriano Camps; Mercè Vall-Llossera; Marco Talone



GLIDER: Free tool imagery data visualization, analysis and mining  

NASA Astrophysics Data System (ADS)

Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis tools generally do not provide a good interface between visualization and analysis, especially a 3D view, and GLIDER fills this gap. This feature gives the users extremely useful spatial context to their data and analysis/mining results. This presentation will demonstrate the latest version of GLIDER and also describe its supporting documentation such as video tutorial, online resources etc.

Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.



Protocol analysis as a tool for behavior analysis  

PubMed Central

The study of thinking is made difficult by the fact that many of the relevant stimuli and responses are not apparent. Although the use of verbal reports has a long history in psychology, it is only recently that Ericsson and Simon's (1993) book on verbal reports explicated the conditions under which such reports may be reliable and valid. We review some studies in behavior analysis and cognitive psychology that have used talk-aloud reporting. We review particular methods for collecting reliable and valid verbal reports using the “talk-aloud” method as well as discuss alternatives to the talk-aloud procedure that are effective under different task conditions, such as the use of reports after completion of very rapid task performances. We specifically caution against the practice of asking subjects to reflect on the causes of their own behavior and the less frequently discussed problems associated with providing inappropriate social stimulation to participants during experimental sessions.

Austin, John; Delaney, Peter F.



Principles and tools for collaborative entity-based intelligence analysis.  


Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis. PMID:20075480

Bier, Eric A; Card, Stuart K; Bodnar, John W


Operations other than war: Requirements for analysis tools research report  

SciTech Connect

This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

Hartley, D.S. III



TRIAC: A code for track measurements using image analysis tools  

NASA Astrophysics Data System (ADS)

A computer program named TRIAC written in MATLAB has been developed for track recognition and track parameters measurements from images of the Solid State Nuclear Track Detectors CR39. The program using image analysis tools counts the number of tracks for dosimetry proposes and classifies the tracks according to their radii for the spectrometry of alpha-particles. Comparison of manual scanning counts with those output by the automatic system are presented for detectors exposed to a radon rich environment. The system was also tested to differentiate tracks recorded by alpha-particles of different energies.

Patiris, D. L.; Blekas, K.; Ioannides, K. G.



Java Analysis Tools for Element Production Calculations in Computational Astrophysics  

NASA Astrophysics Data System (ADS)

We are developing a set of extendable, cross-platform tools and interfaces using Java and vector graphic technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as stand-alone applications or distributed across a network. These tools, which have broad applications in general scientific visualization, are currently being used to explore and analyze a large library of nuclear reaction rates and visualize results of explosive nucleosynthesis calculations with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to easily include new rates and compare and adjust current ones. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic (ten-fold) reduction in visualization file sizes while maintaining high visual quality and interactive control. ORNL Physics Division is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

Lingerfelt, E.; Hix, W.; Guidry, M.; Smith, M.



Analysis Tool for Predicting the Transient Hydrodynamics Resulting from the Rapid Filling of Voided Piping Systems.  

National Technical Information Service (NTIS)

An analysis tool is constructed for the purpose of predicting the transient hydrodynamics resulting from the rapid filling of voided piping systems. The basic requirements of such an analysis tool are established, and documentation is presented for severa...

R. L. Williamson



Deconvolution Estimation in Measurement Error Models: The R Package decon.  


Data from many scientific areas often come with measurement error. Density or distribution function estimation from contaminated data and nonparametric regression with errors-in-variables are two important topics in measurement error models. In this paper, we present a new software package decon for R, which contains a collection of functions that use the deconvolution kernel methods to deal with the measurement error problems. The functions allow the errors to be either homoscedastic or heteroscedastic. To make the deconvolution estimators computationally more efficient in R, we adapt the fast Fourier transform algorithm for density estimation with error-free data to the deconvolution kernel estimation. We discuss the practical selection of the smoothing parameter in deconvolution methods and illustrate the use of the package through both simulated and real examples. PMID:21614139

Wang, Xiao-Feng; Wang, Bin



Alternate Norms for the Cabrelli and Wiggins Blind Deconvolution Algorithms.  

National Technical Information Service (NTIS)

Two blind deconvolution algorithms are extended to include alternate norms and used to estimate transient source signatures. The inputs to the blind algorithms are received signals which have undergone propagation through a medium and may be difficult to ...

L. A. Pflug M. K. Broadhead



A theorem on the difficulty of numerical deconvolution  

Microsoft Academic Search

The difficulty of numerical deconvolution is explained by examining the matrix of the linear equations generated by approximating the linear system superposition integral. It is shown that any row of the matrix is approximately a linear combination of other rows.

B. Hunt



Spatial Deconvolution Studies of Nearside Lunar Prospector Thorium Abundances  

NASA Astrophysics Data System (ADS)

We have carried out spatial deconvolution studies of Lunar Prospector thorium abundances. We show that these techniques can be useful in improving interpretations of low-spatial resolution datasets such as orbital gamma-ray data.

Lawrence, D. J.; Elphic, R. C.; Feldman, W. C.; Hagerty, J. J.; Prettyman, T. H.



Image analysis tools and emerging algorithms for expression proteomics  

PubMed Central

Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS.

English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.



Resolution of Conventional and Masked Apertures with Lucy-Richardson Deconvolution  

NASA Astrophysics Data System (ADS)

Non-redundant aperture masking has recently been used to achieve resolution of binary stars beyond the conventional diffraction limit, at separations as small as 0.5 lambda/D. FGS/TFI on JWST will include a non-redundant mask to exploit this "super-resolution" property. Although the usual data analysis method for non-redundant apertures is an interferometric approach, the super-resolution advantage of masked apertures can also be understood through deconvolution. By running iterative Lucy-Richardson deconvolutions on different apertures (including Gaussian, annulus, circular, long-baseline interferometer, and non-redundant masking apertures), we are able to demonstrate the resolution advantages of aperture masking with a straightforward approach.

Sitarski, Breann; Lloyd, J.



Rapid deconvolution of NMR powder spectra by weighted fast Fourier transformation.  


The resolution enhancement conferred by numerical deconvolution of powder pattern spectra to spectra characteristic of a single alignment greatly simplifies solid state NMR spectral analysis. This is especially beneficial when the spectrum is a superposition of signals from multiple environments or sites of labelling. We have developed an innovative method to deconvolute (depake) spectra governed by axially symmetric second rank tensor interactions which possess a P2(cos theta) dependence upon orientation, where theta is the angle between the symmetry axis and the external magnetic field. Our approach differs substantially from previously published procedures which are iterative or require matrix inversion and, hence, are slow. The new method, instead, utilizes weighting functions in time and frequency domains to facilitate a rapidly executed solution based upon fast Fourier transformation (FFT). Its efficacy is demonstrated with 2H and 31P NMR data for model membranes. PMID:9472792

McCabe, M A; Wassall, S R



Multichannel blind deconvolution and equalization using the natural gradient  

Microsoft Academic Search

Multichannel deconvolution and equalization is an important task for numerous applications in communications, signal processing, and control. We extend the efficient natural gradient search method of Amari, Cichocki and Yang (see Advances in Neural Information Processing Systems, p.752-63, 1995) to derive a set of on-line algorithms for combined multichannel blind source separation and time-domain deconvolution\\/equalization of additive, convolved signal mixtures.

Shun-ichi Amari; Scott C. Douglas; Andrzej Cichocki; Howard H. Yang




Microsoft Academic Search

The paper proposes a new wavelet-based Bayesian approach to image deconvolution, under the space-invariant blur and ad- ditive white Gaussian noise assumptions. Image deconvolution exploits the well known sparsity of the wavelet coefficients, de- scribed by heavy-tailed priors. The present approach admits any prior given by a linear (finite of infinite) combination of Gaussian densities. To compute the maximum a

M. B. Dias; Torre Norte



Blind deconvolution of noisy complex-valued image  

NASA Astrophysics Data System (ADS)

A new algorithm, resulting from adaptions of iterative techniques for blind deconvolution developed by Ayers and Dainty and by ourselves, is introduced. The new algorithm allows the blind deconvolution of a single blurred image, each of whose pixels is complex valued. Result are reported of applying the algorithm to images contaminated with pseudo-random noise. The effect upon the algorithm's convergence of introducing various constraints is reported.

Davey, B. L. K.; Lane, R. G.; Bates, R. H. T.



Bayesian multiscale deconvolution applied to gamma-ray spectroscopy  

Microsoft Academic Search

A common task in gamma-ray astronomy is to extract spectral information, such as model constraints and incident photon spectrum estimates, given the measured energy deposited in a detector and the detector response. This is the classic problem of spectral ``deconvolution'' or spectral inversion. The methods of forward folding (i.e., parameter fitting) and maximum entropy ``deconvolution'' (i.e., estimating independent input photon

C. A. Young; A. Connors; E. Kolaczyk; M. McConnell; G. Rank; J. M. Ryan; V. Schönfelder



Array-conditioned deconvolution of multiple-component teleseismic recordings  

NASA Astrophysics Data System (ADS)

A variety of methodologies have been developed over the years to analyze converted and scattered seismic waves, ranging from single station applications to high-resolution imaging using dense arrays of broadband seismometers. A key step in the data preprocessing chain for these imaging techniques is the 'source-normalization', which requires the construction and application of a deconvolution operator to remove the extended earthquake source function, replacing it with an approximate impulse. Recently, the increased availability of dense seismic array data has motivated the development of novel multichannel deconvolution methods. Here, we investigate the applicability of an array-conditioned deconvolution technique, originally developed for analyzing borehole seismic exploration data, to teleseismic receiver functions and data preprocessing steps for scattered wavefield imaging. This multichannel deconvolution constructs an approximate inverse filter to the estimated source signature by solving an overdetermined set of deconvolution equations, using an array of receivers detecting a common source. We find that this technique improves the efficiency and automation of receiver function calculation and data preprocessing workflow. Synthetic experiments demonstrate that this optimal deconvolution automatically determines and subsequently attenuates the noise from data, enhancing Ps converted phases in seismograms with various noise levels. In this context, the array-conditioned deconvolution presents a new, effective and automatic means for processing large amounts of array data, as it does not require any ad-hoc regularization; the regularization is achieved naturally by using the noise present in the array itself. Application of the array deconvolution to a teleseismic data set from a dense array in the Slave craton yields deconvolved data that clearly identify the Ps conversion at the Moho, and suggest the presence of local crustal heterogeneities beneath each receiver. The performance of this technique with noisy data promises the potential of exploiting earthquakes with smaller magnitudes, which would increase the number of usable sources, and thus provide more comprehensive azimuthal coverage for teleseismic imaging.

Chen, C.; Miller, D. E.; Djikpesse, H.; Haldorsen, J. B.; Rondenay, S.



CGHPRO - A comprehensive data analysis tool for array CGH  

PubMed Central

Background Array CGH (Comparative Genomic Hybridisation) is a molecular cytogenetic technique for the genome wide detection of chromosomal imbalances. It is based on the co-hybridisation of differentially labelled test and reference DNA onto arrays of genomic BAC clones, cDNAs or oligonucleotides, and after correction for various intervening variables, loss or gain in the test DNA can be indicated from spots showing aberrant signal intensity ratios. Now that this technique is no longer confined to highly specialized laboratories and is entering the realm of clinical application, there is a need for a user-friendly software package that facilitates estimates of DNA dosage from raw signal intensities obtained by array CGH experiments, and which does not depend on a sophisticated computational environment. Results We have developed a user-friendly and versatile tool for the normalization, visualization, breakpoint detection and comparative analysis of array-CGH data. CGHPRO is a stand-alone JAVA application that guides the user through the whole process of data analysis. The import option for image analysis data covers several data formats, but users can also customize their own data formats. Several graphical representation tools assist in the selection of the appropriate normalization method. Intensity ratios of each clone can be plotted in a size-dependent manner along the chromosome ideograms. The interactive graphical interface offers the chance to explore the characteristics of each clone, such as the involvement of the clones sequence in segmental duplications. Circular Binary Segmentation and unsupervised Hidden Markov Model algorithms facilitate objective detection of chromosomal breakpoints. The storage of all essential data in a back-end database allows the simultaneously comparative analysis of different cases. The various display options facilitate also the definition of shortest regions of overlap and simplify the identification of odd clones. Conclusion CGHPRO is a comprehensive and easy-to-use data analysis tool for array CGH. Since all of its features are available offline, CGHPRO may be especially suitable in situations where protection of sensitive patient data is an issue. It is distributed under GNU GPL licence and runs on Linux and Windows.

Chen, Wei; Erdogan, Fikret; Ropers, H-Hilger; Lenzner, Steffen; Ullmann, Reinhard



Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique  

ERIC Educational Resources Information Center

A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.



Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique  

ERIC Educational Resources Information Center

|A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.



Novel ultra-fast deconvolution method for fluorescence lifetime imaging microscopy based on the Laguerre expansion technique  

Microsoft Academic Search

A new deconvolution method for fluorescence lifetime imaging microscopy (FLIM) based on the Laguerre expansion technique is presented. The performance of this method was tested on synthetic FLIM images derived from a multiexponential model and from fluorescence lifetime standards, and then compared to standard algorithms of FLIM analysis. Our results demonstrated significant advantages of the Laguerre method over standard algorithms.

J. A. Jo; Q. Fang; T. Papaioannou; L. Marcu



Semi-blind spectral deconvolution with adaptive Tikhonov regularization.  


Deconvolution has become one of the most used methods for improving spectral resolution. Deconvolution is an ill-posed problem, especially when the point spread function (PSF) is unknown. Non-blind deconvolution methods use a predefined PSF, but in practice the PSF is not known exactly. Blind deconvolution methods estimate the PSF and spectrum simultaneously from the observed spectra, which become even more difficult in the presence of strong noise. In this paper, we present a semi-blind deconvolution method to improve the spectral resolution that does not assume a known PSF but models it as a parametric function in combination with the a priori knowledge about the characteristics of the instrumental response. First, we construct the energy functional, including Tikhonov regularization terms for both the spectrum and the parametric PSF. Moreover, an adaptive weighting term is devised in terms of the magnitude of the first derivative of spectral data to adjust the Tikhonov regularization for the spectrum. Then we minimize the energy functional to obtain the spectrum and the parameters of the PSF. We also discuss how to select the regularization parameters. Comparative results with other deconvolution methods on simulated degraded spectra, as well as on experimental infrared spectra, are presented. PMID:23146190

Yan, Luxin; Liu, Hai; Zhong, Sheng; Fang, Houzhang



PyRAT (python radiography analysis tool): overview  

SciTech Connect

PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory



A new tool for accelerator system modeling and analysis  

SciTech Connect

A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated.

Gillespie, G.H.; Hill, B.W. [G.H. Gillespie Associates, Inc., Del Mar, CA (United States); Jameson, R.A. [Los Alamos National Lab., NM (United States)



Ganalyzer: A Tool for Automatic Galaxy Image Analysis  

NASA Astrophysics Data System (ADS)

We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at, and the data used in the experiment are available at

Shamir, Lior



The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy  

NASA Astrophysics Data System (ADS)

In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.



Ganalyzer: A tool for automatic galaxy image analysis  

NASA Astrophysics Data System (ADS)

We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.

Shamir, Lior



Application of the Lucy–Richardson Deconvolution Procedure to High Resolution Photoemission Spectra  

SciTech Connect

Angle-resolved photoemission has developed into one of the leading probes of the electronic structure and associated dynamics of condensed matter systems. As with any experimental technique the ability to resolve features in the spectra is ultimately limited by the resolution of the instrumentation used in the measurement. Previously developed for sharpening astronomical images, the Lucy-Richardson deconvolution technique proves to be a useful tool for improving the photoemission spectra obtained in modern hemispherical electron spectrometers where the photoelectron spectrum is displayed as a 2D image in energy and momentum space.

Rameau, J.; Yang, H.-B.; Johnson, P.D.



Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool  

SciTech Connect

Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

Unwin, Stephen D.; Seiple, Timothy E.



Analysis of a public sector organizational unit using strategic and operational analysis tools  

Microsoft Academic Search

This paper reports on a project which analyzed the processes carried out by a unit within a public sector organization. The method used a combination of strategic and operational analysis tools. This combination proved to be complementary and effective in practice. This outcome of the study suggests that where a process analysis project has strategic considerations, as many do, then

Malcolm Brady



Lag Sequential Analysis as a Tool for Functional Analysis of Student Disruptive Behavior in Classrooms  

Microsoft Academic Search

Lag sequential analysis of individual interactions was explored as a tool to generate hypotheses regarding the social control of inappropriate classroom behavior of students with severe behavior disorders. Four single subject experiments with two students who displayed high rates of disruptive behavior in special education classrooms were completed using lag sequential analysis to identify antecedent and subsequent social events that




Analysis of flow cytometry data using an automatic processing tool.  


In spite of recent advances in flow cytometry technology, most cytometry data is still analyzed manually which is labor-intensive for large datasets and prone to bias and inconsistency. We designed an automatic processing tool (APT) to rapidly and consistently define and describe cell populations across large datasets. Image processing, smoothing, and clustering algorithms were used to generate an expert system that automatically reproduces the functionality of commercial manual cytometry processing tools. The algorithms were developed using a dataset collected from CMV-infected infants and combined within a graphical user interface, to create the APT. The APT was used to identify regulatory T-cells in HIV-infected adults, based on expression of FOXP3. Results from the APT were compared directly with the manual analyses of five immunologists and showed close agreement, with a concordance correlation coefficient of 0.96 (95% CI 0.91-0.98). The APT was well accepted by users and able to process around 100 data files per hour. By applying consistent criteria to all data generated by a study, the APT can provide a level of objectivity that is difficult to match using conventional manual analysis. PMID:18613039

Jeffries, David; Zaidi, Irfan; de Jong, Bouke; Holland, Martin J; Miles, David J C



Spasmodic dysphonia, perceptual and acoustic analysis: presenting new diagnostic tools.  


In this article, we investigate whether (1) the IINFVo (Impression, Intelligibility, Noise, Fluency and Voicing) perceptual rating scale and (2) the AMPEX (Auditory Model Based Pitch Extractor) acoustical analysis are suitable for evaluating adductor spasmodic dysphonia (AdSD). Voice recordings of 12 patients were analysed. The inter-rater and intra-rater consistency showed highly significant correlations for the IINFVo rating scale, with the exception of the parameter Noise. AMPEX reliably analyses vowels (correlation between PUVF (percentage of frames with unreliable F0/voicing 0.748), running speech (correlation between PVF (percentage of voiced frames)/voicing 0.699) and syllables. Correlations between IINFVo and AMPEX range from 0.608 to 0.818, except for noise. This study indicates that IINFVo and AMPEX could be robust and complementary assessment tools for the evaluation of AdSD. Both the tools provide us with the valuable information about voice quality, stability of F0 (fundamental frequency) and specific dimensions controlling the transitions between voiced and unvoiced segments. PMID:19866529

Siemons-Lühring, Denise Irene; Moerman, Mieke; Martens, Jean-Pierre; Deuster, Dirk; Müller, Frank; Dejonckere, Philippe



[Meta-analysis as a tool for evaluation of evidence].  


In these days, more than one clinical trial is mostly performed to evaluate a new treatment or therapeutic intervention. This necessitates a combined evaluation of their results. An integration of evidence from several trials is also helpful to determine the actual knowledge. These are the main goals of meta-analyses. Since the end of the 80s meta-analyses are widely used in clinical research. At the beginning of a meta-analysis, a protocol has to be developed. Similar to a protocol of a clinical trial, the inclusion and exclusion criteria for trials, the hypotheses and the planned analyses have to be fixed. After a careful localization of trials, a combined statistical analysis is performed. An investigation of heterogeneity, i.e., differences between study results, is indispensable. During the last years, the tool meta-analysis has been criticized. The criticism mainly results from poorly conducted meta-analyses which generated results without prespecifying hypotheses or which merely combined study results. Well-planned meta-analyses, on the contrary, have an increasing influence in clinical research. PMID:10714131

Koch, A; Ziegler, S



[Meta-analysis as a tool for gaining knowledge].  


In these days, more than one clinical trial is mostly performed to evaluate a new treatment or therapeutic intervention. This necessitates a combined evaluation of their results. An integration of evidence from several trials is also helpful to determine the actual knowledge. These are the main goals of meta-analyses. Since the end of the 80s meta-analyses are widely used in clinical research. At the beginning of a meta-analysis, a protocol has to be developed. Similar to a protocol of a clinical trial, the inclusion and exclusion criteria for trials, the hypotheses and the planned analyses have to be fixed. After a careful localization of trials, a combined statistical analysis is performed. An investigation of heterogeneity, i.e., differences between study results, is indispensable. During the last years, the tool meta-analysis has been criticized. The criticism mainly results from poorly conducted meta-analyses which generated results without prespecifying hypotheses or which merely combined study results. Well-planned meta-analyses, on the contrary, have an increasing influence in clinical research. PMID:10851854

Koch, A; Ziegler, S



Immunoglobulin Analysis Tool: A Novel Tool for the Analysis of Human and Mouse Heavy and Light Chain Transcripts  

PubMed Central

Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft® Excel® based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts – such as the usage of germline gene segments, or the length and composition of the CDR-3 region – IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai’s H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel® 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set to its fullest.

Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F.; Ippolito, Gregory C.; Zemlin, Michael




Microsoft Academic Search

This paper presents a software tool for performing the static and dynamic analysis of regular and simple multi-storied structures. This analysis procedure is based on IS 1893:2002 (Part I), Criteria for Earthquake Resistant Design of Structures. This tool is a windows based software program developed in Visual Basic. The tool provides a user- friendly GUI for the calculation of base



Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics  

Microsoft Academic Search

The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict

Vincent J Jodoin; Ronald W Lee; Douglas E. Peplow; Jordan P Lefebvre



Elementary mode analysis: a useful metabolic pathway analysis tool for characterizing cellular metabolism  

Microsoft Academic Search

Elementary mode analysis is a useful metabolic pathway analysis tool to identify the structure of a metabolic network that\\u000a links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised\\u000a of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes\\u000a that can support steady state operation of

Cong T. Trinh; Aaron Wlaschin; Friedrich Srienc



Reference-Free XRF - Principle, Calibrated Instrumentation and Spectra Deconvolution  

NASA Astrophysics Data System (ADS)

The Physikalisch-Technische Bundesanstalt operates its own laboratory1 at the electron storage ring BESSY II in Berlin. One major task of this laboratory, hosting the departments Radiometry and X-ray Metrology with Synchrotron Radiation, is the use of well-defined synchrotron radiation for calibration of different types of detectors in the spectral range from UV/VUV to the harder X-ray range. Well-known radiation sources in conjunction with calibrated instrumentation are used for X-ray fluorescence analysis (XRF) allowing for completely reference-free quantification. Here, the XRF spectra deconvolution with experimentally determined detector response functions and the further improvement of it by using line- sets for each subshell of an involved element has been developed. Synchrotron radiation originating from a bending magnet can be partially seen as an equivalent to the solar emission spectrum in the soft and hard X-ray range. Using different electron energies in the storage ring of BESSY II as well as of PTB's own Metrology Light Source (MLS), different parts of the solar spectrum can be approximated allowing for complementary simulations of excitation conditions for XRF remote sensing of planetary surfaces.

Kolbe, M.; Beckhoff, B.; Mantler, M.



Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools  

SciTech Connect

The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

Barrand, Guy



A Web-based Tool For The Analysis Of Concept Inventory Data  

Microsoft Academic Search

“FOCIA” stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of

Joseph P. Beuckman; Scott V. Franklin; Rebecca S. Lindell



A Web-based Tool For The Analysis Of Concept Inventory Data  

Microsoft Academic Search

``FOCIA'' stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of

Joseph P. Beuckman; Scott V. Franklin; Rebecca S. Lindell



The use of current risk analysis tools evaluated towards preventing external domino accidents  

Microsoft Academic Search

Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that there are many appropriate techniques for any circumstance and the choice has become more

G. L. L. Reniers; W. Dullaert; B. J. M. Ale




Microsoft Academic Search

The SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis has been a useful tool for industry. This article proposes the application of the SWOT tool for use as a decision-making aid as new vocational programs are planned.

Radha Balamuralikrishna; John C. Dugger


Data Analysis Tools Using JAVA/Internet Technology at Arnold Engineering Development Center.  

National Technical Information Service (NTIS)

AEDC is in the process of bringing Virtual Presence capabilities to its customers through Data Analysis Tools using the Java Programming Language and Internet Technologies. These technology tools are maturing at a time when U. S. dominance and market shar...

D. Pemberton



Process Documentation and Execution: Introducing a Tool to Support Analysis of Alternatives.  

National Technical Information Service (NTIS)

PROBLEM STATEMENT: Develop software tools to support analytical data preparation processes: Populate authoritative databases * Prepare data for analysis * Post-process model output. What features would you like to see in these tools. Usability * Reliabili...

T. A. Dufresne R. L. Turner



Array-conditioned deconvolution of multiple-component teleseismic recordings  

NASA Astrophysics Data System (ADS)

We investigate the applicability of an array-conditioned deconvolution technique, developed for analysing borehole seismic exploration data, to teleseismic receiver functions and data pre-processing steps for scattered wavefield imaging. This multichannel deconvolution technique constructs an approximate inverse filter to the estimated source signature by solving an overdetermined set of deconvolution equations, using an array of receivers detecting a common source. We find that this technique improves the efficiency and automation of receiver function calculation and data pre-processing workflow. We apply this technique to synthetic experiments and to teleseismic data recorded in a dense array in northern Canada. Our results show that this optimal deconvolution automatically determines and subsequently attenuates the noise from data, enhancing P-to-S converted phases in seismograms with various noise levels. In this context, the array-conditioned deconvolution presents a new, effective and automatic means for processing large amounts of array data, as it does not require any ad-hoc regularization; the regularization is achieved naturally by using the noise present in the array itself.

Chen, C.-W.; Miller, D. E.; Djikpesse, H. A.; Haldorsen, J. B. U.; Rondenay, S.



EVA - An Interactive Online Tool for Extreme Value Analysis  

NASA Astrophysics Data System (ADS)

Forecasting and analysing extreme events and their impact is a duty of operational forecasters, though it happens not very frequently. In such situations forecasters often rely on a synopsis of different forecast models, own experience, historical observations and intuition. Especially historical data are usually not available at the entirety and timeliness needed in operational forecasting and warning. A forecaster needs a comprehensive overview. He has no time to dig data from a database, search for extremes and compile a rather complicated extreme value analysis on the data. On the other hand in the field of engineering expertise on extreme events is often asked from a modern weather service and in a lot of cases time for elaboration is limited. EVA (Extreme Value Analysis) was developed at ZAMG during METEORISK, a project among alpine weather- and hydrological services dealing with meteorological and hydrological risks. The EVA system consists of two main components: An effective database containing pre-processed precipitation data (rain, snow and snow height) from meteorological events of durations from 1 minute up to 15 days measured at each station in the partner regions. The second part of the system is a set of web-tools to deal with the actual extreme value analysis. Different theoretical models can be chosen to calculate annualities. Presentation of the output is either tabular showing all extreme events at a station together with the theoretically calculated return times, or graphical where parameters like precipitation amount at certain return times and confidence intervals are plotted together with the empirical distribution of the actual measurements. Additional plots (quantile-quantile plots, empirical and fitted theoretical distribution model) allowing a more detailed assessment of the extreme value analysis can be requested. To complete analysis of a special extreme event ECMWF ERA40 sea level and upper air pressure fields and temperature distribution are available within the system. During the years after Meteorisk, the EVA System has been expanded by ZAMG adding further parameters like wind speed and temperature. The system has lately been harmonized, so that ZAMG has now only one platform providing fast extreme value analysis for all kind of interesting meteorological parameter. A further development is the EVA-maps application. Forecasted extreme events at station locations and actual measurements are compared to historical extreme events. Return times of the forecasted and measured events are classified and displayed in a map. A mouse-over menu offers detailed analysis of the situation at each station. EVA-maps is a powerful assistance to the forecasters, where they get a comprehensive overview of forecasted precipitation in relation to extreme events of the past.

Zingerle, C.; Buchauer, M.; Neururer, A.; Schellander, H.



jSIPRO - Analysis tool for magnetic resonance spectroscopic imaging.  


Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan



Improved Cell Typing by Charge-State Deconvolution of matrix-assisted laser desorption/ionization Mass Spectra  

SciTech Connect

Robust, specific, and rapid identification of toxic strains of bacteria and viruses, to guide the mitigation of their adverse health effects and optimum implementation of other response actions, remains a major analytical challenge. This need has driven the development of methods for classification of microorganisms using mass spectrometry, particularly matrix-assisted laser desorption ionization MS (MALDI) that allows high throughput analyses with minimum sample preparation. We describe a novel approach to cell typing based on pattern recognition of MALDI spectra, which involves charge-state deconvolution in conjunction with a new correlation analysis procedure. The method is applicable to both prokaryotic and eukaryotic cells. Charge-state deconvolution improves the quantitative reproducibility of spectra because multiply-charged ions resulting from the same biomarker attaching a different number of protons are recognized and their abundances are combined. This allows a clearer distinction of bacterial strains or of cancerous and normal liver cells. Improved class distinction provided by charge-state deconvolution was demonstrated by cluster spacing on canonical variate score charts and by correlation analyses. Deconvolution may enhance detection of early disease state or therapy progress markers in various tissues analyzed by MALDI.

Wilkes, Jon G.; Buzantu, Dan A.; Dare, Diane J.; Dragan, Yvonne P.; Chiarelli, M. Paul; Holland, Ricky D.; Beaudoin, Michael; Heinze, Thomas M.; Nayak, Rajesh; Shvartsburg, Alexandre A.



Edge-preserving image deconvolution with nonlocal domain transform  

NASA Astrophysics Data System (ADS)

In this paper, we propose a new approach for performing efficient edge-preserving image deconvolution algorithm based on a nonlocal domain transform (NLDT). We present the geodesic distance-preserving transforming procedure of a 1D signal embedded in 2D space into a new 1D domain via a transformation for simplicity. The nonlocal domain transform derives from the (1D) nonlocal means filter kernel and iteratively and separably applies 1D edge-aware operations. In order to solve the main issue with noisy images that is finding robust estimates for their derivatives, we develop an efficient joint nonlocal domain transform filter in the deblurring process. Furthermore, we derive the discrepancy principle to automatically adjust the regularization parameter at each iteration. We compare our deconvolution algorithm with many competitive deconvolution techniques in terms of ISNR and visual quality.

Yang, Hang; Zhang, Zhongbo; Zhu, Ming; Huang, Heyan



Deconvolution microscopy of living cells for phase-contrast imaging  

NASA Astrophysics Data System (ADS)

The goal of deconvolution microscopy for phase-contrast imaging is to reassign the optical blur to its original position and to reduce statistical noise, thus visualizing the cellular structures of living cells in three dimensions and at subresolution scale. The major features of this technology for a phase-contrast microscopy are discussed through a series of theoretical analyses. A few of possible sources of aberrations and image degradation processes are presented. The theoretical and experimental results have shown that deconvolution microscopy can enhance resolution and contrast by either subtracting or reassigning out-of-focus blur.

Cheng, Guanxiao; Xu, Ping; Sun, Zhilong; Hong, Chunquan; Li, Zelin



CProb: a computational tool for conducting conditional probability analysis.  


Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at PMID:18948494

Hollister, Jeffrey W; Walker, Henry A; Paul, John F



Failure analysis of a tool-steel torque shaft  

SciTech Connect

A low design load drive shaft from an experimental diesel truck engine failed unexpectedly during highway testing. The shaft was driven by a turbine used to deliver power from an experimental exhaust heat recovery system to the engine's crankshaft. During design, fatigue was not considered a major problem because of the low operating cyclic stresses. An independent testing laboratory analyzed the failure by routine metallography. The structure of the hardened S-7 tool steel shaft was banded and the laboratory attributed the failure to fatigue induced by a banded microstructure. NASA was asked to confirm this analysis. Visual examination of the failed shaft plus the knowledge of the torsional load that it carried pointed to a 100% ductile failure with no evidence of fatigue. Scanning electron microscopy confirmed this. Torsional test specimens were produced from pieces of the failed shaft and torsional overload testing produced identical failures to that which had occurred in the truck engine. This pointed to a failure caused by a high overload and although the microstructure was defective it was not the cause of the failure.

Reagan, J.R.



GPFrontend and GPGraphics: graphical analysis tools for genetic association studies  

PubMed Central

Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.



A Virtual Environment Task-Analysis Tool for the Creation of Virtual Art Exhibits  

Microsoft Academic Search

This paper describes the creation of a hypothetical virtual art exhibit using a virtual environment task analysis tool. The Virtual Environment Task Analysis Tool (VETAT-ART) is a paper-and-pencil tool developed to provide structure and guidance to the needs-analysis process that is essential to the development of lifelike virtual exhibits. To illustrate its potential usefulness, VETAT-ART is applied to the design

Anne Parent



Second generation sequencing allows for mtDNA mixture deconvolution and high resolution detection of heteroplasmy  

PubMed Central

Aim To use parallel array pyrosequencing to deconvolute mixtures of mitochondrial DNA (mtDNA) sequence and provide high resolution analysis of mtDNA heteroplasmy. Methods The hypervariable segment 1 (HV1) of the mtDNA control region was analyzed from 30 individuals using the 454 GS Junior instrument. Mock mixtures were used to evaluate the system’s ability to deconvolute mixtures and to reliably detect heteroplasmy, including heteroplasmic differences between 5 family members of the same maternal lineage. Amplicon sequencing was performed on polymerase chain reaction (PCR) products generated with primers that included multiplex identifiers (MID) and adaptors for pyrosequencing. Data analysis was performed using NextGENe® software. The analysis of an autosomal short tandem repeat (STR) locus (D18S51) and a Y-STR locus (DYS389 I/II) was performed simultaneously with a portion of HV1 to illustrate that multiplexing can encompass different markers of forensic interest. Results Mixtures, including heteroplasmic variants, can be detected routinely down to a component ratio of 1:250 (20 minor variant copies with a coverage rate of 5000 sequences) and can be readily detected down to 1:1000 (0.1%) with expanded coverage. Amplicon sequences from D18S51, DYS389 I/II, and the second half of HV1 were successfully partitioned and analyzed. Conclusions The ability to routinely deconvolute mtDNA mixtures down to a level of 1:250 allows for high resolution analysis of mtDNA heteroplasmy, and for differentiation of individuals from the same maternal lineage. The pyrosequencing approach results in poor resolution of homopolymeric sequences, and PCR/sequencing artifacts require a filtering mechanism similar to that for STR stutter and spectral bleed through. In addition, chimeric sequences from jumping PCR must be addressed to make the method operational.

Holland, Mitchell M.; McQuillan, Megan R.; O'Hanlon, Katherine A.



Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools  

NASA Astrophysics Data System (ADS)

Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency management, to know in advance the different levels of risk perception and preparedness existing among several sectors of the population. Knowing where the most vulnerable population is located may optimize the use of resources, better direct the initial efforts and organize the evacuation and attention procedures. As part of the CBEWS, a comprehensive survey was applied in the study area to measure, among others features, the levels of risk perception, preparation and information received about natural hazards. After a statistical and direct analysis on a complete social dataset recorded, a spatial information distribution is actually in progress. Based on boundaries features (municipalities and sub-districts) of Italian Institute of Statistics (ISTAT), a local scale background has been granted (a private address level is not accessible for privacy rules so the local districts-ID inside municipality has been the detail level performed) and a spatial location of the surveyed population has been completed. The geometric component has been defined and actually it is possible to create a local distribution of social parameters derived from perception questionnaries results. A lot of raw information and social-statistical analysis offer different mirror and "visual concept" of risk perception. For this reason a concrete complete GeoDB is under working for the complete organization of the dataset. By a technical point of view the environment for data sharing is based on a complete open source web-service environment, to offer manually-made and user-friendly interface to this kind of information. Final aim is to offer different switches of dataset, using the same scale prototype and data hierarchical structure, to provide and compare social location of risk perception in the most detailed level.

Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone



Analysis of the JPlan Exercise As an Experimental Learning tool.  

National Technical Information Service (NTIS)

The purpose of this study is to develop an effective educational tool for the Combat Logistics course offered at the Air Force Institute of Technology. The current tool being used, the JPLAN Exercise, was identified as outdated in several areas. The resea...

M. J. Lynch M. E. Washington




Microsoft Academic Search

Abstract- Selection of tool steels is not limited to one ,type of steel for satisfying customer ,needs. On the other hand, technical specifications of tool steels are notabsolute and it consistentlydeviates from the mean. Thus in this paper, the fuzzy logic theory has been used for selecting appropriate tool steels with price analysis. The main steps are as follows: 1.




Microsoft Academic Search

A Planetary Entry Systems Synthesis Tool, with applications to conceptual design and modeling of entry systems has been developed. This tool is applicable to exploration missions that employ entry, descent and landing or aerocapture. An integrated framework brings together relevant disciplinary analyses and enables rapid design and analysis of the atmospheric entry mission segment. Tool performance has been validated against

D. M. Kipp; J. A. Dec; G. W. Wells; R. D. Braun


Computer-aided circuit analysis tools for RFIC simulation: algorithms, features, and limitations  

Microsoft Academic Search

The design of the radio frequency (RF) section in a communication integrated circuit (IC) is a challenging problem. Although several computer-aided analysis tools are available for RFIC design, they are not effectively used, because there is a lack of understanding about their features and limitations. These tools provide fast simulation of RFICs. However, no single tool delivers a complete solution

Kartikeya Mayaram; David C. Lee; Shahriar Moinian; David A. Rich; Jaijeet Roychowdhury



Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis  

NASA Astrophysics Data System (ADS)

The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. Using a syringe, this low-viscosity, transparent glue could be easily applied on the target area. We found that the glue permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.



Tool Chatter Monitoring in Turning Operations Using Wavelet Analysis of Ultrasound Waves  

Microsoft Academic Search

This paper presents a new method for tool chatter monitoring using the wavelet analysis of ultrasound waves. Ultrasound waves\\u000a are pulsed through the cutting tool towards the nose and are reflected back off the cutting edge. Fluctuating states of contact\\u000a and non-contact between the tool insert and the workpiece, which are generated as a result of tool chatter, affect the

J. H. Lange; N. H. Abu-Zahra



Online Analysis of Wind and Solar Part II: Transmission Tool  

SciTech Connect

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa



Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics  

SciTech Connect

The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

Jodoin, Vincent J [ORNL; Lee, Ronald W [ORNL; Peplow, Douglas E. [ORNL; Lefebvre, Jordan P [ORNL



Sub-diffraction limit differentiation of single fluorophores using Single Molecule Image Deconvolution (SMID)  

NASA Astrophysics Data System (ADS)

In order to better understand biological systems, researchers demand new techniques and improvements in single molecule differentiation. We present a unique approach utilizing an analysis of the standard deviation of the Gaussian point spread function of single immobile fluorescent molecules. This technique, Single Molecule Image Deconvolution (SMID), is applicable to standard TIRF instrumentation and standard fluorophores. We demonstrate the method by measuring the separation of two Cy3 molecules attached to the ends of short double-stranded DNA immobilized on a surface without photobleaching. Preliminary results and further applications will be presented.

Decenzo, Shawn H.; Desantis, Michael C.; Wang, Y. M.



Scatter correction for gamma cameras using constrained deconvolution  

Microsoft Academic Search

Several groups have proposed the use of multiwindow (usually dual window) techniques for scatter correction in SPECT (single photon emission computed tomography) and conventional gamma camera imaging. The authors have developed a technique for constrained deconvolution of the primary photopeak from the observed spectrum using multiple (5-30) windows. They have tested it both with simulations and with a modified gamma

D. R. Haynor; R. L. Harrison; T. K. Lewellen



Identification of RC networks by deconvolution: chances and limits  

Microsoft Academic Search

This paper deals with the identification of RC networks from their time- or frequency-domain responses. A new method is presented based on a recent approach of the network description where all response functions are calculated by convolution integrals. The identification is carried out by deconvolution (NID method). This paper discusses the practical details of the method. Special attention is paid

V. Szekely



Blind separation and blind deconvolution: an information-theoretic approach  

Microsoft Academic Search

Blind separation and blind deconvolution are related problems in unsupervised learning. In this contribution, static non-linearities are used in combination with an information-theoretic objective function, making the approach more rigorous than previous ones. We derive a new algorithm and with it perform nearly perfect separation of up to 10 digitally mixed human speakers, better performance than any previous algorithms for

Anthony J. Bell; Terrence J. Sejnowski



A quantitative evaluation of various iterative deconvolution algorithms  

Microsoft Academic Search

Various iterative deconvolution algorithms are evaluated that are commonly used to restore degraded chromatographic or spectroscopic peak data. The evaluation criteria include RMS errors, relative errors in peak areas, peak area variances, and rate of convergence. The iterative algorithms to be evaluated include Van Cittert's method, Van Cittert's method with constraint operators, relaxation based methods, and Gold's ratio method. The

P. B. Crilly



Higher order spectra based deconvolution of ultrasound images  

Microsoft Academic Search

We address the problem of improving the spatial resolution of ulrasound images through blind deconvolution. The ultrasound image formation process in the RF domain can be expressed as a spatio-temporal convolution between the tissue response and the ultrasonic system response, plus additive noise. Convolutional components of the dispersive attenuation and aberrations introduced by propagating through the object being imaged are

Udantha R. Abeyratne; Athina P. Petropulu; John M. Reid



Spent Nuclear Fuel Characterization Through Neutron Flux Deconvolution  

SciTech Connect

A method to determine the composition of spent fuel through spectral deconvolution of the neutron flux emitted from the fuel is proposed. Recently developed GaAs({sup 10}B) semiconductor detector arrays are used. The results of Monte Carlo simulations of the detector responses, illustrating the feasibility of the spectral unfolding technique for spent fuel characterization, are presented.

Hartman, Michael R.; Lee, John C.



Multichannel blind separation and deconvolution of sources with arbitrary distributions  

Microsoft Academic Search

Blind deconvolution and separation of linearly mixed and convolved sources is an important and challenging task for numerous applications. While several algorithms have shown promise in these tasks, these techniques may fail to separate signal mixtures containing both sub- and super-Gaussian distributed sources. In this paper, we present a simple and efficient extension of a family of algorithms that enables

Scott C. Douglas; Andrzej Cichocki; Shun-ichi Amari



A bound optimization approach to wavelet-based image deconvolution  

Microsoft Academic Search

We address the problem of image deconvolution under lp norm (and other) penalties expressed in the wavelet domain. We propose an algorithm based on the bound optimization approach; this ap- proach allows deriving EM-type algorithms without using the con- cept of missing\\/hidden data. The algorithm has provable mono- tonicity both with orthogonal or redundant wavelet transforms. We also derive bounds

Mário A. T. Figueiredo; Robert D. Nowak



Deconvolution of Mineral Absorption Bands: An Improved Approach  

Microsoft Academic Search

Although visible and near-infrared reflectance spectra contain absorption bands that are characteristic of the composition and structure of the absorbing species, deconvolving a complex spectrum is nontrivial. An improved approach to spectral deconvolution is presented here that accurately represents absorption bands as discrete mathematical distributions and resolves composite absorption features into individual absorptions bands. The frequently used Gaussian model of

Jessica M. Sunshine; Carle M. Pieters; Stephen F. Pratt



Constrained Least Squares Filtering Algorithm for Ultrasound Image Deconvolution  

Microsoft Academic Search

A new medical ultrasound tissue model is considered in this paper, which incorporates random fluctuations of the tissue response and provides more realistic interpretation of the received pulse-echo ultrasound signal. Using this new model, we propose an algorithm for restoration of the degraded ultrasound image. The proposed deconvolution is a modification of the classical regularization technique which combines Wiener filter

Wee-Soon Yeoh; Cishen Zhang



Multi-Parseval frame-based nonconvex sparse image deconvolution  

NASA Astrophysics Data System (ADS)

Image deconvolution is an ill-posed, low-level vision task, restoring a clear image from the blurred and noisy observation. From the perspective of statistics, previous work on image deconvolution has been formulated as a maximum a posteriori or a general Bayesian inference problem, with Gaussian or heavy-tailed non-Gaussian prior image models (e.g., a student's t distribution). We propose a Parseval frame-based nonconvex image deconvolution strategy via penalizing the l0-norm of the coefficients of multiple different Parseval frames. With these frames, flexible filtering operators are provided to adaptively capture the point singularities, the curvilinear edges and the oscillating textures in natural images. The proposed optimization problem is implemented by borrowing the idea of recent penalty decomposition method, resulting in a simple and efficient iteration algorithm. Experimental results show that the proposed deconvolution scheme is highly competitive among state-of-the-art methods, in both the improvement of signal-to-noise ratio and visual perception.

Shao, Wen-Ze; Deng, Hai-Song; Wei, Zhi-Hui



Blind Image Deconvolution via Particle Swarm Optimization with Entropy Evaluation  

Microsoft Academic Search

This study addresses a blind image deconvolution which uses only blurred image and tiny point spread function (PSF) information to restore the original image. In order to mitigate the problem trapping into a local solution in conventional algorithms, the evolutionary learning is reasonably to apply to this task. In this paper, particle swarm optimization (PSO) is therefore utilized to seek

Tsung-ying Sun; Chan-cheng Liu; Yu-peng Jheng; Jyun-hong Jheng; Shang-jeng Tsai; Sheng-ta Hsieh



A globally convergent approach for blind MIMO adaptive deconvolution  

Microsoft Academic Search

We discuss the blind deconvolution of multiple input\\/multiple output (MIMO) linear convolutional mixtures and propose a set of hierarchical criteria motivated by the maximum entropy principle. The proposed criteria are based on the constant-modulus (CM) criterion in order to guarantee that all minima achieve perfectly restoration of different sources. The approach is moreover robust to errors in channel order estimation.

Azzédine Touzni; Inbar Fijalkow; Michael G. Larimore; John R. Treichler



An approximate deconvolution procedure for large-eddy simulation  

Microsoft Academic Search

An alternative approach to large-eddy simulation based on approximate deconvolution (ADM) is developed. The main ingredient is an approximation of the nonfiltered field by truncated series expansion of the inverse filter operator. A posteriori tests for decaying compressible isotropic turbulence show excellent agreement with direct numerical simulation. The computational overhead of ADM is similar to that of a scale-similarity model

S. Stolz; N. A. Adams



Blind deconvolution of ultrasonic signals in nondestructive testing applications  

Microsoft Academic Search

Advanced nondestructive testing techniques use a laser to generate ultrasonic waves at the surface of a test material. An air-coupled transducer receives the ultrasound that is the convolution of the signal leaving the test material and the distortion function. Blind deconvolution methods are applied to estimate the signal leaving the material

A. K. Nandi; D. Mampel; B. Roscher



Automation Tools for Finite Element Analysis of Adhesively Bonded Joints.  

National Technical Information Service (NTIS)

This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these au...

F. Tahmasebi



Breast image feature learning with adaptive deconvolutional networks  

NASA Astrophysics Data System (ADS)

Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpiñán, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.



A Library of Cortical Morphology Analysis Tools to Study Development, Aging and Genetics of Cerebral Cortex  

Microsoft Academic Search

Sharing of analysis techniques and tools is among the main driving forces of modern neuroscience. We describe a library of\\u000a tools developed to quantify global and regional differences in cortical anatomy in high resolution structural MR images. This\\u000a library is distributed as a plug-in application for popular structural analysis software, BrainVisa (BV). It contains tools\\u000a to measure global and regional

Peter Kochunov; William Rogers; Jean-Francois Mangin; Jack Lancaster


Development and Application of a Military Intelligence (MI) Job Comparison and Analysis Tool (JCAT).  

National Technical Information Service (NTIS)

As part of a military intelligence MOS-intelligence/electronic warfare (IEW) analysis method, the Job Comparison and Analysis Tool (JCAT) was developed to identify MOS military occupational specialties capabilities and IEW system demands in terms of abili...

S. Seven A. Akman F. A. Muckler B. G. Knapp D. Burstein



Silencer(exclamation point) A Tool for Substrate Noise Coupling Analysis.  

National Technical Information Service (NTIS)

This thesis presents Silencer(exclamation point), a fully automated, schematic-driven tool for substrate noise coupling simulation and analysis. It has been integrated into the CADENCE DFII environment and seamlessly enables substrate coupling analysis in...

P. Birrer



Image restoration for confocal microscopy: improving the limits of deconvolution, with application to the visualization of the mammalian hearing organ.  

PubMed Central

Deconvolution algorithms have proven very effective in conventional (wide-field) fluorescence microscopy. Their application to confocal microscopy is hampered, in biological experiments, by the presence of important levels of noise in the images and by the lack of a precise knowledge of the point spread function (PSF) of the system. We investigate the application of wavelet-based processing tools to deal with these problems, in particular wavelet denoising methods, which turn out to be very effective in application to three-dimensional confocal images. When used in combination with more classical deconvolution algorithms, these methods provide a robust and efficient restoration scheme allowing one to deal with difficult imaging conditions. To make our approach applicable in practical situations, we measured the PSF of a Biorad-MRC1024 confocal microscope under a large set of imaging conditions, including in situ acquisitions. As a specific biological application, we present several examples of restorations of three-dimensional confocal images acquired inside an intact preparation of the hearing organ. We also provide a quantitative assessment of the gain in quality achieved by wavelet-aided restorations over classical deconvolution schemes, based on a set of numerical experiments that we performed with test images.

Boutet de Monvel, J; Le Calvez, S; Ulfendahl, M



Application of parallel computing to the Monte Carlo simulation of electron scattering in solids: A rapid method for profile deconvolution  

SciTech Connect

X-ray microanalysis by analytical electron microscopy (AEM) has proven to be a powerful tool for characterizing the spatial distribution of solute elements in materials. True compositional variations over spatial scales smaller than the actual resolution for microanalysis can be determined if the measured composition profile is deconvoluted. Explicit deconvolutions of such data, via conventional techniques such as Fourier transforms, are not possible due to statistical noise in AEM microanalytical data. Hence, the method of choice is to accomplish the deconvolution via iterative convolutions. In this method, a function describing the assumed true composition profile, calculated by physically permissible thermodynamic and kinetic modeling, is convoluted with the x-ray generation function and the result compared to the measured composition profile. If the measured and calculated profiles agree within experimental error, it is assumed that the true compositional profile has been determined. If the measured and calculated composition profiles are in disagreement, the assumptions in the physical model are adjusted and the convolution process repeated. To employ this procedure it is necessary to calculate the x-ray generation function explicitly. While a variety of procedures are available for calculating this function, the most accurate procedure is to use Monte Carlo modeling of electron scattering.

Romig, A.D. Jr.; Plimpton, S.J.; Michael, J.R. (Sandia National Labs., Albuquerque, NM (USA)); Myklebust, R.L.; Newbury, D.E. (National Inst. of Standards and Technology, Gaithersburg, MD (USA))



Mesh-based spherical deconvolution: A flexible approach to reconstruction of non-negative fiber orientation distributions  

PubMed Central

Diffusion-weighted MRI has enabled the imaging of white matter architecture in vivo. Fiber orientations have classically been assumed to lie along the major eigenvector of the diffusion tensor, but this approach has well-characterized shortcomings in voxels containing multiple fiber populations. Recently proposed methods for recovery of fiber orientation via spherical deconvolution utilize a spherical harmonics framework and are susceptible to noise, yielding physically-invalid results even when additional measures are taken to minimize such artifacts. In this work, we reformulate the spherical deconvolution problem onto a discrete spherical mesh. We demonstrate how this formulation enables the estimation of fiber orientation distributions which strictly satisfy the physical constraints of realness, symmetry, and non-negativity. Moreover, we analyze the influence of the flexible regularization parameters included in our formulation for tuning the smoothness of the resultant fiber orientation distribution (FOD). We show that the method is robust and reliable by reconstructing known crossing fiber anatomy in multiple subjects. Finally, we provide a software tool for computing the FOD using our new formulation in hopes of simplifying and encouraging the adoption of spherical deconvolution techniques.

Patel, Vishal; Shi, Yonggang; Thompson, Paul M.; Toga, Arthur W.




Microsoft Academic Search

In aerospace, carbon fibre-reinforced polymer (CFRP) materials and postbuckling skin-stiffened structures are key technologies that have been used to improve structural efficiency. However, the application of composite postbuckling structures in aircraft has been limited as today's analysis tools cannot accurately predict structural collapse in compression. In this work, a finite element analysis tool for design and certification of aerospace structures




New Geant4 based simulation tools for space radiation shielding and effects analysis  

Microsoft Academic Search

We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the

G. Santina; P. Nieminen; H. Evansa; E. Daly; F. Lei; P. R. Truscott; C. S. Dyer; B. Quaghebeur; D. Heynderickx



Securing Java code: heuristics and an evaluation of static analysis tools  

Microsoft Academic Search

A secure coding standard for Java does not exist. Even if a standard did exist, it is not known how well static analysis tools could enforce it. In this work, we show how well eight static analysis tools can identify violations of a comprehensive collection of coding heuristics for increasing the quality and security of Java SE code. A new

Michael S. Ware; Christopher J. Fox



XAssist: A Tool For Automated Analysis of Automated X-ray Data Analysis Data  

Microsoft Academic Search

XAssist (http:\\/\\/ is a tool (available for download) for automatically downloading, reducing and performing preliminary analysis on Chandra, ASCA, ROSAT and (eventually) XMM-Newton data. The system is capable of reprocessing data, running source detection algorithms (a built-in routine is used fro ASCA and ROSAT data, CIAO's wavdetect is used for Chandra data), determining median background levels, filtering flares from background

A. Ptak



Tools for Scalable Parallel Program Analysis - Vampir VNG and DeWiz  

Microsoft Academic Search

Large scale high-performance computing systems pose a tough obstacle for todays program analysis tools. Their demands in computational performance and memory capacity for processing program analysis data exceed the capabilities of standard workstations and traditional analysis tools. The sophisticated approaches of Vampir NG (VNG) and the Debugging Wizard DeWiz\\u000a intend to provide novel ideas for scalable parallel program analysis. While

Holger Brunst; Dieter Kranzlmüller; Wolfgang E. Nagel



pathFinder: A Static Network Analysis Tool for Pharmacological Analysis of Signal Transduction Pathways  

NSDL National Science Digital Library

The study of signal transduction is becoming a de facto part of the analysis of gene expression and protein profiling techniques. Many online tools are used to cluster genes in various ways or to assign gene products to signal transduction pathways. Among these, pathFinder is a unique tool that can find signal transduction pathways between first, second, or nth messengers and their targets within the cell. pathFinder can identify qualitatively all possible signal transduction pathways connecting any starting component and target within a database of two-component pathways (directional dyads). One or more intermediate pathway components can be excluded to simulate the use of pharmacological inhibitors or genetic deletion (knockout). Missing elements in a pathway connecting the activator or initiator and target can also be inferred from a null pathway result. The value of this static network analysis tool is illustrated by the predication from pathFinder analysis of a novel cyclic AMP–dependent, protein kinase A–independent signaling pathway in neuroendocrine cells, which has been experimentally confirmed.

Babru B. Samal (NIH;National Institute of Mental Health--Intramural Research Programs (NIMH-IRP) Bioinformatics Core REV); Lee E. Eiden (NIH;Section on Molecular Neuroscience REV)



Protein kinase structure and function analysis with chemical tools.  


Protein kinases are the largest enzyme superfamily involved in cell signal transduction and represent therapeutic targets for a range of diseases. There have been intensive efforts from many labs to understand their catalytic mechanisms, discover inhibitors and discern their cellular functions. In this review, we will describe two approaches developed to analyze protein kinases: bisubstrate analog inhibition and phosphonate analog utilization. Both of these methods have been used in combination with the protein semisynthesis method expressed protein ligation to advance our understanding of kinase-substrate interactions and functional elucidation of phosphorylation. Previous work on the nature of the protein kinase mechanism suggests it follows a dissociative transition state. A bisubstrate analog was designed against the insulin receptor kinase to mimic the geometry of a dissociative transition state reaction coordinate distance. This bisubstrate compound proved to be a potent inhibitor against the insulin receptor kinase and occupied both peptide and nucleotide binding sites. Bisubstrate compounds with altered hydrogen bonding potential as well as varying spacers between the adenine and the peptide demonstrate the importance of the original design features. We have also shown that related bisubstrate analogs can be used to potently block serine/threonine kinases including protein kinase A. Since many protein kinases recognize folded protein substrates for efficient phosphorylation, it was advantageous to incorporate the peptide-ATP conjugates into protein structures. Using expressed protein ligation, a Src-ATP conjugate was produced and shown to be a high affinity ligand for the Csk tyrosine kinase. Nonhydrolyzable mimics of phosphoSer/phosphoTyr can be useful in examining the functionality of phosphorylation events. Using expressed protein ligation, we have employed phosphonomethylene phenylalanine and phosphonomethylene alanine to probe the phosphorylation of Tyr and Ser, respectively. These tools have permitted an analysis of the SH2-phosphatases (SHP1 and SHP2), revealing a novel intramolecular stimulation of catalytic activity mediated by the corresponding phosphorylation events. They have also been used to characterize the cellular regulation of the melatonin rhythm enzyme by phosphorylation. PMID:16213197

Shen, Kui; Hines, Aliya C; Schwarzer, Dirk; Pickin, Kerry A; Cole, Philip A



An Evaluation of Visual and Textual Network Analysis Tools  

SciTech Connect

User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This article presents the results of a comparative evaluation between a visualization-based application and a more traditional, table-based application for analyzing computer network packet captures. We conducted this evaluation as part of the user-centered design process. Participants performed both structured, well-defined tasks and exploratory, open-ended tasks with both tools. We measured accuracy and efficiency for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study design presented may be useful for future researchers performing user testing on visualization for cyber security applications.

Goodall, John R [ORNL



Discriminating adenocarcinoma from normal colonic mucosa through deconvolution of Raman spectra.  


In this work, we considered the feasibility of Raman spectroscopy for discriminating between adenocarcinomatous and normal mucosal formalin-fixed colonic tissues. Unlike earlier studies in colorectal cancer, a spectral deconvolution model was implemented to derive spectral information. Eleven samples of human colon were used, and 55 spectra were analyzed. Each spectrum was resolved into 25 bands from 975 to 1720 cm(-1), where modes of proteins, lipids, and nucleic acids are observed. From a comparative study of band intensities, those presenting higher differences between tissue types were correlated to biochemical assignments. Results from fitting procedure were further used as inputs for linear discriminant analysis, where combinations of band intensities and intensity ratios were tested, yielding accuracies up to 81%. This analysis yields objective discriminating parameters after fitting optimization. The bands with higher diagnosis relevance detected by spectra deconvolution enable to confine the study to some spectral regions instead of broader ranges. A critical view upon limitations of this approach is presented, along with a comparison of our results to earlier ones obtained in fresh colonic tissues. This enabled to assess the effect of formalin fixation in colonic tissues, and determine its relevance in the present analysis. PMID:22191931

Lopes, Patricia Cambraia; Moreira, Joaquim Agostinho; Almeida, Abilio; Esteves, Artur; Gregora, Ivan; Ledinsky, Martin; Lopes, Jose Machado; Henrique, Rui; Oliveira, Albino



Discriminating adenocarcinoma from normal colonic mucosa through deconvolution of Raman spectra  

NASA Astrophysics Data System (ADS)

In this work, we considered the feasibility of Raman spectroscopy for discriminating between adenocarcinomatous and normal mucosal formalin-fixed colonic tissues. Unlike earlier studies in colorectal cancer, a spectral deconvolution model was implemented to derive spectral information. Eleven samples of human colon were used, and 55 spectra were analyzed. Each spectrum was resolved into 25 bands from 975 to 1720 cm-1, where modes of proteins, lipids, and nucleic acids are observed. From a comparative study of band intensities, those presenting higher differences between tissue types were correlated to biochemical assignments. Results from fitting procedure were further used as inputs for linear discriminant analysis, where combinations of band intensities and intensity ratios were tested, yielding accuracies up to 81%. This analysis yields objective discriminating parameters after fitting optimization. The bands with higher diagnosis relevance detected by spectra deconvolution enable to confine the study to some spectral regions instead of broader ranges. A critical view upon limitations of this approach is presented, along with a comparison of our results to earlier ones obtained in fresh colonic tissues. This enabled to assess the effect of formalin fixation in colonic tissues, and determine its relevance in the present analysis.

Cambraia Lopes, Patricia; Moreira, Joaquim Agostinho; Almeida, Abilio; Esteves, Artur; Gregora, Ivan; Ledinsky, Martin; Lopes, Jose Machado; Henrique, Rui; Oliveira, Albino



Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details  

PubMed Central

Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners.

Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca



Automated Multivariate Optimization Tool for Energy Analysis: Preprint  

SciTech Connect

Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.



Translational meta-analysis tool for temporal gene expression profiles.  


Widespread use of microarray technology that led to highly complex datasets often is addressing similar or related biological questions. In translational medicine research is often based on measurements that have been obtained at different points in time. However, the researcher looks at them as a progression over time. If a biological stimulus shows an effect on a particular gene that is reversed over time, this would show, for instance, as a peak in the gene's temporal expression profile. Our program SPOT helps researchers find these patterns in large sets of microarray data. We created the software tool using open-source platforms and the Semantic Web tool Protégé-OWL. PMID:22874385

Tusch, Guenter; Tole, Olvi



Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).  


This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings. PMID:18406925

Rath, Frank



Design tools for daylighting illumination and energy analysis  

Microsoft Academic Search

The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads,




TAPAs: A Tool for the Analysis of Process Algebras  

Microsoft Academic Search

Process Algebras are formalisms for modelling concurrent systems that permit mathematical reasoning with respect to a set of desired properties. TAPAs is a tool that can be used to support the use of process algebras to spec- ify and analyze concurrent systems. It does not aim at guaranteeing high perfor- mances, but has been developed as a support to teaching.

Francesco Calzolai; Rocco De Nicola; Michele Loreti; Francesco Tiezzi




EPA Science Inventory

Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...


A semiotic analysis of iMarketing tools  

Microsoft Academic Search

This paper tries to point out current developments in the commercialization of the Internet and its various effects on the World Wide Web. Approaching Hypertext Theory from of the viewpoint of applied Semiotics, the author analyzes recently developed Internet Marketing Tools such as Banner Ad Keying and Keywords in Discussion Groups.

Moritz Neumüller



Analysis of the Electrical Engineering Problems Using Computer Tools  

Microsoft Academic Search

This paper shows possibilities using of computer tools in Basic Circuit Theory. There is a brief summary of the computer approach in Electrical Engineering Education in the world, which we found among books, papers, and the web presentations. The survey of the computer exercises prepared at the Department of Circuit Theory of the Czech Technical University in Prague is introduced.

Vaclav Havlicek; Roman Cmejla


Clinical decision support tools: analysis of online drug information databases  

Microsoft Academic Search

BACKGROUND: Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information

Kevin A Clauson; Wallace A Marsh; Hyla H Polen; Matthew J Seamon; Blanca I Ortiz



Regional energy planning through SWOT analysis and strategic planning tools  

Microsoft Academic Search

Strategic planning processes, which are commonly used as a tool for region development and territorial structuring, can be harnessed by politicians and public administrations, at the local level, to redesign the regional energy system and encourage renewable energy development and environmental preservation. In this sense, the province of Jaén, a southern Spanish region whose economy is mainly based on olive

J. Terrados; G. Almonacid; L. Hontoria



Sensitivity analysis of volatility - a new tool for risk management  

Microsoft Academic Search

The extension of GARCH models to the multivariate setting has been fraught with difficulties. In this paper, we suggest to work with univariate portfolio GARCH models. We show how the multivariate dimension of the portfolio allocation problem may be recovered from the univariate approach. The main tool we use is the \\

Simone Manganelli; Vladimiro Ceci; Walter Vecchiato



A flexible tool for scenario analysis of network demand  

NASA Astrophysics Data System (ADS)

This is another in a sequence of papers reporting on the development of innovative methods and tools for estimating demand requirements for network supply capabilities. An extension of the demand estimation methodology, this paper focuses on steps required to assess the adequacy of performance of candidate networks by means of an integrated tool. The steps include mapping units in a scenario to units in the associated database to determine their aggregate demand, developing an appropriate logical network with computational constraints dictated by the scenario, and calculating inter-unit demand of the units in the logical network. Because of the complexity of the end-to-end process, assuring repeatability while facilitating rapid exploration of issues is a challenge. Earlier tools implementing this process were fragmented and prone to error, requiring significant analyst effort to accomplish even the smallest changes. To address these limitations, the process has been implemented in an easy to use, integrated tool. This allows complete exibility in manipulating data and promotes rapid, but repeatable analyses of tailored scenarios.

O'Donnel, Jack E.; George, Ayanah S.; Wynn, Danielle M.; Brett, Samuel W.; Ridder, Jeffrey P.; Signori, David T.; Schoenborn, Heather W.



A requirements analysis for videogame design support tools  

Microsoft Academic Search

Designing videogames involves weaving together systems of rules, called game mechanics, which support and str ucture com- pelling player experiences. Thus a significant port ion of game design involves reasoning about the effects of diff erent potential game mechanics on player experience. Unlike some design fields, such as architecture and mechanical design, that ha ve CAD tools to support designers

Mark J. Nelson; Michael Mateas



Tools for integrated sequence-structure analysis with UCSF Chimera  

Microsoft Academic Search

Background: Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a) provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure

Elaine C. Meng; Eric F. Pettersen; Gregory S. Couch; Conrad C. Huang; Thomas E. Ferrin



Tools for Education Policy Analysis [with CD-ROM].  

ERIC Educational Resources Information Center

|This manual contains a set of tools to assist policymakers in analyzing and revamping educational policy. Its main focus is on some economic and financial aspects of education and selected features in the arrangements for service delivery. Originally offered as a series of training workshops for World Bank staff to work with clients in the…

Mingat, Alain; Tan, Jee-Peng


Mapping and spatiotemporal analysis tool for hydrological data: Spellmap  

Technology Transfer Automated Retrieval System (TEKTRAN)

Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...


Adaptive deconvolution based on spectral decomposition  

NASA Astrophysics Data System (ADS)

An adaptive algorithm for estimating the input to a linear system is presented. This explicit self-tuning filter is based on the identification of an innovations model. From that model, input and measurement noise ARMA-descriptions are decomposed, using second order moments. Identifiability results guarantee a unique decomposition. Main tools in the algorithm are the solution of two linear systems of equations. The filter design is based on the polynomial approach to Wiener filtering.

Ahlen, Anders; Sternad, Mikael



A pitfall in the reconstruction of fibre ODFs using spherical deconvolution of diffusion MRI data.  


Diffusion weighted (DW) MRI facilitates non-invasive quantification of tissue microstructure and, in combination with appropriate signal processing, three-dimensional estimates of fibrous orientation. In recent years, attention has shifted from the diffusion tensor model, which assumes a unimodal Gaussian diffusion displacement profile to recover fibre orientation (with various well-documented limitations), towards more complex high angular resolution diffusion imaging (HARDI) analysis techniques. Spherical deconvolution (SD) approaches assume that the fibre orientation density function (fODF) within a voxel can be obtained by deconvolving a 'common' single fibre response function from the observed set of DW signals. In practice, this common response function is not known a priori and thus an estimated fibre response must be used. Here the establishment of this single-fibre response function is referred to as 'calibration'. This work examines the vulnerability of two different SD approaches to inappropriate response function calibration: (1) constrained spherical harmonic deconvolution (CSHD)--a technique that exploits spherical harmonic basis sets and (2) damped Richardson-Lucy (dRL) deconvolution--a technique based on the standard Richardson-Lucy deconvolution. Through simulations, the impact of a discrepancy between the calibrated diffusion profiles and the observed ('Target') DW-signals in both single and crossing-fibre configurations was investigated. The results show that CSHD produces spurious fODF peaks (consistent with well known ringing artefacts) as the discrepancy between calibration and target response increases, while dRL demonstrates a lower over-all sensitivity to miscalibration (with a calibration response function for a highly anisotropic fibre being optimal). However, dRL demonstrates a reduced ability to resolve low anisotropy crossing-fibres compared to CSHD. It is concluded that the range and spatial-distribution of expected single-fibre anisotropies within an image must be carefully considered to ensure selection of the appropriate algorithm, parameters and calibration. Failure to choose the calibration response function carefully may severely impact the quality of any resultant tractography. PMID:23085109

Parker, G D; Marshall, D; Rosin, P L; Drage, N; Richmond, S; Jones, D K



Pyrosequencing data analysis software: a useful tool for EGFR, KRAS, and BRAF mutation analysis  

PubMed Central

Background Pyrosequencing is a new technology and can be used for mutation tests. However, its data analysis is a manual process and involves sophisticated algorithms. During this process, human errors may occur. A better way of analyzing pyrosequencing data is needed in clinical diagnostic laboratory. Computer software is potentially useful for pyrosequencing data analysis. We have developed such software, which is able to perform pyrosequencing mutation data analysis for epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene homolog and v-raf murine sarcoma viral oncogene homolog B1. The input data for analysis includes the targeted nucleotide sequence, common mutations in the targeted sequence, pyrosequencing dispensing order, pyrogram peak order and peak heights. The output includes mutation type and percentage of mutant gene in the specimen. Results The data from 1375 pyrosequencing test results were analyzed using the software in parallel with manual analysis. The software was able to generate correct results for all 1375 cases. Conclusion The software developed is a useful molecular diagnostic tool for pyrosequencing mutation data analysis. This software can increase laboratory data analysis efficiency and reduce data analysis error rate. Virtual slides The virtual slide(s) for this article can be found here:



Proteomic tools for the analysis of transient interactions between metalloproteins.  


Metalloproteins play major roles in cell metabolism and signalling pathways. In many cases, they show moonlighting behaviour, acting in different processes, depending on the physiological state of the cell. To understand these multitasking proteins, we need to discover the partners with which they carry out such novel functions. Although many technological and methodological tools have recently been reported for the detection of protein interactions, specific approaches to studying the interactions involving metalloproteins are not yet well developed. The task is even more challenging for metalloproteins, because they often form short-lived complexes that are difficult to detect. In this review, we gather the different proteomic techniques and biointeractomic tools reported in the literature. All of them have shown their applicability to the study of transient and weak protein-protein interactions, and are therefore suitable for metalloprotein interactions. PMID:21352492

Martínez-Fábregas, Jonathan; Rubio, Silvia; Díaz-Quintana, Antonio; Díaz-Moreno, Irene; De la Rosa, Miguel Á



Independent component analysis as a tool for the dimensionality reduction and the representation of hyperspectral images  

Microsoft Academic Search

Independent component analysis (ICA) is a multivariate data analysis process largely studied these last years in the signal processing community for blind source separation. This paper proposes to show the interest of ICA as a tool for unsupervised analysis of hyperspectral images. The commonly used principal component analysis (PCA) is the mean square optimal projection for gaussian data leading to

M. Lennon; G. Mercier; M. C. Mouchot; L. Hubert-Moy



Tools for Analysis and Design of Distributed Resources—Part I: Tools for Feasibility Studies  

Microsoft Academic Search

The feasibility study of a power system is aimed at de- termining whether the system can adequately satisfy the load. The study will usually include other purposes, such as estimating the life-cycle cost of the system. This papersummarizesthe main capa- bilities of current software packages for feasibility analysis of dis- tributed energy resources. The document presents a short descrip- tion

J. A. Martinez; J. Martin-Arnedo



Online Tool for Analysis of Denaturing Gradient Gel Electrophoresis Profiles  

PubMed Central

We present an online tool (EquiBands, that quantifies the matching of two bands considered to be the same in different samples, even when samples are applied to different denaturing gradient gel electrophoresis gels. With an environmental example we demonstrate the procedure for the classification of two bands of different samples with the help of EquiBands.

Huber, Florian; Peduzzi, Peter



Online tool for analysis of denaturing gradient gel electrophoresis profiles.  


We present an online tool (EquiBands, that quantifies the matching of two bands considered to be the same in different samples, even when samples are applied to different denaturing gradient gel electrophoresis gels. With an environmental example we demonstrate the procedure for the classification of two bands of different samples with the help of EquiBands. PMID:15240327

Huber, Florian; Peduzzi, Peter



Virtual Reality As Communication Tool: A Socio-Cognitive Analysis  

Microsoft Academic Search

Virtual Reality (VR) is usually described by the media as a particular collection of technological hardware: a computer capable of 3D real-time animation, a head-mounted display, data gloves equipped with one or more position trackers. However, this focus on technology is disappointing for communication researchers and VR designers. To overcome this limitation this paper describes VR as a communication tool:

Giuseppe Riva



Lectin Microarrays: Simple Tools for the Analysis of Complex Glycans  

Microsoft Academic Search

\\u000a The emerging roles for post-translational modifications in the regulation of cellular function have turned the spotlight on\\u000a glycosylation. Given the prevalence of protein and lipid glycosylation, it has become imperative to create and utilize new\\u000a tools to study these critical biopolymers. In particular, there has been an emphasis on the development of high-throughput\\u000a methodologies to study the structural and functional

Lakshmi Krishnamoorthy; Lara K. Mahal


Ontological analysis of gene expression data: current tools, limitations, and open problems  

PubMed Central

Summary Independent of the platform and the analysis methods used, the result of a microarray experiment is, in most cases, a list of differentially expressed genes. An automatic ontological analysis approach has been recently proposed to help with the biological interpretation of such results. Currently, this approach is the de facto standard for the secondary analysis of high throughput experiments and a large number of tools have been developed for this purpose. We present a detailed comparison of 14 such tools using the following criteria: scope of the analysis, visualization capabilities, statistical model(s) used, correction for multiple comparisons, reference microar-rays available, installation issues and sources of annotation data. This detailed analysis of the capabilities of these tools will help researchers choose the most appropriate tool for a given type of analysis. More importantly, in spite of the fact that this type of analysis has been generally adopted, this approach has several important intrinsic drawbacks. These drawbacks are associated with all tools discussed and represent conceptual limitations of the current state-of-the-art in ontological analysis. We propose these as challenges for the next generation of secondary data analysis tools. Contact

Khatri, Purvesh



Development of a Viscoelastic Finite Element Tool for Asphalt Pavement Low Temperature Cracking Analysis  

Microsoft Academic Search

This paper proposed and developed a tailored tool, “VE2D” for pavement low temperature cracking analysis based on viscoelastic two-dimensional (2D) finite element (FE) method. The tool can provide accurate thermal stress evaluation and thermal cracking prediction while considering the entire pavement structure rather than just the asphalt concrete layer. Also, this tool has four innovative features: Firstly, it incorporates the

Sheng Hu; Fujie Zhou; Lubinda F. Walubita




USGS Publications Warehouse

Observations of the earthquake source-time function are enhanced if path, recording-site, and instrument complexities can be removed from seismograms. Assuming that a small earthquake has a simple source, its seismogram can be treated as an empirical Green's function and deconvolved from the seismogram of a larger and/or more complex earthquake by spectral division. When the deconvolution is well posed, the quotient spectrum represents the apparent source-time function of the larger event. This study shows that with high-quality locally recorded earthquake data it is feasible to Fourier transform the quotient and obtain a useful result in the time domain. In practice, the deconvolution can be stabilized by one of several simple techniques. Application of the method is given. Refs.

Mueller, Charles, S.



Soft constraints in nonlinear spectral fitting with regularized lineshape deconvolution.  


This article presents a novel method for incorporating a priori knowledge into regularized nonlinear spectral fitting as soft constraints. Regularization was recently introduced to lineshape deconvolution as a method for correcting spectral distortions. Here, the deconvoluted lineshape was described by a new type of lineshape model and applied to spectral fitting. The nonlinear spectral fitting was carried out in two steps that were subject to hard constraints and soft constraints, respectively. The hard constraints step provided a starting point and, therefore, only the changes of the relevant variables were constrained in the soft constraints step and incorporated into the linear substeps of the Levenberg-Marquardt algorithm. The method was demonstrated using localized averaged echo time point resolved spectroscopy proton spectroscopy of human brains. PMID:22618964

Zhang, Yan; Shen, Jun



Integrated Tool for System Analysis of Sample Return Vehicles.  

National Technical Information Service (NTIS)

The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis...

J. A. Samareh R. G. Winski R. W. Maddock



Validity Analysis of Tool and Trade Knowledge Test Items.  

National Technical Information Service (NTIS)

The objective of the present analysis was to develop experimental scales for the differential prediction of performance in construction and related mechanical jobs. The present report covers analysis of items for (1) difficulty level in samples representa...

W. H. Helme W. R. Graham



X3DBio1: a visual analysis tool for biomolecular structure exploration  

NASA Astrophysics Data System (ADS)

Protein tertiary structure analysis provides valuable information on their biochemical functions. The structure-to-function relationship can be directly addressed through three dimensional (3D) biomolecular structure exploration and comparison. We present X3DBio1, a visual analysis tool for 3D biomolecular structure exploration, which allows for easy visual analysis of 2D intra-molecular contact map and 3D density exploration for protein, DNA, and RNA structures. A case study is also presented in this paper to illustrate the utility of the tool. X3DBio1 is open source and freely downloadable. We expect this tool can be applied to solve a variety of biological problems.

Yi, Hong; Singh, Abhishek; Yingling, Yaroslava G.



An integrated tool for system analysis of sample return vehicles  

Microsoft Academic Search

The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis,

Jamshid A. Samareh; Robert W. Maddock; Richard G. Winski



Deconvolution for digital in-line holographic microscopy  

NASA Astrophysics Data System (ADS)

To improve the resolution in point source digital in-line holography, we present two deconvolutions, one for the illumination system (coherent or partially coherent light source such as a laser or diode and pinhole) and one for the finite numerical aperture of the hologram. We show that for a system with moderate numerical aperture, optimal resolution of ?/2 laterally and ? in depth can be achieved.

Nickerson, Brenden Scott; Kreuzer, Hans Jürgen



New criteria for blind deconvolution of nonminimum phase systems (channels)  

Microsoft Academic Search

A necessary and sufficient condition for blind deconvolution (without observing the input) of nonminimum-phase linear time-invariant systems (channels) is derived. Based on this condition, several optimization criteria are proposed, and their solution is shown to correspond to the desired response. These criteria involve the computation only of second- and fourth-order moments, implying a simple tap update procedure. The proposed methods

Ofir Shalvi; Ehud Weinstein



Wavelet deconvolution in a periodic setting using cross-validation  

Microsoft Academic Search

The wavelet deconvolution method WaveD using band-limited wavelets offers both theoretical and computational advantages over traditional compactly supported wavelets. The translation-invariant WaveD with a fast algorithm improves further. The twofold cross-validation method for choosing the threshold parameter and the finest resolution level in WaveD is introduced. The algorithm's performance is compared with the fixed constant tuning and the default tuning

Leming Qu; Partha S. Routh; Kyungduk Ko



Ill-posedness of space-variant image deconvolution  

NASA Astrophysics Data System (ADS)

In optical systems with quite good correction, the field-dependence of aberrations often can be neglected. In low performance systems, for the application of deconvolution methods the field-dependence of the point spread function must be taken into account. However, the number of publications dealing with the topic of space-variant deconvolution in order to compensate system aberrations is quite low. In this contribution, we investigate the fundamental difficulty accompanied by space-variant deconvolution, which makes the problem ill-posed, even in the case of non-vanishing modular transfer functions and the assumption of noise-free imaging. The spatial frequencies of the image spectrum are mixed depending on the field-dependencies of the optical aberrations. In general, it is therefore not possible to reconstruct the individual frequencies exactly. Some discrete examples with a non unique solution are presented. For the 2D case, we will show and investigate how the most popular algorithms deal with this fundamental problem for different typical types of optical aberrations. Depending on the aberration, the computational results for those algorithms differ from very good results to images with artifacts. The Lucy Richardson method, which is often recommended in the case of spaceinvariant image reconstruction since it may even reconstruct frequencies above the cut off frequency, provides poor results for unsymmetrical space-variant aberrations like Coma or a simple Tilt. However, we will show that a simpler method like the Landweber algorithm is better suited to deal with those kinds of aberrations.

Kieweg, Michael; Gross, Herbert; Sievers, Torsten; Müller, Lothar



Dependability Analysis Using a Fault Injection Tool Based on Synthesizability of HDL Models  

Microsoft Academic Search

This paper presents a fault injection tool called SINJECT that supports several synthesizable and non-synthesizable fault models for dependability analysis of digital systems modeled by popular HDLs. The tool provides injection of transient and permanent faults into the Verilog as well as VHDL models of a digital circuit to study the fault behavior, fault propagation and fault coverage. Moreover, using

Hamid R. Zarandi; Seyed Ghassem Miremadi; Ali Reza Ejlali



Data stream management system: Tools for live stream handling & their application on trivial network analysis problems  

Microsoft Academic Search

This paper presents handling and analysis of network packets using data stream management system tool TelegraphCQ. The number of tools for analyzing data traffic in the Internet is continuously increasing, because there is an increasing need in many different application domains. The high volume data that flows within a network requires one to rethink the fundamental architecture of a DBMS

Nadeem Akhtar; Mohammed A Qadeer; Faraz Khan; Faridul Haque



The Ribosomal Database Project: improved alignments and new tools for rRNA analysis  

Microsoft Academic Search

The Ribosomal Database Project (RDP) provides researchers with quality-controlled bacterial and archaeal small subunit rRNA alignments and analy- sis tools. An improved alignment strategy uses the Infernal secondary structure aware aligner to pro- vide a more consistent higher quality alignment and faster processing of user sequences. Substan- tial new analysis features include a new Pyrose- quencing Pipeline that provides tools

James R. Cole; Q. Wang; E. Cardenas; J. Fish; B. Chai; Ryan J. Farris; A. S. Kulam-syed-mohideen; D. M. Mcgarrell; Terry L. Marsh; George M. Garrity; James M. Tiedje



Software tool for automated design and cost benefit analysis of offshore grid  

Microsoft Academic Search

This paper discusses a software tool for automated design and cost benefit analysis of an offshore grid. The tool utilises a graphical user interface (GUI) and a catalogue of components (cables, transformers, converters, platforms) to design an offshore wind farm and create the corresponding network model. A set of calculations that includes load flow, cable sizing inspection, reactive power compensation,

Dusko P. Nedic; Muhammad Ali; Jovica V. Milanovic



Expanding capabilities of EMTP-like tools: from analysis to design  

Microsoft Academic Search

EMTP-like tools are widely used for simulation of transients in power systems. The implementation of several features in some of these tools has significantly expanded their applications. Some of them can be now used to perform sensitivity analysis, select power components, or introduce modifications in a power system during a simulation. This letter describes the new features. Although the document

J. A. Martinez; J. Martin-Arnedo



Experiences on implementation of GIS based tools for analysis, planning and design of distribution systems  

Microsoft Academic Search

Geo-spatial information system (GIS) is a powerful tool to Visualize distribution network as it exists on ground. GIS based electrical distribution network analysis & design tools enables the planning engineer to visualize the areas of low voltages and high losses; reconfigure network for reduction of losses in planning mode, design of distribution network to extend supply to new customers and

M. V. K. Rao; B. S. Varma; C. Radhakrishna



A comparative analysis of DEA as a discrete alternative multiple criteria decision tool  

Microsoft Academic Search

The application of Data Envelopment Analysis (DEA) as a discrete alternative multiple criteria decision making (MCDM) tool has been gaining more attention in the literature. In this literature, DEA has been applied as an MCDM tool and compared analytically to other MCDM models and their structures, especially those that are based on multiple objective linear programming approaches. In this paper,

Joseph Sarkis



An agent-based tool for infrastructure interdependency policy analysis.  

SciTech Connect

Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructure interdependencies such as those between the electric power and natural gas markets. These markets are undergoing fundamental transformations including major changes in electric generator fuel sources. Electric generators that use natural gas as a fuel source are rapidly gaining market share. These generators introduce direct interdependency between the electric power and natural gas markets. These interdependencies have been investigated using the emergent behavior of CAS model agents within the Spot Market Agent Research Tool Version 2.0 Plus Natural Gas (SMART II+).

North, M. J.



Spectral probability density as a tool for ambient noise analysis.  


This paper presents the empirical probability density of the power spectral density as a tool to assess the field performance of passive acoustic monitoring systems and the statistical distribution of underwater noise levels across the frequency spectrum. Using example datasets, it is shown that this method can reveal limitations such as persistent tonal components and insufficient dynamic range, which may be undetected by conventional techniques. The method is then combined with spectral averages and percentiles, which illustrates how the underlying noise level distributions influence these metrics. This combined approach is proposed as a standard, integrative presentation of ambient noise spectra. PMID:23556689

Merchant, Nathan D; Barton, Tim R; Thompson, Paul M; Pirotta, Enrico; Dakin, D Tom; Dorocicz, John



Mechanisms proposed for spectrogram correlation and transformation deconvolution in FM bat sonar  

NASA Astrophysics Data System (ADS)

Big brown bats use time/frequency distributions to represent FM biosonar pulses and echoes as a consequence of reception through frequency tuned channels of the inner ear and subsequent processing by similarly tuned neural channels in the auditory pathway. Integration time is 350 ?s, yet delay resolution is 2-10 ?s, which must be based on detecting changes in the echo spectrum caused by interference between overlapping reflections inside the integration time. However, bats perceive not merely the echo interference spectrum but the numerical value of the delay separation from the spectrum, which requires deconvolution. Because spectrograms are the initial representation, this process is spectrogram correlation and transformation (SCAT). Proposed SCAT deconvolution mechanisms include extraction of echo envelope ripples for time-domain spectrometry, cepstral analysis of echoes, use of coherent or noncoherent reconstruction with basis functions, segmentation of onsets of overlapping replicas at moderate to long time separations, and localization of the occurrence of spectral interference ripples at specific times within dechirped spectrograms. Physiological evidence from single-unit recordings reveals a cepstral-like time-frequency process based on freqlets, both single-unit and multiunit responses reveal which may prove to be time-domain basis functions, and multiunit responses exhibit modulations by onset and envelope ripple. [Work supported by NIH and ONR.

Simmons, James A.



Peak Studio: a tool for the visualization and analysis of fragment analysis files.  


While emerging technologies such as next-generation sequencing are increasingly important tools for the analysis of metagenomic communities, molecular fingerprinting techniques such as automated ribosomal intergenic spacer analysis (ARISA) and terminal restriction fragment length polymorphisms (T-RFLP) remain in use due to their rapid speed and low cost. Peak Studio is a java-based graphical user interface (GUI) designed for the visualization and analysis of fragment analysis (FSA) files generated by the Applied Biosystems capillary electrophoresis instrument. Specifically designed for ARISA and T-RFLP experiments, Peak Studio provides the user the ability to freely adjust the parameters of a peak-calling algorithm and immediately see the implications for downstream clustering by principal component analysis. Peak Studio is fully open-source and, unlike proprietary solutions, can be deployed on any computer running Windows, OS X or Linux. Peak Studio allows data to be saved in multiple formats and can serve as a pre-processing suite that prepares data for statistical analysis in programs such as SAS or R. PMID:23760901

McCafferty, Jonathan; Reid, Robert; Spencer, Melanie; Hamp, Timothy; Fodor, Anthony




Microsoft Academic Search

Metabolic differences between test and control groups (i.e., metabonomics) are routinely accomplished by using multivariate analysis for data obtained commonly from NMR, GC-MS and LC-MS. Multivariate analysis (e.g., principal component analysis PCA) is commonly used to extract potential metabolites responsible for clinical observations. Metabonomics applied to the clinical field is challenging because the physiological variabilities like gender, age, race … etc might

Muhammed Alzweiri; David Watson; John Parkinson



Distribution automation system with real-time analysis tools  

SciTech Connect

In the past 10 years, the electric power industry`s involvement in distribution automation (DA) has been principally focused on remote monitoring and control of the distribution systems and their equipment. SCADA has constituted the most significant attribute. Electric utilities in many locations around the world have installed numerous SCADA systems for their distribution substations and feeders and built the infrastructure for real-time distribution operation and control. Real-time data is available to human operators, enabling them to monitor more and more events in their distribution systems and to control automatic equipment remotely. At the same time, the added volume of real-time data has also created data overloads at some distribution control centers. Without proper decision support tools, operators could only rely on their past experience in making operating decisions based on a subset of the data they receive. As a result, large volumes of real-time data along with much of the infrastructure built for automatic operation of the distribution systems could remain underutilized. This article features PG and E`s approach to utilizing the existing infrastructures and available data more effectively by introducing intelligence to DA/SCADA systems through development of advanced analytical tools for operations decision support.

Shirmohammadi, D. [Shir Consultants, San Ramon, CA (United States); Liu, W.H.E.; Lau, K.C. [Pacific Gas and Electric Co., San Ramon, CA (United States); Hong, H.W. [Opercon Systems, Inc., Alameda, CA (United States)



Applying observations of work activity in designing prototype data analysis tools  

SciTech Connect

Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

Springmeyer, R.R.




SciTech Connect

For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project ( has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services''. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed.

Perl, Joseph



Interfacing interactive data analysis tools with the grid: The PPDG CS-11 activity  

SciTech Connect

For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics DataGrid project ( has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services.'' The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments (ALICE, ATLAS, BaBar, D0, CMS, JLab, STAR, others welcome), to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis frame work developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed.

Olson, Douglas L.; Perl, Joseph



Maximum correlated Kurtosis deconvolution and application on gear tooth chip fault detection  

NASA Astrophysics Data System (ADS)

In this paper a new deconvolution method is presented for the detection of gear and bearing faults from vibration data. The proposed maximum correlated Kurtosis deconvolution method takes advantage of the periodic nature of the faults as well as the impulse-like vibration behaviour associated with most types of faults. The results are compared to the standard minimum entropy deconvolution method on both simulated and experimental data. The experimental data is from a gearbox with gear chip fault, and the results are compared between healthy and faulty vibrations. The results indicate that the proposed maximum correlated Kurtosis deconvolution method performs considerably better than the traditional minimum entropy deconvolution method, and often performs several times better at fault detection. In addition to this improved performance, deconvolution of separate fault periods is possible; allowing for concurrent fault detection. Finally, an online implementation is proposed and shown to perform well and be computationally achievable on a personal computer.

McDonald, Geoff L.; Zhao, Qing; Zuo, Ming J.



An open-source deconvolution software package for 3-D quantitative fluorescence microscopy imaging  

PubMed Central

Summary Deconvolution techniques have been widely used for restoring the 3-D quantitative information of an unknown specimen observed using a wide-field fluorescence microscope. Deconv, an open-source deconvolution software package, was developed for 3-D quantitative fluorescence microscopy imaging and was released under the GNU Public License. Deconv provides numerical routines for simulation of a 3-D point spread function and deconvolution routines implemented three constrained iterative deconvolution algorithms: one based on a Poisson noise model and two others based on a Gaussian noise model. These algorithms are presented and evaluated using synthetic images and experimentally obtained microscope images, and the use of the library is explained. Deconv allows users to assess the utility of these deconvolution algorithms and to determine which are suited for a particular imaging application. The design of Deconv makes it easy for deconvolution capabilities to be incorporated into existing imaging applications.




Evaluation of the Business Attraction Module in Montana's Highway Economic Analysis Tool.  

National Technical Information Service (NTIS)

Montana's Highway Economic Analysis Tool (HEAT) was created to forecast the economic benefits of highways projects, including possible growth in employment. Of particular interest is the business attraction module in HEAT, which estimates the direct emplo...

A. J. Horowitz X. Jin Y. Zhu



Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool  

SciTech Connect

The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model.

Johnson, P.E. [Oak Ridge National Lab., TN (United States); Lester, P.B. [Dept. of Energy Oak Ridge Operations, TN (United States)



GRAS: a general-purpose 3-D Modular Simulation tool for space environment effects analysis  

Microsoft Academic Search

Geant4 Radiation Analysis for Space (GRAS) is a modular, extendable tool for space environment effects simulation. Analyses include cumulative ionizing and NIEL doses, effects to humans, charging, fluence and transient effects in three-dimensional geometry models.

Giovanni Santin; Vladimir Ivanchenko; Hugh Evans; Petteri Nieminen; Eamonn Daly



Portfolio-Analysis Tool for Missile Defense (PAT-MD). Methodology and User's Manual.  

National Technical Information Service (NTIS)

RAND's Portfolio-Analysis Tool for Missile Defense (PAT-MD) was built to support highlevel discussion and decisionmaking in the Missile Defense Agency (MDA) by providing summary portfolio-style characterizations of alternative investment options. These ch...

P. Dreyer P. K. Davis



Analysis and Thermal-Design Improvements of Downhole Tools for Use in Hot-Dry Wells.  

National Technical Information Service (NTIS)

Design improvements made for downhole thermal protection of systems based on results obtained from the analysis of the electronics, heat sink, and dewar packaged in a steel tubular body are described. Results include heat flux at the tool surface, tempera...

G. A. Bennett G. R. Sherman



Using a Bracketed Analysis as a Learning Tool.  

ERIC Educational Resources Information Center

|Bracketed analysis is an examination of experiences within a defined time frame or "bracket." It assumes the ability to learn from any source: behaviors, emotions, rational and irrational thought, insights, reflections, and reactions. A bracketed analysis to determine what went wrong with a grant proposal that missed deadlines illustrates its…

Main, Keith



Configural Frequency Analysis as a Statistical Tool for Developmental Research.  

ERIC Educational Resources Information Center

Configural frequency analysis (CFA) is suggested as a technique for longitudinal research in developmental psychology. Stability and change in answers to multiple choice and yes-no item patterns obtained with repeated measurements are identified by CFA and illustrated by developmental analysis of an item from Gorham's Proverb Test. (Author/DWH)

Lienert, Gustav A.; Oeveste, Hans Zur



Core Curriculum Analysis: A Tool for Educational Design  

ERIC Educational Resources Information Center

This paper examines the outcome of a dimensional core curriculum analysis. The analysis process was an integral part of an educational development project, which aimed to compact and clarify the curricula of the degree programmes. The task was also in line with the harmonising of the degree structures as part of the Bologna process within higher…

Levander, Lena M.; Mikkola, Minna



Socioeconomic analysis: a tool for assessing the potential of nanotechnologies  

Microsoft Academic Search

Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of

Jean-Marc Brignon



Prototype Visualization Tools For Multi-Experiment Performance Analysis  

Microsoft Academic Search

The analysis of modern, parallelized applications, such as scientific modeling, is of interest to a variety of people within the computing community of the Department of Defense (DoD). Persons desiring insight into the performance of these large programs include application users, application programmers\\/developers, portfolio and center managers, and others. The analysis needed requires the examination of large data sets obtained

R. Araiza; J. Nava; A. Taylor; P. Teller; D. Cronk; S. Moore



Bioinformatics enrichment tools: paths toward the comprehensive functional analysis of large gene lists  

PubMed Central

Functional analysis of large gene lists, derived in most cases from emerging high-throughput genomic, proteomic and bioinformatics scanning approaches, is still a challenging and daunting task. The gene-annotation enrichment analysis is a promising high-throughput strategy that increases the likelihood for investigators to identify biological processes most pertinent to their study. Approximately 68 bioinformatics enrichment tools that are currently available in the community are collected in this survey. Tools are uniquely categorized into three major classes, according to their underlying enrichment algorithms. The comprehensive collections, unique tool classifications and associated questions/issues will provide a more comprehensive and up-to-date view regarding the advantages, pitfalls and recent trends in a simpler tool-class level rather than by a tool-by-tool approach. Thus, the survey will help tool designers/developers and experienced end users understand the underlying algorithms and pertinent details of particular tool categories/tools, enabling them to make the best choices for their particular research interests.

Huang, Da Wei; Sherman, Brad T.; Lempicki, Richard A.



Evaluation of a semiautomatic software tool for left ventricular function analysis with 16-slice computed tomography  

Microsoft Academic Search

The purpose of the study was to evaluate a semiautomatic analysis tool for assessing global left ventricular myocardial function with multislice computed tomography (MSCT). We examined 33 patients with MSCT using 16×0.5 mm detector collimation and magnetic resonance imaging (MRI) on a 1.5-T scanner. MSCT data were analyzed using semiautomatic volumetric analysis software (ANET, CSCF-001A, Toshiba). This software tool automatically creates

Marc Dewey; Mira Müller; Florian Teige; Bernd Hamm



An Automated Refereeing and Analysis Tool for the Four-Legged League  

Microsoft Academic Search

The aim of this paper is to propose an automated refereeing and analysis tool for robot soccer. This computer vision based\\u000a tool can be applied for diverse tasks such as: (i) automated game refereeing, (ii) computer-based analysis of the game, and\\u000a derivation of game statistics, (iii) automated annotations and semantic descriptions of the game, which could be used for\\u000a the

Javier Ruiz-del-solar; Patricio Loncomilla; Paul A. Vallejos



Experimental analysis and modeling of the dynamic performance of machine tool spindle-bearing systems  

Microsoft Academic Search

In this paper, an analysis of the dynamic characteristics of machine tool spindle-bearing systems is presented. The research utilized the force impact-response testing method. The results are applied to the analysis and modeling of the dynamic performance of machine tool spindle-bearing systems. As an indicator of dynamic performance, the impulse response matrices are experimentally obtained. Two types of impulse response

Evgueni V. Bordatchev; Peter E. Orban; Adam Rehorn



Identifying Security Relevant Warnings from Static Code Analysis Tools through Code Tainting  

Microsoft Academic Search

Static code analysis tools are often used by developers as early vulnerability detectors. Due to their automation they are less time-consuming and error-prone then manual reviews. However, they produce large quantities of warnings that developers have to manually examine and understand.In this paper, we look at a solution that makes static code analysis tools more useful as an early vulnerability

Dejan Baca



A Simple and Efficient Tool for Design Analysis of Synchronous Reluctance Motor  

Microsoft Academic Search

We present an efficient tool for design analysis of a synchronous reluctance motor (SynRM). We use winding function analysis (WFA) instead of finite-element analysis (FEA). With WFA, parameter sensitivity can be analyzed and the effects of parameters on the machine design can be evaluated very rapidly (under linear condition). We investigated the effect of rotor skewing, stator winding chording, pole

Tahar Hamiti; Thierry Lubin; Abderrezak Rezzoug



Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool  

Microsoft Academic Search

The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be

Wondwesen Tafesse; Tor Korneliussen; Kåre Skallerud



Systematic studies of the Richardson-Lucy deconvolution algorithm applied to VHE gamma data  

NASA Astrophysics Data System (ADS)

The Richardson-Lucy deconvolution algorithm was applied to astronomical images in the very high-energy regime with photon energies above 100 GeV. Through a systematic study with respect to source significance, background level and source morphology we were able to derive optimal deconvolution parameters. The results presented show that deconvolution makes it possible to study structural details well below the angular resolution of the very high-energy ?-ray experiment.

Heinz, S.; Jung, I.; Stegmann, C.



Texture image analysis for osteoporosis detection with morphological tools  

NASA Astrophysics Data System (ADS)

The disease of osteoporosis shows itself both in a reduction of the bone mass and a degradation of the microarchitecture of the bone tissue. Radiological images of heel's bone are analyzed in order to extract informations about microarchitectural patterns. We first extract the gray-scale skeleton of the microstructures contained in the underlying images. More precisely, we apply the thinning procedure proposed by Mersal which preserves connectivity of the microarchitecture. Then, a post-processing of the resulting skeleton consists in detecting the points of intersection of the trabecular bones (multiple points). The modified skeleton can be considered as a powerful tool to extract discriminant features between Osteoporotic Patients (OP) and Control Patients (CP). For instance, computing the distance between two horizontal (respectively vertical) adjacent trabecular bones is a straightforward task once the multiple points are available. Statistical tests indicate that the proposed method is more suitable to discriminate between OP and CP than conventional methods based on binary skeleton.

Sevestre-Ghalila, Sylvie; Benazza-Benyahia, Amel; Cherif, Hichem; Souid, Wided



GCAFIT—A new tool for glow curve analysis in thermoluminescence nanodosimetry  

NASA Astrophysics Data System (ADS)

Glow curve analysis is widely used for dosimetric studies and applications. Therefore, a new computer program, GCAFIT, for deconvoluting first-order kinetics thermoluminescence (TL) glow curves and evaluating the activation energy for each glow peak in the glow curve has been developed using the MATLAB technical computing language. A non-linear function describing a single glow peak is fitted to experimental points using the Levenberg-Marquardt least-square method. The developed GCAFIT software was used to analyze the glow curves of TLD-100, TLD-600, and TLD-700 nanodosimeters. The activation energy E obtained by the developed GCAFIT software was compared with that obtained by the peak shape methods of Grossweiner, Lushchik, and Halperin-Braner. The frequency factor S for each glow peak was also calculated. The standard deviations are discussed in each case and compared with those of other investigators. The results show that GCAFIT is capable of accurately analyzing first-order TL glow curves. Unlike other software programs, the developed GCAFIT software does not require activation energy as an input datum; in contrast, activation energy for each glow peak is given in the output data. The resolution of the experimental glow curve influences the results obtained by the GCAFIT software; as the resolution increases, the results obtained by the GCAFIT software become more accurate. The values of activation energy obtained by the developed GCAFIT software a in good agreement with those obtained by the peak shape methods. The agreement with the Halperin-Braner and Lushchik methods is better than with that of Grossweiner. High E and S values for peak 5 were observed; we believe that these values are not real because peak 5 may in fact consist of two or three unresolved peaks. We therefore treated E and S for peak 5 as an effective activation energy, Eeff, and an effective frequency factor, Seff. The temperature value for peak 5 was also treated as an effective quantity, Tm eff.

Abd El-Hafez, A. I.; Yasin, M. N.; Sadek, A. M.



A Web-based Tool For The Analysis Of Concept Inventory Data  

NASA Astrophysics Data System (ADS)

``FOCIA'' stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of gain, normalized change and effect size. The tool currently analyzes data from the Lunar Phases Concept Inventory (LPCI), the Force Concept Inventory (FCI), the Astronomy Diagnostic Test (ADT), the Force and Motion Concept Inventory (FMCE) and generically, any multiple choice test. It will be expanded to analyze data from other commonly utilized concept inventories in the PER community and, from user-designed and uploaded tools. In this paper, we will discuss the development of this analysis tool including some technical details of implementation and a description of what is available for use. Instructors and researchers are encouraged to use the latest version of the analysis tool via our website,

Beuckman, Joseph P.; Franklin, Scott V.; Lindell, Rebecca S.



A Web-based Tool For The Analysis Of Concept Inventory Data  

NSDL National Science Digital Library

"FOCIA" stands for Free Online Concept Inventory Analyzer. FOCIA, our new web-based tool will allow teachers and researchers in any location to upload their test data and instantly receive a complete analysis report. Analyses included with this tool are basic test statistics, Traditional Item Analysis, Concentration Analysis, Model Analysis Theory results, pre and post test comparison, including the calculations of gain, normalized change and effect size. The tool currently analyzes data from the Lunar Phases Concept Inventory (LPCI), the Force Concept Inventory (FCI), the Astronomy Diagnostic Test (ADT), the Force and Motion Concept Inventory (FMCE) and generically, any multiple choice test. It will be expanded to analyze data from other commonly utilized concept inventories in the PER community and, from user-designed and uploaded tools. In this paper, we will discuss the development of this analysis tool including some technical details of implementation and a description of what is available for use. Instructors and researchers are encouraged to use the latest version of the analysis tool via our website,

Beuckman, Joseph; Franklin, Scott V.; Lindell, Rebecca S.



Deconvolution of Microfluorometric Histograms with B Splines  

Microsoft Academic Search

We consider the problem of estimating a probability density from observations from that density which are further contaminated by random errors. We propose a method of estimation using spline functions, discuss the numerical implementation of the method, and prove its consistency. The problem is motivated by the analysis of DNA content obtained by microfluorometry, and an example of such an

John Mendelsohn; John Rice



Ferret: A Computer Visualization and Analysis Tool for Gridded Data.  

National Technical Information Service (NTIS)

Program FERRET is an interactive computer visualization and analysis environment designed to meet the needs of physical scientists analyzing large and complex gridded data sets. FERRET was origionally conceived and written to analyze the numerical ocean m...

S. Hankin J. Davison K. O'Brien D. E. Harrison



Collaboration and Modeling Tools for Counter-Terrorism Analysis.  

National Technical Information Service (NTIS)

One of the major challenges in counter-terrorism analysis today involves connecting the relatively few and sparse terrorism-related dots embedded within massive amounts of data flowing into the government's intelligence and counter-terrorism agencies. Inf...

D. Serfaty K. Pattipati P. Willett R. Popp W. Stacy



Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles  

SciTech Connect

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties.

Baer, Donald R.; Gaspar, Daniel J.; Nachimuthu, Ponnusamy; Techane, Sirnegeda D.; Castner, David G.



Dynamics and Control Analysis Package (DCAP): An Automated Analysis and Design Tool for Structural Control of Space Structures.  

National Technical Information Service (NTIS)

The Dynamics and Control Analysis Package (DCAP) automated design and checking tool for the dynamics and control of large flexible structures is described. The DCAP includes programs for nonlinear simulation of multibody systems dynamics; linear or linear...

R. P. Singh R. J. Vandervoort C. Arduini A. Festa C. Maccone



DEBRISK, a Tool for Re-Entry Risk Analysis  

NASA Astrophysics Data System (ADS)

An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the ablated object. The mass in the lumped mass equation is termed 'thermal mass' and consists of the part of the object that is exposed to the flow (so excluding the mass of the contained children). A fair amount of predefined materials is implemented, along with their thermal properties. In order to allow the users to modify the properties or to add new materials, user defined materials can be used. In that case the properties such as specific heat, emissivity and conductivity can either be entered as a constant or as being temperature dependent by entering a table. Materials can be derived from existing objects, which is useful in case only one or few of the material properties change. The code has been developed in the Java language, benefitting from the object oriented approach. Most methods that are used in DEBRISK to compute drag coefficients and heating rates are based on engineering methods developed in the 1950 to 1960's, which are used as well in similar tools (ORSAT, SESAME, ORSAT-J, ...). The paper presents a set of comparisons with literature cases of similar tools in order to verify the implementation of those methods in the developed software.

Omaly, P.; Spel, M.



X-ray scatter removal by deconvolution  

Microsoft Academic Search

The distribution of scattered x rays detected in a two-dimensional projection radiograph at diagnostic x-ray energies is measured as a function of field size and object thickness at a fixed x-ray potential and air gap. An image intensifier-TV based imaging system is used for image acquisition, manipulation, and analysis. A scatter point spread function (PSF) with an assumed linear, spatially

J. A. Seibert; J. M. Boone



Analysis of the exhalome: a diagnostic tool of the future.  


Investigations on breath analysis have provided preliminary data on its potential in the noninvasive diagnosis of lung diseases. Although the conventional comparisons of exhaled breath in study populations (ie, diseased vs healthy) may help to identify patients with various lung diseases, we believe that the analysis of exhaled breath holds promise beyond this scenario. On the basis of preliminary findings, we hypothesize that breath analysis (1) could be applied not only to identify patients with lung disease but also to better phenotype healthy subjects at risk and patients with a particular disease, which is in-line with current efforts toward individualized medicine; (2) could be useful in estimating internal body time to determine the optimal time of drug administration, thereby maximizing drug activity and reducing toxicity (chronopharmacology); and (3) could be applied to monitor drugs or drug metabolites, thus, enhancing adherence to prescribed medications and enabling studies on pharmacokinetics. PMID:24008952

Martinez-Lozano Sinues, Pablo; Zenobi, Renato; Kohler, Malcolm



Comparative genomic hybridization: Tools for high resolution analysis  

SciTech Connect

Comparative genomic hybridization (CGH) is a powerful FISH-based technique that allows detection and mapping of genome imbalances using genomic DNA as probe. Limitations are resolution (limited by the use of metaphase chromosomes as target and by the high noise generated by the technique), sensitivity (imbalances as large as 10 Mb or more may remain undetected), and cumbersome analysis. We have utilized a statistical procedure based on eigenanalysis that allows extraction of information from many chromosome images to provide a result suitable for objective statistical evaluation. The analysis allows the recognition of consistent patterns along the chromosome and discards patterns of random appearance (assumed to be noise). An obvious application for this powerful analysis is for CGH experiments. We have performed test experiments using genomic DNAs from patients with partial trisomies or deletions of a chromosome segment. Image ratio analysis was performed and 20-30 ratio images of the relevant chromosome were subjected to eigenanalysis (in our system the processing time is 60-90 seconds). Results showed a superior contrast enhancement of the regions corresponding to the unbalanced genomic segment with sharp limits between normal and abnormal fluorescence ratio. This allowed the precise mapping of unbalanced regions that was in agreement with standard cytogenetic analysis. Because there is no limit to the number of chromosomes that can be analyzed at any one time, the method has the potential to increase the sensitivity of CGH by eliminating the noise component. We are currently testing the possibility of using this method to resolve complex unbalanced rearrangements that would be impossible to analyze by standard cytogenetic analysis or that would require multiple fluorescence in situ hybridization experiments.

Knapp, R.D.; Antonacci, R.; Haddad, B. [Baylor College of Medicine, Houston, TX (United States)] [and others



A geospatial tool for wildfire threat analysis in central Texas  

NASA Astrophysics Data System (ADS)

Wildland fires in the United States are not always confined to wilderness areas. The growth of population centers and housing developments in wilderness areas has blurred the boundaries between rural and urban. This merger of human development and natural landscape is known in the wildland fire community as the wildland urban interface or WUI, and it is within this interface that many wildland fires increasingly occur. As wildland fire intrusions in the WUI increase so too does the need for tools to assess potential impact to valuable assets contained within the interface. This study presents a methodology that combines real-time weather data, a wildland fire behavior model, satellite remote sensing and geospatial data in a geographic information system to assess potential risk to human developments and natural resources within the Austin metropolitan area and surrounding ten counties of central Texas. The methodology uses readily available digital databases and satellite images within Texas, in combination with an industry standard fire behavior model to assist emergency and natural resource managers assess potential impacts from wildland fire. Results of the study will promote prevention of WUI fire disasters, facilitate watershed and habitat protection, and help direct efforts in post wildland fire mitigation and restoration.

Hunter, Bruce Allan