Note: This page contains sample records for the topic deconvolution analysis tool from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: August 15, 2014.
1

Deconvolution  

Microsoft Academic Search

In this chapter, we present deconvolution techniques to reconstruct the influence (impulse) function of the well and reservoir system from pressure-rate and pressure-pressure data sets. Deconvolution is simply solving the convolution integral (the Volterra integral equation of the first kind) for the convolution kernel. Basically deconvolution enables us to reconstruct (compute) an equivalent (extrapolated) constant-rate drawdown response for all production

Fikri J. Kuchuk; Mustafa Onur; Florian Hollaender

2010-01-01

2

PVT Analysis With A Deconvolution Algorithm  

SciTech Connect

Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

Kouzes, Richard T.

2011-02-01

3

Poissonian image deconvolution with analysis sparsity priors  

NASA Astrophysics Data System (ADS)

Deconvolving Poissonian image has been a significant subject in various application areas such as astronomical, microscopic, and medical imaging. In this paper, a regularization-based approach is proposed to solve Poissonian image deconvolution by minimizing the regularization energy functional, which is composed of the generalized Kullback-Leibler divergence as the data-fidelity term and sparsity prior constraints as the regularization term, and a non-negativity constraint. We consider two sparsity prior constraints which include framelet-based analysis prior and combination of framelet and total variation analysis priors. Furthermore, we show that the resulting minimization problems can be efficiently solved by the split Bregman method. The comparative experimental results including quantitative and qualitative analysis manifest that our algorithm can effectively remove blur, suppress noise, and reduce artifacts.

Fang, Houzhang; Yan, Luxin

2013-04-01

4

A survey of deconvolution approaches in teleseismic receiver function analysis  

NASA Astrophysics Data System (ADS)

Receiver function analysis is frequently used to image the Earth's crustal and upper mantle structure. The essential processing step in this analysis is the source normalization, which can be accomplished through deconvolution. Though a variety of deconvolution approaches have been employed over the years to solve this problem, no systematic comparison of these approaches has yet been done. Here, we present the results of such a comparison with the aim of evaluating the various deconvolution approaches and providing some guidelines as to which approach may be better suited for specific applications. The following deconvolution approaches are systematically compared: frequency-domain spectral division using both water-level and damping-factor regularization, multi-taper cross-correlation in the frequency domain, time-domain least squares filtering, and iterative time-domain deconvolution. We carry out benchmark tests on synthetic and real data to assess how the various approaches perform for different input conditions - e.g., data quality (including noise content), data volume based on number of stations and events, and the complexity of the target structure. Our results show that the different approaches produce receiver functions that are equally robust provided that a suitable regularization parameter is found - a task that is usually more easily accomplished in the time domain. However, in the case of noisy data, we find that the iterative time-domain deconvolution can generate as much ringing in the resulting receiver function as poorly regularized frequency-domain spectral division. If computational speed is sought, for example when dealing with large data sets, then the use of frequency-domain approaches might be more attractive. We also find that some deconvolution approaches may be better adapted than others to address specific imaging goals. For example, iterative time-domain deconvolution can be used to quickly construct profiles of first-order discontinuities (e.g., Moho and its multiples) by restricting the number of iterations (n=10-20) and thus filtering out higher-order converted signals.

Spieker, Kathrin; Rondenay, Stéphane; Halpaap, Felix

2014-05-01

5

Chemometric data analysis for deconvolution of overlapped ion mobility profiles.  

PubMed

We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution. PMID:22948903

Zekavat, Behrooz; Solouki, Touradj

2012-11-01

6

CAM-CM: a signal deconvolution tool for in vivo dynamic contrast-enhanced imaging of complex tissues  

PubMed Central

Summary:In vivo dynamic contrast-enhanced imaging tools provide non-invasive methods for analyzing various functional changes associated with disease initiation, progression and responses to therapy. The quantitative application of these tools has been hindered by its inability to accurately resolve and characterize targeted tissues due to spatially mixed tissue heterogeneity. Convex Analysis of Mixtures – Compartment Modeling (CAM-CM) signal deconvolution tool has been developed to automatically identify pure-volume pixels located at the corners of the clustered pixel time series scatter simplex and subsequently estimate tissue-specific pharmacokinetic parameters. CAM-CM can dissect complex tissues into regions with differential tracer kinetics at pixel-wise resolution and provide a systems biology tool for defining imaging signatures predictive of phenotypes. Availability: The MATLAB source code can be downloaded at the authors? website www.cbil.ece.vt.edu/software.htm Contact: yuewang@vt.edu Supplementary information: Supplementary data are available at Bioinformatics online.

Chen, Li; Chan, Tsung-Han; Choyke, Peter L.; Hillman, Elizabeth M. C.; Chi, Chong-Yung; Bhujwalla, Zaver M.; Wang, Ge; Wang, Sean S.; Szabo, Zsolt; Wang, Yue

2011-01-01

7

A L? sparse analysis prior for blind poissonian image deconvolution.  

PubMed

This paper proposes a new approach for blindly deconvolving images that are contaminated by Poisson noise. The proposed approach incorporates a new prior, that is the L0 sparse analysis prior, together with the total variation constraint into the maximum a posteriori (MAP) framework for deconvolution. A greedy analysis pursuit numerical scheme is exploited to solve the L0 regularized MAP problem. Experimental results show that our approach not only produces smooth results substantially suppressing artifacts and noise, but also preserves intensity changes sharply. Both quantitative and qualitative comparisons to the specialized state-of-the-art algorithms demonstrate its superiority. PMID:24663705

Gong, Xiaojin; Lai, Baisheng; Xiang, Zhiyu

2014-02-24

8

Experimental analysis and application of sparsity constrained deconvolution  

NASA Astrophysics Data System (ADS)

Sparsity constrained deconvolution can improve the resolution of band-limited seismic data compared to conventional deconvolution. However, such deconvolution methods result in nonunique solutions and suppress weak reflections. The Cauchy function, modified Cauchy function, and Huber function are commonly used constraint criteria in sparse deconvolution. We used numerical experiments to analyze the ability of sparsity constrained deconvolution to restore reflectivity sequences and protect weak reflections under different constraint criteria. The experimental results demonstrate that the performance of sparsity constrained deconvolution depends on the agreement between the constraint criteria and the probability distribution of the reflectivity sequences; furthermore, the modified Cauchyconstrained criterion protects the weak reflections better than the other criteria. Based on the model experiments, the probability distribution of the reflectivity sequences of carbonate and clastic formations is statistically analyzed by using well-logging data and then the modified Cauchy-constrained deconvolution is applied to real seismic data much improving the resolution.

Li, Guo-Fa; Qin, De-Hai; Peng, Geng-Xin; Yue, Ying; Zhai, Tong-Li

2013-06-01

9

Comparative analysis of UWB deconvolution and feature-extraction algorithms for GPR landmine detection  

Microsoft Academic Search

In this work we developed target recognition algorithms for landmine detection with ultra-wideband ground penetrating radar (UWB GPR). Due to non-stationarity of UWB signals their processing requires advanced techniques, namely regularized deconvolution, time-frequency or time-scale analysis. We use deconvolution to remove GPR and soil characteristics from the received signals. An efficient algorithm of deconvolution, based on a regularized Wiener inverse

Timofei G. Savelyev; Motoyuki Sato

2004-01-01

10

Richardson-Lucy deconvolution as a general tool for combining images with complementary strengths.  

PubMed

We use Richardson-Lucy (RL) deconvolution to combine multiple images of a simulated object into a single image in the context of modern fluorescence microscopy techniques. RL deconvolution can merge images with very different point-spread functions, such as in multiview light-sheet microscopes,1,?2 while preserving the best resolution information present in each image. We show that RL deconvolution is also easily applied to merge high-resolution, high-noise images with low-resolution, low-noise images, relevant when complementing conventional microscopy with localization microscopy. We also use RL deconvolution to merge images produced by different simulated illumination patterns, relevant to structured illumination microscopy (SIM)3,?4 and image scanning microscopy (ISM). The quality of our ISM reconstructions is at least as good as reconstructions using standard inversion algorithms for ISM data, but our method follows a simpler recipe that requires no mathematical insight. Finally, we apply RL deconvolution to merge a series of ten images with varying signal and resolution levels. This combination is relevant to gated stimulated-emission depletion (STED) microscopy, and shows that merges of high-quality images are possible even in cases for which a non-iterative inversion algorithm is unknown. PMID:24436314

Ingaramo, Maria; York, Andrew G; Hoogendoorn, Eelco; Postma, Marten; Shroff, Hari; Patterson, George H

2014-03-17

11

Comparative analysis of UWB deconvolution and feature-extraction algorithms for GPR landmine detection  

NASA Astrophysics Data System (ADS)

In this work we developed target recognition algorithms for landmine detection with ultra-wideband ground penetrating radar (UWB GPR). Due to non-stationarity of UWB signals their processing requires advanced techniques, namely regularized deconvolution, time-frequency or time-scale analysis. We use deconvolution to remove GPR and soil characteristics from the received signals. An efficient algorithm of deconvolution, based on a regularized Wiener inverse filter with wavelet noise level estimation, has been developed. Criteria of efficiency were stability of the signal after deconvolution, difference between the received signal and the convolved back signal, and computational speed. The novelty of the algorithm is noise level estimation with wavelet decomposition, which defines the noise level separately for any signal, independently of its statistics. The algorithm was compared with an iterative time-domain deconvolution algorithm based on regularization. For target recognition we apply singular value decomposition (SVD) to a time-frequency signal distribution. Here we compare the Wigner transform and continuous wavelet transform (CWT) for discriminant feature selection. The developed algorithms have been checked on the data acquired with a stepped-frequency GPR.

Savelyev, Timofei G.; Sato, Motoyuki

2004-09-01

12

Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research  

NASA Astrophysics Data System (ADS)

During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer studies is critically discussed, where special emphasis is set on evaluating different data processing strategies on the example of enriched stable Sr isotopes.1 The analytical key parameters such as blank (Kr, Sr and Rb), variation of the natural Sr isotopic composition in the sample, mass bias, interferences (Rb) and total combined uncertainty are considered. A full metrological protocol for data processing using IPD is presented based on data gained during two transgenerational marking studies of fish, where the transfer of a Sr isotope double spike (84Sr and 86Sr) from female spawners of common carp (Cyprinus carpio L.) and brown trout (Salmo trutta f.f.)2 to the centre of the otoliths of their offspring was studied by (LA)-MC-ICP-MS. 1J. Irrgeher, A. Zitek, M. Cervicek and T. Prohaska, J. Anal. At. Spectrom., 2014, 29, 193-200. 2A. Zitek, J. Irrgeher, M. Kletzl, T. Weismann and T. Prohaska, Fish. Manage. Ecol., 2013, 20, 654-361.

Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

2014-05-01

13

GlowFit—a new tool for thermoluminescence glow-curve deconvolution  

Microsoft Academic Search

A new computer program, GlowFit, for deconvoluting first-order kinetics thermoluminescence (TL) glow-curves has been developed. A non-linear function describing a single glow-peak is fitted to experimental points using the least squares Levenberg–Marquardt method. The main advantage of GlowFit is the ability to resolve complex TL glow-curves consisting of strongly overlapping peaks, as those observed in heavily-doped LiF:Mg,Ti (MTT) detectors. This

M. Puchalska; P. Bilski

2006-01-01

14

Objective determination of the water level in frequency-domain deconvolution for receiver function analysis  

NASA Astrophysics Data System (ADS)

Deconvolution is the central operation carried out in teleseismic receiver function (RF) analysis. It transforms the recorded teleseismic signal into the Earth's impulse response by effectively removing the source and instrument responses from this signal. The operation can be carried out either in the time domain or in the frequency domain. Time-domain deconvolution is generally more computationally intensive, but it allows for automatic convergence towards a stable solution (i.e., an RF devoid of ringing) for noisy data. Frequency-domain deconvolution is faster to compute, but it often requires user input to find the optimal regularization/water-level parameter that yields a stable solution. In this study, we investigate ways to objectively determine the optimal water level parameter for frequency-domain deconvolution of teleseismic RFs. Using synthetic and field data, we compare various optimization schemes with L-curves that provide a tradeoff between the root-mean-square error, L2-norm, signal sparseness and spectral flatness of the computed RF. We find that maximising the spectral flatness of the computed RF is the best way to find the optimum water level. Applications to field data from central and northern Norway illustrate the viability of this objective optimization scheme. The resulting RF profiles show clear signals from the Moho (with relief associated with the central Scandes) as well as from the 410 and 660 km-discontinuities below Norway.

Halpaap, Felix; Spieker, Kathrin; Rondenay, Stéphane

2014-05-01

15

Deconvolution analysis in radionuclide quantitation of left-to-right cardiac shunts.  

PubMed

A poor bolus injection results in an unsatisfactory quantitative radionuclide angiocardiogram in as many as 20% of children with possible, left-to-right (L-R) cardiac shunts. Deconvolution analysis was applied to similar studies in experimental animals to determine whether dependence on the input bolus could be minimized. Repeated good-bolus, prolonged (greater than 2.5 sec), or multiple-peak injections were made in four normal dogs and seven dogs with surgically created atrial septal defects (ASD). QP/QS was determined using the gamma function. The mean QP/QS from ten good-bolus studies in each animal was used as the standard for comparison. In five trials in normal animals, where a prolonged or double-peak bolus led to a shunt calculation (QP/QS greater than 1.2 : 1), deconvolution resulted in QP/QS = 1.0. Deconvolution improved shunt quantitation in eight of ten trials in animals that received a prolonged bolus. The correlation between the reference QP/QS and the QP/QS calculated from uncorrected bad bolus studies was only 0.39 (p greater than 0.20). After deconvolution using a low pass filter, the correlation improved significantly (r = 0.77, p less than 0.01). The technique gave inconsistent results with multiple-peak bolus injections. Deconvolution analysis in these studies is useful in preventing normals from being classified as shunts, and in improving shunt quantitation after a prolonged bolus. Clinical testing of this technique in children with suspected L-R shunts seems warranted. PMID:536823

Alderson, P O; Douglass, K H; Mendenhall, K G; Guadiani, V A; Watson, D C; Links, J M; Wagner, H N

1979-06-01

16

Undercomplete Blind Subspace Deconvolution  

Microsoft Academic Search

We introduce the blind subspace deconvolution (BSSD) problem, which is the extension of both the blind source deconvolution (BSD) and the independent subspace analysis (ISA) tasks. We examine the case of the undercomplete BSSD (uBSSD). Applying temporal concatenation we reduce this problem to ISA. The associated `high dimensional' ISA problem can be handled by a recent technique called joint f-decorrelation

Zoltan Szabo; Barnabas Poczos; Andras Lorincz

2007-01-01

17

Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function  

SciTech Connect

A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

1987-06-01

18

Quantitative analysis of nucleic acids, proteins, and viruses by Raman band deconvolution.  

PubMed

A constrained, iterative Fourier deconvolution method is employed to enhance the resolution of Raman spectra of biological molecules for quantitative assessment of macromolecular secondary structures and hydrogen isotope exchange kinetics. In an application to the Pf1 filamentous bacterial virus, it is shown that the Raman amide I band contains no component other than that due to alpha-helix, indicating the virtual 100% helicity of coat proteins in the native virion. Comparative analysis of the amide I band of six filamentous phages (fd, If1, IKe, Pf1, Xf, and Pf3), all at the same experimental conditions, indicates that the subunit helix-percentage ranges from a high of 100% in Pf1 to a low of 71% in Xf. Deconvolution of amide I of Pf3 at elevated temperatures, for which an alpha-to-beta transition was previously reported (Thomas, G. J., Jr., and L. A. Day, 1981, Proc. Natl. Acad. Sci. USA., 78:2962-2966), allows quantitative evaluation of the contributions of both alpha-helix and beta-strand conformations to the structure of the thermally perturbed viral coat protein. Weak Raman lines of viral DNA bases and coat protein side chains, which are poorly resolved instrumentally, are also distinguished for all viruses by the deconvolution procedure. Application to the carbon-8 hydrogen isotope exchange reaction of a purine constituent of transfer RNA permits accurate determination of the exchange rate constant, which is in agreement with calculations based upon curve-fitting methods. PMID:6083811

Thomas, G J; Agard, D A

1984-12-01

19

A further analysis for the minimum-variance deconvolution filter performance  

NASA Technical Reports Server (NTRS)

Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

Chi, Chong-Yung

1987-01-01

20

Fully automated deconvolution method for on-line analysis of time-resolved fluorescence spectroscopy data based on an iterative Laguerre expansion technique  

NASA Astrophysics Data System (ADS)

Time-resolved fluorescence spectroscopy (TRFS) is a powerful analytical tool for quantifying the biochemical composition of organic and inorganic materials. The potential of TRFS for tissue diagnosis has been recently demonstrated. To facilitate the translation of TRFS to the clinical arena, algorithms for online TRFS data analysis are essential. A fast model-free TRFS deconvolution algorithm based on the Laguerre expansion method has previously been introduced. One limitation of this method, however, is the need to heuristically select two parameters that are crucial for the accurate estimation of the fluorescence decay: the Laguerre parameter ? and the expansion order. Here, a new implementation of the Laguerre deconvolution method is introduced, in which a nonlinear least-square optimization of the Laguerre parameter ? is performed, and the optimal expansion order is selected based on a minimum description length criterion (MDL). In addition, estimation of the zero-time delay between the recorded instrument response and fluorescence decay is also performed based on normalized mean square error criterion (NMSE). The method is validated on experimental data from fluorescence lifetime standards, endogenous tissue fluorophores, and human tissue. The proposed automated Laguerre deconvolution method will facilitate online applications of TRFS, such as real-time clinical tissue diagnosis.

Dabir, Aditi S.; Trivedi, Chintan A.; Ryu, Yeontack; Pande, Paritosh; Jo, Javier A.

2009-03-01

21

Finite element error analysis of a zeroth order approximate deconvolution model based on a mixed formulation  

NASA Astrophysics Data System (ADS)

A suitable discretization for the Zeroth Order Model in Large Eddy Simulation of turbulent flows is sought. This is a low order model, but its importance lies in the insight that it provides for the analysis of the higher order models actually used in practice by the pioneers Stolz and Adams [N.A. Adams, S. Stolz, On the approximate deconvolution procedure for LES, Phys. Fluids 2 (1999) 1699-1701; N.A. Adams, S. Stolz, Deconvolution methods for subgrid-scale approximation in large eddy simulation, in: B.J. Geurts (Ed.), Modern Simul. Strategies for Turbulent Flow, Edwards, Philadelphia, 2001, pp. 21-44] and others. The higher order models have proven to be of high accuracy. However, stable discretizations of them have proven to be tricky and other stabilizations, such as time relaxation and eddy viscosity, are often added. We propose a discretization based on a mixed variational formulation that gives the correct energy balance. We show it to be unconditionally stable and prove convergence.

Manica, Carolina Cardoso; Merdan, Songul Kaya

2007-07-01

22

Application of automated mass spectrometry deconvolution and identification software for pesticide analysis in surface waters.  

PubMed

A new approach to surface water analysis has been investigated in order to enhance the detection of different organic contaminants in Nathan Creek, British Columbia. Water samples from Nathan Creek were prepared by liquid/liquid extraction using dichloromethane (DCM) as an extraction solvent and analyzed by gas chromatography mass spectrometry method in scan mode (GC-MS scan). To increase sensitivity for pesticides detection, acquired scan data were further analyzed by Automated Mass Spectrometry Deconvolution and Identification Software (AMDIS) incorporated into the Agilent Deconvolution Reporting Software (DRS), which also includes mass spectral libraries for 567 pesticides. Extracts were reanalyzed by gas chromatography mass spectrometry single ion monitoring (GC-MS-SIM) to confirm and quantitate detected pesticides. Pesticides: atrazine, dimethoate, diazinone, metalaxyl, myclobutanil, napropamide, oxadiazon, propazine and simazine were detected at three sampling sites on the mainstream of the Nathan Creek. Results of the study are further discussed in terms of detectivity and identification level for each pesticide found. The proposed approach of monitoring pesticides in surface waters enables their detection and identification at trace levels. PMID:17090491

Furtula, Vesna; Derksen, George; Colodey, Alan

2006-01-01

23

Performance Analysis Tools  

NSDL National Science Digital Library

A variety of profiling and execution analysis tools exist for both serial and parallel programs. They range widely in usefulness and complexity. Most of the more sophisticated and useful tools have a learning curve associated with them and would deserve a full day tutorial themselves. The purpose of this tutorial is to briefly review a range of performance analysis tools, and to provide pointers for more information to many of these tools. Although a number of the tools reviewed are cross-platform, the emphasis of this tutorial is their usage on the IBM SP platform.

Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Escale, Francesc; Lipari, Don; Team, Intel T.

24

Hurricane Data Analysis Tool  

NSDL National Science Digital Library

The Hurricane Data Analysis Tool (formerly the TRMM QuikScat Analysis tool) allows users to overlay various data products relevant in the study of hurricanes in either an area plot or a time plot using an interactive tool. The data products being offered include TRMM's product 3B42, TMI's sea surface temperature, NCEP Reanalysis sea level pressure, QuikScat's wind and global Merged IR product. This tool is beneficial for users to obtain a visualization of a single product, animation or a comparison of two products during a hurricane event.

2008-01-01

25

Band composition analysis: a new procedure for deconvolution of the mass spectra of organometallic compounds.  

PubMed

A new chemometric procedure called band composition analysis (BCA) designed for the deconvolution of mass spectra of organometallics is proposed. BCA generates theoretical bands T(i), then combines them to obtain a model band M, which is finally compared with the experimental band E. All of these steps are realized with computer assistance. This modeling yields four parameters characterizing the experimental band: theoretical and model variances s(2) (theor) and s(2) (model), a fit factor alpha and a contribution x(i) from the theoretical band. If s(2) (theor) > 20 the band is deemed complex and needs modeling. The values alpha > 90 indicate that there is good agreement between the experimental and model bands. BCA is particularly effective for the modeling of complex isotopic bands often present in organometallics. Two illustrations of BCA for tetrabutyltin, C(16)H(36)Sn, and 1,1',2,2',3,3'-hexachloroferrocene, C(10)H(4)Cl(6)Fe, are shown. PMID:12938102

Szymura, Jacek A; Lamkiewicz, Jan

2003-08-01

26

ATAMM analysis tool  

NASA Technical Reports Server (NTRS)

Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

Jones, Robert; Stoughton, John; Mielke, Roland

1991-01-01

27

State Analysis Database Tool  

NASA Technical Reports Server (NTRS)

The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

Rasmussen, Robert; Bennett, Matthew

2006-01-01

28

Analysis of force-deconvolution methods in frequency-modulation atomic force microscopy  

PubMed Central

Summary In frequency-modulation atomic force microscopy the direct observable is the frequency shift of an oscillating cantilever in a force field. This frequency shift is not a direct measure of the actual force, and thus, to obtain the force, deconvolution methods are necessary. Two prominent methods proposed by Sader and Jarvis (Sader–Jarvis method) and Giessibl (matrix method) are investigated with respect to the deconvolution quality. Both methods show a nontrivial dependence of the deconvolution quality on the oscillation amplitude. The matrix method exhibits spikelike features originating from a numerical artifact. By interpolation of the data, the spikelike features can be circumvented. The Sader–Jarvis method has a continuous amplitude dependence showing two minima and one maximum, which is an inherent property of the deconvolution algorithm. The optimal deconvolution depends on the ratio of the amplitude and the characteristic decay length of the force for the Sader–Jarvis method. However, the matrix method generally provides the higher deconvolution quality.

Illek, Esther; Giessibl, Franz J

2012-01-01

29

Extended Testability Analysis Tool  

NASA Technical Reports Server (NTRS)

The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

Melcher, Kevin; Maul, William A.; Fulton, Christopher

2012-01-01

30

Data enhancement and analysis through mathematical deconvolution of signals from scientific measuring instruments  

NASA Technical Reports Server (NTRS)

Mathematical deconvolution of digitized analog signals from scientific measuring instruments is shown to be a means of extracting important information which is otherwise hidden due to time-constant and other broadening or distortion effects caused by the experiment. Three different approaches to deconvolution and their subsequent application to recorded data from three analytical instruments are considered. To demonstrate the efficacy of deconvolution, the use of these approaches to solve the convolution integral for the gas chromatograph, magnetic mass spectrometer, and the time-of-flight mass spectrometer are described. Other possible applications of these types of numerical treatment of data to yield superior results from analog signals of the physical parameters normally measured in aerospace simulation facilities are suggested and briefly discussed.

Wood, G. M.; Rayborn, G. H.; Ioup, J. W.; Ioup, G. E.; Upchurch, B. T.; Howard, S. J.

1981-01-01

31

Analysis of a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging  

NASA Astrophysics Data System (ADS)

Grating-based X-ray phase-contrast imaging is a promising imaging modality to increase soft tissue contrast in comparison to conventional attenuation-based radiography. Complementary and otherwise inaccessible information is provided by the dark-field image, which shows the sub-pixel size granularity of the measured object. This could especially turn out to be useful in mammography, where tumourous tissue is connected with the presence of supertiny microcalcifications. In addition to the well-established image reconstruction process, an analysis method was introduced by Modregger, 1 which is based on deconvolution of the underlying scattering distribution within a single pixel revealing information about the sample. Subsequently, the different contrast modalities can be calculated with the scattering distribution. The method already proved to deliver additional information in the higher moments of the scattering distribution and possibly reaches better image quality with respect to an increased contrast-to-noise ratio. Several measurements were carried out using melamine foams as phantoms. We analysed the dependency of the deconvolution-based method with respect to the dark-field image on different parameters such as dose, number of iterations of the iterative deconvolution-algorithm and dark-field signal. A disagreement was found in the reconstructed dark-field values between the FFT method and the iterative method. Usage of the resulting characteristics might be helpful in future applications.

Horn, Florian; Bayer, Florian; Pelzer, Georg; Rieger, Jens; Ritter, André; Weber, Thomas; Zang, Andrea; Michel, Thilo; Anton, Gisela

2014-03-01

32

Swift Science Analysis Tools  

NASA Astrophysics Data System (ADS)

Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (http://swiftsc.gsfc.nasa.gov), and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

Marshall, F. E.; Swift Team Team

2003-05-01

33

ASC Data Analysis Tool Architecture  

NASA Astrophysics Data System (ADS)

The AXAF Science Center (ASC) is using an ``open architecture'' approach to develop a set of data analysis tools. The tool architecture is designed to allow the same tools to be instantiated in a variety of environments and via a variety of control mechanisms. This modular tool design allows the same tools to be used in all phases of ASC data analysis: automated pipeline processing, user-interactive command-line interaction and GUI driven analysis.

Conroy, M.; Doe, S.; Herrero, J.

34

A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis  

Microsoft Academic Search

DNA methylation is an indispensible epigenetic modification required for regulating the expression of mammalian genomes. Immunoprecipitation-based methods for DNA methylome analysis are rapidly shifting the bottleneck in this field from data generation to data analysis, necessitating the development of better analytical tools. In particular, an inability to estimate absolute methylation levels remains a major analytical difficulty associated with immunoprecipitation-based DNA

Daniel J Turner; Paul Flicek; Heng Li; Eugene Kulesha; Stefan Gräf; Nathan Johnson; Javier Herrero; Eleni M Tomazou; Natalie P Thorne; Liselotte Bäckdahl; Marlis Herberth; Kevin L Howe; David K Jackson; Marcos M Miretti; John C Marioni; Ewan Birney; Tim J P Hubbard; Richard Durbin; Simon Tavaré; Thomas A Down; Vardhman K Rakyan; Stephan Beck

2008-01-01

35

Neutron multiplicity analysis tool  

SciTech Connect

I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This program was developed to help speed the analysis of Monte Carlo neutron transport simulation (MCNP) data, and only requires the count-rate data to calculate the mass of material using INCC's analysis methods instead of the full neutron multiplicity distribution required to run analysis in INCC. This paper describes what is implemented within EXCOM, including the methods used, how the program corrects for deadtime, and how uncertainty is calculated. This paper also describes how to use EXCOM within Excel.

Stewart, Scott L [Los Alamos National Laboratory

2010-01-01

36

Geodetic Strain Analysis Tool  

NASA Technical Reports Server (NTRS)

A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

2011-01-01

37

Testing of reliability - Analysis tools  

NASA Technical Reports Server (NTRS)

An outline is presented of issues raised in verifying the accuracy of reliability analysis tools. State-of-the-art reliability analysis tools implement various decomposition, aggregation, and estimation techniques to compute the reliability of a diversity of complex fault-tolerant computer systems. However, no formal methodology has been formulated for validating the reliability estimates produced by these tools. The author presents three states of testing that can be performed on most reliability analysis tools to effectively increase confidence in a tool. These testing stages were applied to the SURE (semi-Markov Unreliability Range Evaluator) reliability analysis tool, and the results of the testing are discussed.

Hayhurst, Kelly J.

1989-01-01

38

Comparison of several algorithms for blind deconvolution: analysis of noise effects  

NASA Astrophysics Data System (ADS)

The object of this communication is to compare two inversion algorithms in their application to the blind deconvolution problem. After a brief summary of the previous works in this field, we describe the Richardson-Lucy and the steepest descent algorithms and we introduce these methods in the basic error reduction algorithm of Ayers and Dainty. These algorithms are compared when used for blind deconvolution of simulated binary objects convolved by a point spread function and corrupted by a Gaussian additive noise. We consider the effects of the noise level on the reconstruction error, together with the effects of the algorithmic parameters (inner and outer iteration numbers). Particular effects occurring during the reconstruction process are also shown.

Lanteri, Henri; Barilli, Marco; Beaumont, Hubert; Aime, Claude; Gaucherel, Philippe; Touma, H.

1995-12-01

39

Draper Station Analysis Tool  

NASA Technical Reports Server (NTRS)

Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

2011-01-01

40

Portfolio Analysis Tool  

NASA Technical Reports Server (NTRS)

Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

2005-01-01

41

Hurricane Data Analysis Tool  

NASA Technical Reports Server (NTRS)

In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

2011-01-01

42

Functional speciation of metal-dissolved organic matter complexes by size exclusion chromatography coupled to inductively coupled plasma mass spectrometry and deconvolution analysis  

Microsoft Academic Search

High performance size exclusion chromatography coupled to inductively coupled plasma mass spectrometry (HP-SEC–ICP-MS), in combination with deconvolution analysis, has been used to obtain multielemental qualitative and quantitative information about the distributions of metal complexes with different forms of natural dissolved organic matter (DOM). High performance size exclusion chromatography coupled to inductively coupled plasma mass spectrometry chromatograms only provide continuous distributions

Francisco Laborda; Sergio Ruiz-Beguería; Eduardo Bolea; Juan R. Castillo

2009-01-01

43

FSSC Science Tools: Pulsar Analysis  

NASA Technical Reports Server (NTRS)

This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

Thompson, Dave

2010-01-01

44

General Mission Analysis Tool (GMAT).  

National Technical Information Service (NTIS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology developme...

S. P. Hughes

2007-01-01

45

A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis  

PubMed Central

DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation.

Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Graf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Backdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavare, Simon; Beck, Stephan

2009-01-01

46

A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis.  

PubMed

DNA methylation is an indispensible epigenetic modification required for regulating the expression of mammalian genomes. Immunoprecipitation-based methods for DNA methylome analysis are rapidly shifting the bottleneck in this field from data generation to data analysis, necessitating the development of better analytical tools. In particular, an inability to estimate absolute methylation levels remains a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling. To address this issue, we developed a cross-platform algorithm-Bayesian tool for methylation analysis (Batman)-for analyzing methylated DNA immunoprecipitation (MeDIP) profiles generated using oligonucleotide arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). We developed the latter approach to provide a high-resolution whole-genome DNA methylation profile (DNA methylome) of a mammalian genome. Strong correlation of our data, obtained using mature human spermatozoa, with those obtained using bisulfite sequencing suggest that combining MeDIP-seq or MeDIP-chip with Batman provides a robust, quantitative and cost-effective functional genomic strategy for elucidating the function of DNA methylation. PMID:18612301

Down, Thomas A; Rakyan, Vardhman K; Turner, Daniel J; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M; Thorne, Natalie P; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L; Jackson, David K; Miretti, Marcos M; Marioni, John C; Birney, Ewan; Hubbard, Tim J P; Durbin, Richard; Tavaré, Simon; Beck, Stephan

2008-07-01

47

A co-operative study on the clinical value of dynamic renal scanning with deconvolution analysis.  

PubMed

An international project was set up to study the clinical usefulness of intrarenal transit times derived from the renogram by deconvolution. A common data sheet, to collect clinical, biochemical, radiological and isotopic information, was completed by the centres. Five hundred and ninety-one patients were studied and the results analysed. The mean transit time (MTT) in normal kidneys was found to be 3.6 +/- 1.1 min. If the MTT is greater than 7.6 min, a kidney is likely to be obstructed. In vesico-ureteric reflux, the transit times are prolonged, but they are normal in infection, hypertension, parenchymal disease and minimally irradiated kidneys. In transplantation, when the kidney is normal, the transit times are shorter than in the natural kidney; in acute rejection, transit time are prolonged. PMID:7049298

Piepsz, A; Ham, H R; Erbsmann, F; Hall, M; Diffey, B L; Goggin, M J; Hall, F M; Miller, J A; Lumbroso, J; Di Paola, R; Bazin, J P; Di Paola, M; Fries, D

1982-06-01

48

LPA1,LPA2. Deconvolution Program  

SciTech Connect

The program is suitable for a lot of applications in applied mathematics, experimental physics, signal analytical system and some engineering applications range i.e. deconvolution spectrum, signal analysis and system property analysis etc.

Ping-An, L.; Jiang-Lai, Y. [Bejiing Normal University, Bejiing (China)

1991-01-01

49

Marginal Abatement Cost Analysis Tool  

EPA Science Inventory

The Non-CO2 Marginal Abatement Cost Analysis Tool is an extensive bottom-up engineering-economic spreadsheet model capturing the relevant cost and performance data on sectors emitting non-CO2 GHGs. The tool has 24 regions and 7 sectors and produces marginal abatement cost curves...

50

Heliostat cost-analysis tool  

Microsoft Academic Search

A heliostat cost analysis tool (HELCAT) that processes manufacturing transportation, and installation cost data was developed which provides a consistent structure for cost analyses. The HELCAT calculates a representation product price based on direct input data and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost

L. D. Brandt; R. E. Chang

1981-01-01

51

Analysis tools for reliability databases.  

National Technical Information Service (NTIS)

This report outlines the work performed at Risoe, under contract with the Swedish Nuclear Power Inspectorate, with the goal to develop analysis tools for reliability databases, that can suit the information needs of the users of the TUD (Reliability/ Main...

J. Dorrepaal

1996-01-01

52

Automatic Stripe Analysis Tool.  

National Technical Information Service (NTIS)

This report discusses the design and implementation of an automatic stripe analysis application for use in metrology. It has been implemented in Mathworks Matlab scripting environment and wrapped in an easy-to-use graphical user interface (GUI). The algor...

J. R. Bickford

2013-01-01

53

Convolution-deconvolution in DIGES  

SciTech Connect

Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities.

Philippacopoulos, A.J.; Simos, N. [Brookhaven National Lab., Upton, NY (United States). Dept. of Advanced Technology

1995-05-01

54

Deconvolution filtering: Temporal smoothing revisited.  

PubMed

Inferences made from analysis of BOLD data regarding neural processes are potentially confounded by multiple competing sources: cardiac and respiratory signals, thermal effects, scanner drift, and motion-induced signal intensity changes. To address this problem, we propose deconvolution filtering, a process of systematically deconvolving and reconvolving the BOLD signal via the hemodynamic response function such that the resultant signal is composed of maximally likely neural and neurovascular signals. To test the validity of this approach, we compared the accuracy of BOLD signal variants (i.e., unfiltered, deconvolution filtered, band-pass filtered, and optimized band-pass filtered BOLD signals) in identifying useful properties of highly confounded, simulated BOLD data: (1) reconstructing the true, unconfounded BOLD signal, (2) correlation with the true, unconfounded BOLD signal, and (3) reconstructing the true functional connectivity of a three-node neural system. We also tested this approach by detecting task activation in BOLD data recorded from healthy adolescent girls (control) during an emotion processing task. Results for the estimation of functional connectivity of simulated BOLD data demonstrated that analysis (via standard estimation methods) using deconvolution filtered BOLD data achieved superior performance to analysis performed using unfiltered BOLD data and was statistically similar to well-tuned band-pass filtered BOLD data. Contrary to band-pass filtering, however, deconvolution filtering is built upon physiological arguments and has the potential, at low TR, to match the performance of an optimal band-pass filter. The results from task estimation on real BOLD data suggest that deconvolution filtering provides superior or equivalent detection of task activations relative to comparable analyses on unfiltered signals and also provides decreased variance over the estimate. In turn, these results suggest that standard preprocessing of the BOLD signal ignores significant sources of noise that can be effectively removed without damaging the underlying signal. PMID:24768215

Bush, Keith; Cisler, Josh

2014-07-01

55

VCAT: Visual Crosswalk Analysis Tool  

SciTech Connect

VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

2012-08-31

56

Heliostat cost-analysis tool  

NASA Astrophysics Data System (ADS)

A heliostat cost analysis tool (HELCAT) that processes manufacturing transportation, and installation cost data was developed which provides a consistent structure for cost analyses. The HELCAT calculates a representation product price based on direct input data and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost estimates are discussed. A set of nominal economic and financial parameters is also suggested.

Brandt, L. D.; Chang, R. E.

1981-10-01

57

A new spectral deconvolution - Selected ion monitoring method for the analysis of alkylated polycyclic aromatic hydrocarbons in complex mixtures.  

PubMed

A new gas chromatography/mass spectrometry (GC/MS) method is proffered for the analysis of polycyclic aromatic hydrocarbons (PAH) and their alkylated homologs in complex samples. Recent work elucidated the fragmentation pathways of alkylated PAH, concluding that multiple fragmentation patterns per homolog (MFPPH) are needed to correctly identify all isomers. Programming the MS in selected ion monitoring (SIM) mode to detect homolog-specific MFPPH ions delivers the selectivity and sensitivity that the conventional SIM and/or full scan mass spectrometry methods fail to provide. New spectral deconvolution software eliminates the practice of assigning alkylated homolog peaks via pattern recognition within laboratory-defined retention windows. Findings show that differences in concentration by SIM/molecular ion detection of C1-C4 PAH, now the standard, yield concentration differences compared to SIM/MFPPH of thousands of percent for some homologs. The SIM/MFPPH methodology is also amenable to the analysis of polycyclic aromatic sulfur heterocycles (PASH) and their alkylated homologs, since many PASH have the same m/z ions as those of PAH and, thus, are false positives in SIM/1-ion PAH detection methods. PMID:24840423

Robbat, Albert; Wilton, Nicholas M

2014-07-01

58

Analysis of protein film voltammograms as Michaelis-Menten saturation curves yield the electron cooperativity number for deconvolution.  

PubMed

Deconvolution of protein film voltammetric data by fitting multiple components (sigmoids, derivative peaks) often is ambiguous when features are partially overlapping, due to exchangeability between the width and the number of components. Here, a new method is presented to obtain the width of the components. This is based on the equivalence between the sigmoidal catalytic response as function of electrode potential, and the classical saturation curve obtained for the enzyme activity as function of the soluble substrate concentration, which is also sigmoidal when plotted versus log[S]. Thus, analysis of the catalytic voltammogram with Lineweaver-Burk, Eadie-Hofstee, and Hanes-Woolf plots is feasible. This provides a very sensitive measure of the cooperativity number (Hill coefficient), which for electrons equals the apparent (fractional) number of electrons that determine the width, and thereby the number of components (kinetic phases). This analysis is applied to the electrocatalytic oxygen reduction by Paracoccus denitrificans cytochrome aa(3) (cytochrome c oxidase). Four partially overlapping kinetic phases are observed that (stepwise) increase the catalytic efficiency with increasingly reductive potential. Translated to cell biology, the activity of the terminal oxidase stepwise adapts to metabolic demand for oxidative phosphorylation. PMID:22265100

Heering, Hendrik A

2012-10-01

59

Data-driven deconvolution  

Microsoft Academic Search

In this paper we study an automatic empirical procedure for density deconvolution based on observations that are contaminated by additive measurement errors from a known distribution. The assumptions placed on the density to be estimated are mild and apart from continuity do not include additional smoothness conditions. The procedure uses a class of deconvoluting kernel estimates and selects the smoothing

Christian H. Hesse

1999-01-01

60

New spectral deconvolution algorithms for the analysis of polycyclic aromatic hydrocarbons and sulfur heterocycles by comprehensive two-dimensional gas chromatography-quadrupole mass spectrometery.  

PubMed

New mass spectral deconvolution algorithms have been developed for comprehensive two-dimensional gas chromatography/quadrupole mass spectrometry (GC × GC/qMS). This paper reports the first use of spectral deconvolution of full scan quadrupole GC × GC/MS data for the quantitative analysis of polycyclic aromatic hydrocarbons (PAH) and polycyclic aromatic sulfur heterocycles (PASH) in coal tar-contaminated soil. A method employing four ions per isomer and multiple fragmentation patterns per alkylated homologue (MFPPH) is used to quantify target compounds. These results are in good agreement with GC/MS concentrations, and an examination of method precision, accuracy, selectivity, and sensitivity is discussed. MFPPH and SIM/1-ion concentration differences are also examined. PMID:24063305

Antle, Patrick M; Zeigler, Christian D; Gankin, Yuriy; Robbat, Albert

2013-11-01

61

Static Analysis Tool Exposition (SATE) IV.  

National Technical Information Service (NTIS)

The NIST Software Assurance Metrics And Tool Evaluation (SAMATE) project conducted the fourth Static Analysis Tool Exposition (SATE IV) to advance research in static analysis tools that find security defects in source code. The main goals of SATE were to ...

A. Delaitre P. E. Black V. Okun

2013-01-01

62

WEAT: Web Enabled Analysis Tool  

NSDL National Science Digital Library

Behavioral Risk Factor Surveillance System The BRFSS, the world’s largest telephone survey, tracks health risks in the United States. Information from the survey is used to improve the health of the American people. This tool allows users to create crosstablulations and perform logistic analysis on these data.  

Control, Center F.

63

Heliostat cost-analysis tool  

SciTech Connect

Estimated production costs of solar energy systems serve as guides for future component development and as measures of the potential economic viability of the technologies. The analysis of heliostat costs is particularly important since the heliostat field is the largest cost component of a solar central receiver plant. A heliostat cost analysis tool (HELCAT) that processes manufacturing, transportation, and installation cost data has been developed to provide a consistent structure for cost analyses. HELCAT calculates a representative product price based on direct input data (e.g. direct materials, direct labor, capital requirements) and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost estimates are discussed. A set of nominal economic and financial parameters is also suggested.

Brandt, L.D.; Chang, R.E.

1981-10-01

64

A comparison of deconvolution techniques for stress relaxation.  

PubMed

Stress relaxation (or equivalently creep) allows a large range of the relaxation (retardation) spectrum of materials to be examined, particularly at lower frequencies. However, higher frequency components of the relaxation curves (typically of the order of Hertz) are attenuated due to the finite time taken to strain the specimen. This higher frequency information can be recovered by deconvolution of the stress and strain during the loading period. This paper examines the use of three separate deconvolution techniques: numerical (Fourier) deconvolution, semi-analytical deconvolution using a theoretical form of the strain, and deconvolution by a linear approximation method. Both theoretical data (where the exact form of the relaxation function is known) and experimental data were used to assess the accuracy and applicability of the deconvolution methods. All of the deconvolution techniques produced a consistent improvement in the higher frequency data up to the frequencies of the order of Hertz, with the linear approximation method showing better resolution in high-frequency analysis of the theoretical data. When the different deconvolution techniques were applied to experimental data, similar results were found for all three deconvolution techniques. Deconvolution of the stress and strain during loading is a simple and practical method for the recovery of higher frequency data from stress-relaxation experiments. PMID:12413967

Holmes, A D; Lu, W W; Luk, K D K; Leong, J C Y

2002-11-01

65

Common Bolted Joint Analysis Tool  

NASA Technical Reports Server (NTRS)

Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

Imtiaz, Kauser

2011-01-01

66

Dynamic Hurricane Data Analysis Tool  

NASA Technical Reports Server (NTRS)

A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

2009-01-01

67

Deconvolution analysis to determine relaxation time spectra of internal friction peaks  

SciTech Connect

A new method for analysis of an internal friction vs temperature peak to obtain an approximation of the spectrum of relaxation time responsible for the peak is described. This method, referred to as direct spectrum analysis (DSA), is shown to provide an accurate estimate of the distribution of relaxation times. The method is validated for various spectra, and it is shown that: (1) It provides approximations to known input spectra which replicate the position, amplitude, width and shape with good accuracy (typically 10%). (2) It does not yield approximations which have false spectral peaks.

Cost, J.R.

1985-01-01

68

Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols  

EPA Science Inventory

In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

69

EPR spectrum deconvolution and dose assessment of fossil tooth enamel using maximum likelihood common factor analysis  

Microsoft Academic Search

In order to determine the components which give rise to the EPR spectrum around g = 2 we have applied Maximum Likelihood Common Factor Analysis (MLCFA) on the EPR spectra of enamel sample 1126 which has previously been analysed by continuous wave and pulsed EPR as well as EPR microscopy. MLCFA yielded agreeing results on three sets of X-band spectra

G. Vanhaelewyn; F. Callens; R. Grün

2000-01-01

70

Behavior based network traffic analysis tool  

Microsoft Academic Search

Pattern matching systems are mainly based on network models, which are formed from detailed analysis of user statistics and network traffic. These models are used in developing traffic analysis tools. This paper focuses on development of a behavior analysis tool on any operating system and its use on detecting internal active\\/passive attacks. Many kinds of tools and firewalls are in

Sindhu Kakuru

2011-01-01

71

General Mission Analysis Tool (GMAT)  

NASA Technical Reports Server (NTRS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

Hughes, Steven P.

2007-01-01

72

RADC SCAT automated sneak circuit analysis tool  

Microsoft Academic Search

The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital

E. L. DePalma

1990-01-01

73

System analysis: Developing tools for the future  

SciTech Connect

This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

1996-02-01

74

Wavespace-Based Coherent Deconvolution  

NASA Technical Reports Server (NTRS)

Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

Bahr, Christopher J.; Cattafesta, Louis N., III

2012-01-01

75

Survey of visualization and analysis tools  

NASA Technical Reports Server (NTRS)

A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

Meyer, P. J.

1994-01-01

76

Unsupervised Blind Deconvolution.  

National Technical Information Service (NTIS)

To reduce the influence of atmospheric turbulence on images of space- based objects we are developing a maximum a posteriori deconvolution approach. In contrast to techniques found in the literature, we are focusing on the statistics of the point-spread f...

L. Mugnier R. Gudimetla R. B. Galle R. L. Johnson S. Gladysz

2013-01-01

77

Blind image deconvolution  

Microsoft Academic Search

The goal of image restoration is to reconstruct the original scene from a degraded observation. This recovery process is critical to many image processing applications. Although classical linear image restoration has been thoroughly studied, the more difficult problem of blind image restoration has numerous research possibilities. We introduce the problem of blind deconvolution for images, provide an overview of the

DEEPA KUNDUR; D. Hatzinakos

1996-01-01

78

Strategic analysis tools for high tech marketing  

Microsoft Academic Search

High tech marketing is characterized by high levels of technical, market and financial uncertainties, rapidly declining prices, collapsing markets and shortening product life cycles. Conventional strategic analysis tools are inadequate for effective analysis in developing high tech marketing strategy. This paper reviews a portfolio of contemporary strategic analysis tools that have been used effectively in developing high tech marketing strategies

C. D'Cruz; K. Ports

2003-01-01

79

ADVANCED POWER SYSTEMS ANALYSIS TOOLS  

SciTech Connect

The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

Robert R. Jensen; Steven A. Benson; Jason D. Laumb

2001-08-31

80

Analysis Tools for CFD Multigrid Solvers  

NASA Technical Reports Server (NTRS)

Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

2004-01-01

81

NCI Interactive Budget Analysis Tool  

Cancer.gov

This tool provides users an interactive overview of the National Cancer Institute (NCI) budget and Fact Book data since Fiscal Year 1999. Additional historical NCI budget information can be obtained through the NCI Fact Book Collection.

82

EASY-GOING deconvolution: Automated MQMAS NMR spectrum analysis based on a model with analytical crystallite excitation efficiencies  

NASA Astrophysics Data System (ADS)

The EASY-GOING deconvolution (EGdeconv) program is extended to enable fast and automated fitting of multiple quantum magic angle spinning (MQMAS) spectra guided by evolutionary algorithms. We implemented an analytical crystallite excitation model for spectrum simulation. Currently these efficiencies are limited to two-pulse and z-filtered 3QMAS spectra of spin 3/2 and 5/2 nuclei, whereas for higher spin-quantum numbers ideal excitation is assumed. The analytical expressions are explained in full to avoid ambiguity and facilitate others to use them. The EGdeconv program can fit interaction parameter distributions. It currently includes a Gaussian distribution for the chemical shift and an (extended) Czjzek distribution for the quadrupolar interaction. We provide three case studies to illustrate EGdeconv's capabilities for fitting MQMAS spectra. The EGdeconv program is available as is on our website http://egdeconv.science.ru.nl for 64-bit Linux operating systems.

Grimminck, Dennis L. A. G.; van Meerten, Bas; Verkuijlen, Margriet H. W.; van Eck, Ernst R. H.; Leo Meerts, W.; Kentgens, Arno P. M.

2013-03-01

83

EASY-GOING deconvolution: Automated MQMAS NMR spectrum analysis based on a model with analytical crystallite excitation efficiencies.  

PubMed

The EASY-GOING deconvolution (EGdeconv) program is extended to enable fast and automated fitting of multiple quantum magic angle spinning (MQMAS) spectra guided by evolutionary algorithms. We implemented an analytical crystallite excitation model for spectrum simulation. Currently these efficiencies are limited to two-pulse and z-filtered 3QMAS spectra of spin 3/2 and 5/2 nuclei, whereas for higher spin-quantum numbers ideal excitation is assumed. The analytical expressions are explained in full to avoid ambiguity and facilitate others to use them. The EGdeconv program can fit interaction parameter distributions. It currently includes a Gaussian distribution for the chemical shift and an (extended) Czjzek distribution for the quadrupolar interaction. We provide three case studies to illustrate EGdeconv's capabilities for fitting MQMAS spectra. The EGdeconv program is available as is on our website http://egdeconv.science.ru.nl for 64-bit Linux operating systems. PMID:23376481

Grimminck, Dennis L A G; van Meerten, Bas; Verkuijlen, Margriet H W; van Eck, Ernst R H; Meerts, W Leo; Kentgens, Arno P M

2013-03-01

84

Total variation blind deconvolution  

Microsoft Academic Search

We present a blind deconvolution algorithm based on the total variational (TV) minimization method proposed by Acar and Vogel (1994). The motivation for regularizing with the TV norm is that it is extremely effective for recovering edges of images as well as some blurring functions, e.g., motion blur and out-of-focus blur. An alternating minimization (AM) implicit iterative scheme is devised

Tony F. Chan; Chiu-Kwong Wong

1998-01-01

85

General Mission Analysis Tool (GMAT) Mathematical Specifications  

NASA Technical Reports Server (NTRS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

Hughes, Steve

2007-01-01

86

IRTool: an IRST X Windows analysis tool  

Microsoft Academic Search

IRTool is an IRST X Windows analysis tool, which is being developed by Arete Associates and NSWC\\/WO under the sponsorship of the Office of Naval Research in support of the Infrared Analysis Modeling and Measurements Program (IRAMMP). The tool consists of an integrated set of physics based modules to support IRST multispectral and space-time analyses. The primary modules are for

Philip J. Davis; Eric Branlund; Steven R. Church; Don Chmielewski; David Klesch; Erik P. Krumrey; Douglas Crowder

1995-01-01

87

Obtaining meaningful results from Fourier deconvolution of reaction time data.  

PubMed

The technique of Fourier deconvolution is a powerful tool for testing distributional predictions of stage models of reaction time. However, direct application of Fourier theory to reaction time data has sometimes produced disappointing results. This article reviews Fourier transform theory as it applies to the problem of deconvolving a component of the reaction time distribution. Problems encountered in deconvolution are shown to be due to the presence of noise in the Fourier transforms of the sampled distributions, which is amplified by the operation of deconvolution. A variety of filtering techniques for the removal of noise are discussed, including window functions, adaptive kernel smoothing, and optimal Wiener filtering. The best results were obtained using a window function whose pass band was determined empirically from the power spectrum of the deconvolved distribution. These findings are discussed in relation to other, nontrigonometric approaches to the problem of deconvolution. PMID:2270239

Smith, P L

1990-11-01

88

Digital Deconvolution: Image Sampling and Restoration Techniques.  

National Technical Information Service (NTIS)

The finite length digital deconvolution problem is formulated and discussed in terms of modern optimization theory. The ill-conditioned nature of deconvolution is identified and classic deconvolution operators are examined in terms of their respective eff...

D. J. Udovic R. Mittra

1975-01-01

89

Functional speciation of metal-dissolved organic matter complexes by size exclusion chromatography coupled to inductively coupled plasma mass spectrometry and deconvolution analysis  

NASA Astrophysics Data System (ADS)

High performance size exclusion chromatography coupled to inductively coupled plasma mass spectrometry (HP-SEC-ICP-MS), in combination with deconvolution analysis, has been used to obtain multielemental qualitative and quantitative information about the distributions of metal complexes with different forms of natural dissolved organic matter (DOM). High performance size exclusion chromatography coupled to inductively coupled plasma mass spectrometry chromatograms only provide continuous distributions of metals with respect to molecular masses, due to the high heterogeneity of dissolved organic matter, which consists of humic substances as well as biomolecules and other organic compounds. A functional speciation approach, based on the determination of the metals associated to different groups of homologous compounds, has been followed. Dissolved organic matter groups of homologous compounds are isolated from the aqueous samples under study and their high performance size exclusion chromatography coupled to inductively coupled plasma mass spectrometry elution profiles fitted to model Gaussian peaks, characterized by their respective retention times and peak widths. High performance size exclusion chromatography coupled to inductively coupled plasma mass spectrometry chromatograms of the samples are deconvoluted with respect to these model Gaussian peaks. This methodology has been applied to the characterization of metal-dissolved organic matter complexes in compost leachates. The most significant groups of homologous compounds involved in the complexation of metals in the compost leachates studied have been hydrophobic acids (humic and fulvic acids) and low molecular mass hydrophilic compounds. The environmental significance of these compounds is related to the higher biodegradability of the low molecular mass hydrophilic compounds and the lower mobility of humic acids. In general, the hydrophilic compounds accounted for the complexation of around 50% of the leached metals, with variable contributions of humic and fulvic acids, depending on the nature of the samples and the metals.

Laborda, Francisco; Ruiz-Beguería, Sergio; Bolea, Eduardo; Castillo, Juan R.

2009-05-01

90

IMPAIR: massively parallel deconvolution on the GPU  

NASA Astrophysics Data System (ADS)

The IMPAIR software is a high throughput image deconvolution tool for processing large out-of-core datasets of images, varying from large images with spatially varying PSFs to large numbers of images with spatially invariant PSFs. IMPAIR implements a parallel version of the tried and tested Richardson-Lucy deconvolution algorithm regularised via a custom wavelet thresholding library. It exploits the inherently parallel nature of the convolution operation to achieve quality results on consumer grade hardware: through the NVIDIA Tesla GPU implementation, the multi-core OpenMP implementation, and the cluster computing MPI implementation of the software. IMPAIR aims to address the problem of parallel processing in both top-down and bottom-up approaches: by managing the input data at the image level, and by managing the execution at the instruction level. These combined techniques will lead to a scalable solution with minimal resource consumption and maximal load balancing. IMPAIR is being developed as both a stand-alone tool for image processing, and as a library which can be embedded into non-parallel code to transparently provide parallel high throughput deconvolution.

Sherry, Michael; Shearer, Andy

2013-02-01

91

Fast Holographic Deconvolution: A New Technique for Precision Radio Interferometry  

NASA Astrophysics Data System (ADS)

We introduce the Fast Holographic Deconvolution method for analyzing interferometric radio data. Our new method is an extension of A-projection/software-holography/forward modeling analysis techniques and shares their precision deconvolution and wide-field polarimetry, while being significantly faster than current implementations that use full direction-dependent antenna gains. Using data from the MWA 32 antenna prototype, we demonstrate the effectiveness and precision of our new algorithm. Fast Holographic Deconvolution may be particularly important for upcoming 21 cm cosmology observations of the Epoch of Reionization and Dark Energy where foreground subtraction is intimately related to the precision of the data reduction.

Sullivan, I. S.; Morales, M. F.; Hazelton, B. J.; Arcus, W.; Barnes, D.; Bernardi, G.; Briggs, F. H.; Bowman, J. D.; Bunton, J. D.; Cappallo, R. J.; Corey, B. E.; Deshpande, A.; deSouza, L.; Emrich, D.; Gaensler, B. M.; Goeke, R.; Greenhill, L. J.; Herne, D.; Hewitt, J. N.; Johnston-Hollitt, M.; Kaplan, D. L.; Kasper, J. C.; Kincaid, B. B.; Koenig, R.; Kratzenberg, E.; Lonsdale, C. J.; Lynch, M. J.; McWhirter, S. R.; Mitchell, D. A.; Morgan, E.; Oberoi, D.; Ord, S. M.; Pathikulangara, J.; Prabu, T.; Remillard, R. A.; Rogers, A. E. E.; Roshi, A.; Salah, J. E.; Sault, R. J.; Udaya Shankar, N.; Srivani, K. S.; Stevens, J.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Waterson, M.; Webster, R. L.; Whitney, A. R.; Williams, A.; Williams, C. L.; Wyithe, J. S. B.

2012-11-01

92

Target deconvolution techniques in modern phenotypic profiling  

PubMed Central

The past decade has seen rapid growth in the use of diverse compound libraries in classical phenotypic screens to identify modulators of a given process. The subsequent process of identifying the molecular targets of active hits, also called ‘target deconvolution’, is an essential step for understanding compound mechanism of action and for using the identified hits as tools for further dissection of a given biological process. Recent advances in ‘omics’ technologies, coupled with in silico approaches and the reduced cost of whole genome sequencing, have greatly improved the workflow of target deconvolution and have contributed to a renaissance of ‘modern’ phenotypic profiling. In this review, we will outline how both new and old techniques are being used in the difficult process of target identification and validation as well as discuss some of the ongoing challenges remaining for phenotypic screening.

Lee, Jiyoun; Bogyo, Matthew

2013-01-01

93

Integrating Reliability Analysis with a Performance Tool  

NASA Technical Reports Server (NTRS)

A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

1995-01-01

94

Growth hormone secretory rates in children as estimated by deconvolution analysis of 24-h plasma concentration profiles.  

PubMed

The kinetics of growth-hormone (GH) distribution and elimination was estimated in five GH-deficient children who received 11 intravenous single injections of GH. The plasma disappearance data were analyzed in terms of a two-compartment model. The kinetic parameters obtained were then used in calculating the GH-secretory rate by a numerical deconvolution technique. A simple formula was derived for calculation of the cumulated secretion from the area under the concentration curve of 145 healthy children of various ages, heights, and stages of puberty. The estimated 24-h GH secretion increased with age, corresponding to a two- to fourfold increase during the adolescence period. The highest secretions were found in pubertal stages 3-4. In prepubertal children the heights correlated markedly with the secretion of GH (r = 0.83). Thus an indication of the range of the GH secretion in normal growing children is found, which is important to estimate substitution doses for treatment of GH-deficient children. PMID:2610253

Albertsson-Wikland, K; Rosberg, S; Libre, E; Lundberg, L O; Groth, T

1989-12-01

95

Test of the accuracy of the computerized glow curve deconvolution algorithm for the analysis of thermoluminescence glow curves  

NASA Astrophysics Data System (ADS)

The accuracy of the thermoluminescence (TL) kinetics parameters obtained using the computerized glow curve deconvolution (CGCD) algorithm was tested. The differential equation governing the electron traffic in the one trap-one recombination (OTOR) level model was solved with almost no approximation using the Fehlberg-Runge-Kutta method. A set of simulated glow peaks was generated for a wide range of kinetics parameter values. These glow peaks were then fitted using the general-order kinetics equation. Comparisons between the kinetics parameter values of the simulated glow peaks and those obtained by the CGCD method were made. The results show that the accuracy of the different kinetics parameters obtained by the CGCD method is not the same and that it varies according to the value of the kinetics order (b). The overlapping of two glow peaks with very close maximum peak positions (Tms) results in a glow peak with unexpected values for the kinetics parameters. A set of different cases of overlapping glow peaks is also discussed.

Sadek, A. M.

2013-06-01

96

Deconvolution in Astronomy: A Review  

Microsoft Academic Search

This article reviews different deconvolution methods. The all-pervasive presence of noise is what makes deconvolution particularly difficult. The diversity of resulting algorithms reflects different ways of estimating the true signal under various idealizations of its properties. Different ways of approaching signal recovery are based on different instrumental noise models, whether the astronomical objects are pointlike or extended, and indeed on

J. L. Starck; E. Pantin; F. Murtagh

2002-01-01

97

Industry Sector Analysis Mexico: Machine Tools, Lathes.  

National Technical Information Service (NTIS)

The market survey covers the machine tool lathes market in Mexico. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Mexican consumers to U.S. products; the competitive situation, and market ...

L. P. Sanroman M. Gerard

1992-01-01

98

Industry Sector Analysis Mexico: Metalworking Machine Tools.  

National Technical Information Service (NTIS)

The market survey covers the metalworking machine tools market in Mexico. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Mexican consumers to U.S. products; the competitive situation, and ...

1991-01-01

99

Numerical Uncertainty Quantification for Radiation Analysis Tools.  

National Technical Information Service (NTIS)

Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle d...

B. Anderson M. Clowdsley S. Blattnig

2007-01-01

100

Analysis of Ten Reverse Engineering Tools  

NASA Astrophysics Data System (ADS)

Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

Koskinen, Jussi; Lehmonen, Tero

101

Design and Analysis Tool Validation.  

National Technical Information Service (NTIS)

The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplif...

R. Judkoff

1981-01-01

102

Heliostat Cost-Analysis Tool.  

National Technical Information Service (NTIS)

Estimated production costs of solar energy systems serve as guides for future component development and as measures of the potential economic viability of the technologies. The analysis of heliostat costs is particularly important since the heliostat fiel...

L. D. Brandt R. E. Chang

1981-01-01

103

GAIA: Graphical Astronomy and Image Analysis Tool  

NASA Astrophysics Data System (ADS)

GAIA is an image and data-cube display and analysis tool for astronomy. It provides the usual facilities of image display tools, plus more astronomically useful ones such as aperture and optimal photometry, contouring, source detection, surface photometry, arbitrary region analysis, celestial coordinate readout, calibration and modification, grid overlays, blink comparison, defect patching and the ability to query on-line catalogues and image servers. It can also display slices from data-cubes, extract and visualize spectra as well as perform full 3D rendering. GAIA uses the Starlink software environment (ascl:1110.012) and is derived from the ESO SkyCat tool (ascl:1109.019).

Draper, Peter W.; Gray, Norman; Berry, David S.; Taylor, Mark

2014-03-01

104

Tools for Basic Statistical Analysis  

NASA Technical Reports Server (NTRS)

Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

Luz, Paul L.

2005-01-01

105

SPIDA — A Novel Data Analysis Tool  

Microsoft Academic Search

In modern businesses, intelligent data analysis (IDA) is an important aspect of turning data into information and then into\\u000a action. Data analysis has become a practical area and data analysis methods are nowadays used as tools. This approach to data\\u000a analysis requires IDA platforms that support users and prevent them from making errors or from using methods in the wrong

D Nauck; M Spott; B Azvine

2003-01-01

106

Automated Steel Cleanliness Analysis Tool (ASCAT)  

Microsoft Academic Search

The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting

Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Scott Story

2005-01-01

107

The Galileo Fault Tree Analysis Tool  

Microsoft Academic Search

We present Galileo, a dynamic fault tree modeling and analysis tool that combines the innovative DIF- Tree analysis methodology with a rich user interface built using package-oriented programming. DIFTree integrates binary decision diagram and Markov meth- ods under the common notation of dynamic fault trees, allowing the user to exploit the benefits of both tech- niques while avoiding the need

Kevin J. Sullivan; Joanne Bechta Dugan; David Coppit

1999-01-01

108

Built Environment Energy Analysis Tool Overview (Presentation)  

SciTech Connect

This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

Porter, C.

2013-04-01

109

A network evaluation and analysis tool  

SciTech Connect

The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.

1993-01-01

110

A network evaluation and analysis tool  

SciTech Connect

The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.

1993-05-01

111

Photogrammetry Tool for Forensic Analysis  

NASA Technical Reports Server (NTRS)

A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

Lane, John

2012-01-01

112

Performance Analysis of GYRO: A Tool Evaluation  

SciTech Connect

The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

2005-06-26

113

RADC SCAT automated sneak circuit analysis tool  

NASA Astrophysics Data System (ADS)

The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.

Depalma, Edward L.

114

Deconvolution: a wavelet frame approach  

Microsoft Academic Search

This paper devotes to analyzing deconvolution algorithms based on wavelet frame approaches, which has already appeared in\\u000a Chan et al. (SIAM J. Sci. Comput. 24(4), 1408–1432, 2003; Appl. Comput. Hormon. Anal. 17, 91–115, 2004a; Int. J. Imaging Syst. Technol. 14, 91–104, 2004b) as wavelet frame based high resolution image reconstruction methods. We first give a complete formulation\\u000a of deconvolution in terms

Anwei Chai; Zuowei Shen

2007-01-01

115

Mars Reconnaissance Orbiter Uplink Analysis Tool  

NASA Technical Reports Server (NTRS)

This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

2008-01-01

116

Design and Analysis Tools for Supersonic Inlets  

NASA Technical Reports Server (NTRS)

Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

Slater, John W.; Folk, Thomas C.

2009-01-01

117

Wavelet bicoherence: A new turbulence analysis tool  

Microsoft Academic Search

Abstract Arecently introduced tool for the analysis of turbulence, wavelet bicoherence [B. Ph. van Milligen, C. Hidalgo and E. Sánchez, Phys. Rev. Lett. 16 (1995) 395], is investigated. It is capable of detecting phase coupling - nonlinear interactions of the lowest (quadratic) order - with time resolution. To demonstrate its potential, it is applied to numerical models of chaos and

B. Ph. van Milligen; E. Sánchez; T. Estrada; C. Hidalgo; B. Brañas; B. Carreras; L. García

1995-01-01

118

lmbench: Portable Tools for Performance Analysis  

Microsoft Academic Search

lmbench is a micro-benchmark suite designed to focus attention on the basic building blocks of man y common system applications, such as databases, simu- lations, software development, and networking. In almost all cases, the indi vidual tests are the result of analysis and isolation of a customer' sa ctual perfor- mance problem. These tools can be, and currently are, used

Larry W. Mcvoy; Carl Staelin

1996-01-01

119

Deconvolution of the vestibular evoked myogenic potential.  

PubMed

The vestibular evoked myogenic potential (VEMP) and the associated variance modulation can be understood by a convolution model. Two functions of time are incorporated into the model: the motor unit action potential (MUAP) of an average motor unit, and the temporal modulation of the MUAP rate of all contributing motor units, briefly called rate modulation. The latter is the function of interest, whereas the MUAP acts as a filter that distorts the information contained in the measured data. Here, it is shown how to recover the rate modulation by undoing the filtering using a deconvolution approach. The key aspects of our deconvolution algorithm are as follows: (1) the rate modulation is described in terms of just a few parameters; (2) the MUAP is calculated by Wiener deconvolution of the VEMP with the rate modulation; (3) the model parameters are optimized using a figure-of-merit function where the most important term quantifies the difference between measured and model-predicted variance modulation. The effectiveness of the algorithm is demonstrated with simulated data. An analysis of real data confirms the view that there are basically two components, which roughly correspond to the waves p13-n23 and n34-p44 of the VEMP. The rate modulation corresponding to the first, inhibitory component is much stronger than that corresponding to the second, excitatory component. But the latter is more extended so that the two modulations have almost the same equivalent rectangular duration. PMID:22079097

Lütkenhöner, Bernd; Basel, Türker

2012-02-01

120

The CO5BOLD analysis tool.  

NASA Astrophysics Data System (ADS)

The interactive IDL-based CO5BOLD Analysis Tool (CAT) was developed to facilitate an easy and quick analysis of numerical simulation data produced with the 2D/3D radiation magnetohydrodynamics code CO5BOLD. The basic mode of operation is the display and analysis of cross-sections through a model either as 2D slices or 1D graphs. A wide range of physical quantities can be selected. Further features include the export of models into VAPOR format or the output of images and animations. A short overview including scientific analysis examples is given.

Wedemeyer, S.

121

Specification Techniques for Automatic Performance Analysis Tools  

Microsoft Academic Search

Performance analysis of parallel programs is a time-consuming task and requires a lot of experience. It is the goal of the\\u000a KOJAK project at the Research Centre Juelich to develop an automatic performance analysis environment. A key requirement for\\u000a the success of this new environment is its easy integration with already existing tools on the target platform. The design\\u000a should

Michael Gerndt; Hans-georg Eßer

2000-01-01

122

Paramedir: A Tool for Programmable Performance Analysis  

NASA Technical Reports Server (NTRS)

Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

2004-01-01

123

Integrated multidisciplinary analysis tool IMAT users' guide  

NASA Technical Reports Server (NTRS)

The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

Meissner, Frances T. (editor)

1988-01-01

124

Decision Analysis Tools for Volcano Observatories  

NASA Astrophysics Data System (ADS)

Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

Hincks, T. H.; Aspinall, W.; Woo, G.

2005-12-01

125

Vibration analysis as a predictive maintenance tool  

SciTech Connect

Vibration analysis is a powerful and effective tool in both predicting and isolating incipient fault conditions. Vibration can assist in the identification of root cause failure analysis and can be used to establish maintenance procedures on a condition assessment basis rather than a scheduled or calendar basis. Recent advances in technology allow for not only new types of testing to be performed, but when integrated with other types of machine information, can lead to even greater insight and accuracy of the entire predictive maintenance program. Case studies and recent findings will be presented along with a discussion of how vibration is used as an invaluable tool in the detection of defects in gearboxes, mill stands, and roll chatter detection and correction. Acceptable vibration criteria and cost benefit summaries will be included.

Dischner, J.M. [Computational Systems, Inc., Knoxville, TN (United States)

1995-09-01

126

Integrated tools for control-system analysis  

NASA Technical Reports Server (NTRS)

The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

1989-01-01

127

Challenges Facing Design and Analysis Tools  

NASA Technical Reports Server (NTRS)

The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

2001-01-01

128

Image deconvolution in digital autoradiography  

PubMed Central

Digital autoradiography (DAR) is a powerful method to determine quantitatively the “small-scale” (i.e., submillimeter) distribution of a radiotracer within a tissue section. However, the limited spatial resolution of the DAR image, due to blurring by the point spread function (PSF), can result in a poor correlation with tissue histology and immunohistochemistry. The authors attempt to overcome this limitation by recovering the radiotracer distribution by image deconvolution using the Richardson-Lucy algorithm and a measured PSF obtained from a small radioactive source on hydrophobic microscope slide. Simulation studies have shown that the deconvolution algorithm reliably recovers the pixel values corresponding to the radioactivity distributions. As an example, the proposed image restoration approach has been tested with DAR images of different radiolabeled markers on tumor sections obtained from clinical and preclinical animal model studies. Digital autoradiograms following deconvolution show improved sharpness and contrast relative to the unprocessed autoradiograms.

Zhang, Mutian; Chen, Qing; Li, Xiao-Feng; O'Donoghue, Joseph; Ruan, Shutian; Zanzonico, Pat; Ling, C. Clifton; Humm, John L.

2009-01-01

129

Subsurface scattering deconvolution for improved NIR-visible facial image correlation  

Microsoft Academic Search

Significant improvements in face-recognition performance have recently been achieved by obtaining near infrared (NIR) probe images. We demonstrate that by taking into account the differential effects of sub-surface scattering, correlation between facial images in the visible (VIS) and NIR wavelengths can be significantly improved. Hence, by using Fourier analysis and Gaussian deconvolution with variable thresholds for the scattering deconvolution radius

Josef Kittler; David Windridge; Debaditya Goswami

2008-01-01

130

Regularization of deconvolution using steerable pyramids  

Microsoft Academic Search

In this paper, a method is proposed to regularize deconvolution based on steerable pyramids. Deconvolution is the process of recovering an ideal image from an observation, usually degraded by blurring and a certain noise process (additive, multiplicative, ...). Since deconvolution is an ill-posed problem, noise will be amplified when no regularization is applied. This regularization step formulates our prior knowledge

Filip Rooms; Wilfried Philips

2002-01-01

131

Sparse Deconvolution Using Support Vector Machines  

Microsoft Academic Search

Sparse deconvolution is a classical subject in digital signal processing, having many practical applications. Support vector machine (SVM) algorithms show a series of characteristics, such as sparse solutions and implicit regularization, which make them attractive for solving sparse deconvolution problems. Here, a sparse deconvolution algorithm based on the SVM framework for signal processing is presented and analyzed, including compara- tive

José Luis Rojo-Álvarez; Manel Martínez-Ramón; Jordi Muñoz-Marí; Gustavo Camps-Valls; Aníbal R. Figueiras-Vidal

2007-01-01

132

Desktop Analysis Reporting Tool (DART) User's Guide  

SciTech Connect

The Desktop Analysis Reporting Tool (DART) is a software package that allows a user to easily view and analyze radiation portal monitor (RPM) daily files that span long periods. DART gives users the capability to determine the state of health of a monitor, troubleshoot and diagnose problems, and view data in various time frames to perform trend analysis. In short, it converts the data strings written in the daily files into meaningful tables and plots. DART is an application-based program that was designed to maximize the benefit of a centralized data repository while distributing the workload to individual desktop machines. This networked approach requires a more complex database manager (SQL Server); however, SQL Server is not currently provided with the DART Installation Disk. SQL Express is sufficient for local data analysis and requires the installation of SQL Express and DART on each machine intended for analysis.

Lousteau, Angela L [ORNL; Alcala, Scott [ORNL

2012-04-01

133

Evaluation of HALO Deconvolution Schemes.  

National Technical Information Service (NTIS)

The final report chronicles various aspects of the research conducted by the principal investigator on the wavefront deconvolution problem. Only work carried out since the interim report RADC-TR-80-154, May 1980, was released is discussed because the mate...

R. Barakat

1981-01-01

134

Deconvolution using the complex cepstrum  

SciTech Connect

The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

Riley, H B

1980-12-01

135

Enhancement of Local Climate Analysis Tool  

NASA Astrophysics Data System (ADS)

The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

2012-12-01

136

Microfracturing and new tools improve formation analysis  

SciTech Connect

This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))

1992-12-07

137

DEVELOPING NEW TOOLS FOR POLICY ANALYSIS  

SciTech Connect

For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S&S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S&S) directives set will be conducted using software tools to analyze the present directives with a view toward 1) identifying areas of positive synergism among topical areas, 2) identifying areas of unnecessary duplication within and among topical areas, and 3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

Donovan, Richard L.; Schwartz, Michael J.; Selby, Kevin B.; Uecker, Norma J.

2010-08-11

138

Determinants for global cargo analysis tools  

NASA Astrophysics Data System (ADS)

The purpose of Global TRADER (GT) is not only to gather and query supply-chain transactional data for facts but also to analyze that data for hidden knowledge for the purpose of useful and meaningful pattern prediction. The application of advanced analytics provides benefits beyond simple information retrieval from GT, including computer-aided detection of useful patterns and associations. Knowledge discovery, offering a breadth and depth of analysis unattainable by manual processes, involves three components: repository structures, analytical engines, and user tools and reports. For a large and complex domain like supply-chains, there are many stages to developing the most advanced analytic capabilities; however, significant benefits accrue as components are incrementally added. These benefits include detecting emerging patterns; identifying new patterns; fusing data; creating models that can learn and predict behavior; and identifying new features for future tools. The GT Analyst Toolset was designed to overcome a variety of constraints, including lack of third party data, partial data loads, non-cleansed data (non-disambiguation of parties, misspellings, transpositions, etc.), and varying levels of analyst experience and expertise. The end result was a set of analytical tools that are flexible, extensible, tunable, and able to support a wide range of analyst demands.

Wilmoth, M.; Kay, W.; Sessions, C.; Hancock, M.

2007-05-01

139

Optimal application of Morrison's iterative noise removal for deconvolution  

NASA Technical Reports Server (NTRS)

Morrison's iterative method of noise removal can be applied for both noise removal alone and noise removal prior to deconvolution. This method is applied to noise of various noise levels added to determine the optimum use of the method. The phase shift method of migration and modeling is evaluated and the results are compared to Stolt's approach. A method is introduced by which the optimum iterative number for deconvolution can be found. Statistical computer simulation is used to describe the optimum use of two convergent iterative techniques for seismic data. The Always-Convergent deconvolution technique was applied to data recorded during the quantitative analysis of materials through NonDestructive Evaluation (NDE) in which ultrasonic signals were used to detect flaws in substances such as composites.

Ioup, George E.; Ioup, Juliette W.

1986-01-01

140

Interpretation and deconvolution of nanodisc native mass spectra.  

PubMed

Nanodiscs are a promising system for studying gas-phase and solution complexes of membrane proteins and lipids. We previously demonstrated that native electrospray ionization allows mass spectral analysis of intact Nanodisc complexes at single lipid resolution. This report details an improved theoretical framework for interpreting and deconvoluting native mass spectra of Nanodisc lipoprotein complexes. In addition to the intrinsic lipid count and charge distributions, Nanodisc mass spectra are significantly shaped by constructive overlap of adjacent charge states at integer multiples of the lipid mass. We describe the mathematical basis for this effect and develop a probability-based algorithm to deconvolute the underlying mass and charge distributions. The probability-based deconvolution algorithm is applied to a series of dimyristoylphosphatidylcholine Nanodisc native mass spectra and used to provide a quantitative picture of the lipid loss in gas-phase fragmentation. PMID:24353133

Marty, Michael T; Zhang, Hao; Cui, Weidong; Gross, Michael L; Sligar, Stephen G

2014-02-01

141

Three-dimensional analysis tool for segmenting and measuring the structure of telomeres in mammalian nuclei  

NASA Astrophysics Data System (ADS)

Quantitative analysis in combination with fluorescence microscopy calls for innovative digital image measurement tools. We have developed a three-dimensional tool for segmenting and analyzing FISH stained telomeres in interphase nuclei. After deconvolution of the images, we segment the individual telomeres and measure a distribution parameter we call ?T. This parameter describes if the telomeres are distributed in a sphere-like volume (?T ? 1) or in a disk-like volume (?T >> 1). Because of the statistical nature of this parameter, we have to correct for the fact that we do not have an infinite number of telomeres to calculate this parameter. In this study we show a way to do this correction. After sorting mouse lymphocytes and calculating ?T and using the correction introduced in this paper we show a significant difference between nuclei in G2 and nuclei in either G0/G1 or S phase. The mean values of ?T for G0/G1, S and G2 are 1.03, 1.02 and 13 respectively.

Vermolen, Bart J.; Young, Ian T.; Chuang, Alice; Wark, Landon; Chuang, Tony; Mai, Sabine; Garini, Yuval

2005-03-01

142

The NRL Protocol Analysis Tool: A Position Paper  

Microsoft Academic Search

The author gives a brief description of the NRL protocol analysis tool, and contrasts its approach with other approaches. The NRL protocol analysis tool was developed in order to assist in security proofs for protocols. However, it has also proved to be useful in pointing out previously undiscovered flaws in already published protocols. The successes using the protocol analysis tool

Catherine Meadows

1991-01-01

143

SEAT: A strategic engagement analysis tool  

SciTech Connect

The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

Dreicer, J.; Michelsen, C.; Morgeson, D.

1988-01-01

144

Software tools supporting business process analysis and modelling  

Microsoft Academic Search

Examines the wide range of business process analysis\\/modelling (BPA\\/M) tools available, and compares the features of 12 specific tools. Presents two case studies with examples of software tool analysis results. The discussion addresses whether these tools meet the needs of managers of change and business process re-engineering (BPR) initiatives, and offers suggestions for future tool evolution. The focus is to

Bing Yu; David T. Wright

1997-01-01

145

[Gait analysis--a new diagnostic tool].  

PubMed

Three-dimensional gait analysis is a systematic measurement, description, and assessment of human gait. Gait analysis is established as a useful diagnostic tool in patients with gait problems, as it is not possible to obtain an adequate and detailed understanding of such a complex mechanism as gait in a conventional clinical examination. The method has provided a better understanding of both normal gait and abnormal gait patterns; it is a suitable instrument for evaluation of treatment results as well as for scientific work. The first gait laboratory for clinical use in Norway was established in 2002 in the Section for child neurology at Rikshospitalet University Hospital in Oslo, Norway. In this article the procedure for gait analysis is described and the clinical value is indicated by a case record of a child with cerebral palsy. Gait analysis has entailed a change of policy with regard to surgical treatment in this patient group. Previously, operative intervention at a single level was usual, whereas current practice involves simultaneous interventions at several levels of both lower extremities. After three years' experience we recommend gait analysis in routine diagnostics, particularly as a preoperative evaluation, in all children with gait problems and in the follow up after surgery or other treatment. PMID:16100541

Lofterød, Bjørn; Terjesen, Terje; Skaaret, Ingrid

2005-08-11

146

Bayesian deconvolution method applied to experimental bidirectional transmittance distribution functions  

NASA Astrophysics Data System (ADS)

Optical simulations are a common tool in the development of luminaires for lighting applications. The reliability of the virtual prototype is strongly dependent on the accuracy of the input data such as the emission characteristics of the light source and the scattering properties of the optical components (reflectors, filters and diffusers). These scattering properties are characterized by the bidirectional scatter distribution function (BSDF). Experimental determination of the BSDF of the materials is however very sensitive to the characteristics of the measuring instrument, i.e. the dimensions of the illumination spot, the detector aperture, etc. These instrumental characteristics are reflected in the instrument function. In order to eliminate the influence of the instrument function the use of a Bayesian deconvolution technique is proposed. A suitable stopping rule for the iterative deconvolution algorithm is presented. The deconvolution method is validated using Monte Carlo ray tracing software by simulating a BSDF measurement instrument and a virtual sample with a known bidirectional transmittance distribution function (BTDF). The Bayesian deconvolution technique is applied to experimental BTDF data of holographic diffusers, which exhibit a symmetrical angular broadening under normal incident irradiation. In addition, the effect of applying deconvolved experimental BTDF data on simulations of luminance maps is illustrated.

Audenaert, Jan; Leloup, Frédéric B.; Durinck, Guy; Deconinck, Geert; Hanselaer, Peter

2013-03-01

147

Automated Steel Cleanliness Analysis Tool (ASCAT)  

SciTech Connect

The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

2005-12-30

148

Wavelet bicoherence: A new turbulence analysis tool  

SciTech Connect

A recently introduced tool for the analysis of turbulence, wavelet bicoherence [van Milligen, Hidalgo, and Sanchez, Phys. Rev. Lett. {bold 16}, 395 (1995)], is investigated. It is capable of detecting phase coupling---nonlinear interactions of the lowest (quadratic) order---with time resolution. To demonstrate its potential, it is applied to numerical models of chaos and turbulence and to real measurements. It detected the coupling interaction between two coupled van der Pol oscillators. When applied to a model of drift wave turbulence relevant to plasma physics, it detected a highly localized coherent structure. Analyzing reflectometry measurements made in fusion plasmas, it detected temporal intermittency and a strong increase in nonlinear phase coupling coinciding with the L/H (low-to-high confinement mode) transition. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}.

van Milligen, B.P.; Sanchez, E.; Estrada, T.; Hidalgo, C.; Branas, B. [Asociacion EURATOM-CIEMAT, Madrid (Spain)] [Asociacion EURATOM-CIEMAT, Madrid (Spain); Carreras, B. [Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States)] [Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States); Garcia, L. [Universidad Carlos III, Madrid (Spain)] [Universidad Carlos III, Madrid (Spain)

1995-08-01

149

Airborne LIDAR Data Processing and Analysis Tools  

NASA Astrophysics Data System (ADS)

Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

Zhang, K.

2007-12-01

150

AIDA: Adaptive Image Deconvolution Algorithm  

NASA Astrophysics Data System (ADS)

AIDA is an implementation and extension of the MISTRAL myopic deconvolution method developed by Mugnier et al. (2004) (see J. Opt. Soc. Am. A 21:1841-1854). The MISTRAL approach has been shown to yield object reconstructions with excellent edge preservation and photometric precision when used to process astronomical images. AIDA improves upon the original MISTRAL implementation. AIDA, written in Python, can deconvolve multiple frame data and three-dimensional image stacks encountered in adaptive optics and light microscopic imaging.

Hom, Erik; Haase, Sebastian; Marchis, Franck

2013-10-01

151

Defining Digital Forensic Examination and Analysis Tools Using Abstraction Layers  

Microsoft Academic Search

This paper uses the theory of abstraction layers to describe the purpose and goals of digital forensic analysis tools. Using abstraction layers, we identify where tools can introduce errors and provide requirements that the tools must follow. Categories of forensic analysis types are also defined based on the abstraction layers. Abstraction layers are not a new concept, but their usage

Brian Carrier

2002-01-01

152

Statistical tools for regional data analysis using GIS  

Microsoft Academic Search

A GIS provides a powerful collection of tools for the management, visualization and analysis of spatial data. These tools can be even more powerful when they are integrated with statistical methods for spatial data analysis and many GIS users are requesting this integration. The Geostatistical Analyst extension to ArcGIS was developed to integrate statistical methods with GIS tools for mapping

Konstantin Krivoruchko; Carol A. Gotway; Alex Zhigimont

2003-01-01

153

Industry Sector Analysis Mexico: Machine Tools.  

National Technical Information Service (NTIS)

Mexico represents a promising market for U.S. machine tools industry. Metal cutting machine tools such as grinding machines, lathes and drilling and metal forming machines such as punching, bending, presses and forming machines will be among those offerin...

F. Ceron J. Koloditch

1993-01-01

154

Built Environment Analysis Tool: April 2013  

SciTech Connect

This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

Porter, C.

2013-05-01

155

Tool commonality analysis for yield enhancement  

Microsoft Academic Search

ULSI semiconductor processing today involves hundreds of process steps through various semiconductor processing tools. Any tool excursion could lead to serious and costly yield problems. Tool commonality among bad lots is a proven technique to identify the root cause of the problem. As the complexity of process and the number of process steps increase, it is a very challenging task

George Kong

2002-01-01

156

Deconvolution of Plant Type(s) for Homeland Security Enforcement Using Remote Sensing on a UAV Collection Platform.  

National Technical Information Service (NTIS)

The technological ability to distinguish drug plants from other plant types, termed deconvolution, can be a valuable technological tool in the fight against drug trafficking and the war on terrorism. The use of computers and associated hardware as well as...

J. A. Tindall

2006-01-01

157

Solar Array Verification Analysis Tool (SAVANT) Developed  

NASA Technical Reports Server (NTRS)

Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

1999-01-01

158

Comparative analysis of pharmacophore screening tools.  

PubMed

The pharmacophore concept is of central importance in computer-aided drug design (CADD) mainly because of its successful application in medicinal chemistry and, in particular, high-throughput virtual screening (HTVS). The simplicity of the pharmacophore definition enables the complexity of molecular interactions between ligand and receptor to be reduced to a handful set of features. With many pharmacophore screening softwares available, it is of the utmost interest to explore the behavior of these tools when applied to different biological systems. In this work, we present a comparative analysis of eight pharmacophore screening algorithms (Catalyst, Unity, LigandScout, Phase, Pharao, MOE, Pharmer, and POT) for their use in typical HTVS campaigns against four different biological targets by using default settings. The results herein presented show how the performance of each pharmacophore screening tool might be specifically related to factors such as the characteristics of the binding pocket, the use of specific pharmacophore features, and the use of these techniques in specific steps/contexts of the drug discovery pipeline. Algorithms with rmsd-based scoring functions are able to predict more compound poses correctly as overlay-based scoring functions. However, the ratio of correctly predicted compound poses versus incorrectly predicted poses is better for overlay-based scoring functions that also ensure better performances in compound library enrichments. While the ensemble of these observations can be used to choose the most appropriate class of algorithm for specific virtual screening projects, we remarked that pharmacophore algorithms are often equally good, and in this respect, we also analyzed how pharmacophore algorithms can be combined together in order to increase the success of hit compound identification. This study provides a valuable benchmark set for further developments in the field of pharmacophore search algorithms, e.g., by using pose predictions and compound library enrichment criteria. PMID:22646988

Sanders, Marijn P A; Barbosa, Arménio J M; Zarzycka, Barbara; Nicolaes, Gerry A F; Klomp, Jan P G; de Vlieg, Jacob; Del Rio, Alberto

2012-06-25

159

Blind deconvolution applied to acoustical systems identification with supporting experimental results.  

PubMed

Many acoustical applications require the analysis of a signal that is corrupted by an unknown filtering function. Examples arise in the areas of noise or vibration control, room acoustics, structural vibration analysis, and speech processing. Here, the observed signal can be modeled as the convolution of the desired signal with an unknown system impulse response. Blind deconvolution refers to the process of learning the inverse of this unknown impulse response and applying it to the observed signal to remove the filtering effects. Unlike classical deconvolution, which requires prior knowledge of the impulse response, blind deconvolution requires only reasonable prior estimates of the input signal's statistics. The significant contribution of this work lies in experimental verification of a blind deconvolution algorithm in the context of acoustical system identification. Previous experimental work concerning blind deconvolution in acoustics has been minimal, as previous literature concerning blind deconvolution uses computer simulated data. This paper examines experiments involving three classical acoustic systems: driven pipe, driven pipe with open side branch, and driven pipe with Helmholtz resonator side branch. Experimental results confirm that the deconvolution algorithm learns these systems' inverse impulse responses, and that application of these learned inverses removes the effects of the filters. PMID:14587599

Roan, Michael J; Gramann, Mark R; Erling, Josh G; Sibul, Leon H

2003-10-01

160

Blind deconvolution applied to acoustical systems identification with supporting experimental results  

NASA Astrophysics Data System (ADS)

Many acoustical applications require the analysis of a signal that is corrupted by an unknown filtering function. Examples arise in the areas of noise or vibration control, room acoustics, structural vibration analysis, and speech processing. Here, the observed signal can be modeled as the convolution of the desired signal with an unknown system impulse response. Blind deconvolution refers to the process of learning the inverse of this unknown impulse response and applying it to the observed signal to remove the filtering effects. Unlike classical deconvolution, which requires prior knowledge of the impulse response, blind deconvolution requires only reasonable prior estimates of the input signal's statistics. The significant contribution of this work lies in experimental verification of a blind deconvolution algorithm in the context of acoustical system identification. Previous experimental work concerning blind deconvolution in acoustics has been minimal, as previous literature concerning blind deconvolution uses computer simulated data. This paper examines experiments involving three classical acoustic systems: driven pipe, driven pipe with open side branch, and driven pipe with Helmholtz resonator side branch. Experimental results confirm that the deconvolution algorithm learns these systems' inverse impulse responses, and that application of these learned inverses removes the effects of the filters.

Roan, Michael J.; Gramann, Mark R.; Erling, Josh G.; Sibul, Leon H.

2003-10-01

161

Spacecraft Electrical Power System (EPS) generic analysis tools and techniques  

NASA Technical Reports Server (NTRS)

An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

Morris, Gladys M.; Sheppard, Mark A.

1992-01-01

162

Tools for Decision Analysis: Analysis of Risky Decisions  

NSDL National Science Digital Library

This site offers a decision making procedure for solving complex problems step by step. It presents the decision-analysis process for both public and private decision-making, using different decision criteria, different types of information, and information of varying quality. It describes the elements in the analysis of decision alternatives and choices, as well as the goals and objectives that guide decision-making. The key issues related to a decision-maker's preferences regarding alternatives, criteria for choice, and choice modes, together with the risk assessment tools are also presented.

163

Mars UHF relay telecom: Engineering tools and analysis  

Microsoft Academic Search

A compact set of tools have evolved for engineering analysis of Mars UHF relay telecommunications performance at NASA's Jet Propulsion Laboratory. The tools model all telecom variables from the RF layer up through the protocol link layer. The tools can solve for point solutions, analyze dynamic performance based on link geometry and can generate a variety of multi-day performance statistics.

Bradford W. Arnold; David J. Bell; Monika J. Danos; Peter A. Ilott; Ricardo Mendoza; Mazen Shihabi

2012-01-01

164

Scalable analysis tools for sensitivity analysis and UQ (3160) results.  

SciTech Connect

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

2009-09-01

165

Blind deconvolution of speckle images.  

NASA Astrophysics Data System (ADS)

A technique for deconvolving an image from both a single convolution and an ensemble of differently blurred images is presented. The method is more robust than the earlier blind deconvolution algorithms proposed by Ayers and Dainty. The performance of the algorithm in the presence of noise is evaluated. It is also demonstrated how the algorithm can be modified to utilize the much greater amount of information contained in an ensemble of differently blurred pictures of an image. Reconstructions using both computer simulations and infrared astronomical speckle data are presented. The speckle reconstructions are compared with those obtained by both Fourier phase retrieval and bispectral estimation.

Lane, R. G.

1992-09-01

166

Principal components analysis as a tool for Quaternary paleoclimatic research  

SciTech Connect

Nine small lakes on southeast Baffin Island, NWT, Canada, were cored and the sediments retrieved were analyzed for sediment size and composition, magnetic susceptibility, sediment geochemistry, organic matter content, and carbon isotopic composition. Age control was obtained from 85 AMS radiocarbon dates. in total, 1,847 measurements were made on twelve cores. The size of the data set precluded the use of visual analysis of the trends within each of the variable data sets. The method used to deconvolute the paleoenvironmental signal was one of principal components analysis and regression. Principal components analysis was carried out on the entire data set to determine which variables caused most of the variance within the overall signal. This showed that three principal components axes (PCAs) could account for 79% of the total variance within the data set. For each PCA, the closest correlated variable was chosen (sand content, total organic matter content, and sedimentation rate) and for each lake core, this variable was regressed against time. Residuals from the regression trend were then derived and normalized to a Z score. Z scores for each variable were plotted against age. Then, within 500 year timeslots, the median residual Z score was determined. This gave a stepped record of residuals throughout the Holocene and indicated periods of significant environmental change within the lakes' watersheds. Comparing this to previously obtained pollen and diatom records from the same area showed similarity and also illustrated important local differences.

Miller, R.J.O. (Univ. of Colorado, Boulder, CO (United States). Dept. of Geological Sciences)

1992-01-01

167

DERMAL ABSORPTION OF PESTICIDES CALCULATED BY DECONVOLUTION  

EPA Science Inventory

Using published human data on skin-to-urine and blood-to-urine transfer of 12 pesticides and herbicides, the skin-to-blood transfer rates for each compound were estimated by two numerical deconvolution techniques. Regular constrained deconvolution produced an estimated upper limi...

168

Direct deconvolution of radio synthesis images using L1 minimisation  

NASA Astrophysics Data System (ADS)

Aims: We introduce an algorithm for the deconvolution of radio synthesis images that accounts for the non-coplanar-baseline effect, allows multiscale reconstruction onto arbitrarily positioned pixel grids, and allows the antenna elements to have direcitonal dependent gains. Methods: Using numerical L1-minimisation techniques established in the application of compressive sensing to radio astronomy, we directly solve the deconvolution equation using graphics processing unit (GPU) hardware. This approach relies on an analytic expression for the contribution of a pixel in the image to the observed visibilities, and the well-known expression for Dirac delta function pixels is used along with two new approximations for Gaussian pixels, which allow for multi-scale deconvolution. The algorithm is similar to the CLEAN algorithm in that it fits the reconstructed pixels in the image to the observed visibilities while minimising the total flux; however, unlike CLEAN, it operates on the ungridded visibilities, enforces positivity, and has guaranteed global convergence. The pixels in the image can be arbitrarily distributed and arbitrary gains between each pixel and each antenna element can also be specified. Results: Direct deconvolution of the observed visibilities is shown to be feasible for several deconvolution problems, including a 1 megapixel wide-field image with over 400 000 visibilities. Correctness of the algorithm is shown using synthetic data, and the algorithm shows good image reconstruction performance for wide field images and requires no regridding of visibilities. Though this algorithm requires significantly more computation than methods based on the CLEAN algorithm, we demonstrate that it is trivially parallelisable across multiple GPUs and potentially can be scaled to GPU clusters. We also demonstrate that a significant speed up is possible through the use of multi-scale analysis using Gaussian pixels.

Hardy, Stephen J.

2013-09-01

169

Industry Sector Analysis Mexico: Machine Tools.  

National Technical Information Service (NTIS)

The Industry Sector Analyses (I.S.A.) for machine tools contains statistical and narrative information on projected market demand, end-users, receptivity of Mexican consumers to U.S. products, the competitive situation - Mexican production, total import m...

1990-01-01

170

Industry Sector Analysis Mexico: Metalworking Machine Tools.  

National Technical Information Service (NTIS)

The Industry Sector Analyses (I.S.A.) for metalworking machine tools contains statistical and narrative information on projected market demand, end-users, receptivity of Mexican consumers to U.S. products, the competitive situation - Mexican production, t...

1991-01-01

171

Tools for Knowledge Analysis, Synthesis, and Sharing  

NASA Astrophysics Data System (ADS)

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

Medland, Michael B.

2007-04-01

172

Interactive Graphics Tools for Analysis of MOLA and Other Data  

NASA Technical Reports Server (NTRS)

We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

Frey, H.; Roark, J.; Sakimoto, S.

2000-01-01

173

General Mission Analysis Tool (GMAT) User's Guide (Draft)  

NASA Technical Reports Server (NTRS)

4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

Hughes, Steven P.

2007-01-01

174

Scenarios in selected tools for environmental systems analysis  

Microsoft Academic Search

A number of different tools for analysing environmental impacts of different systems have been developed. These include procedural tools such as strategic environmental assessment (SEA) and environmental management systems (EMS) as well as analytical ones such as life cycle assessment (LCA), life cycle costing (LCC), cost–benefit analysis (CBA) and the system of economic and environmental accounts (SEEA) including input–output analysis

Mattias Höjer; Sofia Ahlroth; Karl-Henrik Dreborg; Tomas Ekvall; Göran Finnveden; Olof Hjelm; Elisabeth Hochschorner; Måns Nilsson; Viveka Palm

2008-01-01

175

Multi-body Dynamic Contact Analysis Tool for Transmission Design.  

National Technical Information Service (NTIS)

The report describes the development of a finite element based multi- body contact analysis tool. This analysis tool is meant for use in the design of complex and flexible geared transmissions, such as those found in rotor-craft and automobiles. The techn...

S. Vijayakar S. Abad R. Gunda

2003-01-01

176

Space mission scenario development and performance analysis tool  

Microsoft Academic Search

This paper discusses a new and innovative approach for a rapid spacecraft multidisciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during

M. Kordon; J. Baker; J. Gilbert; D. Hanks

2005-01-01

177

SWOT Analysis Support Tool for Verification of Business Strategy  

Microsoft Academic Search

To verify business strategies, it is essential to extract and analyze business information. The amount of business information is so enormous that it is time-consuming for an analyst to extract and analyze. Thus, we propose a support tool for verifying business strategies using SWOT analysis. This tool supports the extraction of factors for analysis. However, the information includes unnecessary factors,

M. Samejima; Y. Shimizu; M. Akiyoshi; N. Komoda

2006-01-01

178

Traffic Analysis Toolbox Volume II. Decision Support Methodology for Selecting Traffic Analysis Tools.  

National Technical Information Service (NTIS)

This report provides an overview of the role of traffic analysis tools in the transportation analysis process and provides a detailed decision support methodology for selecting the appropriate type of analysis tool for the job at hand. An introduction to ...

K. Jeannotte A. Chandra V. Alexiadis A. Skabardonis

2004-01-01

179

A Multidimensional Analysis Tool for Visualizing Online Interactions  

ERIC Educational Resources Information Center

This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

Kim, Minjeong; Lee, Eunchul

2012-01-01

180

An Integrated Tool for System Analysis of Sample Return Vehicles  

NASA Technical Reports Server (NTRS)

The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

2012-01-01

181

FDTD simulation tools for UWB antenna analysis.  

SciTech Connect

This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

Brocato, Robert Wesley

2004-12-01

182

FDTD simulation tools for UWB antenna analysis.  

SciTech Connect

This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

Brocato, Robert Wesley

2005-02-01

183

Tools for Knowledge Analysis, Synthesis, and Sharing  

ERIC Educational Resources Information Center

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

Medland, Michael B.

2007-01-01

184

CCMR: Wound Dressing Tool and Wound Analysis  

NSDL National Science Digital Library

The goal of our project is to develop a Wound Dressing Tool (WDT) that in addition to extracting overabundant chemicals like the VAC system does, can also allow for variable rates of mass transfer as well as a way for clinicians to monitor the fluid chemical composition of the wound bed during the healing and treatment processes.

Men, Shannon

2005-08-17

185

Knowledge Mapping: A Multipurpose Task Analysis Tool.  

ERIC Educational Resources Information Center

Describes knowledge mapping, a tool developed to increase the objectivity and accuracy of task difficulty ratings for job design. Application in a semiconductor manufacturing environment is discussed, including identifying prerequisite knowledge for a given task; establishing training development priorities; defining knowledge levels; identifying…

Esque, Timm J.

1988-01-01

186

Model analysis tools in the Virtual Model Repository (VMR)  

NASA Astrophysics Data System (ADS)

The Virtual Model Repository (VMR) provides scientific analysis tools for a wide variety of numerical models of the Earth's magnetosphere. Data discovery, visualization tools and data/model comparisons are provided in a consistent and intuitive format. A large collection of numerical model runs are available to analyze, including the large Earth magnetosphere event run library at the CCMC and many runs from the University of Michigan. Relevant data useful for data/model comparisons is found using various APIs and included in many of the visualization tools. Recent additions to the VMR include a comprehensive suite of tools for analysis of the Global Ionosphere Thermosphere Model (GITM).

De Zeeuw, D.; Ridley, A. J.

2013-12-01

187

Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool  

NASA Technical Reports Server (NTRS)

This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

Maul, William A.; Fulton, Christopher E.

2011-01-01

188

Fourier transform spectrometer instrument lineshape (ILS) retrieval by Fourier deconvolution  

NASA Astrophysics Data System (ADS)

This work presents a Fourier deconvolution (FD) technique for retrieving the instrument lineshape (ILS) function of high-resolution Fourier transform infrared (FTIR) spectrometers. The ILS retrieved by FD is compared with the results obtained using the LINEFIT technique of Hase. The effect of a non-ideal ILS function on quantitative analysis of HBr is explored and improvements in the results of quantitative analyses are demonstrated.

Bernardo, Cirilo; Griffith, David W. T.

2005-10-01

189

Time domain deconvolution approach relying on Galois sequences  

Microsoft Academic Search

Purpose – The paper seeks to present a novel pulsed eddy currents (PEC) non-destructive technique to investigate deep cracks in metallic structures. Design\\/methodology\\/approach – The method is based on the time-domain analysis of the “defect response” and joins the PEC approach with the exploitation of the peculiar auto-correlation properties of the Galois sequences. The procedure, relying on the deconvolution of

Pietro Burrascano; Mario Carpentieri; Alessandro Pirani; Marco Ricci; Francesco Tissi

2007-01-01

190

SIMMER as a safety analysis tool  

SciTech Connect

SIMMER has been used for numerous applications in fast reactor safety, encompassing both accident and experiment analysis. Recent analyses of transition-phase behavior in potential core disruptive accidents have integrated SIMMER testing with the accident analysis. Results of both the accident analysis and the verification effort are presented as a comprehensive safety analysis program.

Smith, L.L.; Bell, C.R.; Bohl, W.R.; Bott, T.F.; Dearing, J.F.; Luck, L.B.

1982-01-01

191

FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)  

NASA Technical Reports Server (NTRS)

The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

Pack, G.

1994-01-01

192

Fuzzy logic components for iterative deconvolution systems  

NASA Astrophysics Data System (ADS)

Deconvolution systems rely heavily on expert knowledge and would benefit from approaches that capture this expert knowledge. Fuzzy logic is an approach that is used to capture expert knowledge rules and produce outputs that range in degree. This paper describes a fuzzy-deconvolution-system that integrates traditional Richardson-Lucy deconvolution with fuzzy components. The system is intended for restoration of 3D widefield images taken under conditions of refractive index mismatch. The system uses a fuzzy rule set for calculating sample refractive index, a fuzzy median filter for inter-iteration noise reduction, and a fuzzy rule set for stopping criteria.

Northan, Brian M.

2013-02-01

193

IC failure analysis: techniques and tools for quality reliability improvement  

Microsoft Academic Search

The role of failure analysis is discussed. Failure analysis techniques and tools, including electrical measurements, optical microscopy, thermal imaging analysis, electron beam techniques, light emission microscopy, ion beam techniques, and scanning probe microscopy, are reviewed. Opportunities for advances in the field of IC failure analysis are considered

JERRY M. SODEN; RICHARD E. ANDERSON

1993-01-01

194

Symbolic data analysis tools for recommendation systems  

Microsoft Academic Search

Recommender systems have become an important tool to cope with the information overload problem by acquiring data about user\\u000a behavior. After tracing the user’s behavior, through actions or rates, computational recommender systems use information-\\u000a filtering techniques to recommend items. In order to recommend new items, one of the three major approaches is generally adopted:\\u000a content-based filtering, collaborative filtering, or hybrid

Byron Leite Dantas Bezerra; Francisco de Assis Tenorio de Carvalho

2011-01-01

195

Development of wavelet analysis tools for turbulence  

NASA Technical Reports Server (NTRS)

Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

1992-01-01

196

Development of wavelet analysis tools for turbulence  

NASA Astrophysics Data System (ADS)

Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

Bertelrud, A.; Erlebacher, G.; Dussouillez, Ph.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, Ph.

1992-07-01

197

Interaction Analysis: A Tool for Understanding Negotiations.  

ERIC Educational Resources Information Center

The authors explain how interaction analysis can extend existing methods of analyzing negotiations by examining economic bargaining theories. They code the transcript of a negotiation using two coding schemes and apply Markov chain analysis to the results. (Author/SK)

Bednar, David A.; Curington, William P.

1983-01-01

198

Tools for Physics Analysis in CMS  

NASA Astrophysics Data System (ADS)

The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling the development of common analysis efforts across and within physics analysis groups. It aims at fulfilling the needs of most CMS analyses, providing both ease-of-use for the beginner and flexibility for the advanced user. The main PAT concepts are described in detail and some examples from realistic physics analyses are given.

Hinzmann, Andreas

2011-12-01

199

The physics analysis tools project for the ATLAS experiment  

NASA Astrophysics Data System (ADS)

The Large Hadron Collider is expected to start colliding proton beams in 2009. The enormous amount of data produced by the ATLAS experiment (~1 PB per year) will be used in searches for the Higgs boson and Physics beyond the standard model. In order to meet this challenge, a suite of common Physics Analysis Tools has been developed as part of the Physics Analysis software project. These tools run within the ATLAS software framework, ATHENA, covering a wide range of applications. There are tools responsible for event selection based on analysed data and detector quality information, tools responsible for specific physics analysis operations including data quality monitoring and physics validation, and complete analysis toolkits (frameworks) with the goal to aid the physicist to perform his analysis hiding the details of the ATHENA framework.

Lenzi, Bruno; ATLAS Collaboration

2012-12-01

200

Automated Security Protocol Analysis With the AVISPA Tool  

Microsoft Academic Search

The AVISPA Tool is a push-button tool for the Automated Validation of Internet Security Protocols and Applications. It provides a modular and expressive formal language for specifying protocols and their security properties, and integrates different back-ends that implement a variety of automatic protocol analysis techniques. Experimental results, carried out on a large library of Internet security protocols, indicate that the

Luca Viganò

2006-01-01

201

ATACOBOL: A COBOL Test Coverage Analysis Tool and Its Applications  

Microsoft Academic Search

A coverage testing tool ATACOBOL (Automatic Test Analysis for COBOL) that applies data flow coverage technique is developed for software development on IBM System\\/390 mainframe. We show that the data flow coverage criteria can identify possible problematic paths that maps to the actual testing semantic required by Y2K compliance software testing. However, the mainframe environment lacks testing tools that equip

Sam K. S. Sze; Michael R. Lyu

2000-01-01

202

Tools and techniques for failure analysis and qualification of MEMS.  

SciTech Connect

Many of the tools and techniques used to evaluate and characterize ICs can be applied to MEMS technology. In this paper we discuss various tools and techniques used to provide structural, chemical, and electrical analysis and how these data aid in qualifying MEMS technologies.

Walraven, Jeremy Allen

2003-07-01

203

Analysis of tools for simulation of Shipboard Electric Power Systems  

Microsoft Academic Search

Navy Shipboard Power Systems have different characteristics when compared with utility power systems. To conduct system studies on shipboard power systems, an effective simulation tool is required to model the Shipboard Electric Power Systems. This paper presents the results of the analysis of three popular simulation tools, ATP, PSpice and Saber for transient simulation of Shipboard Electric Power Systems (SPSs).

H. Zhang; K. l. Butler; N. d. r. Sarma; H. Docarmo; S. Gopalakrishnan; A. Adediran

2001-01-01

204

Design of a Syntax Validation Tool for Requirements Analysis Using Structured Analysis and Design Technique (SADT).  

National Technical Information Service (NTIS)

This thesis investigation presents the prototype development of a validation tool for checking the syntax of Structured Analysis and Design Technique (SADT) method from a structured analysis diagram. The tool provides the requirements analyst and the desi...

D. H. Jung

1988-01-01

205

Monte Carlo Approach to Numerical Deconvolution.  

National Technical Information Service (NTIS)

A numerical procedure for solving deconvolution problems is presented. The procedure is based on the Monte Carlo method, which statistically estimates each element in the deconvolved excitation. A discrete Fourier transform technique is used to improve th...

M. P. Ekstrom

1976-01-01

206

Deconvolution of sinusoidal rapid EPR scans.  

PubMed

In rapid scan EPR the magnetic field is scanned through the signal in a time that is short relative to electron spin relaxation times. Previously it was shown that the slow-scan lineshape could be recovered from triangular rapid scans by Fourier deconvolution. In this paper a general Fourier deconvolution method is described and demonstrated to recover the slow-scan lineshape from sinusoidal rapid scans. Since an analytical expression for the Fourier transform of the driving function for a sinusoidal scan was not readily apparent, a numerical method was developed to do the deconvolution. The slow scan EPR lineshapes recovered from rapid triangular and sinusoidal scans are in excellent agreement for lithium phthalocyanine, a trityl radical, and the nitroxyl radical, tempone. The availability of a method to deconvolute sinusoidal rapid scans makes it possible to scan faster than is feasible for triangular scans because of hardware limitations on triangular scans. PMID:21163677

Tseitlin, Mark; Rinard, George A; Quine, Richard W; Eaton, Sandra S; Eaton, Gareth R

2011-02-01

207

Deconvolution of Sinusoidal Rapid EPR Scans  

PubMed Central

In rapid scan EPR the magnetic field is scanned through the signal in a time that is short relative to electron spin relaxation times. Previously it was shown that the slow scan lineshape could be recovered from triangular rapid scans by Fourier deconvolution. In this paper a general Fourier deconvolution method is described and demonstrated to recover the slow scan lineshape from sinusoidal rapid scans. Since an analytical expression for the Fourier transform of the driving function for a sinusoidal scan was not readily apparent, a numerical method was developed to do the deconvolution. The slow scan EPR lineshapes recovered from rapid triangular and sinusoidal scans are in excellent agreement for lithium phthalocyanine, a trityl radical, and the nitroxyl radical, tempone. The availability of a method to deconvolute sinusoidal rapid scans makes it possible to scan faster than is feasible for triangular scans because of hardware limitations on triangular scans.

Tseitlin, Mark; Rinard, George A.; Quine, Richard W.; Eaton, Sandra S.; Eaton, Gareth R.

2011-01-01

208

Extending Iris: The VAO SED Analysis Tool  

NASA Astrophysics Data System (ADS)

Iris is a tool developed by the Virtual Astronomical Observatory (VAO) for building and analyzing Spectral Energy Distributions (SEDs). Iris was designed to be extensible, so that new components and models can be developed by third parties and then included at runtime. Iris can be extended in different ways: new file readers allow users to integrate data in custom formats into Iris SEDs; new models can be fitted to the data, in the form of template libraries for template fitting, data tables, and arbitrary Python functions. The interoperability-centered design of Iris and the Virtual Observatory standards and protocols can enable new science functionalities involving SED data.

Laurino, O.; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.

2013-10-01

209

BRFSS: Prevalence Data and Data Analysis Tools  

NSDL National Science Digital Library

RFSS is the nation's premier system of health-related telephone surveys that collect state data about U.S. residents regarding their health-related risk behaviors, chronic health conditions, and use of preventive services. BRFSS collects data in all 50 states as well as the District of Columbia and three U.S. territories. BRFSS completes more than 400,000 adult interviews each year, making it the largest continuously conducted health survey system in the world. These tools allow the user to perform various analyses and display the data in different means. 

Control, Center F.

210

Efficient Bayesian-based multiview deconvolution.  

PubMed

Light-sheet fluorescence microscopy is able to image large specimens with high resolution by capturing the samples from multiple angles. Multiview deconvolution can substantially improve the resolution and contrast of the images, but its application has been limited owing to the large size of the data sets. Here we present a Bayesian-based derivation of multiview deconvolution that drastically improves the convergence time, and we provide a fast implementation using graphics hardware. PMID:24747812

Preibisch, Stephan; Amat, Fernando; Stamataki, Evangelia; Sarov, Mihail; Singer, Robert H; Myers, Eugene; Tomancak, Pavel

2014-06-01

211

The role and selection of the filter function in Fourier self-deconvolution revisited.  

PubMed

Overlapped bands often appear in applications of infrared spectroscopy, for instance in the analysis of the amide I band of proteins. Fourier self-deconvolution (FSD) is a popular band-narrowing mathematical method, allowing for the resolution of overlapped bands. The filter function used in FSD plays a significant role in the factor by which the deconvolved bands are actually narrowed (the effective narrowing), as well as in the final signal-to-noise degradation induced by FSD. Moreover, the filter function determines, to a good extent, the band-shape of the deconvolved bands. For instance, the intensity of the harmful side-lobule oscillations that appear in over-deconvolution depends importantly on the filter function used. In the present paper we characterized the resulting band shape, effective narrowing, and signal-to-noise degradation in infra-, self-, and over-deconvolution conditions for several filter functions: Triangle, Bessel, Hanning, Gaussian, Sinc2, and Triangle2. We also introduced and characterized new filters based on the modification of the Blackmann filter. Our conclusion is that the Bessel filter (in infra-, self-, and mild over-deconvolution), the newly introduced BL3 filter (in self- and mild/moderate over-deconvolution), and the Gaussian filter (in moderate/strong over-deconvolution) are the most suitable filter functions to be used in FSD. PMID:19589217

Lórenz-Fonfría, Víctor A; Padrós, Esteve

2009-07-01

212

Industry Sector Analysis Mexico: Machine-Tools: Presses.  

National Technical Information Service (NTIS)

The market survey covers the machine tool presses market in Mexico. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Mexican consumers to U.S. products; the competitive situation, and market...

A. Harris

1991-01-01

213

Analysis tools for turbulence studies at Alcator C-Mod  

NASA Astrophysics Data System (ADS)

A new suite of analysis tools written in IDL is being developed to support experimental investigation of turbulence at Alcator C-Mod. The tools include GUIs for spectral analysis (coherence, cross-phase and bicoherence) and characteristic frequency calculations. A user-friendly interface for the GENRAY code, to facilitate in-between shot ray-tracing analysis, is also being developed. The spectral analysis tool is being used to analyze data from existing edge turbulence diagnostics, such as the O-mode correlation reflectometer and Gas Puff Imaging, during I-mode, ITB and EDA H-mode plasmas. GENRAY and the characteristic frequency tool are being used to study diagnostic accessibility limits set by wave propagation and refraction for X-mode Doppler Backscattering and Correlation Electron Cyclotron Emission (CECE) systems that are being planned for core turbulence studies at Alcator C-Mod.

Burns, C.; Shehata, S.; White, A. E.; Cziegler, I.; Dominguez, A.; Terry, J. L.; Pace, D. C.

2010-11-01

214

HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL  

EPA Science Inventory

An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

215

The environment power system analysis tool development program  

NASA Technical Reports Server (NTRS)

The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

1990-01-01

216

SEA-MAT: Matlab Tools for Oceanographic Analysis  

NSDL National Science Digital Library

This site is a collaborative effort to organize and distribute Matlab tools for the oceanographic community. Matlab is multi-platform software that provides numeric computation, technical graphics and visualization, and an intuitive programming language for applications in engineering and science. This site provides links to and downloads of code for oceanographic data analysis tools, including time series analysis, numerical modelling, mapping, hydrography, and data interface.

1998-09-16

217

Analysis and computer tools for separation processes involving nonideal mixtures  

SciTech Connect

The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

Lucia, A.

1992-05-01

218

Blind deconvolution in spread spectrum communications over non minimum phase channels  

Microsoft Academic Search

Higher than second order statistics is a powerful signal processing tool and a lot of research effort has been made to find new and implementable cumulants based algorithms for system identification and deconvolution of non minimum phase systems for the application in cases of practical interest. In communications problems equalization in the presence of time dispersive channels (usually non minimum

Massimiliano Martone

1994-01-01

219

LCD ROOT Simulation and Analysis Tools  

SciTech Connect

The North American Linear Collider Detector group has developed a simulation program package based on the ROOT system. The package consists of Fast simulation, the reconstruction of the Full simulated data, and physics analysis utilities.

Iwasaki, Masako

2001-02-08

220

LCD Root Simulation and Analysis Tools.  

National Technical Information Service (NTIS)

The North American Linear Collider Detector group has developed a simulation program package based on the ROOT system. The package consists of Fast simulation, the reconstruction of the Full simulated data, and physics analysis utilities.

M. Iwasaki

2001-01-01

221

RADC SCAT: Automated Sneak Circuit Analysis Tool.  

National Technical Information Service (NTIS)

Standard Sneak Analysis procedures are costly from a time, money and personnel perspective. The processing of design data available only during the latter portions of the development cycle are highly labor intensive and difficult to institute a design cha...

E. L. DePalma

1990-01-01

222

An adaptive local deconvolution method for implicit LES  

NASA Astrophysics Data System (ADS)

The adaptive local deconvolution method (ALDM) is proposed as a new nonlinear discretization scheme designed for implicit large-eddy simulation (ILES) of turbulent flows. In ILES the truncation error of the discretization of the convective terms functions as a subgrid-scale model. Therefore, the model is implicitly contained within the discretization, and an explicit computation of model terms becomes unnecessary. The discretization is based on a solution-adaptive deconvolution operator which allows to control the truncation error. Deconvolution parameters are determined by an analysis of the spectral numerical viscosity. An automatic optimization based on an evolutionary algorithm is employed to obtain a set of parameters which results in an optimum spectral match for the numerical viscosity with theoretical predictions for isotropic turbulence. Simulations of large-scale forced and decaying three-dimensional homogeneous isotropic turbulence show an excellent agreement with theory and experimental data and demonstrate the good performance of the implicit model. As an example for transitional flows, instability and breakdown of the three-dimensional Taylor-Green vortex are considered. The implicit model correctly predicts instability growth and transition to developed turbulence. It is shown that the implicit model performs at least as well as established explicit models.

Hickel, Stefan; Adams, Nikolaus A.; Domaradzki, J. Andrzej

2006-03-01

223

A Robust Deconvolution Method based on Transdimensional Hierarchical Bayesian Inference  

NASA Astrophysics Data System (ADS)

Analysis of P-S and S-P conversions allows us to map receiver side crustal and lithospheric structure. This analysis often involves deconvolution of the parent wave field from the scattered wave field as a means of suppressing source-side complexity. A variety of deconvolution techniques exist including damped spectral division, Wiener filtering, iterative time-domain deconvolution, and the multitaper method. All of these techniques require estimates of noise characteristics as input parameters. We present a deconvolution method based on transdimensional Hierarchical Bayesian inference in which both noise magnitude and noise correlation are used as parameters in calculating the likelihood probability distribution. Because the noise for P-S and S-P conversion analysis in terms of receiver functions is a combination of both background noise - which is relatively easy to characterize - and signal-generated noise - which is much more difficult to quantify - we treat measurement errors as an known quantity, characterized by a probability density function whose mean and variance are model parameters. This transdimensional Hierarchical Bayesian approach has been successfully used previously in the inversion of receiver functions in terms of shear and compressional wave speeds of an unknown number of layers [1]. In our method we used a Markov chain Monte Carlo (MCMC) algorithm to find the receiver function that best fits the data while accurately assessing the noise parameters. In order to parameterize the receiver function we model the receiver function as an unknown number of Gaussians of unknown amplitude and width. The algorithm takes multiple steps before calculating the acceptance probability of a new model, in order to avoid getting trapped in local misfit minima. Using both observed and synthetic data, we show that the MCMC deconvolution method can accurately obtain a receiver function as well as an estimate of the noise parameters given the parent and daughter components. Furthermore, we demonstrate that this new approach is far less susceptible to generating spurious features even at high noise levels. Finally, the method yields not only the most-likely receiver function, but also quantifies its full uncertainty. [1] Bodin, T., M. Sambridge, H. Tkal?i?, P. Arroucau, K. Gallagher, and N. Rawlinson (2012), Transdimensional inversion of receiver functions and surface wave dispersion, J. Geophys. Res., 117, B02301

Kolb, J.; Lekic, V.

2012-12-01

224

Tool for Rapid Analysis of Monte Carlo Simulations  

NASA Technical Reports Server (NTRS)

Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

2013-01-01

225

Nonlinear Robustness Analysis Tools for Flight Control Law Validation & Verification  

NASA Astrophysics Data System (ADS)

Loss of control in flight is among the highest aviation accident categories for both the number of accidents and the number of fatalities. The flight controls community is seeking an improved validation tools for safety critical flight control systems. Current validation tools rely heavily on linear analysis, which ignore the inherent nonlinear nature of the aircraft dynamics and flight control system. Specifically, current practices in validating the flight control system involve gridding the flight envelope and checking various criteria based on linear analysis to ensure safety of the flight control system. The analysis and certification methods currently applied assume the aircrafts' dynamics is linear. In reality, the behavior of the aircraft is always nonlinear due to its aerodynamic characteristics and physical limitations imposed by the actuators. This thesis develops nonlinear analysis tools capable of certifying flight control laws for nonlinear aircraft dynamics. The proposed analysis tools can handle both the aerodynamic nonlinearities and the physical limitations imposed by the actuators in the aircrafts' dynamics. This proposed validation technique will extend and enrich the predictive capability of existing flight control law validation methods to analyze nonlinearities. The objective of this thesis is to provide the flight control community with an advanced set of analysis tools to reduce aviation fatalities and accidents rate.

Chakraborty, Abhijit

226

Automated Scalability Analysis Tools for Message Passing Parallel Programs  

NASA Technical Reports Server (NTRS)

In order to develop scalable parallel applications, a number of programming decisions have to be made during the development of the program. Performance tools that help in making these decisions are few, if existent. Traditionally, performance tools have focused on exposing performance bottlenecks of small-scale executions of the program. However, it is common knowledge that programs that perform exceptionally well on small processor configurations, more often than not, perform poorly when executed on larger processor configurations. Hence, new tools that predict the execution characteristics of scaled-up programs are an essential part of an application developers toolkit. In this paper we discuss important issues that need to be considered in order to build useful scalability analysis tools for parallel programs. We introduce a simple tool that automatically extracts scalability characteristics of a class of deterministic parallel programs. We show with the help of a number of results on the Intel iPSC/860, that predictions are within reasonable bounds.

Sarukkai, Sekhar R.; Mehra, Pankaj; Tucker, Deanne (Technical Monitor)

1994-01-01

227

BIOMEDICAL IMAGE ANALYSIS USING WAVELET TOOLS FOR EMERGENCY MEDICAL APPLICATIONS  

Microsoft Academic Search

In this paper, the analysis of 2D signals, especially emergency biomedical images are performed using the wavelet tools of MATLAB, has been presented for medical application. In terms of 2D signal analysis, an image is taken and added with different four types of noise (Salt and Peeper, Speckle, Gaussian and Poisson). After that all of the noisy images are de-noised

Fazlul Haque; Hanif Ali; M Adnan Kiber

2010-01-01

228

Soaplab - a unified Sesame door to analysis tools  

Microsoft Academic Search

Soaplab is a set of Web Services providing programmatic access to many applications on remote computers. Because such applications in the scientific environment usually analyze data, Soaplab is often referred to as an Analysis Web Service. It uses a unified (and partly standardized) API to find an analysis tool, discover what data it requires and what data it produces, to

Martin Senger; Peter Rice; Tom Oinn

2003-01-01

229

Protocol Analysis of a Federated Search Tool: Designing for Users  

Microsoft Academic Search

Librarians at Springfield College conducted usability testing of Endeavor's federated search tool, ENCompass for Resource Access. The purpose of the testing was to make informed decisions prior to customizing the look and function of the software's interface in order to make the product more usable for their patrons. Protocol, or think-aloud, analysis was selected as a testing and analysis method.

Emily Alling; Rachael Naismith

2007-01-01

230

GEOGRAPHIC ANALYSIS TOOL FOR HEALTH AND ENVIRONMENTAL RESEARCH (GATHER)  

EPA Science Inventory

GATHER, Geographic Analysis Tool for Health and Environmental Research, is an online spatial data access system that provides members of the public health community and general public access to spatial data that is pertinent to the analysis and exploration of public health issues...

231

Analysis of Event Synchronization in A Parallel Programming Tool  

Microsoft Academic Search

Understanding synchronization is important for a parallelprogramming tool that uses dependence analysis as the basisfor advising programmers on the correctness of parallel constructs.This paper discusses static analysis methods thatcan be applied to parallel programs with event variable synchronization.The objective is to be able to predict potentialdata races in a parallel program. The focus is on how dependencesand synchronization statements inside

David Callahan; Ken Kennedy; Jaspal Subhlok

1990-01-01

232

A Survey on Tools for Binary Code Analysis  

Microsoft Academic Search

Different strategies for binary analysis are widely used in systems dealing with software maintenance and system security. Binary code is self-contained; though it is easy to execute, it is not easy to read and understand. Binary analysis tools are useful in software maintenance because the binary of software has all the information necessary to recover the source code. It is

Shengying Li

233

Cost-Benefit Analysis: Tools for Decision Making.  

ERIC Educational Resources Information Center

Suggests that cost-benefit analysis can be a helpful tool for assessing difficult and complex problems in child care facilities. Defines cost-benefit analysis as an approach to determine the most economical way to manage a program, describes how to analyze costs and benefits through hypothetical scenarios, and discusses some of the problems…

Bess, Gary

2002-01-01

234

Applying engineering feedback analysis tools to climate dynamics  

Microsoft Academic Search

The application of feedback analysis tools from engineering control theory to problems in climate dynamics is discussed through two examples. First, the feedback coupling between the thermohaline circulation and wind-driven circulation in the North Atlantic Ocean is analyzed with a relatively simple model, in order to better understand the coupled system dynamics. The simulation behavior is compared with analysis using

Douglas G. MacMynowski; Eli Tziperman

2008-01-01

235

Diamond-turning tool setting by interferogram analysis  

SciTech Connect

A method was developed to establish a numerically controlled tool path with respect to the work spindle centerline. Particularly adapted to the diamond turning of optics, this method is based upon interferogram analysis and is applicable to the establishment of the work spindle centerline relative to the tool path for any center-turned optic having a well-defined vertex radius of curvature. The application reported is for an f/2 concave spherical mirror.

Rasnick, W.H.; Yoder, R.C.

1980-10-22

236

Development of a climate data analysis tool (CDAT)  

SciTech Connect

The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

Marlais, S.M.

1997-09-01

237

Inverse problems and image deconvolution  

NASA Astrophysics Data System (ADS)

Founding on a physical transformation process described by a Fredholm integral equation of the first kind, we first recall the main difficulties appearing in linear inverse problems in the continuous case as well as in the discrete case. We describe several situations corresponding to various properties of the kernel of the integral equation. The need to take into account the properties of the solution not contained in the model is then put in evidence. This leads to the regularization principles for which the classical point of view as well as the Bayesian interpretation are briefly reminded. We then focus on the problem of deconvolution specially applied to astronomical images. A complete model of image formation is described in Section4, and a general method allowing to derive image restoration algorithms, the Split Gradient Method (SGM), is detailed in Section5. We show in Section6, that when this method is applied to the likelihood maximization problems with positivity constraint, the ISRA algorithm can be recovered in the case of the pure Gaussian additive noise case, while in the case of pure Poisson noise, the well known EM, Richardson-Lucy algorithm is easily obtained. The method is then applied to the more realistic situation typical of CCD detectors: Poisson photo-conversion noise plus Gaussian readout noise, and to a new particular situation corresponding to data acquired with Low Light Level CCD. Some numerical results are exhibited in Section7 for these two last cases. Finally, we show how all these algorithms can be regularized in the context of the SGM and we give a general conclusion.

Lanteri, H.; Theys, C.

238

Analysis Tool Web Services from the EMBL-EBI  

PubMed Central

Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

2013-01-01

239

Vulnerability assessment using two complementary analysis tools  

SciTech Connect

To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has began using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. ASSESS analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. Its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weaknesses of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weaknesses, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

Paulus, W.K.

1993-07-01

240

Deconvolution of Thomson scattering temperature profiles  

SciTech Connect

Deconvolution of Thomson scattering (TS) profiles is required when the gradient length of the electron temperature (T{sub e}) or density (n{sub e}) are comparable to the instrument function length ({Delta}{sub R}). The most correct method for deconvolution to obtain underlying T{sub e} and n{sub e} profiles is by consideration of scattered signals. However, deconvolution at the scattered signal level is complex since it requires knowledge of all spectral and absolute calibration data. In this paper a simple technique is presented where only knowledge of the instrument function I(r) and the measured profiles, T{sub e,observed}(r) and n{sub e,observed}(r), are required to obtain underlying T{sub e}(r) and n{sub e}(r). This method is appropriate for most TS systems and is particularly important where high spatial sampling is obtained relative to {Delta}{sub R}.

Scannell, R.; Beurskens, M.; Carolan, P. G.; Kirk, A.; Walsh, M. [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX14 3DB (United Kingdom); O'Gorman, T. [Department of Physics, University College Cork, Cork (Ireland); Osborne, T. H. [General Atomics, P.O. Box, San Diego, California 92186-5608 (United States)

2011-05-15

241

Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions  

NASA Technical Reports Server (NTRS)

Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

2002-01-01

242

Blind-deconvolution optical-resolution photoacoustic microscopy in vivo.  

PubMed

Optical-resolution photoacoustic microscopy (OR-PAM) is becoming a vital tool for studying the microcirculation system in vivo. By increasing the numerical aperture of optical focusing, the lateral resolution of OR-PAM can be improved; however, the depth of focus and thus the imaging range will be sacrificed correspondingly. In this work, we report our development of blind-deconvolution optical-resolution photoacoustic microscopy (BD-PAM) that can provide a lateral resolution ~2-fold finer than that of conventional OR-PAM (3.04 vs. 5.78?m), without physically increasing the system's numerical aperture. The improvement achieved with BD-PAM is demonstrated by imaging graphene nanoparticles and the microvasculature of mice ears in vivo. Our results suggest that BD-PAM may become a valuable tool for many biomedical applications that require both fine spatial resolution and extended depth of focus. PMID:23546115

Chen, Jianhua; Lin, Riqiang; Wang, Huina; Meng, Jing; Zheng, Hairong; Song, Liang

2013-03-25

243

Color Deconvolution and Support Vector Machines  

NASA Astrophysics Data System (ADS)

Methods for machine learning (support vector machines) and image processing (color deconvolution) are combined in this paper for the purpose of separating colors in images of documents. After determining the background color, samples from the image that are representative of the colors to be separated are mapped to a feature space. Given the clusters of samples of either color the support vector machine (SVM) method is used to find an optimal separating line between the clusters in feature space. Deconvolution image processing parameters are determined from the separating line. A number of examples of applications in forensic casework are presented.

Berger, Charles E. H.; Veenman, Cor J.

244

Werner deconvolution for variable altitude aeromagnetic data  

SciTech Connect

The standard Werner deconvolution method is extended to include the effects of variable sensor altitude but this leads to a deconvolution algorithm that is unstable for slowly changing flight height. By expressing the sensor altitude as a linear function of horizontal position (within a specified window), the authors show that the numerical instability can be avoided. The subsequent selection and averaging of the raw solutions is controlled by three parameters that can be adjusted to specific survey data characteristics. Results for an aeromagnetic survey over Vancouver Island, British Columbia show that, in comparison with the variable altitude approach, the standard Werner method produces unacceptable errors when applied to variable altitude data.

Ostrowski, J.S. (Horler Information Inc., Ottawa, Ontario (Canada)); Pilkington, M.; Teskey, D.J. (Geological Survey of Canada, Ottawa, Ontario (Canada))

1993-10-01

245

Software Construction and Analysis Tools for Future Space Missions  

NASA Technical Reports Server (NTRS)

NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

2002-01-01

246

Parallel Analysis Tools for Ultra-Large Climate Data Sets  

NASA Astrophysics Data System (ADS)

While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

2013-04-01

247

Statistical analysis: a tool for understanding monitoring data  

Microsoft Academic Search

Monitoring data often need deep analysis in order to understand the real behaviour of a structure or a soil mass which may be overshadowed by unknown phenomena or boundary conditions. Temperature is one of the most common parameters which affect field measurements and create problems in data interpretation and use. One of the most powerful tool which can be used

Domenico Bruzzi; Alessandro Fassò; Orietta Nicolis; Giorgio Pezzetti

248

An integrated analysis and simulation tool for avionics system development  

Microsoft Academic Search

The Loral Instrumentation System 500 is a complete avionics simulation and analysis development tool that addresses the spectrum of MIL STD 1553 avionics development needs, including: real-time monitoring; simulations; data capture, time stamping, and archival; multiple bus correlation; interactive data browsing; exhaustive postanalysis; and report generation. The expandable and portable System 500 is based on a UNIX\\/X Window System foundation.

J. R. LaPlante

1991-01-01

249

Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study.  

National Technical Information Service (NTIS)

An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and...

J. T. Malin M. Flores

2013-01-01

250

An Automated Data Analysis Tool for Livestock Market Data  

ERIC Educational Resources Information Center

This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

Williams, Galen S.; Raper, Kellie Curry

2011-01-01

251

DSD-Crasher: a hybrid analysis tool for bug finding  

Microsoft Academic Search

DSD-Crasher is a bug finding tool that follows a three-step approach to program analysis: D. Capture the program's intended execution behavior with dynamic invariant detection. The derived invariants exclude many unwanted values from the program's input domain. S. Statically analyze the program within the restricted input domain to explore many paths. D. Automatically generate test cases that focus on veri-

Christoph Csallner; Yannis Smaragdakis

2006-01-01

252

Assessing Extremes Climatology Using NWS Local Climate Analysis Tool  

Microsoft Academic Search

The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices' ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast

M. M. Timofeyeva; A. Hollingshead; D. Hilderbrand; B. Mayes; T. Hartley; N. M. Kempf McGavock; E. Lau; E. A. Olenic; B. Motta; R. Bunge; L. E. Brown; F. Fritsch

2010-01-01

253

Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint  

SciTech Connect

This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

Christensen, C.; Horowitz, S.

2008-07-01

254

Chianti: a tool for change impact analysis of java programs  

Microsoft Academic Search

This paper reports on the design and implementation of Chianti, a change impact analysis tool for Java that is imple- mented in the context of the Eclipse environment. Chianti analyzes two versions of an application and decomposes their difierence into a set of atomic changes. Change impact is then reported in terms of afiected (regression or unit) tests whose execution

Xiaoxia Ren; Fenil Shah; Frank Tip; Barbara G. Ryder; Ophelia Chesley

2004-01-01

255

Strategic groups analysis (SGA) as a tool for strategic marketing  

Microsoft Academic Search

Argues in favour of the convenience of using strategic groups analysis (SGA) as a business management tool that is especially useful for strategic marketing planning. To illustrate the great versatility offered by SGA, we take as a reference the results obtained from a study of the Spanish retail grocery sector. By way of this empirical work, we analyse how SGA

Carlos Flavián; Yolanda Polo

1999-01-01

256

Separation analysis, a tool for analyzing multigrid algorithms  

NASA Technical Reports Server (NTRS)

The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

Costiner, Sorin; Taasan, Shlomo

1995-01-01

257

Discovery and New Frontiers Project Budget Analysis Tool  

NASA Technical Reports Server (NTRS)

The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

Newhouse, Marilyn E.

2011-01-01

258

Nonnegative deconvolution for time of arrival estimation  

Microsoft Academic Search

The interaural time difference (ITD) of arrival is a primary cue for acoustic sound source localization. Traditional estimation techniques for ITD based upon cross-correlation are related to maximum-likelihood estimation of a simple generative model. We generalize the time difference estimation into a deconvolution problem with nonnegativity constraints. The resulting nonnegative least squares optimization can be efficiently solved using a novel

Yuanqing Lin; Daniel D. Lee; Lawrence K. Saul

2004-01-01

259

Histogram deconvolution - An aid to automated classifiers  

NASA Technical Reports Server (NTRS)

It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

Lorre, J. J.

1983-01-01

260

Iterative blind deconvolution method and its applications  

Microsoft Academic Search

A simple iterative technique has been developed for blind deconvolution of two convolved functions. The method is described, and a number of results obtained from a computational implementation are presented. Some further possible applications are indicated. The convolution c(x) of two functions, f(x) and g(x), can be expressed mathematically by the integral equa- tion

G. R. Ayers; J. C. Dainty

1988-01-01

261

Blind deconvolution through digital signal processing  

Microsoft Academic Search

This paper addresses the problem of deconvolving two signals when both are unknown. The authors call this problem blind deconvolution. The discussion develops two related solutions which can be applied through digital signal processing in certain practical cases. The case of reverberated and resonated sound forms the center of the development. The specific problem of restoring old acoustic recordings provides

T. M. Cannon; R. B. Ingebretsen

1975-01-01

262

Multichannel blind deconvolution of polarimetric imagery.  

PubMed

A maximum likelihood blind deconvolution algorithm is derived for incoherent polarimetric imagery using expectation maximization. In this approach, the unpolarized and fully polarized components of the scene are estimated along with the corresponding angles of polarization and channel point spread functions. The scene state of linear polarization is determined unambiguously using this parameterization. Results are demonstrated using laboratory data. PMID:18758542

Lemaster, Daniel A; Cain, Stephen C

2008-09-01

263

Mammographic image restoration using maximum entropy deconvolution  

Microsoft Academic Search

An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise

A. Jannetta; J. C. Jackson; C. J. Kotre; I. P. Birch; K. J. Robson; R. Padgett

2004-01-01

264

Solving a Deconvolution Problem in Photon Spectrometry  

Microsoft Academic Search

We solve numerically a deconvolution problem to extract the undisturbed spectrum from the measured distribution contaminated by the finite resolution of the measuring device. A problem of this kind emerges when one wants to infer the momentum distribution of the neutral pions by detecting the ?0 decay photons using the photon spectrometer of the ALICE LHC experiment at CERN [1].

D. Aleksandrov; J. Alme; V. Basmanov; B. Batyunya; D. Blau; M. Bogolyubsky; V. Budilov; D. Budnikov; J. I. Buskenes; X. Cai; F. Chuman; A. Deloff; V. Demanov; O. Djuvsland; T. Dobrowolski; M. Faltys; D. Fehlker; S. Fil’chagin; A. Hiei; P. T. Hille; T. Horaguchi; M. Huang; R. Il’kaev; I. Ilkiv; M. Ippolitov; T. Iwasaki; A. Kazantsev; K. Karadzhev; Y. Kharlov; Y. Kucheryaev; P. Kurashvili; A. Kuryakin; D. T. Larsen; S. Lindal; L. Liu; G. Lovhoiden; K. Ma; A. Mamonov; V. Manko; Y. Mao; J. Mares; Y. Maruyama; H. Müller; K. Mizoguchi; S. Nazarenko; G. Nazarov; S. Nikolaev; S. Nikolaev; P. Nomokonov; J. Nystrand; A. Pavlinov; D. Peresunko; V. Petrov; K. Polak; B. Polichtchouk; T. Potcheptsov; V. Punin; H. Qvigstad; K. Redlich; D. Roehrich; S. Sadovsky; V. Senko; K. Shigaki; I. Sibiryak; T. Siemiarczuk; B. Skaali; K. Skjerdal; G. Shabratova; A. Soloviev; P. Stolpovsky; T. Sugitate; M. Sukhorukov; H. Torii; T. Tveter; K. Ullaland; J. Wikne; O. Vikhlyantsev; A. Vinogradov; Y. Vinogradov; A. Vodopyanov; R. Wan; Y. Wang; G. Wilk; D. Wang; C. Xu; Z. Yin; V. Yanovsky; X. Yuan; S. Zaporozhets; A. Zenin; X. Zhang; D. Zhou

2010-01-01

265

Super-exponential methods for blind deconvolution  

Microsoft Academic Search

A class of iterative methods for solving the blind deconvolution problem, i.e. for recovering the input of an unknown possibly nonminimum-phase linear system by observation of its output, is presented. These methods are universal do not require prior knowledge of the input distribution, are computationally efficient and statistically stable, and converge to the desired solution regardless of initialization at a

Ofir Shalvi; Ehud Weinstein

1993-01-01

266

Fast deconvolution of multichannel systems using regularization  

Microsoft Academic Search

A very fast deconvolution method, which is based on the fast Fourier transform (FFT), can be used to control the outputs from a multichannel plant comprising any number of control sources and error sensors. The result is a matrix of causal finite impulse response filters whose performance is optimized at a large number of discrete frequencies. The paper is particularly

Ole Kirkeby; Philip A. Nelson; Hareo Hamada; Felipe Orduna-Bustamante

1998-01-01

267

Resolution Enhancement by Fourier Self-Deconvolution.  

National Technical Information Service (NTIS)

The use of the Fourier self-deconvolution (FSD) method in a computer program operating with standard BOMEM interferogram files is described. Line broadening mechanisms are discussed as well as broadening effects on the measured positions of close-lying sp...

O. Appelblad

1984-01-01

268

Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study  

NASA Technical Reports Server (NTRS)

An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

Flores, Melissa; Malin, Jane T.

2013-01-01

269

Mass spectrometry tools for analysis of intermolecular interactions.  

PubMed

The small quantities of protein required for mass spectrometry (MS) make it a powerful tool to detect binding (protein-protein, protein-small molecule, etc.) of proteins that are difficult to express in large quantities, as is the case for many intrinsically disordered proteins. Chemical cross-linking, proteolysis, and MS analysis, combined, are a powerful tool for the identification of binding domains. Here, we present a traditional approach to determine protein-protein interaction binding sites using heavy water ((18)O) as a label. This technique is relatively inexpensive and can be performed on any mass spectrometer without specialized software. PMID:22821539

Auclair, Jared R; Somasundaran, Mohan; Green, Karin M; Evans, James E; Schiffer, Celia A; Ringe, Dagmar; Petsko, Gregory A; Agar, Jeffrey N

2012-01-01

270

A dataflow analysis tool for parallel processing of algorithms  

NASA Technical Reports Server (NTRS)

A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

Jones, Robert L., III

1993-01-01

271

Analysis tools for non-radially pulsating objects  

NASA Astrophysics Data System (ADS)

At the University of Canterbury we have been developing a set of tools for the analysis of spectra of varying types of non-radially pulsating objects. This set currently includes: calculation of the moments, calculations of the phase across the profile as well as basic binary profile fitting for determination of orbital characteristics and projected rotational velocity (v sin i) measurement. Recently the ability to calculate cross-correlation profiles using either specified or synthesized line lists has been added, all implemented in MATLAB. A number of observations of ? Doradus candidates is currently being used to test these tools. For information on our observing facilities see Pollard et al. (2007).

Wright, D. J.; Pollard, K. R.; Cottrell, P. L.

2007-06-01

272

Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)  

NASA Technical Reports Server (NTRS)

The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

2005-01-01

273

Tool Support for Parametric Analysis of Large Software Simulation Systems  

NASA Technical Reports Server (NTRS)

The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

2008-01-01

274

JADS Analysis Toolbox. A Tool for Analysis of Distributed Simulations.  

National Technical Information Service (NTIS)

Science Applications International Corporation (SAIC) has developed a software product for the Joint Advanced Distributed Simulation (JADS) called the JADS Analysis Toolbox that has been very helpful in the troubleshooting, analysis, and visualization of ...

D. G. Gonzalez J. Black

1999-01-01

275

GPR antipersonnel mine detection: improved deconvolution and time-frequency feature extraction  

NASA Astrophysics Data System (ADS)

This work deals with the processing of GPR (ground penetrating radar) signals for AP (anti-personnel) mine detection. It focuses on two steps in this processing, namely the deconvolution of the system impulse response, and the extraction of target features for classification. The objective of the work is to find discriminant and robust target features by means of time-frequency analysis. Deconvolution is an ill-posed inverse problem, which can be solved with regularization methods. In this paper a deconvolution algorithm, based on the iterative v-method, is proposed. For discriminant feature selection the Wigner distribution (WD) is considered. Singular value decomposition (SVD) along with the concept of the center of mass as the most robust feature are used for feature extraction from the WD. The proposed normalized time-frequency-energetic features have a good discriminant power, which doesn't degrade with increasing object depth.

Savelyev, Timofei G.; van Kempen, Luc; Sahli, Hichem

2003-08-01

276

Tool for cost-effectiveness analysis in teleradiology  

NASA Astrophysics Data System (ADS)

Although several studies have demonstrated the feasibility of teleradiology its cost- effectiveness has not yet been proven but is a prerequisite for wide acceptance and routine use in the medical environment. Thus, our aim is to develop a tool for cost-effectiveness analysis of radiological remote expert consultation and other teleradiological applications. The overall objective of the cost-effectiveness analysis tool is to provide the user with a detailed and comprehensive overview on the costs and benefits associated with the teleradiological application under consideration. A recommendation on the cost-effectiveness of a particular application should be given, and, if possible, an identification of superior alternative(s). The tool comprises several modules for data input, cost analysis, analysis of benefits, cost- effectiveness evaluation, and sensitivity analysis. It is applied in the EC project TELEMED (R1086) for the economic assessment of the pan-European teleradiological application pilot `Remote Expert Consultation.' Simulating costs and benefits of these and other teleradiology applications yields essential information for future routine use of teleradiology.

Gerneth, Marlene; Bartsch, Frank-Reinhard; Schosser, Rudolf

1993-09-01

277

Rosetta CONSERT operations and data analysis preparation: simulation software tools.  

NASA Astrophysics Data System (ADS)

The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

2014-05-01

278

Phenostat: visualization and statistical tool for analysis of phenotyping data.  

PubMed

The effective extraction of information from multidimensional data sets derived from phenotyping experiments is a growing challenge in biology. Data visualization tools are important resources that can aid in exploratory data analysis of complex data sets. Phenotyping experiments of model organisms produce data sets in which a large number of phenotypic measures are collected for each individual in a group. A critical initial step in the analysis of such multidimensional data sets is the exploratory analysis of data distribution and correlation. To facilitate the rapid visualization and exploratory analysis of multidimensional complex trait data, we have developed a user-friendly, web-based software tool called Phenostat. Phenostat is composed of a dynamic graphical environment that allows the user to inspect the distribution of multiple variables in a data set simultaneously. Individuals can be selected by directly clicking on the graphs and thus displaying their identity, highlighting corresponding values in all graphs, allowing their inclusion or exclusion from the analysis. Statistical analysis is provided by R package functions. Phenostat is particularly suited for rapid distribution and correlation analysis of subsets of data. An analysis of behavioral and physiologic data stemming from a large mouse phenotyping experiment using Phenostat reveals previously unsuspected correlations. Phenostat is freely available to academic institutions and nonprofit organizations and can be used from our website at: (http://www.bioinfo.embl.it/phenostat/). PMID:17674099

Reuveni, Eli; Carola, Valeria; Banchaabouchi, Mumna Al; Rosenthal, Nadia; Hancock, John M; Gross, Cornelius

2007-09-01

279

A survey of visualization tools for biological network analysis  

PubMed Central

The analysis and interpretation of relationships between biological molecules, networks and concepts is becoming a major bottleneck in systems biology. Very often the pure amount of data and their heterogeneity provides a challenge for the visualization of the data. There are a wide variety of graph representations available, which most often map the data on 2D graphs to visualize biological interactions. These methods are applicable to a wide range of problems, nevertheless many of them reach a limit in terms of user friendliness when thousands of nodes and connections have to be analyzed and visualized. In this study we are reviewing visualization tools that are currently available for visualization of biological networks mainly invented in the latest past years. We comment on the functionality, the limitations and the specific strengths of these tools, and how these tools could be further developed in the direction of data integration and information sharing.

Pavlopoulos, Georgios A; Wegener, Anna-Lynn; Schneider, Reinhard

2008-01-01

280

Virtual tool mark generation for efficient striation analysis(.).  

PubMed

This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

2014-07-01

281

Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)  

SciTech Connect

IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

Alkadi, Nasr E [ORNL; Starke, Michael R [ORNL; Ma, Ookie [DOE EERE; Nimbalkar, Sachin U [ORNL; Cox, Daryl [ORNL

2013-01-01

282

A conceptual design tool for RBCC engine performance analysis  

SciTech Connect

Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

Olds, J.R.; Saks, G. [Aerospace Systems Design Laboratory, School of Aerospace Engineering Georgia Institute of Technology Atlanta, Georgia30332-0150 (United States)

1997-01-01

283

Multi-Spacecraft Analysis with Generic Visualization Tools  

NASA Astrophysics Data System (ADS)

To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

2010-12-01

284

Auto-calibrating spherical deconvolution based on ODF sparsity.  

PubMed

Spherical deconvolution models the diffusion MRI signal as the convolution of a fiber orientation density function (fODF) with a single fiber response. We propose a novel calibration procedure that automatically determines this fiber response. This has three advantages: First, the user no longer needs to provide an estimate of the response. Second, we estimate a per-voxel fiber response, which is more adequate for the analysis of patient data with focal white matter degeneration. Third, parameters of the estimated response reflect diffusion properties of the white matter tissue, and can be used for quantitative analysis. Our method works by finding a tradeoff between a low fitting error and a sparse fODF. Results on simulated data demonstrate that auto-calibration successfully avoids erroneous fODF peaks that can occur with standard deconvolution, and that it resolves fiber crossings with better angular resolution than FORECAST, an alternative method. Parameter maps and tractography results corroborate applicability to clinical data. PMID:24505724

Schultz, Thomas; Groeschel, Samuel

2013-01-01

285

Applying AI tools to operational space environmental analysis  

NASA Technical Reports Server (NTRS)

The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

Krajnak, Mike; Jesse, Lisa; Mucks, John

1995-01-01

286

Space mission scenario development and performance analysis tool  

NASA Technical Reports Server (NTRS)

This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

2004-01-01

287

SMART (Shop floor Modeling, Analysis and Reporting Tool Project  

NASA Technical Reports Server (NTRS)

This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

1999-01-01

288

MODELLING AND ANALYSIS TOOLS FOR INTERFEROMETRIC SAR OBSERVATIONS  

Microsoft Academic Search

A quantitative deformation monitoring using the differential interferometric SAR (DInSAR) technique may be achieved when multiple DInSAR observations and suitable modelling and analysis tools are employed. The paper begins with a description of the main characteristics of the DInSAR data. Then, it discusses a new modelling and filtering strategy, which takes advantage of the specific properties of the DInSAR observations.

M. Crosetto; B. Crippa; R. Barzaghi; M. Agudo

289

Application of signal processing tools for power quality analysis  

Microsoft Academic Search

This paper presents the application of signal processing tools for power quality analysis. Three signal processing techniques are considered: the discrete Fourier transforms, the wavelet filters and the discrete short-time Fourier transforms. It is designed an adjustable speed drive with a six-pulse converter using EMTP\\/ATP and it is presented normal energizing of utility capacitors. Finally, each kind of electrical disturbance

F. Jurado; Natividad Acero; Blas Ogayar

2002-01-01

290

Techniques and Tools for the Temporal Analysis of Retrieved Information  

Microsoft Academic Search

In this paper we present a set of visual interfaces to query newspapers databases with conditions on their contents, structure\\u000a and temporal properties. Query results are presented in various interfaces designed to facilitate the reformulation of query\\u000a conditions and the analysis of the temporal distribution of news. The group of techniques and tools here described has shown\\u000a useful for the

Rafael Berlanga; Juan Pérez; María José Aramburu; Dolores Llidó

291

Comparison of gas chromatography–pulsed flame photometric detection–mass spectrometry, automated mass spectral deconvolution and identification system and gas chromatography–tandem mass spectrometry as tools for trace level detection and identification  

Microsoft Academic Search

The complexity of a matrix is in many cases the major limiting factor in the detection and identification of trace level analytes. In this work, the ability to detect and identify trace level of pesticides in complex matrices was studied and compared in three, relatively new methods: (a) GC–PFPD–MS where simultaneous PFPD (pulsed flame photometric detection) and MS analysis is

Shai Dagan

2000-01-01

292

Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool  

NASA Technical Reports Server (NTRS)

The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

2008-01-01

293

Aerospace Power Systems Design and Analysis (APSDA) Tool  

NASA Technical Reports Server (NTRS)

The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

Truong, Long V.

1998-01-01

294

Cultural Consensus Analysis as a Tool for Clinic Improvements  

PubMed Central

Some problems in clinic function recur because of unexpected value differences between patients, faculty, and residents. Cultural consensus analysis (CCA) is a method used by anthropologists to identify groups with shared values. After conducting an ethnographic study and using focus groups, we developed and validated a CCA tool for use in clinics. Using this instrument, we identified distinct groups with 6 important value differences between those groups. An analysis of these value differences suggested specific and pragmatic interventions to improve clinic functioning. The instrument has also performed well in preliminary tests at another clinic.

Smith, C Scott; Morris, Magdalena; Hill, William; Francovich, Chris; McMullin, Juliet; Chavez, Leo; Rhoads, Caroline

2004-01-01

295

Interactive Software Fault Analysis Tool for Operational Anomaly Resolution  

NASA Technical Reports Server (NTRS)

Resolving software operational anomalies frequently requires a significant amount of resources for software troubleshooting activities. The time required to identify a root cause of the anomaly in the software may lead to significant timeline impacts and in some cases, may extend to compromise of mission and safety objectives. An integrated tool that supports software fault analysis based on the observed operational effects of an anomaly could significantly reduce the time required to resolve operational anomalies; increase confidence for the proposed solution; identify software paths to be re-verified during regression testing; and, as a secondary product of the analysis, identify safety critical software paths.

Chen, Ken

2002-01-01

296

ISAC: A tool for aeroservoelastic modeling and analysis  

NASA Technical Reports Server (NTRS)

The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

Adams, William M., Jr.; Hoadley, Sherwood Tiffany

1993-01-01

297

DSD-Crasher: A hybrid analysis tool for bug finding  

Microsoft Academic Search

ABSTRACT DSD-Crasher is a bug,finding tool that follows a three-step approach,to program,analysis: D. Capture,the program’s,intended,execution,behavior with,dynamic,invariant,detection. The derived,invariants exclude,many,unwanted,values,from,the program’s,input domain. S. Statically analyze,the program,within,the restricted input domain,to explore many,paths. D. Automatically,generate,test cases that focus on veri- fying the results of the static analysis. Thereby,confirmed results are never false positives, as opposed to the high false positive rate inherent,in conservative,static

Christoph Csallner; Yannis Smaragdakis; Tao Xie

2008-01-01

298

IMPLEMENTING THE STANDARD SPECTRUM METHOD FOR ANALYSIS OF β-γ COINCIDENCE SPECTRA  

Microsoft Academic Search

The standard deconvolution analysis tool (SDAT) algorithms were developed and tested at the University of Texas at Austin. These algorithms utilize the standard spectrum technique for spectral analysis of β-γ coincidence spectra for nuclear explosion monitoring. Work has been conducted under this contract to implement these algorithms into a useable scientific software package with a graphical user interface. Improvements include

S. Biegalski; Adam E. Flory; Brian T. Schrom; James H. Ely; Derek A. Haas; Ted W. Bowyer; James C. Hayes

2011-01-01

299

Design and Application of the Exploration Maintainability Analysis Tool  

NASA Technical Reports Server (NTRS)

Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

2012-01-01

300

Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.  

EPA Science Inventory

The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

301

Test structure for SPM tip shape deconvolution  

NASA Astrophysics Data System (ADS)

A test structure for SPM cantilever tip shape deconvolution is described. It consists of a silicon mono-crystalline wafer and an array of silicon sharp tips on its surface. Different types of tip shapes are observed with this structure. Images of the cantilever tip before and after contact-mode scanning are compared. Experimental studies of the developed test structure containing an array of sharp tips indicate that it allows three-dimensional images of the scanning tip to be obtained.

Bykov, V.; Gologanov, A.; Shevyakov, V.

302

Deconvolution of diode-laser spectra  

NASA Technical Reports Server (NTRS)

A new technique has been developed for deconvolving diode-laser spectra. This technique treats Doppler broadening, collisional broadening, and instrumental effects simultaneously. This technique is superior to previous deconvolution methods in the recovery of line-strength and transition-frequency information. A section of the ethane spectrum near 12 microns is used as an example. This new approach applies to any spectroscopy in which the instrumental resolution is narrower than actual linewidths.

Halsey, G. W.; Jennings, D. E.; Blass, W. E.

1985-01-01

303

Blind Poissonian images deconvolution with framelet regularization.  

PubMed

We propose a maximum a posteriori blind Poissonian images deconvolution approach with framelet regularization for the image and total variation (TV) regularization for the point spread function. Compared with the TV based methods, our algorithm not only suppresses noise effectively but also recovers edges and detailed information. Moreover, the split Bregman method is exploited to solve the resulting minimization problem. Comparative results on both simulated and real images are reported. PMID:23455078

Fang, Houzhang; Yan, Luxin; Liu, Hai; Chang, Yi

2013-02-15

304

Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)  

NASA Technical Reports Server (NTRS)

An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

2007-01-01

305

Integrated Modeling Tools for Thermal Analysis and Applications  

NASA Technical Reports Server (NTRS)

Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

1999-01-01

306

Seismic analysis and learning tools for high school students  

NASA Astrophysics Data System (ADS)

Easy access and processing of geophysical data is still a challenge in modern geosciences. Many existing tools require long learning time for a user to master them effectively, and moreover their PC interface needs to be updated to modern GUI (Graphical User Interface) standards. Thus most of the existing tools are not applicable for educational purposes. New high level programming languages like Java other opportunities to develop user friendly tools for geophysical data access and processing. We present our software tool for statistical analysis of earthquake catalogs named "Seismotectonics". It is developed as a part of SEIS/SCHOOL/NORWAY project and used in the framework of electronic learning and a practical tool for high school students. Having user-friendly interface it can be easily operated by the students while it provides most important features available by professional software. Based on Java technology it runs on most computers and operating systems. The application includes geological map of Norway (Mosar, 2002) and Fennoscandian Earthquake catalog covering the period 1375-2001. The software provides features for histogram and fault plane solution plotting, space-time filtering for a catalog or its subset, besides that extra features which are useful for educational purposes like earthquake animation are available. We present two studies based on this package: i) catalogues search for explosions and frosting events which have remarkable time distribution patterns, and ii)isolate large historical earthquake which magnitudes may be positive biased. In the latter case, the Luroy earthquakes of 31 Aug. 1819, claimed as the largest one in NW Europe is found to be not largest by high school students.

Boulaenko, M. E.; Husebye, E. S.

2003-04-01

307

Coastal Online Analysis and Synthesis Tool 2.0 (COAST)  

NASA Technical Reports Server (NTRS)

The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

2009-01-01

308

Colossal Tooling Design: 3D Simulation for Ergonomic Analysis  

NASA Technical Reports Server (NTRS)

The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

2003-01-01

309

A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra  

NASA Astrophysics Data System (ADS)

A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

2013-06-01

310

SAGE: A tool for time-series analysis of Greenland  

NASA Astrophysics Data System (ADS)

The National Snow and Ice Data Center (NSIDC) has developed an operational tool for analysis. This production tool is known as "Services for the Analysis of the Greenland Environment" (SAGE). Using an integrated workspace approach, a researcher has the ability to find relevant data and perform various analysis functions on the data, as well as retrieve the data and analysis results. While there continues to be compelling observational evidence for increased surface melting and rapid thinning along the margins of the Greenland ice sheet, there are still uncertainties with respect to estimates of mass balance of Greenland's ice sheet as a whole. To better understand the dynamics of these issues, it is important for scientists to have access to a variety of datasets from multiple sources, and to be able to integrate and analyze the data. SAGE provides data from various sources, such as AMSR-E and AVHRR datasets, which can be analyzed individually through various time-series plots and aggregation functions; or they can be analyzed together with scatterplots or overlaid time-series plots to provide quick and useful results to support various research products. The application is available at http://nsidc.org/data/sage/. SAGE was built on top of NSIDC's existing Searchlight engine. The SAGE interface gives users access to much of NSIDC's relevant Greenland raster data holdings, as well as data from outside sources. Additionally, various web services provide access for other clients to utilize the functionality that the SAGE interface provides. Combined, these methods of accessing the tool allow scientists the ability to devote more of their time to their research, and less on trying to find and retrieve the data they need.

Duerr, R. E.; Gallaher, D. W.; Khalsa, S. S.; Lewis, S.

2011-12-01

311

POPBAM: Tools for Evolutionary Analysis of Short Read Sequence Alignments  

PubMed Central

Background While many bioinformatics tools currently exist for assembling and discovering variants from next-generation sequence data, there are very few tools available for performing evolutionary analyses from these data. Evolutionary and population genomics studies hold great promise for providing valuable insights into natural selection, the effect of mutations on phenotypes, and the origin of species. Thus, there is a need for an extensible and flexible computational tool that can function into a growing number of evolutionary bioinformatics pipelines. Results This paper describes the POPBAM software, which is a comprehensive set of computational tools for evolutionary analysis of whole-genome alignments consisting of multiple individuals, from multiple populations or species. POPBAM works directly from BAM-formatted assembly files, calls variant sites, and calculates a variety of commonly used evolutionary sequence statistics. POPBAM is designed primarily to perform analyses in sliding windows across chromosomes or scaffolds. POPBAM accurately measures nucleotide diversity, population divergence, linkage disequilibrium, and the frequency spectrum of mutations from two or more populations. POPBAM can also produce phylogenetic trees of all samples in a BAM file. Finally, I demonstrate that the implementation of POPBAM is both fast and memory-efficient, and also can feasibly scale to the analysis of large BAM files with many individuals and populations. Software: The POPBAM program is written in C/C++ and is available from http://dgarriga.github.io/POPBAM. The program has few dependencies and can be built on a variety of Linux platforms. The program is open-source and users are encouraged to participate in the development of this resource.

Garrigan, Daniel

2013-01-01

312

Blind image deconvolution for symmetric blurs by polynomial factorization  

NASA Astrophysics Data System (ADS)

In image acquisition, the captured image is often the result of the object being convolved with a blur. Deconvolution is necessary to undo the effects of the blur. However, in reality we often know very little of its exact structure, and therefore we have to perform blind deconvolution. Most existing methods are computationally intensive, Here, we show that if the blur is symmetric, we have an efficient algorithm for deconvolution based on polynomial factorization in the z-domain.

Lam, Edmund Y.; Goodman, Joseph W.

1999-07-01

313

Development of unified plotting tools for GA transport analysis  

NASA Astrophysics Data System (ADS)

A collection of python classes for the TGYRO suite of codes (NEO, GYRO, TGYRO, TGLF) has been developed that provide both the expert user with conceptually simple access to all code output data, and the casual end user with simple command-line control of plotting. The user base for these transport analysis codes continues to grow, raising the urgency of modernizing and unifying the plotting tools used for post-simulation analysis. Simultaneously, there is a push toward larger-scale fusion modeling underscoring the need for a revised, modernized approach to data management and analysis. The TGYRO suite is currently in use at all major fusion laboratories worldwide, and allows the user to make steady-state profile predictions for existing devices and future reactors, and simultaneously to carry out fundamental research on plasma transport (both collisional and turbulent).

Buuck, M.; Candy, J.

2011-11-01

314

PERISCOPE: An Online-Based Distributed Performance Analysis Tool  

NASA Astrophysics Data System (ADS)

This paper presents PERISCOPE - an online distributed performance analysis tool that searches for a wide range of performance bottlenecks in parallel applications. It consists of a set of agents that capture and analyze application and hardware-related properties in an autonomous fashion. The paper focuses on the Periscope design, the different search methodologies, and the steps involved to do an online performance analysis. A new graphical user-friendly interface based on Eclipse is introduced. Through the use of this new easy-to-use graphical interface, remote execution, selection of the type of analysis, and the inspection of the found properties can be performed in an intuitive and easy way. In addition, a real-world application, namely, the GENE code, a grand challenge problem of plasma physics is analyzed using Periscope. The results are illustrated in terms of found properties and scalability issues.

Benedict, Shajulin; Petkov, Ventsislav; Gerndt, Michael

315

Protocol analysis as a tool for behavior analysis  

PubMed Central

The study of thinking is made difficult by the fact that many of the relevant stimuli and responses are not apparent. Although the use of verbal reports has a long history in psychology, it is only recently that Ericsson and Simon's (1993) book on verbal reports explicated the conditions under which such reports may be reliable and valid. We review some studies in behavior analysis and cognitive psychology that have used talk-aloud reporting. We review particular methods for collecting reliable and valid verbal reports using the “talk-aloud” method as well as discuss alternatives to the talk-aloud procedure that are effective under different task conditions, such as the use of reports after completion of very rapid task performances. We specifically caution against the practice of asking subjects to reflect on the causes of their own behavior and the less frequently discussed problems associated with providing inappropriate social stimulation to participants during experimental sessions.

Austin, John; Delaney, Peter F.

1998-01-01

316

The Precision Formation Flying Integrated Analysis Tool (PFFIAT)  

NASA Technical Reports Server (NTRS)

Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

2004-01-01

317

TRIAC: A code for track measurements using image analysis tools  

NASA Astrophysics Data System (ADS)

A computer program named TRIAC written in MATLAB has been developed for track recognition and track parameters measurements from images of the Solid State Nuclear Track Detectors CR39. The program using image analysis tools counts the number of tracks for dosimetry proposes and classifies the tracks according to their radii for the spectrometry of alpha-particles. Comparison of manual scanning counts with those output by the automatic system are presented for detectors exposed to a radon rich environment. The system was also tested to differentiate tracks recorded by alpha-particles of different energies.

Patiris, D. L.; Blekas, K.; Ioannides, K. G.

2006-03-01

318

EEG analysis using wavelet-based information tools.  

PubMed

Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity. PMID:16675027

Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

2006-06-15

319

Image and Data-analysis Tools For Paleoclimatic Reconstructions  

NASA Astrophysics Data System (ADS)

It comes here proposed a directory of instruments and computer science resources chosen in order to resolve the problematic ones that regard the paleoclimatic recon- structions. They will come discussed in particular the following points: 1) Numerical analysis of paleo-data (fossils abundances, species analyses, isotopic signals, chemical-physical parameters, biological data): a) statistical analyses (uni- variate, diversity, rarefaction, correlation, ANOVA, F and T tests, Chi^2) b) multidi- mensional analyses (principal components, corrispondence, cluster analysis, seriation, discriminant, autocorrelation, spectral analysis) neural analyses (backpropagation net, kohonen feature map, hopfield net genetic algorithms) 2) Graphical analysis (visu- alization tools) of paleo-data (quantitative and qualitative fossils abundances, species analyses, isotopic signals, chemical-physical parameters): a) 2-D data analyses (graph, histogram, ternary, survivorship) b) 3-D data analyses (direct volume rendering, iso- surfaces, segmentation, surface reconstruction, surface simplification,generation of tetrahedral grids). 3) Quantitative and qualitative digital image analysis (macro and microfossils image analysis, Scanning Electron Microscope. and Optical Polarized Microscope images capture and analysis, morphometric data analysis, 3-D reconstruc- tions): a) 2D image analysis (correction of image defects, enhancement of image de- tail, converting texture and directionality to grey scale or colour differences, visual enhancement using pseudo-colour, pseudo-3D, thresholding of image features, binary image processing, measurements, stereological measurements, measuring features on a white background) b) 3D image analysis (basic stereological procedures, two dimen- sional structures; area fraction from the point count, volume fraction from the point count, three dimensional structures: surface area and the line intercept count, three dimensional microstructures; line length and the area point count, measurement using grids, measuring area with pixels, measurement parameters, shape and position, im- age processing to enable thresholding and measurement, image processing to extract measurable information, combining multiple images, photogrammetry measurement application)

Pozzi, M.

320

PVT Analysis with a Deconvolution Algorithm.  

National Technical Information Service (NTIS)

Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information a...

R. T. Kouzes

2011-01-01

321

Image analysis tools and emerging algorithms for expression proteomics  

PubMed Central

Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS.

English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

2012-01-01

322

System analysis tools for an ELT at ESO  

NASA Astrophysics Data System (ADS)

Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

Mueller, Michael; Koch, Franz

2006-06-01

323

General Mission Analysis Tool (GMAT) Architectural Specification. Draft  

NASA Technical Reports Server (NTRS)

Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

Hughes, Steven P.; Conway, Darrel, J.

2007-01-01

324

GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS  

SciTech Connect

We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

Shamir, Lior, E-mail: lshamir@mtu.edu [Department of Computer Science, Lawrence Technological University, 21000 West Ten Mile Road, Southfield, MI 48075 (United States)

2011-08-01

325

Automated Web Gis Based Hydrograph Analysis Tool, WHAT  

NASA Astrophysics Data System (ADS)

The separation of the base flow component from a varying streamflow hydrograph is called "hydrograph analysis." In this study, two digital filter based separation modules, the BFLOW and Eckhardt filters, were incorporated into the Web based Hydrograph Analysis Tool (WHAT) system. A statistical component was also developed to provide fundamental information for flow frequency analysis and time series analysis. The Web Geographic Information System (GIS) version of the WHAT system accesses and uses U.S. Geological Survey (USGS) daily streamflow data from the USGS web server. The results from the Eckhardt filter method were compared with the results from the BFLOW filter method that was previously validated, since measured base flow data were not available for this study. Following validation, the two digital filter methods in the WHAT system were run for 50 Indiana gaging stations. The Nash-Sutcliffe coefficient values comparing the results of the two digital filter methods were over 0.91 for all 50 gaging stations, suggesting the filtered base flow using the Eckhardt filter method will typically match measured base flow. Manual separation of base flow from streamflow can lead to inconsistency in the results, while the WHAT system provides consistent results in less than a minute. Although base flow separation algorithms in the WHAT system cannot consider reservoir release and snowmelt that can affect stream hydrographs, the Web based WHAT system provides an efficient tool for hydrologic model calibration and validation. The base flow information from the WHAT system can also play an important role for sustainable ground water and surface water exploitation, including irrigation and industrial uses, and estimation of pollutant loading from both base flow and direct runoff. Thus, best management practices can be appropriately applied to reduce and intercept pollutant leaching if base flow contributes significant amounts of pollutants to the stream. This Web GIS based system also demonstrates how remote, distributed resources can be shared through the Internet using Web programming.

Lim, Kyoung Jae; Engel, Bernard A.; Tang, Zhenxu; Choi, Joongdae; Kim

2005-12-01

326

CRITICA: coding region identification tool invoking comparative analysis  

NASA Technical Reports Server (NTRS)

Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

1999-01-01

327

Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool  

SciTech Connect

Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

Unwin, Stephen D.; Seiple, Timothy E.

2009-05-28

328

Deconvoluting the influences of heat and plastic deformation on internal strains generated by friction stir processing  

NASA Astrophysics Data System (ADS)

Internal-strain profiles in friction-stir processed aluminum-alloy plates were investigated using neutron diffraction. Three different specimens were prepared with a purpose of separating the effects of frictional heating and severe plastic deformation on the internal-strain distribution: (Case 1) a plate processed with both stirring pin and tool shoulder, (Case 2) a plate processed only with the tool shoulder, and (Case 3) a plate processed only with the pin. The comparison between Cases 1 and 2 shows distinctly different strain profiles revealing deconvoluted effects of the different sources (i.e., heat, deformation, or the combination) on the internal strains generated during the friction-stir processing.

Woo, Wanchuck; Choo, Hahn; Brown, Donald W.; Bourke, Mark A. M.; Feng, Zhili; David, Stan A.; Hubbard, Camden R.; Liaw, Peter K.

2005-06-01

329

Blind Deconvolution of the SXT PSF Core Part  

NASA Astrophysics Data System (ADS)

The performance and speed of blind deconvolution algorithms for restoration of SXT images depend on good initial guess for PSF function shape. From the analysis of several compact flare kernels we came to conclusion that a good guess for PSF can be provided directly from images of X-ray compact structures observed by SXT. Recently, we conducted extensive mission-long searches for compact structures through entire database of SXT full resolution frames. The searches returned plenty compact structures which my serve to construct initial approximation of the PSF for BID restoration method. We show a selection of the most compact structures found and its location on SXT CCD detector. Using observation for this selected set of structures we construct constraints for Al12 PSF shrouds and compare them with ground calibration data.

Gburek, S.; Sylwester, J.; Martens, P. C. H.

2002-01-01

330

An online database for plant image analysis software tools  

PubMed Central

Background Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. Results We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. Conclusions The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

2013-01-01

331

Pulse elongation and deconvolution filtering for medical ultrasonic imaging  

Microsoft Academic Search

Range sidelobe artifacts which are associated with pulse compression methods can be reduced with a new method composed of pulse elongation and deconvolution (PED). While pulse compression and PED yield similar signal-to-noise ratio (SNR) improvements, PED inherently minimizes the range sidelobe artifacts. The deconvolution is implemented as a stabilized inverse filter. With proper selection of the excitation waveform an exact

Bruno Haider; Peter A. Lewin; Kai E. Thomenius

1998-01-01

332

Mineral abundance determination: Quantitative deconvolution of thermal emission spectra  

Microsoft Academic Search

A linear retrieval (spectral deconvolution) algorithm is developed and applied to high-resolution laboratory infrared spectra of particulate mixtures and their end- members. The purpose is to place constraints on, and test the viability of, linear spectral deconvolution of high-resolution emission spectra. The effects of addition of noise, data reproducibility, particle size variation, an increasing number of minerals in the mixtures,

Michael S. Ramsey; Philip R. Christensen

1998-01-01

333

Complex deconvolution in non-coherent radar system  

Microsoft Academic Search

This paper is concerned with the estimation of the target magnitude response given the non-coherent receiver measurements and the complex information of the transmitted signal. This is a signal reconstruction problem from partial information which involves phase retrieval as well as deconvolution. Nonlinear least squares and error reduction algorithm are formulated for solving this complex non-coherent deconvolution problem. Some preliminary

Kai-Bor Yu; M. K. Sistanizadeh

1986-01-01

334

Sparse deconvolution: Comparison of statistical and deterministic approaches  

Microsoft Academic Search

Sparse spike train deconvolution is a classical inverse problem which gave rise to many deterministic and stochastic algorithms since the mid-80’s. In the past decade, sparse approximation has been an intensive field of research, leading to the development of a number of algorithms including greedy strategies and convex relaxation methods. Spike train deconvolution can be seen as a specific sparse

Sebastien Bourguignon; Charles Soussen; Herve Carfantan; Jerome Idier

2011-01-01

335

Analysis of a public sector organizational unit using strategic and operational analysis tools  

Microsoft Academic Search

This paper reports on a project which analyzed the processes carried out by a unit within a public sector organization. The method used a combination of strategic and operational analysis tools. This combination proved to be complementary and effective in practice. This outcome of the study suggests that where a process analysis project has strategic considerations, as many do, then

Malcolm Brady

2008-01-01

336

CHRONOS's Paleontological-Stratigraphic Interval Construction and Analysis Tool (PSICAT)  

NASA Astrophysics Data System (ADS)

The Paleontological-Stratigraphic Interval Construction and Analysis Tool (PSICAT) is a Java-based graphical editing tool for creating and viewing stratigraphic column diagrams from drill cores and outcrops. It is customized to the task of working with stratigraphic columns and captures data digitally as you draw and edit the diagram. The data and diagrams are captured in open formats, and integration with the CHRONOS system (www.chronos.org) will allow the user to easily upload their data and diagrams into CHRONOS. Because the data and diagrams are stored in CHRONOS, they will be accessible to anyone, anywhere, at any time. PSICAT is designed with a modular, plug-in-based architecture that will allow it to support a wide variety of functionality, tasks, and geoscientific communities. PSICAT is currently being developed for use by the ANDRILL project (http://www.andrill.org) on their upcoming drilling expeditions in Antarctica, but a general community version will be also available. PSICAT will allow unprecedented communication between Antarctica-based scientists and shore-based scientists, potentially allowing shore-based scientists to interact in almost real time with on-ice operations and data collection.

Reed, J. A.; Cervato, C.; Fielding, C. R.; Fils, D.

2005-12-01

337

Vesta and HED Meteorites: Determining Minerals and their Abundances with Mid-IR Spectral Deconvolution I  

NASA Astrophysics Data System (ADS)

We identify the known mineral compositions and abundances of laboratory samples of Howardite, Eucrite and Diogenite (HED) meteorites (Salisbury et al. 1991, Icarus 9, 280-297) using an established spectral deconvolution algorithm (Ramsey, 1996 Ph.D. Dissertation, ASU; Ramsey and Christiansen 1998, JGR 103, 577-596) for mid-infrared spectral libraries of mineral separates of varying grain sizes. Most notably, the spectral deconvolution algorithm fit the known plagioclase and pyroxene compositions for all of the HED meteorite spectra determined by laboratory analysis. Our results for the HED samples, give us a high degree of confidence that our results are valid and that the spectral deconvolution algorithm is viable. Mineral compositions and abundances are also determined using the same technique for one possible HED parent body, Vesta, using mid-infrared spectra that were obtained from ground-based telescopes (Sprague et al. 1993, A.S.P. 41 Lim et al. 2005, Icarus 173, 385-408) and the Infrared Space Observatory (ISO) (Dotto et al. 2000, A&A 358, 1133-1141). Mid-infrared spectra of Vesta come from different areas on its surface. The ISO Vesta spectral deconvolution is suggestive of triolite, olivine, augite, chromite, wollastonite, and sodalite at one location. Modeling of other locations is underway. We also were successful in modeling spectra from locations on the Moon where no Apollo samples are available and for several locations on Mercury's surface using the same techniques (see lunar and mercurian abstracts this meeting). These results demonstrate promise for the spectral deconvolution method to correctly make mineral identifications on remotely observed objects, in particular main-belt asteroids, the Moon, and Mercury. This work was funded by NSF AST0406796.

Hanna, Kerri D.; Sprague, A. L.

2007-10-01

338

Tools for Energetic Particle Data Analysis Throughout the Solar System  

NASA Astrophysics Data System (ADS)

The current moment presents a unique opportunity for simulatneous observation of energetic particles throughout the solar system to investigate such questions as the site and mechanisms of suprathermal ion acceleration. Data from charged particle detectors of similar capabilities present on the MESSENGER, JUNO, Cassini, New Horizons, and Voyager, much of it publicly available, can be leveraged to illuminate longstanding questions. We present a web launchable set of tools (MIDL) which use a common semantic data model (HELIOLIB) to allow data discovery and analysis of in situ measurements throughout the solar system. Comparisons of spectral features and their evolution measured by similar detectors as the same event propagates outward through the solar system should allow exciting new science. We will show specific results using PEPSSI (New Horizons) cruise science data and provide pointers to other public data sets with MIDL access.;

Brown, L. E.; Hill, M. E.; Vandegriff, J. D.

2012-12-01

339

Color infrared (CIR) photography: A tool for environmental analysis  

NASA Technical Reports Server (NTRS)

Research carried out under NASA auspices suggests that in the future remote sensors may play an important role in monitoring our environment. One medium, color infrared photography, appears to have immediate utility. Its capability to identify, measure the acreage of, and monitor the health of agricultural and woodland resources has been demonstrated, as has its capability to identify the sources and extent of certain types of water pollution. CIR is also beginning to demonstrate considerable potential as a tool for urban analysis. The great value of CIR is that it can provide these data quickly and inexpensively, and for that reason will be preferred to more complex multispectral systems by budget-conscious administrators.

Lindgren, D. T.

1971-01-01

340

Validation of tool mark analysis of cut costal cartilage.  

PubMed

This study was designed to establish the potential error rate associated with the generally accepted method of tool mark analysis of cut marks in costal cartilage. Three knives with different blade types were used to make experimental cut marks in costal cartilage of pigs. Each cut surface was cast, and each cast was examined by three analysts working independently. The presence of striations, regularity of striations, and presence of a primary and secondary striation pattern were recorded for each cast. The distance between each striation was measured. The results showed that striations were not consistently impressed on the cut surface by the blade's cutting edge. Also, blade type classification by the presence or absence of striations led to a 65% misclassification rate. Use of the classification tree and cross-validation methods and inclusion of the mean interstriation distance decreased the error rate to c. 50%. PMID:22081951

Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

2012-03-01

341

In silico tools for the analysis of antibiotic biosynthetic pathways.  

PubMed

Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data. PMID:24631213

Weber, Tilmann

2014-05-01

342

Input Range Testing for the General Mission Analysis Tool (GMAT)  

NASA Technical Reports Server (NTRS)

This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

Hughes, Steven P.

2007-01-01

343

Efficient analysis tool for coupled-SAW-resonator filters.  

PubMed

The advantages of the coupled-mode (COM) formalism and the transmission-matrix approach are combined in order to create exact and computationally efficient analysis and synthesis tools for the design of coupled surface acoustic wave resonator filters. The models for the filter components, in particular gratings, interdigital transducers (IDTs) and multistrip couplers (MSCs), are based on the COM approach that delivers closed-form expressions. To determine the pertinent COM parameters, the COM differential equations are solved and the solution is compared with analytically derived expressions from the transmission-matrix approach and the Green's function method. The most important second-order effects, such as energy storage, propagation loss, and mechanical and electrical loading, are fully taken into account. As an example, a two-pole, acoustically coupled resonator filter at 914.5 MHz on AT quartz is investigated. Excellent agreement between theory and measurement is found. PMID:18267581

Scholl, G; Christ, A; Ruile, W; Russer, P H; Weigel, R

1991-01-01

344

PARSESNP: a tool for the analysis of nucleotide polymorphisms  

PubMed Central

PARSESNP is a tool for the display and analysis of polymorphisms in genes. Using a reference DNA sequence, an exon/intron position model and a list of polymorphisms, it determines the effects of these polymorphisms on the expressed gene product, as well as the changes in restriction enzyme recognition sites. It shows the locations and effects of the polymorphisms in summary on a stylized graphic and in detail on a display of the protein sequence aligned with the DNA sequence. The addition of a homology model, in the form of an alignment of related protein sequences, allows for prediction of the severity of missense changes. PARSESNP is available on the World Wide Web at http://www.proweb.org/parsesnp/.

Taylor, Nicholas E.; Greene, Elizabeth A.

2003-01-01

345

System-of-Systems Technology-Portfolio-Analysis Tool  

NASA Technical Reports Server (NTRS)

Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

2012-01-01

346

Parallel performance wizard: A performance analysis tool for partitioned global-address-space programming  

Microsoft Academic Search

Given the complexity of parallel programs, developers often must rely on performance analysis tools to help them improve the performance of their code. While many tools support the analysis of message-passing programs, no tool exists that fully supports programs written in program- ming models that present a partitioned global address space (PGAS) to the programmer, such as UPC and SHMEM.

Hung-hsun Su; Max Billingsley; Alan D. George

2008-01-01

347

ADAP-GC 2.0: deconvolution of coeluting metabolites from GC/TOF-MS data for metabolomics studies.  

PubMed

ADAP-GC 2.0 has been developed to deconvolute coeluting metabolites that frequently exist in real biological samples of metabolomics studies. Deconvolution is based on a chromatographic model peak approach that combines five metrics of peak qualities for constructing/selecting model peak features. Prior to deconvolution, ADAP-GC 2.0 takes raw mass spectral data as input, extracts ion chromatograms for all the observed masses, and detects chromatographic peak features. After deconvolution, it aligns components across samples and exports the qualitative and quantitative information of all of the observed components. Centered on the deconvolution, the entire data analysis workflow is fully automated. ADAP-GC 2.0 has been tested using three different types of samples. The testing results demonstrate significant improvements of ADAP-GC 2.0, compared to the previous ADAP 1.0, to identify and quantify metabolites from gas chromatography/time-of-flight mass spectrometry (GC/TOF-MS) data in untargeted metabolomics studies. PMID:22747237

Ni, Yan; Qiu, Yunping; Jiang, Wenxin; Suttlemyre, Kyle; Su, Mingming; Zhang, Wenchao; Jia, Wei; Du, Xiuxia

2012-08-01

348

A novel sports ball aerodynamics analysis tool: soccer ball design  

Microsoft Academic Search

A novel soccer ball design tool involving the optimisation of the aerodynamics is developed in this paper. The design tool allows the flight of different balls with various input conditions to be compared and the behaviour of new ball designs to be assessed. The tool was developed by combining CFD results with information from wind tunnel tests, trajectory testing, and

Matt J. Carré; Sarah Barber

2010-01-01

349

A novel sports ball aerodynamics analysis tool: soccer ball design  

Microsoft Academic Search

A novel soccer ball design tool involving the optimisation of the aerodynamics is developed in this paper. The design tool allows the flight of different balls with various input conditions to be compared and the behaviour of new ball designs to be assessed. The tool was developed by combining CFD results with information from wind tunnel tests, trajectory testing, and

Matt J. Carré; Sarah Barber

2012-01-01

350

MetaFIND: A feature analysis tool for metabolomics data  

PubMed Central

Background Metabolomics, or metabonomics, refers to the quantitative analysis of all metabolites present within a biological sample and is generally carried out using NMR spectroscopy or Mass Spectrometry. Such analysis produces a set of peaks, or features, indicative of the metabolic composition of the sample and may be used as a basis for sample classification. Feature selection may be employed to improve classification accuracy or aid model explanation by establishing a subset of class discriminating features. Factors such as experimental noise, choice of technique and threshold selection may adversely affect the set of selected features retrieved. Furthermore, the high dimensionality and multi-collinearity inherent within metabolomics data may exacerbate discrepancies between the set of features retrieved and those required to provide a complete explanation of metabolite signatures. Given these issues, the latter in particular, we present the MetaFIND application for 'post-feature selection' correlation analysis of metabolomics data. Results In our evaluation we show how MetaFIND may be used to elucidate metabolite signatures from the set of features selected by diverse techniques over two metabolomics datasets. Importantly, we also show how MetaFIND may augment standard feature selection and aid the discovery of additional significant features, including those which represent novel class discriminating metabolites. MetaFIND also supports the discovery of higher level metabolite correlations. Conclusion Standard feature selection techniques may fail to capture the full set of relevant features in the case of high dimensional, multi-collinear metabolomics data. We show that the MetaFIND 'post-feature selection' analysis tool may aid metabolite signature elucidation, feature discovery and inference of metabolic correlations.

Bryan, Kenneth; Brennan, Lorraine; Cunningham, Padraig

2008-01-01

351

Data Analysis Tools Using JAVA/Internet Technology at Arnold Engineering Development Center.  

National Technical Information Service (NTIS)

AEDC is in the process of bringing Virtual Presence capabilities to its customers through Data Analysis Tools using the Java Programming Language and Internet Technologies. These technology tools are maturing at a time when U. S. dominance and market shar...

D. Pemberton

1999-01-01

352

Process Documentation and Execution: Introducing a Tool to Support Analysis of Alternatives.  

National Technical Information Service (NTIS)

PROBLEM STATEMENT: Develop software tools to support analytical data preparation processes: Populate authoritative databases * Prepare data for analysis * Post-process model output. What features would you like to see in these tools. Usability * Reliabili...

T. A. Dufresne R. L. Turner

2005-01-01

353

Generalized Analysis Tools for Multi-Spacecraft Missions  

NASA Astrophysics Data System (ADS)

Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI SR-001, 1998. [2] Chanteur, G.: Spatial Interpolation for Four Spacecraft: Theory, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 371-393, ISSI SR-001, 1998. [3] Chanteur, G.: Accuracy of field gradient estimations by Cluster: Explanation of its dependency upon elongation and planarity of the tetrahedron, pp. 265-268, ESA SP-449, 2000. [4] Vogt, J., Paschmann, G., and Chanteur, G.: Reciprocal Vectors, pp. 33-46, ISSI SR-008, 2008.

Chanteur, G. M.

2011-12-01

354

Micropollutants in urban watersheds : substance flow analysis as management tool  

NASA Astrophysics Data System (ADS)

Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

2009-04-01

355

Tool for Sizing Analysis of the Advanced Life Support System  

NASA Technical Reports Server (NTRS)

Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

2005-01-01

356

Genomic Resources and Tools for Gene Function Analysis in Potato  

PubMed Central

Potato, a highly heterozygous tetraploid, is undergoing an exciting phase of genomics resource development. The potato research community has established extensive genomic resources, such as large expressed sequence tag (EST) data collections, microarrays and other expression profiling platforms, and large-insert genomic libraries. Moreover, potato will now benefit from a global potato physical mapping effort, which is serving as the underlying resource for a full potato genome sequencing project, now well underway. These tools and resources are having a major impact on potato breeding and genetics. The genome sequence will provide an invaluable comparative genomics resource for cross-referencing to the other Solanaceae, notably tomato, whose sequence is also being determined. Most importantly perhaps, a potato genome sequence will pave the way for the functional analysis of the large numbers of potato genes that await discovery. Potato, being easily transformable, is highly amenable to the investigation of gene function by biotechnological approaches. Recent advances in the development of Virus Induced Gene Silencing (VIGS) and related methods will facilitate rapid progress in the analysis of gene function in this important crop.

Bryan, Glenn J.; Hein, Ingo

2008-01-01

357

Elementary Mode Analysis: A Useful Metabolic Pathway Analysis Tool for Characterizing Cellular Metabolism  

PubMed Central

Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering.

Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich

2010-01-01

358

Study of academic achievements using spatial analysis tools  

NASA Astrophysics Data System (ADS)

In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

González, C.; Velilla, C.; Sánchez-Girón, V.

2012-04-01

359

Comparative view of in silico DNA sequencing analysis tools.  

PubMed

DNA sequencing is an important tool for discovery of genetic variants. The task of detecting single-nucleotide variants is complicated by noise and sequencing artifacts in sequencing data. Several in silico tools have been developed to assist this process. These tools interpret the raw chromatogram data and perform a specialized base-calling and quality-control assessment procedure to identify variants. The approach used to identify variants differs between the tools, with some specific to SNPs and other for Indels. The choice of a tool is guided by the design of the sequencing project and the nature of the variant to be discovered. In this chapter, these tools are compared to facilitate the choice of a tool used for variant discovery. PMID:21779999

Tongsima, Sissades; Assawamakin, Anunchai; Piriyapongsa, Jittima; Shaw, Philip J

2011-01-01

360

Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.  

PubMed

Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C

2013-05-01

361

Fast Hartley transform and truncated singular value algorithm for circular deconvolution  

Microsoft Academic Search

The fast Hartley transform (FHT) algorithm for solving well- conditioned circular deconvolution is suggested. The arithmetic operations save about half compared to the fast Fourier transform deconvolution algorithm. The Moore-Penrose generalized inverse of the circulant matrix connection to FHT matrices is investigated, then the least-squares solution for circular deconvolution is developed. An efficient numerical stable circular deconvolution algorithm is suggested

Lizhi Cheng

1997-01-01

362

Spectral power distribution deconvolution scheme for phosphor-converted white light-emitting diode using multiple Gaussian functions.  

PubMed

We propose a procedure to deconvolute the spectral power distribution (SPD) of phosphor-converted LEDs (pc-LEDs). The procedure involves a two-step process using multiple Gaussian functions. The first step is a preliminary process to deconvolute an SPD using a pair of Gaussian functions. Using the results from the first step, the second step determines (a) the number of Gaussian functions to be used in the analysis and (b) the initial values and regression domains of the coefficients of each Gaussian function for subsequent multiple-regression operations. Successful deconvolution is confirmed by comparing the values of lumen, correlated color temperature, and color rendering index with the experimental data of cool and warm pc-LEDs. The proposed approach is illustrated to evaluate the yellow-to-blue ratio and the phosphor power conversion efficiency. PMID:23400063

Song, Bong-Min; Han, Bongtae

2013-02-10

363

A comparison of deconvolution and the Rutland-Patlak plot in parenchymal renal uptake rate  

PubMed Central

Introduction: Deconvolution and the Rutland-Patlak (R-P) plot are two of the most commonly used methods for analyzing dynamic radionuclide renography. Both methods allow estimation of absolute and relative renal uptake of radiopharmaceutical and of its rate of transit through the kidney. Materials and Methods: Seventeen patients (32 kidneys) were referred for further evaluation by renal scanning. All patients were positioned supine with their backs to the scintillation gamma camera, so that the kidneys and the heart are both in the field of view. Approximately 5-7 mCi of 99mTc-DTPA (diethylinetriamine penta-acetic acid) in about 0.5 ml of saline is injected intravenously and sequential 20 s frames were acquired, the study on each patient lasts for approximately 20 min. The time-activity curves of the parenchymal region of interest of each kidney, as well as the heart were obtained for analysis. The data were then analyzed with deconvolution and the R-P plot. Results: A strong positive association (n = 32; r = 0.83; R2 = 0.68) was found between the values that obtained by applying the two methods. Bland-Altman statistical analysis demonstrated that ninety seven percent of the values in the study (31 cases from 32 cases, 97% of the cases) were within limits of agreement (mean ± 1.96 standard deviation). Conclusion: We believe that R-P analysis method is expected to be more reproducible than iterative deconvolution method, because the deconvolution technique (the iterative method) relies heavily on the accuracy of the first point analyzed, as any errors are carried forward into the calculations of all the subsequent points, whereas R-P technique is based on an initial analysis of the data by means of the R-P plot, and it can be considered as an alternative technique to find and calculate the renal uptake rate.

Al-Shakhrah, Issa A

2012-01-01

364

Deconvolution of mixed magnetism in multilayer graphene  

NASA Astrophysics Data System (ADS)

Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

Swain, Akshaya Kumar; Bahadur, Dhirendra

2014-06-01

365

Quantitative deconvolution of human thermal infrared emittance.  

PubMed

The bioheat transfer models conventionally employed in etiology of human thermal infrared (TIR) emittance rely upon two assumptions; universal graybody emissivity and significant transmission of heat from subsurface tissue layers. In this work, a series of clinical and laboratory experiments were designed and carried out to conclusively evaluate the validity of the two assumptions. Results obtained from the objective analyses of TIR images of human facial and tibial regions demonstrated significant variations in spectral thermophysical properties at different anatomic locations on human body. The limited validity of the two assumptions signifies need for quantitative deconvolution of human TIR emittance in clinical, psychophysiological and critical applications. A novel approach to joint inversion of the bioheat transfer model is also introduced, levering the deterministic temperature-dependency of proton resonance frequency in low-lipid human soft tissue for characterizing the relationship between subsurface 3D tissue temperature profiles and corresponding TIR emittance. PMID:23086533

Arthur, D T J; Khan, M M

2013-01-01

366

A similarity theory of approximate deconvolution models of turbulence  

NASA Astrophysics Data System (ADS)

We apply the phenomenology of homogeneous, isotropic turbulence to the family of approximate deconvolution models proposed by Stolz and Adams. In particular, we establish that the models themselves have an energy cascade with two asymptotically different inertial ranges. Delineation of these gives insight into the resolution requirements of using approximate deconvolution models. The approximate deconvolution model's energy balance contains both an enhanced energy dissipation and a modification to the model's kinetic energy. The modification of the model's kinetic energy induces a secondary energy cascade which accelerates scale truncation. The enhanced energy dissipation completes the scale truncation by reducing the model's micro-scale from the Kolmogorov micro-scale.

Layton, William; Neda, Monika

2007-09-01

367

Blind deconvolution of two-dimensional complex data  

SciTech Connect

Inspired by the work of Lane and Bates on automatic multidimensional deconvolution, the authors have developed a systematic approach and an operational code for performing the deconvolution of multiply-convolved two-dimensional complex data sets in the absence of noise. They explain, in some detail, the major algorithmic steps, where noise or numerical errors can cause problems, their approach in dealing with numerical rounding errors, and where special noise-mitigating techniques can be used toward making blind deconvolution practical. Several examples of deconvolved imagery are presented, and future research directions are noted.

Ghiglia, D.C.; Romero, L.A.

1994-01-01

368

Abnormal hepatobiliary clearance of 99mTc-N-(2,6-diethylphenylcarbamoylmethyl)iminodiacetic acid in the altered state of thyroid--by an analysis of deconvolution method.  

PubMed

Hepatobiliary clearance of 99mTc-EHIDA was investigated in cases with altered thyroid function by deconvolution method. The results indicated that mean hepatic transit time of all control subjects revealed less than 10 minutes. On the other hand, mean hepatic transit time of cases with altered thyroid function revealed prolonged more than 13 minutes. Cases especially showing an elevated serum concentration of TSH compared with normal range (4.6 microU/ml) had a tendency of a high incidence of markedly prolonged mean hepatic transit time. These results suggest that thyroid hormone may influence on the hepatic metabolism of hepatobiliary radiopharmaceuticals. This phenomenon also could partly explain the cause of liver dysfunction seen in subjects with altered states. PMID:2305113

Tanno, M; Yamada, H; Kurihara, N; Kyomasu, Y; Nakayama, M; Mashima, Y; Chiba, K; Takahashi, R; Satoh, T

1990-01-01

369

SEM analysis as a diagnostic tool for photovoltaic cell degradation  

NASA Astrophysics Data System (ADS)

The importance of scanning electron microscopy (SEM) analysis as a diagnostic tool for analyzing the degradation of a polycrystalline Photovoltaic cell has been studied. The main aim of this study is to characterize the surface morphology of hot spot regions (degraded) cells in photovoltaic solar cells. In recent years, production of hetero and multi-junction solar cells has experience tremendous growth as compared to conventional silicon (Si) solar cells. Thin film photovoltaic solar cells generally are more prone to exhibiting defects and associated degradation modes. To improve the lifetime of these cells and modules, it is imperative to fully understand the cause and effect of defects and degradation modes. The objective of this paper is to diagnose the observed degradation in polycrystalline silicon cells, using scanning electron microscopy (SEM). In this study poly-Si cells were characterize before and after reverse biasing, the reverse biasing was done to evaluate the cells' susceptibility to leakage currents and hotspots formation. After reverse biasing, some cells were found to exhibit hotspots as confirmed by infrared thermography. The surface morphology of these hotspots re

Osayemwenre, Gilbert; Meyer, E. L.

2013-04-01

370

Thermal Management Tools for Propulsion System Trade Studies and Analysis  

NASA Technical Reports Server (NTRS)

Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

McCarthy, Kevin; Hodge, Ernie

2011-01-01

371

Automatic analysis of Rutherford backscattering spectrometry spectra  

Microsoft Academic Search

It has been shown that Bayesian statistics is a powerful tool in the analysis of ion beam analysis (IBA) data. Past work has shown its applicability to the deconvolution of the detector response function from micro-Rutherford backscattering spectrometry (RBS) and micro-proton-induced X-ray emission (PIXE) spectra, subtraction of the background from PIXE spectra, the extraction of depth profiles from PIXE spectra

J. Padayachee; K. A. Meyer; V. M. Prozesky

2001-01-01

372

Developing a high-quality software tool for fault tree analysis  

Microsoft Academic Search

Sophisticated dependability analysis techniques are being developed in academia and research labs, but few have gained wide acceptance in industry. To be valuable, such techniques must be supported by usable, dependable software tools. We present our approach to addressing these issues in developing a dynamic fault tree analysis tool called Galileo. Galileo is designed to support efficient system-level analysis by

Joanne Bechta Dugan; Kevin J. Sullivan; David Coppit

1999-01-01

373

Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools  

NASA Astrophysics Data System (ADS)

Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency management, to know in advance the different levels of risk perception and preparedness existing among several sectors of the population. Knowing where the most vulnerable population is located may optimize the use of resources, better direct the initial efforts and organize the evacuation and attention procedures. As part of the CBEWS, a comprehensive survey was applied in the study area to measure, among others features, the levels of risk perception, preparation and information received about natural hazards. After a statistical and direct analysis on a complete social dataset recorded, a spatial information distribution is actually in progress. Based on boundaries features (municipalities and sub-districts) of Italian Institute of Statistics (ISTAT), a local scale background has been granted (a private address level is not accessible for privacy rules so the local districts-ID inside municipality has been the detail level performed) and a spatial location of the surveyed population has been completed. The geometric component has been defined and actually it is possible to create a local distribution of social parameters derived from perception questionnaries results. A lot of raw information and social-statistical analysis offer different mirror and "visual concept" of risk perception. For this reason a concrete complete GeoDB is under working for the complete organization of the dataset. By a technical point of view the environment for data sharing is based on a complete open source web-service environment, to offer manually-made and user-friendly interface to this kind of information. Final aim is to offer different switches of dataset, using the same scale prototype and data hierarchical structure, to provide and compare social location of risk perception in the most detailed level.

Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

2010-05-01

374

A dynamic deep space communication link analysis tool for the deep space network (DSN)  

Microsoft Academic Search

A dynamic deep space communication link analysis tool is described in this paper. This tool, developed by The Aerospace Corporation, provides the capability to analyze coverage and data throughput for communication links between a spacecraft and the Jet Propulsion Laboratory's (JPL) Deep Space Network (DSN). The tool determines the link margin and data throughput over time during the trajectory of

Y. Y. Krikorian; M. K. Sue; G. V. Leon; L. Cooper; S. K. Do; D. L. Emmons; D. J. Dichmann; J. P. McVey; E. T. Campbell

2005-01-01

375

Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique  

ERIC Educational Resources Information Center

A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

2005-01-01

376

Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis  

NASA Astrophysics Data System (ADS)

The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. Using a syringe, this low-viscosity, transparent glue could be easily applied on the target area. We found that the glue permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

2010-05-01

377

Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis  

NASA Astrophysics Data System (ADS)

The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with a two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. This low-viscosity, transparent glue allowed for an easy application on the target area by means of a syringe and permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

Klapper, D.; Kueppers, U.; Castro, J. M.

2009-12-01

378

Experimental analysis of change detection algorithms for multitooth machine tool fault detection  

NASA Astrophysics Data System (ADS)

This paper describes an industrial application of fault diagnosis method for a multitooth machine tool. Different statistical approaches have been used to detect and diagnose insert breakage in multitooth tools based on the analysis of electrical power consumption of the tool drives. Great effort has been made to obtain a robust method, able to avoid any needed re-calibration process, after, for example, a maintenance operation. From the point of view of maintenance costs, these multitooth tools are the most critical part of the machine tools used for mass production in the car industry. These tools integrate different kinds of machining operations and cutting conditions.

Reñones, Aníbal; de Miguel, Luis J.; Perán, José R.

2009-10-01

379

Deconvolution Estimation in Measurement Error Models: The R Package decon  

PubMed Central

Data from many scientific areas often come with measurement error. Density or distribution function estimation from contaminated data and nonparametric regression with errors-in-variables are two important topics in measurement error models. In this paper, we present a new software package decon for R, which contains a collection of functions that use the deconvolution kernel methods to deal with the measurement error problems. The functions allow the errors to be either homoscedastic or heteroscedastic. To make the deconvolution estimators computationally more efficient in R, we adapt the fast Fourier transform algorithm for density estimation with error-free data to the deconvolution kernel estimation. We discuss the practical selection of the smoothing parameter in deconvolution methods and illustrate the use of the package through both simulated and real examples.

Wang, Xiao-Feng; Wang, Bin

2011-01-01

380

The discrete Kalman filtering approach for seismic signals deconvolution  

SciTech Connect

Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

Kurniadi, Rizal; Nurhandoko, Bagus Endar B. [Departement of Physics Intitut Teknologi Bandung, Jl. Ganesha 10 Bandung (Indonesia)

2012-06-20

381

Bregmanized Nonlocal Regularization for Deconvolution and Sparse Reconstruction  

Microsoft Academic Search

We propose two algorithms based on Bregman iteration and operator splitting tech- nique for nonlocal TV regularization problems. The convergence of the algorithms is analyzed and applications to deconvolution and sparse reconstruction are presented.

Xiaoqun Zhang; Martin Burgery; Xavier Bresson

382

Bregmanized Nonlocal Regularization for Deconvolution and Sparse Reconstruction  

Microsoft Academic Search

Abstract We propose two algorithms based on Bregman iteration and operator splitting tech- nique for nonlocal TV regularization problems. The convergence,of the algorithms is analyzed and applications to deconvolution and sparse reconstruction are presented.

Xiaoqun Zhang; Martin Burger; Xavier Bresson; Stanley Osher

2010-01-01

383

Application of the Lucy–Richardson Deconvolution Procedure to High Resolution Photoemission Spectra  

SciTech Connect

Angle-resolved photoemission has developed into one of the leading probes of the electronic structure and associated dynamics of condensed matter systems. As with any experimental technique the ability to resolve features in the spectra is ultimately limited by the resolution of the instrumentation used in the measurement. Previously developed for sharpening astronomical images, the Lucy-Richardson deconvolution technique proves to be a useful tool for improving the photoemission spectra obtained in modern hemispherical electron spectrometers where the photoelectron spectrum is displayed as a 2D image in energy and momentum space.

Rameau, J.; Yang, H.-B.; Johnson, P.D.

2010-07-01

384

Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution  

NASA Technical Reports Server (NTRS)

A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

1999-01-01

385

Bayesian regularization and nonnegative deconvolution for room impulse response estimation  

Microsoft Academic Search

This paper proposes Bayesian Regularization And Nonnegative Deconvolution (BRAND) for accurately and robustly estimating acoustic room impulse responses for applications such as time-delay estimation and echo cancellation. Similar to conventional deconvolution methods, BRAND estimates the coefficients of convolutive finite-impulse-response (FIR) filters using least-square optimization. However, BRAND exploits the nonnegative, sparse structure of acoustic room impulse responses with nonnegativity constraints and

Yuanqing Lin; Daniel D. Lee

2006-01-01

386

Mineral abundance determination: Quantitative deconvolution of thermal emission spectra  

Microsoft Academic Search

A linear retrieval (spectral deconvolution) algorithm is developed and applied to high-resolution laboratory infrared spectra of particulate mixtures and their end-members. The purpose is to place constraints on, and test the viability of, linear spectral deconvolution of high-resolution emission spectra. The effects of addition of noise, data reproducibility, particle size variation, an increasing number of minerals in the mixtures, and

Michael S. Ramsey; Philip R. Christensen

1998-01-01

387

Bayesian Multiscale Deconvolution Applied to Gamma-ray Spectroscopy  

Microsoft Academic Search

A common task in gamma-ray astronomy is to extract spectral information, such as model constraints and incident photon spectrum estimates, given the measured energy deposited in a detector and the detector response. This is the classic problem of spectral “deconvolution” or spectral inversion [2]. The methods of forward folding (i.e. parameter fitting) and maximum entropy “deconvolution” (i.e. estimating independent input

C. A. Younga; A. Connors; E. Kolaczyk; M. McConnell; G. Rank; J. M. Ryand; V. Schoenfelder

2003-01-01

388

Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics  

SciTech Connect

The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

Jodoin, Vincent J [ORNL] [ORNL; Lee, Ronald W [ORNL] [ORNL; Peplow, Douglas E. [ORNL] [ORNL; Lefebvre, Jordan P [ORNL] [ORNL

2011-01-01

389

Online Analysis of Wind and Solar Part I: Ramping Tool  

SciTech Connect

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

2012-01-31

390

MTpy - Python Tools for Magnetotelluric Data Processing and Analysis  

NASA Astrophysics Data System (ADS)

We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

2014-05-01

391

General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft  

NASA Technical Reports Server (NTRS)

The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

Dove, Edwin; Hughes, Steve

2007-01-01

392

AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool  

USGS Publications Warehouse

Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

Halford, Keith

2009-01-01

393

Wavelet transform-based Fourier deconvolution for resolving oscillographic signals.  

PubMed

Fourier self-deconvolution is an effective means of resolving overlapped bands, but this method requires a mathematical model to yield deconvolution and it is quite sensitive to noises in unresolved bands. Wavelet transform is a technique for noise reduction and deterministic feature capturing because its time-frequency localization or scale is not the same in the entire time-frequency domain. In this work, wavelet transform-based Fourier deconvolution was proposed, in which a discrete approximation (such as A(2)) obtained from performing wavelet transform on the original data was substituted for the original data to be deconvolved and another discrete appropriate approximation (such as A(5)) was used as a lineshape function to yield deconvolution. Again, instead of the apodization function, the B-spline wavelet was used to smooth the deconvolved data to enhance the signal-to-noise ratio. As a consequence, this method does not suffer as badly as Fourier self-deconvolution from noises in the original data. Thus, resolution enhancement can be increased significantly, especially for signals with higher noise level. Furthermore, this method does not require a mathematical model to yield deconvolution; it is very convenient to deconvolve electrochemical signals. PMID:18968359

Zhang, X; Zheng, J; Gao, H

2001-08-01

394

Industry Sector Analysis Mexico: Machine Tools and Metal Working Equipment.  

National Technical Information Service (NTIS)

The Industry Sector Analyses (I.S.A.) for machine tools and metalworking equipment contains statistical and narrative information on projected market demand, end-users, receptivity of Mexican consumers to U.S. products, the competitive situation - Mexican...

1990-01-01

395

Friction stir welding using unthreaded tools: analysis of the flow  

Microsoft Academic Search

Friction stir welding (FSW) is a solid-phase welding process. Material flow during FSW is very complex and not fully understood.\\u000a Most of studies in literature used threaded pins since most industrial applications currently use threaded pins. However initially\\u000a threaded tools may become unthreaded because of the tool wear when used for high melting point alloys or reinforced aluminium\\u000a alloys. In

O. Lorrain; V. Favier; H. Zahrouni; D. Lawrjaniec

2010-01-01

396

CATLAC: Calibration and Validation Analysis Tool of Local Area Coverage for the SeaWifs Mission.  

National Technical Information Service (NTIS)

Calibration and validation Analysis Tool of Local Area Coverage (CATLAC) is an analysis package for selecting and graphically displaying Earth and space targets for calibration and validation activities on a polar orbiting satellite. The package is writte...

R. H. Woodward W. W. Gregg F. S. Patt

1994-01-01

397

Guide on the Consistent Application of Traffic Analysis Tools and Methods.  

National Technical Information Service (NTIS)

Federal Highway Administration, in support of the Traffic Analysis and Simulation Pooled Fund Study, initiated this study to identify and address consistency in the selection and use of traffic analysis tools. This document offers recommendations on the m...

R. Dowling

2011-01-01

398

New Geant4 based simulation tools for space radiation shielding and effects analysis  

Microsoft Academic Search

We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the

G. Santina; P. Nieminen; H. Evansa; E. Daly; F. Lei; P. R. Truscott; C. S. Dyer; B. Quaghebeur; D. Heynderickx

2003-01-01

399

SICOMAT: A system for SImulation and COntrol analysis of MAchine Tools  

Microsoft Academic Search

Presents a software package for the simulation and the control analysis of machine tool axes. This package which is called SICOMAT (SImulation and COntrol analysis of MAchine Tools), provides a large variety of toolboxes to analyze the behavior and the control of the machine. The software takes into account several elements such as the flexibility of bodies, the interaction between

M. Gautier; M. T. Pham; W. Khalil; Ph. Lemoine; Ph. Poignet

2001-01-01

400

Development of Quantitative Risk Analysis tool for the fire safety in railway tunnel  

Microsoft Academic Search

In the context of strengthening fire safety level in the transport tunnel, much effort has been made to develop techniques to quantify the fire risk in the transport tunnel system. In this paper, the development of quantitative risk analysis tool for the fire safety in railway tunnel will be described. Inside the analysis tool, a number of scenarios are constructed

Sungwook Yoon; Hang Choi

401

Model-Based Tool-Chain Infrastructure for Automated Analysis of Embedded Systems  

Microsoft Academic Search

In many safety-critical applications of embedded systems, the system dynamics exhibits hybrid behaviors. To enable automatic analysis of these embedded systems, many analysis tools have been de- veloped based on hybrid automata model. These tools are constructed by their own domain-specific modeling languages (DSMLs) but they are different in various aspects. To enable meaningful semantic interpreta- tion of DSMLs, we

Hang Su; Graham Hemingway; Kai Chen; T. John Koo

2006-01-01

402

The Moon: Determining Minerals and their Abundances with Mid-IR Spectral Deconvolution II  

NASA Astrophysics Data System (ADS)

We determine the mineral compositions and abundances at three locations on the lunar surface using an established spectral deconvolution algorithm (Ramsey 1996, Ph.D. Dissertation, ASU; Ramsey and Christiansen 1998, JGR 103, 577-596) for mid-infrared spectral libraries of mineral separates of varying grain sizes. Spectral measurements of the lunar surface were obtained at the Infrared Telescope Facility (IRTF) on Mauna Kea, HI with Boston University's Mid-Infrared Spectrometer and Imager (MIRSI). Our chosen locations, Aristarchus, Grimaldi and Mersenius C, have been previously observed in the VIS near-IR from ground-based telescopes and spacecraft (Zisk et al. 1977, The Moon 17, 59-99; Hawke et al. 1993, GRL 20, 419-422; McEwen et al. 1994, Science 266, 1858-1862; Peterson et al. 1995, 22, 3055-3058; Warell et al. 2006, Icarus 180, 281-291), however there are no sample returns for analysis. Surface mineral deconvolutions of the Grimaldi Basin infill are suggestive of anorthosite, labradorite, orthopyroxene, olivine, garnet and phosphate. Peterson et al. (1995) indicated the infill of Grimaldi Basin has a noritic anorthosite or anorthositic norite composition. Our spectral deconvolution supports these results. Modeling of other lunar locations is underway. We have also successfully modeled laboratory spectra of HED meteorites, Vesta, and Mercury (see meteorites and mercurian abstracts this meeting). These results demonstrate the spectral deconvolution method to be robust for making mineral identifications on remotely observed objects, in particular main-belt asteroids, the Moon, and Mercury. This work was funded by NSF AST406796.

Kozlowski, Richard W.; Donaldson Hanna, K.; Sprague, A. L.; Grosse, F. A.; Boop, T. S.; Warell, J.; Boccafola, K.

2007-10-01

403

Fast, Automated Implementation of Temporally Precise Blind Deconvolution of Multiphasic Excitatory Postsynaptic Currents  

PubMed Central

Records of excitatory postsynaptic currents (EPSCs) are often complex, with overlapping signals that display a large range of amplitudes. Statistical analysis of the kinetics and amplitudes of such complex EPSCs is nonetheless essential to the understanding of transmitter release. We therefore developed a maximum-likelihood blind deconvolution algorithm to detect exocytotic events in complex EPSC records. The algorithm is capable of characterizing the kinetics of the prototypical EPSC as well as delineating individual release events at higher temporal resolution than other extant methods. The approach also accommodates data with low signal-to-noise ratios and those with substantial overlaps between events. We demonstrated the algorithm’s efficacy on paired whole-cell electrode recordings and synthetic data of high complexity. Using the algorithm to align EPSCs, we characterized their kinetics in a parameter-free way. Combining this approach with maximum-entropy deconvolution, we were able to identify independent release events in complex records at a temporal resolution of less than 250 µs. We determined that the increase in total postsynaptic current associated with depolarization of the presynaptic cell stems primarily from an increase in the rate of EPSCs rather than an increase in their amplitude. Finally, we found that fluctuations owing to postsynaptic receptor kinetics and experimental noise, as well as the model dependence of the deconvolution process, explain our inability to observe quantized peaks in histograms of EPSC amplitudes from physiological recordings.

Andor-Ardo, Daniel; Keen, Erica C.; Hudspeth, A. J.; Magnasco, Marcelo O.

2012-01-01

404

A computational tool for ionosonde CADI's ionogram analysis  

NASA Astrophysics Data System (ADS)

The purpose of this work is to present a new computational tool for ionogram generated with a Canadian Advanced Digital Ionosonde (CADI). This new tool uses the fuzzy relation paradigm to identify the F trace and from this form extract the parameters foF2, h'F, and hpF2. The tool was very extensively tested with ionosondes that operate at low latitudes and near the equatorial region. The ionograms used in this work were recorded at São José dos Campos (23.2° S, 45.9° W; dip latitude 17.6° S) and Palmas (10.2° S, 48.2° W; dip latitude 5.5° S). These automatically extracted ionospheric parameters were compared with those obtained manually and a good agreement was found. The developed tool will greatly expedite and standardize ionogram processing. Therefore, this new tool will facilitate exchange of information among many groups that operate ionosondes of the CADI type, and will be very helpful for space weather purposes.

Pillat, Valdir Gil; Guimarães, Lamartine Nogueira Frutuoso; Fagundes, Paulo Roberto; da Silva, José Demísio Simões

2013-03-01

405

pathFinder: A Static Network Analysis Tool for Pharmacological Analysis of Signal Transduction Pathways  

NSDL National Science Digital Library

The study of signal transduction is becoming a de facto part of the analysis of gene expression and protein profiling techniques. Many online tools are used to cluster genes in various ways or to assign gene products to signal transduction pathways. Among these, pathFinder is a unique tool that can find signal transduction pathways between first, second, or nth messengers and their targets within the cell. pathFinder can identify qualitatively all possible signal transduction pathways connecting any starting component and target within a database of two-component pathways (directional dyads). One or more intermediate pathway components can be excluded to simulate the use of pharmacological inhibitors or genetic deletion (knockout). Missing elements in a pathway connecting the activator or initiator and target can also be inferred from a null pathway result. The value of this static network analysis tool is illustrated by the predication from pathFinder analysis of a novel cyclic AMP–dependent, protein kinase A–independent signaling pathway in neuroendocrine cells, which has been experimentally confirmed.

Babru B. Samal (NIH;National Institute of Mental Health--Intramural Research Programs (NIMH-IRP) Bioinformatics Core REV); Lee E. Eiden (NIH;Section on Molecular Neuroscience REV)

2008-08-05

406

Affinity-based target deconvolution of safranal  

PubMed Central

Background and the purpose of the study Affinity-based target deconvolution is an emerging method for the identification of interactions between drugs/drug candidates and cellular proteins, and helps to predict potential activities and side effects of a given compound. In the present study, we hypothesized that a part of safranal pharmacological effects, one of the major constituent of Crocus sativus L., relies on its physical interaction with target proteins. Methods Affinity chromatography solid support was prepared by covalent attachment of safranal to agarose beads. After passing tissue lysate through the column, safranal-bound proteins were isolated and separated on SDS-PAGE or two-dimensional gel electrophoresis. Proteins were identified using MALDI-TOF/TOF mass spectrometry and Mascot software. Results and major conclusion Data showed that safranal physically binds to beta actin, cytochrome b-c1 complex sub-unit 1, trifunctional enzyme sub-unit beta and ATP synthase sub-unit alpha and beta. These interactions may explain part of safranal’s pharmacological effects. However, phenotypic and/or biological relevance of these interactions remains to be elucidated by future pharmacological studies.

2013-01-01

407

Mammographic image restoration using maximum entropy deconvolution  

NASA Astrophysics Data System (ADS)

An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization.

Jannetta, A.; Jackson, J. C.; Kotre, C. J.; Birch, I. P.; Robson, K. J.; Padgett, R.

2004-11-01

408

Mammographic image restoration using maximum entropy deconvolution.  

PubMed

An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization. PMID:15584533

Jannetta, A; Jackson, J C; Kotre, C J; Birch, I P; Robson, K J; Padgett, R

2004-11-01

409

Development and Distribution of Data Analysis Reduction Tools: the ISOPHOT Interactive Analysis Case  

NASA Astrophysics Data System (ADS)

The different interactive data analysis software packages (``xIA'': CIA, LIA, PIA, SIA) developed for the ISO instruments were conceived primarily as calibration tools. However, all of them were used more and more for astronomical data reduction throughout the mission, imposing new requirements on user friendliness, documentation, distribution, compatibility and user help systems. The strategies followed in the case of the ISOPHOT Interactive Analysis as well as the achieved experiences, valuable in every current and future observatory, will be discussed in this paper.

Gabriel, C.

410

Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).  

PubMed

This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings. PMID:18406925

Rath, Frank

2008-01-01

411

Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)  

SciTech Connect

This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

Rath, Frank [Department of Engineering Professional Development, University of Wisconsin-Madison, Madison, WI (United States)], E-mail: rath@engr.wisc.edu

2008-05-01

412

Performance Analysis of Available Bandwidth Estimation Tools for Grid Networks  

Microsoft Academic Search

Modern large-scale grid computing for processing advanced science and engineering applications relies on geographically distributed clusters. In such highly distributed environments, estimating the available bandwidth between clusters is a key issue for task scheduling. We analyze the performance of two well known available bandwidth estimation tools, pathload and abget, with the aim of using them in grid environments. Our experiments

Daniel M. Batista; Luciano J. Chaves; Nelson L. S. da Fonseca; Artur Ziviani

2009-01-01

413

Regional energy planning through SWOT analysis and strategic planning tools  

Microsoft Academic Search

Strategic planning processes, which are commonly used as a tool for region development and territorial structuring, can be harnessed by politicians and public administrations, at the local level, to redesign the regional energy system and encourage renewable energy development and environmental preservation. In this sense, the province of Jaén, a southern Spanish region whose economy is mainly based on olive

J. Terrados; G. Almonacid; L. Hontoria

2007-01-01

414

QV and PV curves as a planning tool of analysis  

Microsoft Academic Search

This paper deals with the problem of system voltage security. In this sense, load margin, QV curve and system loss reduction are focused. The idea is to use these tools in the planning scenario to determine the best locations for installation of distributed generation . For this purpose, from a base case, the system load margin and its losses are

Pablo Guimaraes; Ubaldo Fernandez; Tito Ocariz; Fritz W. Mohn; A. C. Zambroni de Souza

2011-01-01

415

Cognitive Bargaining Model: An Analysis Tool for Third Party Incentives.  

National Technical Information Service (NTIS)

Although threats and punishments have historically been the more prevalent tools of U.S. foreign policy, the current U.S. administration is signaling a reorientation toward a more positive inducement strategy. Much is written on incentives, but few have t...

B. C. Busch

2009-01-01

416

Usability tool for analysis of web designs using mouse tracks  

Microsoft Academic Search

This paper presents MouseTrack as a web logging system that tracks mouse movements on websites. The system includes a visualization tool that displays the mouse cursor path followed by website visitors. It helps web site administrators run usability tests and analyze the collected data. Practitioners can track any existing webpage by simply entering its URL. This paper includes a design

Ernesto Arroyo; Ted Selker; Willy Wei

2006-01-01

417

Protected marine reserves as fisheries management tools: a bioeconomic analysis  

Microsoft Academic Search

This paper develops a dynamic computational bioeconomic model with the objective of assessing protected marine reserves as fisheries management tools. Data on the North East Atlantic cod stock are used to determine the bioeconomically optimal size of a marine reserve for the Barents Sea cod fishery, as a function of the net transfer rate between the protected and unprotected areas

Ussif Rashid Sumaila

1998-01-01

418

PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis  

ERIC Educational Resources Information Center

This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

Waycott, Jenny; Jones, Ann; Scanlon, Eileen

2005-01-01

419

OutbreakTools: A new platform for disease outbreak analysis using the R software  

PubMed Central

The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks.

Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Hohle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

2014-01-01

420

OutbreakTools: A new platform for disease outbreak analysis using the R software.  

PubMed

The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

Jombart, Thibaut; Aanensen, David M; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Höhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

2014-06-01

421

Reference-Free XRF - Principle, Calibrated Instrumentation and Spectra Deconvolution  

NASA Astrophysics Data System (ADS)

The Physikalisch-Technische Bundesanstalt operates its own laboratory1 at the electron storage ring BESSY II in Berlin. One major task of this laboratory, hosting the departments Radiometry and X-ray Metrology with Synchrotron Radiation, is the use of well-defined synchrotron radiation for calibration of different types of detectors in the spectral range from UV/VUV to the harder X-ray range. Well-known radiation sources in conjunction with calibrated instrumentation are used for X-ray fluorescence analysis (XRF) allowing for completely reference-free quantification. Here, the XRF spectra deconvolution with experimentally determined detector response functions and the further improvement of it by using line- sets for each subshell of an involved element has been developed. Synchrotron radiation originating from a bending magnet can be partially seen as an equivalent to the solar emission spectrum in the soft and hard X-ray range. Using different electron energies in the storage ring of BESSY II as well as of PTB's own Metrology Light Source (MLS), different parts of the solar spectrum can be approximated allowing for complementary simulations of excitation conditions for XRF remote sensing of planetary surfaces.

Kolbe, M.; Beckhoff, B.; Mantler, M.

2010-03-01

422

A hidden-state Markov model for cell population deconvolution.  

PubMed

Microarrays measure gene expression typically from a mixture of cell populations during different stages of a biological process. However, the specific effects of the distinct or pure populations on measured gene expression are difficult or impossible to determine. The ability to deconvolve measured gene expression into the contributions from pure populations is critical to maximizing the potential of microarray analysis for investigating complex biological processes. In this paper, we describe a novel approach called the multinomial hidden Markov model (MHMM) that produces: (i) a maximum a posteriori estimate of the fraction represented by each pure population and (ii) gene expression values for each pure population. Our method uses an unsupervised, probabilistic approach for handling missing data points and clusters genes based on expression in pure populations. MHMM, used with several yeast datasets, identified statistically significant temporal dynamics. This method, unlike the linear decomposition models used previously for deconvolution, can extract information from different types of data, does not require a priori identification of pure gene expression, exploits the temporal nature of time series data, and is less affected by missing data. PMID:17238843

Roy, Sushmita; Lane, Terran; Allen, Chris; Aragon, Anthony D; Werner-Washburne, Margaret

2006-12-01

423

Maximum a posteriori deconvolution of ultrasonic signals using multiple transducers  

PubMed

A new method for deconvolution of ultrasonic pulse-echo measurements employing multiple-transducer setup is proposed in the paper. An optimal way of estimating the material reflection sequence for a linear signal generation model using maximum a posteriori estimation is proposed. The method combines the measurements from a number of transducers covering different frequency bands yielding an optimal estimate of the reflection sequence. The main idea of this approach is to complement the information unavailable from one transducer in some frequency bands with the information from the other transducers. The method is based on the assumption that the measurements are performed using transducers with identical apertures and apodization, which are located exactly at the same position relative to the test object during the measurement. An error analysis presented in the paper proves that when the above assumptions are fulfilled, the proposed method, by utilizing more data for estimation, consistently yields more accurate reflection sequence estimates than the classical Wiener filter. Experimental evidence is presented using both simulated and real ultrasonic data as a verification of the correctness of the multiple-transducer model and the estimation scheme. An illustration of the advantages of the method is also given using real ultrasonic data. PMID:10875373

Olofsson; Stepinski

2000-06-01

424

Improved Cell Typing by Charge-State Deconvolution of matrix-assisted laser desorption/ionization Mass Spectra  

SciTech Connect

Robust, specific, and rapid identification of toxic strains of bacteria and viruses, to guide the mitigation of their adverse health effects and optimum implementation of other response actions, remains a major analytical challenge. This need has driven the development of methods for classification of microorganisms using mass spectrometry, particularly matrix-assisted laser desorption ionization MS (MALDI) that allows high throughput analyses with minimum sample preparation. We describe a novel approach to cell typing based on pattern recognition of MALDI spectra, which involves charge-state deconvolution in conjunction with a new correlation analysis procedure. The method is applicable to both prokaryotic and eukaryotic cells. Charge-state deconvolution improves the quantitative reproducibility of spectra because multiply-charged ions resulting from the same biomarker attaching a different number of protons are recognized and their abundances are combined. This allows a clearer distinction of bacterial strains or of cancerous and normal liver cells. Improved class distinction provided by charge-state deconvolution was demonstrated by cluster spacing on canonical variate score charts and by correlation analyses. Deconvolution may enhance detection of early disease state or therapy progress markers in various tissues analyzed by MALDI.

Wilkes, Jon G.; Buzantu, Dan A.; Dare, Diane J.; Dragan, Yvonne P.; Chiarelli, M. Paul; Holland, Ricky D.; Beaudoin, Michael; Heinze, Thomas M.; Nayak, Rajesh; Shvartsburg, Alexandre A.

2006-05-30

425

Minimum variance image blending for robust ultrasound image deconvolution  

NASA Astrophysics Data System (ADS)

When the ultrasound wave propagates the human body, its velocity and attenuation change at each region, which make the PSF shape different. To solve the PSF estimation problem is ill-posed case and rarely error free which produces the PSF estimation errors and make the image overblurred by the sidelobe artifacts. For the commercialization of the ultrasound deconvolution method, the robustness of the image deconvolution without artifacts is essential. There exist many minimum variance beamformer algorithms. It is robust to noise and shows high resolution efficiently. We consider the channel data as image pixel and we present a new spatial varying MV (minimum variance) blending scheme with the deconvolved imageges in the image processing domain. With a stochastic image blending of the deconvolution images, we obtain high resolution results which suppress the blur artifacts enough although the input deconvolution images have restoration errors. We verify our algorithm on the real data. In all the case, we can observe that the artifacts are suppressed and show the highest resolution among the deconvolution methods.

Park, Sungchan; Kang, Jooyoung; Kim, Yun-Tae; Kim, Kyuhong; Kim, Jung-Ho; Song, Jong Keun

2014-03-01

426

Forensic Analysis of Windows Hosts Using UNIX-based Tools  

SciTech Connect

Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

Cory Altheide

2004-07-19

427

Design tools for daylighting illumination and energy analysis  

SciTech Connect

The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

Selkowitz, S.

1982-07-01

428

Bayesian Networks: A Decision Tool to Improve Portfolio Risk Analysis  

Microsoft Academic Search

Abstract This paper demonstrates how Bayesian Networks can aid decisions ofindividual security analysts and portfolio managers. We present a decision tool to improve analysts’ forecasts, portfolio decision-making,and riskanalysis. Our paper is related to findings in behavioral finance that show buying and selling behavior that is consistent with biased decision-making. However, most behavioral finance literature is descriptive, not normative. Our goal

Riza Demirer; Ronald R. Mau; Catherine Shenoy

429

Energy life-cycle analysis modeling and decision support tool  

Microsoft Academic Search

As one of DOE`s five multi-program national laboratories, Pacific Northwest Laboratory (PNL) develops and deploys technology for national missions in energy and the environment. The Energy Information Systems Group, within the Laboratory`s Computer Sciences Department, focuses on the development of the computational and data communications infrastructure and automated tools for the Transmission and Distribution energy sector and for advanced process

M. Hoza; M. E. White

1993-01-01

430

The Mission Planning Lab: A Visualization and Analysis Tool  

NASA Technical Reports Server (NTRS)

Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

Daugherty, Sarah C.; Cervantes, Benjamin W.

2009-01-01

431

CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS  

EPA Science Inventory

Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

432

New Miscue Analysis: A Tool for Comprehending Reading.  

ERIC Educational Resources Information Center

New miscue analysis, which combines characteristics of cloze procedure with traditional miscue analysis, seems to overcome some of the limitations encountered in the traditional method. In new miscue analysis, subjects read selections below and at their level of reading ability and are not agitated by being asked to read material that is too…

Anderson, Jonathan

433

A freeware java tool for spatial point analysis of neuronal structures.  

PubMed

Spatial point analysis is an analytical approach towards understanding patterns in the distribution of single points, such as synapses. To aid in this type of analysis of neuronal structures, a freeware tool, called PAJ, has been developed. This Java-based tool takes 3D Cartesian coordinates as input and performs a range of analyses to test for underlying patterns. In addition, Monte Carlo analysis is performed to compare experimental input with randomized input. This tool should be especially useful in determining whether neuronal structures are spatially patterned such that individual units interact with each other. PMID:18350260

Condron, Barry G

2008-01-01

434

Applications of custom developed object based analysis tool: Precipitation in Pacific, Tropical cyclones precipitation, Hail areas  

NASA Astrophysics Data System (ADS)

In the last few years an object-based analysis software tool was developed at University of Ljubljana in collaboration with National Center for Atmospheric Research (NCAR). The tool was originally based on ideas of the Method for Object-Based Diagnostic Evaluation (MODE) developed by NCAR but has since evolved and changed considerably and is now available as a separate free software package. The software is called the Forward in Time object analysis tool (FiT tool). The software was used to analyze numerous datasets - mainly focusing on precipitation. Climatology of satellite and model precipitation in the low-and-mid latitude Pacific Ocean was performed by identifying and tracking of individual perception systems and estimating their lifespan, movement and size. A global climatology of tropical cyclone precipitation was performed using satellite data and tracking and analysis of areas with hail in Slovenia was performed using radar data. The tool will be presented along with some results of applications.

Skok, Gregor; Rakovec, Jože; Strajnar, Benedikt; Bacmeister, Julio; Tribbia, Joe

2014-05-01

435

Super-resolution thermal ghost imaging based on deconvolution  

NASA Astrophysics Data System (ADS)

The resolution of classical imaging is limited by the Rayleigh diffraction limit, whereas ghost imaging can overcome this diffraction limit and enhance the resolution. In this paper, we propose a super-resolution thermal ghost imaging scheme based on deconvolution, which can further improve the imaging resolution. Because the traditional thermal ghost imaging result is the convolution between the original image of the object and the correlation function, we can use deconvolution algorithm to decrease the effect of convolution and enhance the imaging resolution. Actually, the correlation function of ghost imaging system is just the point spread function (PSF) of classical imaging system. However, PSF is hard to obtain in classical imaging system, and it is easy to obtain in ghost imaging system. So deconvolution algorithm can be easily used in the ghost imaging system, and the imaging resolution can increases 2-3 times in practice.

Chen, Zhipeng; Shi, Jianhong; Li, Yuan; Li, Qing; Zeng, Guihua

2014-07-01

436

Factor analysis as a tool in groundwater quality management: two southern African case studies  

Microsoft Academic Search

Although developed as a tool in the social sciences, R-mode factor analysis, a multivariate statistical tool, has proven highly effective in studies of groundwater quality. The technique examines the relationships between variables (such as chemical parameters in groundwater), which are shown by a number of cases (such as sampling points). In this study, two examples are presented. The first is

David Love; Dieter Hallbauer; Amos Amos; Roumiana Hranova

2004-01-01

437

The Analysis of User Behaviour of a Network Management Training Tool using a Neural Network  

Microsoft Academic Search

A novel method for the analysis and interpretation of data that describes the interaction between trainee network managers and a network management training tool is presented. A simulation based approach is currently being used to train network managers, through the use of a simulated network. The motivation is to provide a tool for exposing trainees to a life like situation

Helen DONELAN; Colin PATTINSON; Dominic PALMER-BROWN

438

Ecotoxicological Mechanisms and Models in an Impact Analysis Tool for Oil Spills  

Microsoft Academic Search

In an international collaborative effort, an impact analysis tool is being developed to predict the effect of accidental oil spills on recruitment and production of Atlantic cod (Gadus morhua) in the Barents Sea. The tool consisted of three coupled ecological models that describe (1) plankton biomass dynamics, (2) cod larvae growth, and (3) fish stock dynamics. The discussions from a

Frederik De Laender; Gro Harlaug Olsen; Tone Frost; Bjørn Einar Grøsvik; Merete Grung; Bjørn Henrik Hansen; A. Jan Hendriks; Morten Hjorth; Colin R. Janssen; Chris Klok; Trond Nordtug; Mathijs Smit; JoLynn Carroll; Lionel Camus

2011-01-01

439

Army Sustainability Modelling Analysis and Reporting Tool Phase 1: User Manual and Results Interpretation Guide.  

National Technical Information Service (NTIS)

This report is designed to assist users of the Army Sustainability Modelling Analysis and Reporting Tool (A-SMART) in setting up input parameters and scenarios, running the models and interpreting the model outputs. A-SMART is a software tool under contra...

A. Roth J. Stewien M. Zucchi M. K. Richmond S. Miller

2009-01-01

440

Model Execution and Evaluation Tool: Current Status and Initial MM5 Ensemble Member Analysis Results.  

National Technical Information Service (NTIS)

A web-based mesoscale model analysis tool built with Java Server Pages is under development. With this tool a user is able to (1) run either the MM5 or WRF mesoscale model in stand alone mode or as an ensemble on a Linux cluster; (2) generate error statis...

S. F. Kirby

2003-01-01

441

An Overview of Some Automated Tools for Formal Analysis and Verification  

Microsoft Academic Search

The article presents several automated tools for performin g formal analysis and verification. The choice of the tools is restricted to the ones working on concurrent untimed discrete-event systems and exploiting state space searchi ng. For many years computer-aided simulation has been an important technique widely used in science as well as industry. What is more, this is still true

Tomas Vojnar

1998-01-01

442

GASP! A Standardized Performance Analysis Tool Interface for Global Address Space Programming Models  

Microsoft Academic Search

The global address space (GAS) programming model pro- vides important potential productivity advantages over traditional par- allel programming models. Languages us