Science.gov

Sample records for deconvolution analysis tool

  1. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  2. PVT Analysis With A Deconvolution Algorithm

    SciTech Connect

    Kouzes, Richard T.

    2011-02-01

    Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

  3. Two Photon Fluorescence Microscopy and Deconvolution Analysis

    NASA Astrophysics Data System (ADS)

    Nerbun, Claire; Berland, Keith

    2001-10-01

    In two photon fluorescence microscopy, diffraction phenomena cause image distortions. A point spread function (PSF) quantitatively describes such distortions. The PSF, in conjunction with Fourier transforms, is employed in deconvolution analysis in order to perform image restoration. In this experiment, the three dimensional PSF was measured using yellow green fluorescent spheres of diameter 0.11 microns. Wild type and butanol supersentive mutant 7 Saccharomyces Cerevisiae yeast cells labeled with rhodamine were three dimensionally imaged. Normal human fibroblast cells were also three dimensionally imaged. Inverse filter and modified residual norm steepest descent deconvolution algorithms were used to perform two dimensional image restorations. The yeast cells were then examined for any abnormal cell morphology specifically in the actin cytoskeleton. The human fibroblast cells labeled with rhodamine were examined for actin filament structures. Two-dimensional Gaussian profiles were modeled to the PSF to explore the impact of diffraction in two photon fluorescence microscopy. The results of this experiment and the model analyses will be presented.

  4. Poissonian image deconvolution with analysis sparsity priors

    NASA Astrophysics Data System (ADS)

    Fang, Houzhang; Yan, Luxin

    2013-04-01

    Deconvolving Poissonian image has been a significant subject in various application areas such as astronomical, microscopic, and medical imaging. In this paper, a regularization-based approach is proposed to solve Poissonian image deconvolution by minimizing the regularization energy functional, which is composed of the generalized Kullback-Leibler divergence as the data-fidelity term and sparsity prior constraints as the regularization term, and a non-negativity constraint. We consider two sparsity prior constraints which include framelet-based analysis prior and combination of framelet and total variation analysis priors. Furthermore, we show that the resulting minimization problems can be efficiently solved by the split Bregman method. The comparative experimental results including quantitative and qualitative analysis manifest that our algorithm can effectively remove blur, suppress noise, and reduce artifacts.

  5. MMAD: microarray microdissection with analysis of differences is a computational tool for deconvoluting cell type-specific contributions from tissue samples

    PubMed Central

    Liebner, David A.; Huang, Kun; Parvin, Jeffrey D.

    2014-01-01

    Background: One of the significant obstacles in the development of clinically relevant microarray-derived biomarkers and classifiers is tissue heterogeneity. Physical cell separation techniques, such as cell sorting and laser-capture microdissection, can enrich samples for cell types of interest, but are costly, labor intensive and can limit investigation of important interactions between different cell types. Results: We developed a new computational approach, called microarray microdissection with analysis of differences (MMAD), which performs microdissection in silico. Notably, MMAD (i) allows for simultaneous estimation of cell fractions and gene expression profiles of contributing cell types, (ii) adjusts for microarray normalization bias, (iii) uses the corrected Akaike information criterion during model optimization to minimize overfitting and (iv) provides mechanisms for comparing gene expression and cell fractions between samples in different classes. Computational microdissection of simulated and experimental tissue mixture datasets showed tight correlations between predicted and measured gene expression of pure tissues as well as tight correlations between reported and estimated cell fraction for each of the individual cell types. In simulation studies, MMAD showed superior ability to detect differentially expressed genes in mixed tissue samples when compared with standard metrics, including both significance analysis of microarrays and cell type-specific significance analysis of microarrays. Conclusions: We have developed a new computational tool called MMAD, which is capable of performing robust tissue microdissection in silico, and which can improve the detection of differentially expressed genes. MMAD software as implemented in MATLAB is publically available for download at http://sourceforge.net/projects/mmad/. Contact: david.liebner@gmail.com Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:24085566

  6. Chemometric Data Analysis for Deconvolution of Overlapped Ion Mobility Profiles

    NASA Astrophysics Data System (ADS)

    Zekavat, Behrooz; Solouki, Touradj

    2012-11-01

    We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution.

  7. Chemometric data analysis for deconvolution of overlapped ion mobility profiles.

    PubMed

    Zekavat, Behrooz; Solouki, Touradj

    2012-11-01

    We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution. PMID:22948903

  8. Importance of FTIR Spectra Deconvolution for the Analysis of Amorphous Calcium Phosphates

    NASA Astrophysics Data System (ADS)

    Brangule, Agnese; Agris Gross, Karlis

    2015-03-01

    This work will consider Fourier transform infrared spectroscopy - diffuse reflectance infrared reflection (FTIR-DRIFT) for collecting the spectra and deconvolution to identify changes in bonding as a means of more powerful detection. Spectra were recorded from amorphous calcium phosphate synthesized by wet precipitation, and from bone. FTIR-DRIFT was used to study the chemical environments of PO4, CO3 and amide. Deconvolution of spectra separated overlapping bands in the ʋ4PO4, ʋ2CO3, ʋ3CO3 and amide region allowing a more detailed analysis of changes at the atomic level. Amorphous calcium phosphate dried at 80 oC, despite showing an X-ray diffraction amorphous structure, displayed carbonate in positions resembling a carbonated hydroxyapatite. Additional peaks were designated as A1 type, A2 type or B type. Deconvolution allowed the separation of CO3 positions in bone from amide peaks. FTIR-DRIFT spectrometry in combination with deconvolution offers an advanced tool for qualitative and quantitative determination of CO3, PO4 and HPO4 and shows promise to measure the degree of order.

  9. Deconvolution Program

    Energy Science and Technology Software Center (ESTSC)

    1999-02-18

    The program is suitable for a lot of applications in applied mathematics, experimental physics, signal analytical system and some engineering applications range i.e. deconvolution spectrum, signal analysis and system property analysis etc.

  10. Multispectral imaging analysis: spectral deconvolution and applications in biology

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Ahmed, Wamiq; Bayraktar, Bulent; Rajwa, Bartek; Sturgis, Jennifer; Robinson, J. P.

    2005-03-01

    Multispectral imaging has been in use for over half a century. Owing to advances in digital photographic technology, multispectral imaging is now used in settings ranging from clinical medicine to industrial quality control. Our efforts focus on the use of multispectral imaging coupled with spectral deconvolution for measurement of endogenous tissue fluorophores and for animal tissue analysis by multispectral fluorescence, absorbance, and reflectance data. Multispectral reflectance and fluorescence images may be useful in evaluation of pathology in histological samples. For example, current hematoxylin/eosin diagnosis limits spectral analysis to shades of red and blue/grey. It is possible to extract much more information using multispectral techniques. To collect this information, a series of filters or a device such as an acousto-optical tunable filter (AOTF) or liquid-crystal filter (LCF) can be used with a CCD camera, enabling collection of images at many more wavelengths than is possible with a simple filter wheel. In multispectral data processing the "unmixing" of reflectance or fluorescence data and analysis and the classification based upon these spectra is required for any classification. In addition to multispectral techniques, extraction of topological information may be possible by reflectance deconvolution or multiple-angle imaging, which could aid in accurate diagnosis of skin lesions or isolation of specific biological components in tissue. The goal of these studies is to develop spectral signatures that will provide us with specific and verifiable tissue structure/function information. In addition, relatively complex classification techniques must be developed so that the data are of use to the end user.

  11. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    PubMed Central

    2010-01-01

    Background Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results. PMID:20429911

  12. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer studies is critically discussed, where special emphasis is set on evaluating different data processing strategies on the example of enriched stable Sr isotopes.1 The analytical key parameters such as blank (Kr, Sr and Rb), variation of the natural Sr isotopic composition in the sample, mass bias, interferences (Rb) and total combined uncertainty are considered. A full metrological protocol for data processing using IPD is presented based on data gained during two transgenerational marking studies of fish, where the transfer of a Sr isotope double spike (84Sr and 86Sr) from female spawners of common carp (Cyprinus carpio L.) and brown trout (Salmo trutta f.f.)2 to the centre of the otoliths of their offspring was studied by (LA)-MC-ICP-MS. 1J. Irrgeher, A. Zitek, M. Cervicek and T. Prohaska, J. Anal. At. Spectrom., 2014, 29, 193-200. 2A. Zitek, J. Irrgeher, M. Kletzl, T. Weismann and T. Prohaska, Fish. Manage. Ecol., 2013, 20, 654-361.

  13. Improving the precision of fMRI BOLD signal deconvolution with implications for connectivity analysis.

    PubMed

    Bush, Keith; Cisler, Josh; Bian, Jiang; Hazaroglu, Gokce; Hazaroglu, Onder; Kilts, Clint

    2015-12-01

    An important, open problem in neuroimaging analyses is developing analytical methods that ensure precise inferences about neural activity underlying fMRI BOLD signal despite the known presence of confounds. Here, we develop and test a new meta-algorithm for conducting semi-blind (i.e., no knowledge of stimulus timings) deconvolution of the BOLD signal that estimates, via bootstrapping, both the underlying neural events driving BOLD as well as the confidence of these estimates. Our approach includes two improvements over the current best performing deconvolution approach; 1) we optimize the parametric form of the deconvolution feature space; and, 2) we pre-classify neural event estimates into two subgroups, either known or unknown, based on the confidence of the estimates prior to conducting neural event classification. This knows-what-it-knows approach significantly improves neural event classification over the current best performing algorithm, as tested in a detailed computer simulation of highly-confounded fMRI BOLD signal. We then implemented a massively parallelized version of the bootstrapping-based deconvolution algorithm and executed it on a high-performance computer to conduct large scale (i.e., voxelwise) estimation of the neural events for a group of 17 human subjects. We show that by restricting the computation of inter-regional correlation to include only those neural events estimated with high-confidence the method appeared to have higher sensitivity for identifying the default mode network compared to a standard BOLD signal correlation analysis when compared across subjects. PMID:26226647

  14. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  15. Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function

    SciTech Connect

    Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

    1987-06-01

    A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

  16. Application of spectral deconvolution and inverse mechanistic modelling as a tool for root cause investigation in protein chromatography.

    PubMed

    Brestrich, Nina; Hahn, Tobias; Hubbuch, Jürgen

    2016-03-11

    In chromatographic protein purification, process variations, aging of columns, or processing errors can lead to deviations of the expected elution behavior of product and contaminants and can result in a decreased pool purity or yield. A different elution behavior of all or several involved species leads to a deviating chromatogram. The causes for deviations are however hard to identify by visual inspection and complicate the correction of a problem in the next cycle or batch. To overcome this issue, a tool for root cause investigation in protein chromatography was developed. The tool combines a spectral deconvolution with inverse mechanistic modelling. Mid-UV spectral data and Partial Least Squares Regression were first applied to deconvolute peaks to obtain the individual elution profiles of co-eluting proteins. The individual elution profiles were subsequently used to identify errors in process parameters by curve fitting to a mechanistic chromatography model. The functionality of the tool for root cause investigation was successfully demonstrated in a model protein study with lysozyme, cytochrome c, and ribonuclease A. Deviating chromatograms were generated by deliberately caused errors in the process parameters flow rate and sodium-ion concentration in loading and elution buffer according to a design of experiments. The actual values of the three process parameters and, thus, the causes of the deviations were estimated with errors of less than 4.4%. Consequently, the established tool for root cause investigation is a valuable approach to rapidly identify process variations, aging of columns, or processing errors. This might help to minimize batch rejections or contribute to an increased productivity. PMID:26879457

  17. A further analysis for the minimum-variance deconvolution filter performance

    NASA Technical Reports Server (NTRS)

    Chi, Chong-Yung

    1987-01-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  18. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    NASA Astrophysics Data System (ADS)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60C and 80C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the activating position of the main Si-O-T (where T is Al/Si) stretching band in the FTIR spectrum, which gives an indication of the relative changes in the Si/Al ratio. Also, the effect of the cation and silicate concentration in the activating solution has been discussed using the Fourier self deconvolution technique.

  19. Radionuclide quantitation of left-to right cardiac shunts using deconvolution analysis: concise communication

    SciTech Connect

    Ham, H.R.; Dobbeleir, A.; Virat, P.; Piepsz, A.; Lenaers, A.

    1981-08-01

    Quantitative radionuclide angiocardiography (QRAC) was performed with and without deconvolution analysis (DA) in 87 children with various heart disorders. QRAC shunt quantitation was possible without DA in 70% of the cases and with DA in 95%. Among 21 patients with prolonged bolus injections, quantitation of the shunt was possible in 52% of the cases without DA an in all cases with DA. Correlation between oximetry and QRAC with DA was better than between oximetry and QRAC without DA. It is concluded that QRAC with DA is a more reliable, noninvasive means for detection and quantitation of left-to-right cardiac shunts than QRAC without DA.

  20. Investigation of the CLEAN deconvolution method for use with Late Time Response analysis of multiple objects

    NASA Astrophysics Data System (ADS)

    Hutchinson, Simon; Taylor, Christopher T.; Fernando, Michael; Andrews, David; Bowring, Nicholas

    2014-10-01

    This paper investigates the application of the CLEAN non-linear deconvolution method to Late Time Response (LTR) analysis for detecting multiple objects in Concealed Threat Detection (CTD). When an Ultra-Wide Band (UWB) frequency radar signal is used to illuminate a conductive target, surface currents are induced upon the object which in turn give rise to LTR signals. These signals are re-radiated from the target and the results from a number of targets are presented. The experiment was performed using double ridged horn antenna in a pseudo-monostatic arrangement. A Vector network analyser (VNA) has been used to provide the UWB Frequency Modulated Continuous Wave (FMCW) radar signal. The distance between the transmitting antenna and the target objects has been kept at 1 metre for all the experiments performed and the power level at the VNA was set to 0dBm. The targets in the experimental setup are suspended in air in a laboratory environment. Matlab has been used in post processing to perform linear and non-linear deconvolution of the signal. The Wiener filter, Fast Fourier Transform (FFT) and Continuous Wavelet Transform (CWT) are used to process the return signals and extract the LTR features from the noise clutter. A Generalized Pencil-of-Function (GPOF) method was then used to extract the complex poles of the signal. Artificial Neural Networks (ANN) and Linear Discriminant Analysis (LDA) have been used to classify the data.

  1. MORESANE: MOdel REconstruction by Synthesis-ANalysis Estimators. A sparse deconvolution algorithm for radio interferometric imaging

    NASA Astrophysics Data System (ADS)

    Dabbech, A.; Ferrari, C.; Mary, D.; Slezak, E.; Smirnov, O.; Kenyon, J. S.

    2015-04-01

    Context. Recent years have been seeing huge developments of radio telescopes and a tremendous increase in their capabilities (sensitivity, angular and spectral resolution, field of view, etc.). Such systems make designing more sophisticated techniques mandatory not only for transporting, storing, and processing this new generation of radio interferometric data, but also for restoring the astrophysical information contained in such data. Aims.In this paper we present a new radio deconvolution algorithm named MORESANEand its application to fully realistic simulated data of MeerKAT, one of the SKA precursors. This method has been designed for the difficult case of restoring diffuse astronomical sources that are faint in brightness, complex in morphology, and possibly buried in the dirty beam's side lobes of bright radio sources in the field. Methods.MORESANE is a greedy algorithm that combines complementary types of sparse recovery methods in order to reconstruct the most appropriate sky model from observed radio visibilities. A synthesis approach is used for reconstructing images, in which the synthesis atoms representing the unknown sources are learned using analysis priors. We applied this new deconvolution method to fully realistic simulations of the radio observations of a galaxy cluster and of an HII region in M 31. Results.We show that MORESANE is able to efficiently reconstruct images composed of a wide variety of sources (compact point-like objects, extended tailed radio galaxies, low-surface brightness emission) from radio interferometric data. Comparisons with the state of the art algorithms indicate that MORESANE provides competitive results in terms of both the total flux/surface brightness conservation and fidelity of the reconstructed model. MORESANE seems particularly well suited to recovering diffuse and extended sources, as well as bright and compact radio sources known to be hosted in galaxy clusters.

  2. Fried deconvolution

    NASA Astrophysics Data System (ADS)

    Gilles, Jérôme; Osher, Stanley

    2012-06-01

    In this paper we present a new approach to deblur the effect of atmospheric turbulence in the case of long range imaging. Our method is based on an analytical formulation, the Fried kernel, of the atmosphere modulation transfer function (MTF) and a framelet based deconvolution algorithm. An important parameter is the refractive index structure which requires specific measurements to be known. Then we propose a method which provides a good estimation of this parameter from the input blurred image. The final algorithms are very easy to implement and show very good results on both simulated blur and real images.

  3. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  4. Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis.

    PubMed

    Murase, Kenya; Miyazaki, Shohei

    2007-05-21

    We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (F(p)), mean capillary transit time (T(c)), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate F(p) values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated F(p) values on T(c), PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the F(p) values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while T(c) did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent. PMID:17473352

  5. Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis

    NASA Astrophysics Data System (ADS)

    Murase, Kenya; Miyazaki, Shohei

    2007-05-01

    We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (Fp), mean capillary transit time (Tc), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate Fp values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated Fp values on Tc, PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the Fp values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while Tc did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent.

  6. Demand Response Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be usedmore » by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.« less

  7. Demand Response Analysis Tool

    SciTech Connect

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be used by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.

  8. AIRY-LN: an ad-hoc numerical tool for deconvolution of images from the LBT instrument LINC-NIRVANA

    NASA Astrophysics Data System (ADS)

    Desider, Gabriele; La Camera, Andrea; Boccacci, Patrizia; Bertero, Mario; Carbillet, Marcel

    2008-07-01

    LINC-NIRVANA (LN) is the German-Italian Fizeau beam combiner for the Large Binocular Telescope (LBT), composed of two 8.4-m apertures on a unique mount. It will provide multiple images of the same astrophysical target corresponding to different orientations of the 22.8-m maximum baseline. Starting from the already existing Sofware Package AIRY (a set of IDL-based modules developed within the CAOS "system" and dedicated to simulation and/or deconvolution of single or multiple images), an ad-hoc version has been especially designed for the data that will be obtained with LN. In this paper, we present the resulting Software Package AIRY-LN. Its capabilities, including quick-look methods, methods for specific classes of astronomical objects, PSF extraction, and a blind deconvolution algorithm are detailed. An IDL-licence-free (by means of the IDL Virtual Machine) and observer-oriented version of the whole package (with pre-setted LN image processing parameters) is also presented.

  9. Motion correction of PET brain images through deconvolution: I. Theoretical development and analysis in software simulations

    NASA Astrophysics Data System (ADS)

    Faber, T. L.; Raghunath, N.; Tudorascu, D.; Votaw, J. R.

    2009-02-01

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. Existing correction methods that use known patient motion obtained from tracking devices either require multi-frame acquisitions, detailed knowledge of the scanner, or specialized reconstruction algorithms. A deconvolution algorithm has been developed that alleviates these drawbacks by using the reconstructed image to estimate the original non-blurred image using maximum likelihood estimation maximization (MLEM) techniques. A high-resolution digital phantom was created by shape-based interpolation of the digital Hoffman brain phantom. Three different sets of 20 movements were applied to the phantom. For each frame of the motion, sinograms with attenuation and three levels of noise were simulated and then reconstructed using filtered backprojection. The average of the 20 frames was considered the motion blurred image, which was restored with the deconvolution algorithm. After correction, contrast increased from a mean of 2.0, 1.8 and 1.4 in the motion blurred images, for the three increasing amounts of movement, to a mean of 2.5, 2.4 and 2.2. Mean error was reduced by an average of 55% with motion correction. In conclusion, deconvolution can be used for correction of motion blur when subject motion is known.

  10. Rupture behaviors of the 2011 Tohoku earthquake and its strongest foreshock through an empirical Green's function deconvolution analysis

    NASA Astrophysics Data System (ADS)

    Wen, Yi-Ying

    2014-02-01

    An empirical Green's function (EGF) deconvolution analysis was applied to study the source characteristics of the 2011 Mw 9.0 Tohoku and 2011 Mw 7.4 Sanriku-Oki earthquakes. For the 2011 Tohoku earthquake, we demonstrate that nucleation released weak but high-frequency energy and that the rupture propagated downward and sped toward the deep region after the up-dip slip extended to the trench. Moreover, the 2011 Sanriku-Oki earthquake results and a previous study on the 1994 Sanriku-Oki earthquake suggest that large earthquakes in the subduction zone around the Tohoku area prefer to rapidly rupture toward the deeper (down-dip) region.

  11. Radionuclide quantitation of left-to-right cardiac shunts using deconvolution analysis: concise communication. [Tc-99m

    SciTech Connect

    Ham, H.R.; Dobbeleir, A.; Viart, P.; Piepsz, A; Lenaers, A.

    1981-08-01

    Quantitative radionuclide angiocardiography (QRAC) was performed with and without deconvolution analysis (DA) in 87 children with various heart disorders. QRAC shunt quantitation was possible without DA in 70% of the cases and with DA in 95%. Among 21 patients with prolonged bolus injection, quantitation of the shunt was possible in 52% of the cases without DA and in all cases with DA. Correlation between oximetry and QRAC with DA was better than between oximetry and QRAC without DA. It is concluded that QRAC with DA is a more reliable, noninvasive means for detection and quantitation of left-to-right cardiac shunts than QRAC without DA.

  12. Streaming Multiframe Deconvolutions on GPUs

    NASA Astrophysics Data System (ADS)

    Lee, M. A.; Budavri, T.

    2015-09-01

    Atmospheric turbulence distorts all ground-based observations, which is especially detrimental to faint detections. The point spread function (PSF) defining this blur is unknown for each exposure and varies significantly over time, making image analysis difficult. Lucky imaging and traditional co-adding throws away lots of information. We developed blind deconvolution algorithms that can simultaneously obtain robust solutions for the background image and all the PSFs. It is done in a streaming setting, which makes it practical for large number of big images. We implemented a new tool that runs of GPUs and achieves exceptional running times that can scale to the new time-domain surveys. Our code can quickly and effectively recover high-resolution images exceeding the quality of traditional co-adds. We demonstrate the power of the method on the repeated exposures in the Sloan Digital Sky Survey's Stripe 82.

  13. ATAMM analysis tool

    NASA Technical Reports Server (NTRS)

    Jones, Robert; Stoughton, John; Mielke, Roland

    1991-01-01

    Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

  14. TSVD analysis of Euler deconvolution to improve estimating magnetic source parameters: An example from the sele area, Sweden

    NASA Astrophysics Data System (ADS)

    Beiki, Majid

    2013-03-01

    In this paper, I introduce a new approach based on truncated singular value decomposition (TSVD) analysis for improving implementation of grid-based Euler deconvolution with constraints of quasi 2D magnetic sources. I will show that by using TSVD analysis of the gradient matrix of magnetic field anomaly (reduced to pole) for data points located within a square window centered at the maximum of the analytic signal amplitude, we are able to estimate the strike direction and dip angle of 2D structures from the acquired eigenvectors. It is also shown that implementation of the standard grid-based Euler deconvolution can be considerably improved by solving the Euler's homogeneity equation for source location and structural index, simultaneously, using the TSVD method. The dimensionality of the magnetic anomalies can be indicated from the ratio between the smallest and intermediate eigenvalues acquired from the TSVD analysis of the gradient matrix. For 2D magnetic sources, the uncertainty of the estimated source location and structural index is significantly reduced by truncating the smallest eigenvalue. Application of the method is demonstrated on an aeromagnetic data set from the sele area in Sweden. The geology of this area is dominated by several dike swarms. For these dolerite dikes, the introduced method has provided useful information of strike directions and dip angles in addition to the estimated source location and structural index.

  15. MS-DIAL: Data Independent MS/MS Deconvolution for Comprehensive Metabolome Analysis

    PubMed Central

    Tsugawa, Hiroshi; Cajka, Tomas; Kind, Tobias; Ma, Yan; Higgins, Brendan; Ikeda, Kazutaka; Kanazawa, Mitsuhiro; VanderGheynst, Jean; Fiehn, Oliver; Arita, Masanori

    2015-01-01

    Data-independent acquisition (DIA) in liquid chromatography tandem mass spectrometry (LC-MS/MS) provides more comprehensive untargeted acquisition of molecular data. Here we provide an open-source software pipeline, MS-DIAL, to demonstrate how DIA improves simultaneous identification and quantification of small molecules by mass spectral deconvolution. For reversed phase LC-MS/MS, our program with an enriched LipidBlast library identified total 1,023 lipid compounds from nine algal strains to highlight their chemotaxonomic relationships. PMID:25938372

  16. Combining deconvolution and noise analysis for the estimation of transmitter release rates at the calyx of held.

    PubMed

    Neher, E; Sakaba, T

    2001-01-15

    The deconvolution method has been used in the past to estimate release rates of synaptic vesicles, but it cannot be applied to synapses where nonlinear interactions of quanta occur. We have extended this method to take into account a nonlinear current component resulting from the delayed clearance of glutamate from the synaptic cleft. We applied it to the calyx of Held and verified the important assumption of constant miniature EPSC (mEPSC) size by combining deconvolution with a variant of nonstationary fluctuation analysis. We found that amplitudes of mEPSCs decreased strongly after extended synaptic activity. Cyclothiazide (CTZ), an inhibitor of glutamate receptor desensitization, eliminated this reduction, suggesting that postsynaptic receptor desensitization occurs during strong synaptic activity at the calyx of Held. Constant mEPSC sizes could be obtained in the presence of CTZ and kynurenic acid (Kyn), a low-affinity blocker of AMPA-receptor channels. CTZ and Kyn prevented postsynaptic receptor desensitization and saturation and also minimized voltage-clamp errors. Therefore, we conclude that in the presence of these drugs, release rates at the calyx of Held can be reliably estimated over a wide range of conditions. Moreover, the method presented should provide a convenient way to study the kinetics of transmitter release at other synapses. PMID:11160425

  17. Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis

    SciTech Connect

    Greenberg, M.; Ebel, D.S.

    2009-03-19

    We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

  18. Physics analysis tools

    SciTech Connect

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

  19. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  20. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    NASA Astrophysics Data System (ADS)

    Xuan, Chuang; Oda, Hirokuni

    2015-12-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  1. Configuration Analysis Tool

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1983-01-01

    Configuration Analysis Tool (CAT), is information storage and report generation system for aid of configuration management activities. Configuration management is discipline composed of many techniques selected to track and direct evolution of complex systems. CAT is interactive program that accepts, organizes and stores information pertinent to specific phases of project.

  2. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  3. PCard Data Analysis Tool

    SciTech Connect

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes, Email status (date sent, response), audit resolution.

  4. PCard Data Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes,more » Email status (date sent, response), audit resolution.« less

  5. Graphical Contingency Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore » the best action by interactively evaluate candidate actions.« less

  6. Excitation pulse deconvolution in luminescence lifetime analysis for oxygen measurements in vivo.

    PubMed

    Mik, Egbert G; Donkersloot, Cornelis; Raat, Nicolaas J H; Ince, Can

    2002-07-01

    Oxygen-dependent quenching of phosphorescence has been proven to be a valuable tool for the measurement of oxygen concentrations both in vitro and in vivo. For biological measurements the relatively long lifetimes of phosphorescence have promoted time-domain-based devices using xenon arc flashlamps as the most common excitation light source. The resulting complex form of the excitation pulse leads to complications in the analysis of phosphorescence lifetimes and ultimately to errors in the recovered pO2 values. Although the problem has been recognized, the consequences on in vivo phosphorescence lifetime measurements have been neglected so far. In this study, the consequences of finite excitation flash duration are analyzed using computer simulations, and a method for the recovery of phosphorescence decay times from complex photometric signals is presented. The analysis provides an explanation as to why different calibration constants are reported in the literature and presents a unified explanation whereby calibration constants are not solely a property of the dye but also of the measuring device. It is concluded that complex excitation pulse patterns without appropriate analysis methods lead to device-specific calibration constants and nonlinearity and can be a potent source of errors when applied in vivo. The method of analysis presented in this article allows reliable phosphorescence lifetime measurements to be made for oxygen pressure measurements and can easily be applied to existing phosphorimeters. PMID:12126302

  7. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  8. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  9. Stack Trace Analysis Tool

    SciTech Connect

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  10. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  11. Stack Trace Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2008-01-16

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallel application. STAT uses the MRNet free based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to form a single call prefix tree.more »The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence classes. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  12. Stack Trace Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree.more »The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  13. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  14. Convolution-deconvolution in DIGES

    SciTech Connect

    Philippacopoulos, A.J.; Simos, N.

    1995-05-01

    Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities.

  15. Neutron multiplicity analysis tool

    SciTech Connect

    Stewart, Scott L

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This program was developed to help speed the analysis of Monte Carlo neutron transport simulation (MCNP) data, and only requires the count-rate data to calculate the mass of material using INCC's analysis methods instead of the full neutron multiplicity distribution required to run analysis in INCC. This paper describes what is implemented within EXCOM, including the methods used, how the program corrects for deadtime, and how uncertainty is calculated. This paper also describes how to use EXCOM within Excel.

  16. Geodetic Strain Analysis Tool

    NASA Technical Reports Server (NTRS)

    Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

    2011-01-01

    A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

  17. Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis.

    PubMed

    Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M

    2014-01-01

    The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

  18. Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis

    PubMed Central

    Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

    2015-01-01

    The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

  19. Deconvolution filtering: Temporal smoothing revisited

    PubMed Central

    Bush, Keith; Cisler, Josh

    2014-01-01

    Inferences made from analysis of BOLD data regarding neural processes are potentially confounded by multiple competing sources: cardiac and respiratory signals, thermal effects, scanner drift, and motion-induced signal intensity changes. To address this problem, we propose deconvolution filtering, a process of systematically deconvolving and reconvolving the BOLD signal via the hemodynamic response function such that the resultant signal is composed of maximally likely neural and neurovascular signals. To test the validity of this approach, we compared the accuracy of BOLD signal variants (i.e., unfiltered, deconvolution filtered, band-pass filtered, and optimized band-pass filtered BOLD signals) in identifying useful properties of highly confounded, simulated BOLD data: (1) reconstructing the true, unconfounded BOLD signal, (2) correlation with the true, unconfounded BOLD signal, and (3) reconstructing the true functional connectivity of a three-node neural system. We also tested this approach by detecting task activation in BOLD data recorded from healthy adolescent girls (control) during an emotion processing task. Results for the estimation of functional connectivity of simulated BOLD data demonstrated that analysis (via standard estimation methods) using deconvolution filtered BOLD data achieved superior performance to analysis performed using unfiltered BOLD data and was statistically similar to well-tuned band-pass filtered BOLD data. Contrary to band-pass filtering, however, deconvolution filtering is built upon physiological arguments and has the potential, at low TR, to match the performance of an optimal band-pass filter. The results from task estimation on real BOLD data suggest that deconvolution filtering provides superior or equivalent detection of task activations relative to comparable analyses on unfiltered signals and also provides decreased variance over the estimate. In turn, these results suggest that standard preprocessing of the BOLD signal ignores significant sources of noise that can be effectively removed without damaging the underlying signal. PMID:24768215

  20. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  1. Effect of ultra-strong static magnetic field on bacteria: application of Fourier-transform infrared spectroscopy combined with cluster analysis and deconvolution.

    PubMed

    Hu, Xing; Qiu, Zunan; Wang, Yerui; She, Zichao; Qian, Guangren; Ren, Zhongming

    2009-09-01

    A new method based on Fourier-transform infrared (FTIR) spectroscopy combined with cluster analysis and deconvolution was established to investigate the biological effect of an ultra-strong static magnetic field (SMF) of 10.0 T on Escherichia coli and Staphylococcus aureus. FTIR spectroscopy was applied to characterize the spectroscopic fingerprints of these bacterial cells with or without the treatment of the SMF. After the calculation, the results of cluster analysis indicated that the SMF had significant effects on E. coli compared with S. aureus, and the effects were reflected by the changes of spectral region of 1500-1200 cm(-1). The deconvolution results of this major indication region showed that the composition and conformation of nucleic acid, protein, and fatty acid of E. coli were altered under the magnetic conditions. PMID:19441065

  2. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  3. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

  4. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  5. Automated deconvolution of overlapped ion mobility profiles.

    PubMed

    Brantley, Matthew; Zekavat, Behrooz; Harper, Brett; Mason, Rachel; Solouki, Touradj

    2014-10-01

    Presence of unresolved ion mobility (IM) profiles limits the efficient utilization of IM mass spectrometry (IM-MS) systems for isomer differentiation. Here, we introduce an automated ion mobility deconvolution (AIMD) computer software for streamlined deconvolution of overlapped IM-MS profiles. AIMD is based on a previously reported post-IM/collision-induced dissociation (CID) deconvolution approach [J. Am. Soc. Mass Spectrom. 23, 1873 (2012)] and, unlike the previously reported manual approach, it does not require resampling of post-IM/CID data. A novel data preprocessing approach is utilized to improve the accuracy and efficiency of the deconvolution process. Results from AIMD analysis of overlapped IM profiles of data from (1) Waters Synapt G1 for a binary mixture of isomeric peptides (amino acid sequences: GRGDS and SDGRG) and (2) Waters Synapt G2-S for a binary mixture of isomeric trisaccharides (raffinose and isomaltotriose) are presented. PMID:25096279

  6. Automated Deconvolution of Overlapped Ion Mobility Profiles

    NASA Astrophysics Data System (ADS)

    Brantley, Matthew; Zekavat, Behrooz; Harper, Brett; Mason, Rachel; Solouki, Touradj

    2014-10-01

    Presence of unresolved ion mobility (IM) profiles limits the efficient utilization of IM mass spectrometry (IM-MS) systems for isomer differentiation. Here, we introduce an automated ion mobility deconvolution (AIMD) computer software for streamlined deconvolution of overlapped IM-MS profiles. AIMD is based on a previously reported post-IM/collision-induced dissociation (CID) deconvolution approach [ J. Am. Soc. Mass Spectrom. 23, 1873 (2012)] and, unlike the previously reported manual approach, it does not require resampling of post-IM/CID data. A novel data preprocessing approach is utilized to improve the accuracy and efficiency of the deconvolution process. Results from AIMD analysis of overlapped IM profiles of data from (1) Waters Synapt G1 for a binary mixture of isomeric peptides (amino acid sequences: GRGDS and SDGRG) and (2) Waters Synapt G2-S for a binary mixture of isomeric trisaccharides (raffinose and isomaltotriose) are presented.

  7. Elevated growth hormone secretory rate in premature infants: deconvolution analysis of pulsatile growth hormone secretion in the neonate.

    PubMed

    Wright, N M; Northington, F J; Miller, J D; Veldhuis, J D; Rogol, A D

    1992-09-01

    Premature infants have higher circulating concentrations of growth hormone (GH) than term infants. Previous investigations of these differences have used sampling frequencies of every 30 min with subsequent application of pulse detection algorithms, such as the CLUSTER program, to assess serum GH pulse parameters. To determine differences in GH secretory rates or GH t1/2 values between premature and term infants, we have sampled 11 neonates at 15-min intervals. We performed deconvolution analysis of the resultant plasma GH values to estimate GH secretory and clearance parameters. Five premature infants (gestational age range 24-34 wk) and six term infants (gestational age range 38-42 wk) were sampled every 15 min for 6 h. All subjects had indwelling arterial catheters. GH was measured (in duplicate) by RIA using 10 microL of plasma. Premature infants had higher secretory burst amplitudes (2.2 +/- 0.13 micrograms/L/min versus 1.4 +/- 0.27 micrograms/L/min, p = 0.02), higher production rates (product of the total number of bursts and the mean mass of GH secreted per burst, 811 +/- 173 micrograms/L/6 h versus 283 +/- 77 micrograms/L/6 h, p = 0.03), and a higher mass of GH per secretory burst (106 +/- 25 micrograms/L versus 38 +/- 11 micrograms/L, p = 0.049) than term infants. The integrated plasma GH concentration exhibited a strong trend toward a higher value in the premature infants (18,100 +/- 800 micrograms/L versus 10,200 +/- 2,700 micrograms/L, p = 0.067).(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1408463

  8. Digital deconvolution filter derived from linear discriminant analysis and application for multiphoton fluorescence microscopy.

    PubMed

    Sullivan, Shane Z; Schmitt, Paul D; Muir, Ryan D; DeWalt, Emma L; Simpson, Garth J

    2014-04-01

    A digital filter derived from linear discriminant analysis (LDA) is developed for recovering impulse responses in photon counting from a high speed photodetector (rise time of ~1 ns) and applied to remove ringing distortions from impedance mismatch in multiphoton fluorescence microscopy. Training of the digital filter was achieved by defining temporally coincident and noncoincident transients and identifying the projection within filter-space that best separated the two classes. Once trained, data analysis by digital filtering can be performed quickly. Assessment of the reliability of the approach was performed through comparisons of simulated voltage transients, in which the ground truth results were known a priori. The LDA filter was also found to recover deconvolved impulses for single photon counting from highly distorted ringing waveforms from an impedance mismatched photomultiplier tube. The LDA filter was successful in removing these ringing distortions from two-photon excited fluorescence micrographs and through data simulations was found to extend the dynamic range of photon counting by approximately 3 orders of magnitude through minimization of detector paralysis. PMID:24559143

  9. Blind decorrelation and deconvolution algorithm for multiple-input multiple-output system: II. Analysis and simulation

    NASA Astrophysics Data System (ADS)

    Chen, Da-Ching; Yu, Tommy; Yao, Kung; Pottie, Gregory J.

    1999-11-01

    For single-input multiple-output (SIMO) systems blind deconvolution based on second-order statistics has been shown promising given that the sources and channels meet certain assumptions. In our previous paper we extend the work to multiple-input multiple-output (MIMO) systems by introducing a blind deconvolution algorithm to remove all channel dispersion followed by a blind decorrelation algorithm to separate different sources from their instantaneous mixture. In this paper we first explore more details embedded in our algorithm. Then we present simulation results to show that our algorithm is applicable to MIMO systems excited by a broad class of signals such as speech, music and digitally modulated symbols.

  10. Unsupervised Blind Deconvolution

    NASA Astrophysics Data System (ADS)

    Baena-Galle, R.; Kann, L.; Mugnier, L.; Gudimetla, R.; Johnson, R.; Gladysz, S.

    2013-09-01

    "Blind" deconvolution is rarely executed blindly. All available methods have parameters which the user fine-tunes until the most visually-appealing reconstruction is achieved. The "art" of deconvolution is to find constraints which allow for the best estimate of an object to be recovered, but in practice these parameterized constraints often reduce deconvolution to the struggle of trial and error. In the course of AFOSR-sponsored activities we are developing a general maximum a posteriori framework for the problem of imaging through atmospheric turbulence, with the emphasis on multi-frame blind deconvolution. Our aim is to develop deconvolution strategy which is reference-less, i.e. no calibration PSF is required, extendable to longer exposures, and applicable to imaging with adaptive optics. In the first part of the project the focus has been on developing a new theory of statistics of images taken through turbulence, both with-, and without adaptive optics. Images and their Fourier transforms have been described as random phasor sums, their fluctuations controlled by wavefront "cells" and moments of the phase. The models were validated using simulations and real data from the 3.5m telescope at the Starfire Optical Range in New Mexico. Another important ingredient of the new framework is the capability to estimate the average PSF automatically from the target observations. A general approach, applicable to any type of object, has been proposed. Here use is made of an object-cancelling transformation of the image sequence. This transformation yields information about the atmospheric PSF. Currently, the PSF estimation module and the theoretical constraints on PSF variability are being incorporated into multi-frame blind deconvolution. In preliminary simulation tests we obtained significantly sharper images with respect to the starting observations and PSF estimates which closely track the input kernels. Thanks to access to the SOR 3.5m telescope we are now testing our deconvolution approach on images of real, extended objects. Adaptive-optics-assisted I-band observations at SOR rarely exceed Strehl ratios of 15% and therefore, in many cases, deconvolution post adaptive optics would be necessary for object identification.

  11. Tiling Microarray Analysis Tools

    SciTech Connect

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)

  12. Tiling Microarray Analysis Tools

    Energy Science and Technology Software Center (ESTSC)

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more » 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  13. Sandia PUF Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics suchmore » as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less

  14. Sandia PUF Analysis Tool

    SciTech Connect

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  15. Comparison of environmental TLD (thermoluminescent dosimeter) results obtained using glow curve deconvolution and region of interest analysis

    SciTech Connect

    Not Available

    1987-01-01

    We tested a Harshaw Model 4000 TLD Reader in the Sandia Environmental TLD Program. An extra set of LiF TLD-700 chips were prepared for each field location and calibration level. At the end of quarter one, half of the TLDs were read on the Model 4000 and the other half were read on our standard Harshaw Model 2000. This presentation compares the results of the two systems. The Model 4000 results are reported for two regions of interest and for background subtraction using Harshaw Glow Curve Deconvolution Software.

  16. Wavespace-Based Coherent Deconvolution

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Cattafesta, Louis N., III

    2012-01-01

    Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

  17. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  18. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  19. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  20. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  1. Automatic interpretation of magnetic data based on Euler deconvolution with unprescribed structural index

    NASA Astrophysics Data System (ADS)

    Gerovska, Daniela; Arazo-Bravo, Marcos J.

    2003-10-01

    A tool for fully automatic magnetic data interpretation, solving Euler's homogeneity equation with unprescribed structural index and for a linear background in each moving window, is presented here. The implemented Euler deconvolution algorithm is based on the properties of the differential similarity transformation, which decouples the coordinates and the structural index of the singular point and the parameters of the linear background field. Since the deconvolution algorithm resolves the singular point locations well, this allows the application of a two stage clustering technique, focusing the estimated singular point coordinates and structural indices, followed by a statistical analysis of the final solutions. The automatic technique was tested on simple and complex 3D model magnetic anomalies. Finally, the technique was applied to real magnetic anomaly data from the Burgas region and the adjoining Black Sea shelf of Bulgaria. The tool consists of two main functions, written in Matlab v.5.3, requiring Matlab's SPLINE and STATISTICS toolkits.

  2. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  3. Windprofiler optimization using digital deconvolution procedures

    NASA Astrophysics Data System (ADS)

    Hocking, W. K.; Hocking, A.; Hocking, D. G.; Garbanzo-Salas, M.

    2014-10-01

    Digital improvements to data acquisition procedures used for windprofiler radars have the potential for improving the height coverage at optimum resolution, and permit improved height resolution. A few newer systems already use this capability. Real-time deconvolution procedures offer even further optimization, and this has not been effectively employed in recent years. In this paper we demonstrate the advantages of combining these features, with particular emphasis on the advantages of real-time deconvolution. Using several multi-core CPUs, we have been able to achieve speeds of up to 40 GHz from a standard commercial motherboard, allowing data to be digitized and processed without the need for any type of hardware except for a transmitter (and associated drivers), a receiver and a digitizer. No Digital Signal Processor chips are needed, allowing great flexibility with analysis algorithms. By using deconvolution procedures, we have then been able to not only optimize height resolution, but also have been able to make advances in dealing with spectral contaminants like ground echoes and other near-zero-Hz spectral contamination. Our results also demonstrate the ability to produce fine-resolution measurements, revealing small-scale structures within the backscattered echoes that were previously not possible to see. Resolutions of 30 m are possible for VHF radars. Furthermore, our deconvolution technique allows the removal of range-aliasing effects in real time, a major bonus in many instances. Results are shown using new radars in Canada and Costa Rica.

  4. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  5. Target deconvolution techniques in modern phenotypic profiling.

    PubMed

    Lee, Jiyoun; Bogyo, Matthew

    2013-02-01

    The past decade has seen rapid growth in the use of diverse compound libraries in classical phenotypic screens to identify modulators of a given process. The subsequent process of identifying the molecular targets of active hits, also called 'target deconvolution', is an essential step for understanding compound mechanism of action and for using the identified hits as tools for further dissection of a given biological process. Recent advances in 'omics' technologies, coupled with in silico approaches and the reduced cost of whole genome sequencing, have greatly improved the workflow of target deconvolution and have contributed to a renaissance of 'modern' phenotypic profiling. In this review, we will outline how both new and old techniques are being used in the difficult process of target identification and validation as well as discuss some of the ongoing challenges remaining for phenotypic screening. PMID:23337810

  6. Bayesian least squares deconvolution

    NASA Astrophysics Data System (ADS)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  7. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

  8. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  9. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  10. Climate Data Analysis Tools - (CDAT)

    NASA Astrophysics Data System (ADS)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware software, CDAT's focus is to allow climate researchers the ability to access and analyze multi-dimensional distributed climate datasets.

  11. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  12. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  13. Design and analysis tool validation

    SciTech Connect

    Judkoff, R.

    1981-07-01

    The Solar Energy Research Institute (SERI) is developing a procedure for the validation of Building Energy Analysis Simulation Codes (BEAS). These codes are being used increasingly in the building design process, both directly and as the basis for simplified design tools and guidelines. The importance of the validity of the BEAS in predicting building energy performance is obvious when one considers the money and energy that could be wasted by energy-inefficient designs. However, to date, little or no systematic effort has been made to ensure the validity of the various BEAS. The validation work at SERI consists of three distinct parts: Comparative Study, Analytical Verification, and Empirical Validation. The procedures have been developed for the first two parts and have been implemented on a sampling of the major BEAS; results have shown major problems in one of the BEAS tested. Furthermore, when one building design was run using several of the BEAS, large differences were found in the predicted annual cooling and heating loads. The empirical validation procedure has been developed, and five two-zone test cells have been constructed for validation; a summer validation run will take place as soon as the data acquisition system is completed. Additionally, a test validation exercise is now in progress using the low-cal house to fine-tune the empirical validation procedure and better define monitoring data requirements.

  14. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

  15. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  16. Tectonic Analysis of Esh El-Mallaha Area, Gulf of Suez Using Euler Deconvolution for Aeromagnetic Data

    NASA Astrophysics Data System (ADS)

    Aboud, E.; Ushijima, K.

    2004-05-01

    Esh El-Mallaha area is located on the western coast of the Gulf of Suez which is considered the main source of hydrocarbon resources in Egypt. The main exploration problem of the Gulf of Suez (and areas around) is the existence of the Pre-Miocene salt that masks the seismic energy and as a result, seismic method is not usually able to provide information about the subsurface structure. A solution may be existed using potential field methods such as magnetic which is highly sensitive to basement and not affected by salt. Herein, aeromagnetic data over Esh El-Mallaha area have been interpreted to provide a new look on the subsurface structure and tectonics of the area. This interpretation includes the application of Euler method which has been considered as a sufficient tool in magnetic interpretation. Comparing the results of Euler method with the available geologic data (wells, geologic maps), Euler method facilitates in identification of new faults as well as mapping of known faults from geologic information. Generally, the area is characterized by two basins structure trending in the NW-SE (parallel to the Gulf of Suez) direction. These two basins are separated by a high topographic feature (Esh El-Mallaha range) and bounded by faults of most probably normal type.

  17. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  18. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  19. Model Analysis ToolKit

    Energy Science and Technology Software Center (ESTSC)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  20. Model Analysis ToolKit

    SciTech Connect

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  1. Budget Risk & Prioritization Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-12-31

    BRPAtool performs the following: ?Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors ?Enables analysis of different budget scenarios ?Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks ?Real-time analysis ?Enables managers to determine the multipliers and where funding is best applied ?Promotes solid budget defense

  2. Deconvolution in a ridgelet and curvelet domain

    NASA Astrophysics Data System (ADS)

    Easley, Glenn R.; Berenstein, Carlos A.; Healy, Dennis M., Jr.

    2005-03-01

    We present techniques for performing image reconstruction based on deconvolution in the Radon domain. To deal with a variety of possible boundary conditions, we work with a corresponding generalized discrete Radon transform in order to obtain projection slices for deconvolution. By estimating the projections using wavelet techniques, we are able to do deconvolution directly in a ridgelet domain. We also show how this method can be carried out locally, so that deconvolution can be done in a curvelet domain as well. These techniques suggest a whole new paradigm for developing deconvolution algorithms, which can incorporate leading deconvolution schemes. We conclude by showing experimental results indicating that these new algorithms can significantly improve upon current leading deconvolution methods.

  3. 2010 Solar Market Transformation Analysis and Tools

    SciTech Connect

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  4. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  5. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  6. Wavelet-Based Image Deconvolution for Wide Field CCD Imagery

    NASA Astrophysics Data System (ADS)

    Merino, M. T.; Fors, O.; Cardinal, R.; Otazu, X.; Nez, J.; Hildebrand, A. R.

    2006-07-01

    We show how a wavelet-based image adaptive deconvolution algorithm can provide significant improvements in the analysis of wide-field CCD images. To illustrate it, we apply our deconvolution protocol to a set of images from a Baker-Nunn telescope. This f/1 instrument has an outstanding field of view of 4.4x4.4 with high optical quality offering unique properties to study our deconvolution process and results. In particular, we obtain an estimated gain in limiting magnitude of ?R0.6 mag and in limiting resolution of ??3.9 arcsec. These results increase the number of targets and the efficiency of the underlying scientific project.

  7. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  8. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  9. Photogrammetry Tool for Forensic Analysis

    NASA Technical Reports Server (NTRS)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  10. Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle

    NASA Astrophysics Data System (ADS)

    Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

    2015-05-01

    UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement.

  11. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  12. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  13. Performance analysis of GYRO: a tool evaluation

    NASA Astrophysics Data System (ADS)

    Worley, P.; Candy, J.; Carrington, L.; Huck, K.; Kaiser, T.; Mahinthakumar, G.; Malony, A.; Moore, S.; Reed, D.; Roth, P.; Shan, H.; Shende, S.; Snavely, A.; Sreepathi, S.; Wolf, F.; Zhang, Y.

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  14. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  15. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  16. Tools for Life Support Systems Analysis

    NASA Astrophysics Data System (ADS)

    Lange, K.; Ewert, M.

    An analysis of the optimum level of closure of a life support system is a complex task involving hundreds, if not thousands, of parameters. In the absence of complete data on candidate technologies and a complete definition of the mission architecture and requirements, many assumptions are necessary. Because of the large number of parameters, it is difficult to fully comprehend and compare studies performed by different analysts. The Systems Integration, Modeling, and Analysis (SIMA) Project Element within NASA's Advanced Life Support (ALS) Project has taken measures to improve this situation by issuing documents that define ALS requirements, baseline assumptions, and reference missions. As a further step to capture and retain available knowledge and to facilitate system-level studies, various software tools are being developed. These include a database tool for storing, organizing, and updating technology parameters, modeling tools for evaluating time-average and dynamic system performance, and sizing tools for estimating overall system mass, volume, power, cooling, logistics, and crew time. This presentation describes ongoing work on the development and integration of these tools for life support systems analysis.

  17. Dynamic contrast-enhanced CT of head and neck tumors: perfusion measurements using a distributed-parameter tracer kinetic model. Initial results and comparison with deconvolution-based analysis

    NASA Astrophysics Data System (ADS)

    Bisdas, Sotirios; Konstantinou, George N.; Sherng Lee, Puor; Thng, Choon Hua; Wagenblast, Jens; Baghi, Mehran; San Koh, Tong

    2007-10-01

    The objective of this work was to evaluate the feasibility of a two-compartment distributed-parameter (DP) tracer kinetic model to generate functional images of several physiologic parameters from dynamic contrast-enhanced CT data obtained of patients with extracranial head and neck tumors and to compare the DP functional images to those obtained by deconvolution-based DCE-CT data analysis. We performed post-processing of DCE-CT studies, obtained from 15 patients with benign and malignant head and neck cancer. We introduced a DP model of the impulse residue function for a capillary-tissue exchange unit, which accounts for the processes of convective transport and capillary-tissue exchange. The calculated parametric maps represented blood flow (F), intravascular blood volume (v1), extravascular extracellular blood volume (v2), vascular transit time (t1), permeability-surface area product (PS), transfer ratios k12 and k21, and the fraction of extracted tracer (E). Based on the same regions of interest (ROI) analysis, we calculated the tumor blood flow (BF), blood volume (BV) and mean transit time (MTT) by using a modified deconvolution-based analysis taking into account the extravasation of the contrast agent for PS imaging. We compared the corresponding values by using Bland-Altman plot analysis. We outlined 73 ROIs including tumor sites, lymph nodes and normal tissue. The Bland-Altman plot analysis revealed that the two methods showed an accepted degree of agreement for blood flow, and, thus, can be used interchangeably for measuring this parameter. Slightly worse agreement was observed between v1 in the DP model and BV but even here the two tracer kinetic analyses can be used interchangeably. Under consideration of whether both techniques may be used interchangeably was the case of t1 and MTT, as well as for measurements of the PS values. The application of the proposed DP model is feasible in the clinical routine and it can be used interchangeably for measuring blood flow and vascular volume with the commercially available reference standard of the deconvolution-based approach. The lack of substantial agreement between the measurements of vascular transit time and permeability-surface area product may be attributed to the different tracer kinetic principles employed by both models and the detailed capillary tissue exchange physiological modeling of the DP technique.

  18. A new scoring function for top-down spectral deconvolution

    DOE PAGESBeta

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore » that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less

  19. A new scoring function for top-down spectral deconvolution

    SciTech Connect

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showed that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.

  20. Accelerator physics analysis with interactive tools

    SciTech Connect

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties.

  1. Comparing Work Skills Analysis Tools. Project Report.

    ERIC Educational Resources Information Center

    Barker, Kathryn

    This document outlines the processes and outcomes of a research project conducted to review work skills analysis tools (products and/or services) that profile required job skills and/or assess individuals' acquired skills. The document begins with a brief literature review and discussion of pertinent terminology. Presented next is a list of

  2. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  3. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  4. From sensor networks to connected analysis tools

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the applications and web services developed so far as well as to be developed in the future.

  5. Possibilities of deconvolution of image sampling structures

    NASA Astrophysics Data System (ADS)

    Hozman, Jiri

    2006-03-01

    The paper deals with influence of the 2D sampling process upon image quality. The Optical Transfer Function (OTF), which is closely related to the Point Spread Function (PSF) of optical and electro-optical imaging systems, can be regarded as an objective measure of their quality. The main goal was the implementation of direct and blind deconvolution methods in MATLAB environment, in order to estimate these parameters and use them for computation of other characteristics, such as the Modulation Transfer Function (MTF) and the Phase Transfer Function (PTF). Relations between these functions are very useful in deriving the MTF for various geometrical shapes of elementary detectors of image sensors. This paper is focused on direct deconvolution by inverse and Wiener filtering, special focus is held on blind deconvolution using Iterative Blind Deconvolution (IBD), Simulated Annealing (SA) and Blind Deconvolution by Genetic Algorithm (BDGA). The whole process has been modeled in MATLAB. The Graphical User Interface (GUI) was also developed for setting of the deconvolution methods parameters. The parameter PSNR was also used for comparison and evaluation. The image deconvolution method based on the Blind Deconvolution by Genetic Algorithm seems to be very useful, especially from the point of view of computational requirement and results as well.

  6. Virtual tools for teaching electrocardiographic rhythm analysis.

    PubMed

    Criley, John Michael; Nelson, William P

    2006-01-01

    Electrocardiographic (ECG) rhythm analysis is inadequately taught in training programs, resulting in undertrained physicians reading a large percentage of the 40 million ECGs recorded annually. The effective use of simple tools (calipers, ruler, and magnifier) required for crucial measurements and comparisons of intervals requires considerable time for interactive instruction and is difficult to teach in the classroom. The ECGViewer (Blaufuss Medical Multimedia Laboratories, Palo Alto, Calif) program was developed using virtual tools easily manipulated by computer mouse that can be used to analyze archived scanned ECGs on computer screens and classroom projection. Trainees manipulate the on-screen tools from their seats by wireless mouse while the instructor makes corrections with a second mouse, in clear view of the trainees. An on-screen ladder diagram may be constructed by the trainee and critiqued by the instructor. The ECGViewer program has been successfully used and well received by trainees at medical school, residence, and subspecialty fellow level. PMID:16387064

  7. Fairing Separation Analysis Using SepTOOL

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Kosareo, Daniel N.

    2015-01-01

    This document describes the relevant equations programmed in spreadsheet software, SepTOOL, developed by ZIN Technologies, Inc. (ZIN) to determine the separation clearance between a launch vehicle payload fairing and remaining stages. The software uses closed form rigid body dynamic solutions of the vehicle in combination with flexible body dynamics of the fairing, which is obtained from flexible body dynamic analysis or from test data, and superimposes the two results to obtain minimum separation clearance for any given set of flight trajectory conditions. Using closed form solutions allows SepTOOL to perform separation calculations several orders of magnitude faster compared to numerical methods which allows users to perform real time parameter studies. Moreover, SepTOOL can optimize vehicle performance to minimize separation clearance. This tool can evaluate various shapes and sizes of fairings along with different vehicle configurations and trajectories. These geometries and parameters are inputted in a user friendly interface. Although the software was specifically developed for evaluating the separation clearance of launch vehicle payload fairings, separation dynamics of other launch vehicle components can be evaluated provided that aerodynamic loads acting on the vehicle during the separation event are negligible. This document describes the development of SepTOOL providing analytical procedure and theoretical equations whose implementation of these equations is not disclosed. Realistic examples are presented, and the results are verified with ADAMS (MSC Software Corporation) simulations. It should be noted that SepTOOL is a preliminary separation clearance assessment software for payload fairings and should not be used for final clearance analysis.

  8. Rock fracture characterization with GPR by means of deterministic deconvolution

    NASA Astrophysics Data System (ADS)

    Arosio, Diego

    2016-03-01

    In this work I address GPR characterization of rock fracture parameters, namely thickness and filling material. Rock fractures can generally be considered as thin beds, i.e., two interfaces whose separation is smaller than the resolution limit dictated by the Rayleigh's criterion. The analysis of the amplitude of the thin bed response in the time domain might permit to estimate fracture features for arbitrarily thin beds, but it is difficult to achieve and could be applied only to favorable cases (i.e., when all factors affecting amplitude are identified and corrected for). Here I explore the possibility to estimate fracture thickness and filling in the frequency domain by means of GPR. After introducing some theoretical aspects of thin bed response, I simulate GPR data on sandstone blocks with air- and water-filled fractures of known thickness. On the basis of some simplifying assumptions, I propose a 4-step procedure in which deterministic deconvolution is used to retrieve the magnitude and phase of the thin bed response in the selected frequency band. After deconvolved curves are obtained, fracture thickness and filling are estimated by means of a fitting process, which presents higher sensitivity to fracture thickness. Results are encouraging and suggest that GPR could be a fast and effective tool to determine fracture parameters in non-destructive manner. Further GPR experiments in the lab are needed to test the proposed processing sequence and to validate the results obtained so far.

  9. Vibration analysis as a predictive maintenance tool

    SciTech Connect

    Dischner, J.M.

    1995-09-01

    Vibration analysis is a powerful and effective tool in both predicting and isolating incipient fault conditions. Vibration can assist in the identification of root cause failure analysis and can be used to establish maintenance procedures on a condition assessment basis rather than a scheduled or calendar basis. Recent advances in technology allow for not only new types of testing to be performed, but when integrated with other types of machine information, can lead to even greater insight and accuracy of the entire predictive maintenance program. Case studies and recent findings will be presented along with a discussion of how vibration is used as an invaluable tool in the detection of defects in gearboxes, mill stands, and roll chatter detection and correction. Acceptable vibration criteria and cost benefit summaries will be included.

  10. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

  11. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sbastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Cline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  12. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  13. Deconvolution of wellbore pressure and flow rate

    SciTech Connect

    Kuchuk, F.J. ); Carter, R.G. . Langley Research Center); Ayestaran, L. )

    1990-03-01

    Determination of the influence function of a well/reservoir system from the deconvolution of wellbore flow rate and pressure is presented. Deconvolution is fundamental and is particularly applicable to system identification. A variety of different deconvolution algorithms are presented. The simplest algorithm is a direct method that works well for data without measurement noise but that fails in the presence of even small amounts of noise. The authors show, however, that a modified algorithm that imposes constraints on the solution set works well, even with significant measurement errors.

  14. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  15. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  16. Fast positive deconvolution of hyperspectral images.

    PubMed

    Henrot, Simon; Soussen, Charles; Brie, David

    2013-02-01

    In this brief, we provide an efficient scheme for performing deconvolution of large hyperspectral images under a positivity constraint, while accounting for spatial and spectral smoothness of the data. PMID:22955906

  17. GIS-based hydrogeochemical analysis tools (QUIMET)

    NASA Astrophysics Data System (ADS)

    Velasco, V.; Tubau, I.; Vzquez-Su, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

    2014-09-01

    A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Scheller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

  18. Minimum entropy deconvolution and blind equalisation

    NASA Technical Reports Server (NTRS)

    Satorius, E. H.; Mulligan, J. J.

    1992-01-01

    Relationships between minimum entropy deconvolution, developed primarily for geophysics applications, and blind equalization are pointed out. It is seen that a large class of existing blind equalization algorithms are directly related to the scale-invariant cost functions used in minimum entropy deconvolution. Thus the extensive analyses of these cost functions can be directly applied to blind equalization, including the important asymptotic results of Donoho.

  19. Integrated FDIR Analysis Tool for Space Applications

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni; Di Tommaso, Umberto

    2013-08-01

    The crucial role of health management in space applications has been the subject of many studies carried out by NASA and ESA and is held in high regard by Thales Alenia Space. The common objective is to improve reliability and availability of space systems. This paper will briefly illustrate the evolution of IDEHAS (IntegrateD Engineering Harness Avionics and Software), an advanced tool currently used in Thales Alenia Space - Italy in several space programs and recently enhanced to fully support FDIR (Fault Detection Isolation and Recovery) analysis. The FDIR analysis logic flow will be presented, emphasizing the improvements offered to Mission Support & Operations activities. Finally the benefits provided to the Company and a list of possible future enhancements will be given.

  20. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  1. RSAT 2011: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Defrance, Matthieu; Medina-Rivera, Alejandra; Sand, Olivier; Herrmann, Carl; Thieffry, Denis; van Helden, Jacques

    2011-07-01

    RSAT (Regulatory Sequence Analysis Tools) comprises a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. Thirteen new programs have been added to the 30 described in the 2008 NAR Web Software Issue, including an automated sequence retrieval from EnsEMBL (retrieve-ensembl-seq), two novel motif discovery algorithms (oligo-diff and info-gibbs), a 100-times faster version of matrix-scan enabling the scanning of genome-scale sequence sets, and a series of facilities for random model generation and statistical evaluation (random-genome-fragments, random-motifs, random-sites, implant-sites, sequence-probability, permute-matrix). Our most recent work also focused on motif comparison (compare-matrices) and evaluation of motif quality (matrix-quality) by combining theoretical and empirical measures to assess the predictive capability of position-specific scoring matrices. To process large collections of peak sequences obtained from ChIP-seq or related technologies, RSAT provides a new program (peak-motifs) that combines several efficient motif discovery algorithms to predict transcription factor binding motifs, match them against motif databases and predict their binding sites. Availability (web site, stand-alone programs and SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services): http://rsat.ulb.ac.be/rsat/. PMID:21715389

  2. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools applicable to the earliest stage of design, where more informed analysis of possible alternatives could yield the most benefit and the greatest cost savings both economic and environmental. This is where computer modeling and simulation can really lead to better and energy efficient buildings. Both apply to internal environment and human comfort, and environmental impact from surroundings.

  3. Multi-Mission Power Analysis Tool

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2011-01-01

    Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

  4. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  5. Blind source deconvolution for deep Earth seismology

    NASA Astrophysics Data System (ADS)

    Stefan, W.; Renaut, R.; Garnero, E. J.; Lay, T.

    2007-12-01

    We present an approach to automatically estimate an empirical source characterization of deep earthquakes recorded teleseismically and subsequently remove the source from the recordings by applying regularized deconvolution. A principle goal in this work is to effectively deblur the seismograms, resulting in more impulsive and narrower pulses, permitting better constraints in high resolution waveform analyses. Our method consists of two stages: (1) we first estimate the empirical source by automatically registering traces to their 1st principal component with a weighting scheme based on their deviation from this shape, we then use this shape as an estimation of the earthquake source. (2) We compare different deconvolution techniques to remove the source characteristic from the trace. In particular Total Variation (TV) regularized deconvolution is used which utilizes the fact that most natural signals have an underlying spareness in an appropriate basis, in this case, impulsive onsets of seismic arrivals. We show several examples of deep focus Fiji-Tonga region earthquakes for the phases S and ScS, comparing source responses for the separate phases. TV deconvolution is compared to the water level deconvolution, Tikenov deconvolution, and L1 norm deconvolution, for both data and synthetics. This approach significantly improves our ability to study subtle waveform features that are commonly masked by either noise or the earthquake source. Eliminating source complexities improves our ability to resolve deep mantle triplications, waveform complexities associated with possible double crossings of the post-perovskite phase transition, as well as increasing stability in waveform analyses used for deep mantle anisotropy measurements.

  6. Airborne LIDAR Data Processing and Analysis Tools

    NASA Astrophysics Data System (ADS)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  7. Inverse problems and image deconvolution

    NASA Astrophysics Data System (ADS)

    Lanteri, H.; Theys, C.

    Founding on a physical transformation process described by a Fredholm integral equation of the first kind, we first recall the main difficulties appearing in linear inverse problems in the continuous case as well as in the discrete case. We describe several situations corresponding to various properties of the kernel of the integral equation. The need to take into account the properties of the solution not contained in the model is then put in evidence. This leads to the regularization principles for which the classical point of view as well as the Bayesian interpretation are briefly reminded. We then focus on the problem of deconvolution specially applied to astronomical images. A complete model of image formation is described in Section4, and a general method allowing to derive image restoration algorithms, the Split Gradient Method (SGM), is detailed in Section5. We show in Section6, that when this method is applied to the likelihood maximization problems with positivity constraint, the ISRA algorithm can be recovered in the case of the pure Gaussian additive noise case, while in the case of pure Poisson noise, the well known EM, Richardson-Lucy algorithm is easily obtained. The method is then applied to the more realistic situation typical of CCD detectors: Poisson photo-conversion noise plus Gaussian readout noise, and to a new particular situation corresponding to data acquired with Low Light Level CCD. Some numerical results are exhibited in Section7 for these two last cases. Finally, we show how all these algorithms can be regularized in the context of the SGM and we give a general conclusion.

  8. Deconvolution of immittance data: some old and new methods

    SciTech Connect

    Tuncer, Enis; Macdonald, Ross J.

    2007-01-01

    The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of Kronig-Kramers transforms, a powerful nonlinear-least-squares inversion method, and a new Monte-Carlo inversion method. The important distinction, usually ignored, between discrete-point distributions and continuous ones is emphasized, and both recent parametric and non-parametric deconvolution/inversion procedures for frequency-response data are discussed and compared. Information missing in a recent parametric measurement-model deconvolution approach is pointed out and remedied, and its priority evaluated. Comparisons are presented between the standard parametric least squares inversion method and a new non-parametric Monte Carlo one that allows complicated composite distributions of relaxation times (DRT) to be accurately estimated without the uncertainty present with regularization methods. Also, detailed Monte-Carlo DRT estimates for the supercooled liquid 0.4Ca(NO) 0.6KNO3(CKN) at 350 K are compared with appropriate frequency-response-model fit results. These composite models were derived from stretched-exponential Kohlrausch temporal response with the inclusion of either of two different series electrode-polarization functions.

  9. Built Environment Analysis Tool: April 2013

    SciTech Connect

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  10. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    SciTech Connect

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  11. A blind deconvolution approach to ultrasound imaging.

    PubMed

    Yu, Chengpu; Zhang, Cishen; Xie, Lihua

    2012-02-01

    In this paper, a single-input multiple-output (SIMO) channel model is introduced for the deconvolution process of ultrasound imaging; the ultrasound pulse is the single system input and tissue reflectivity functions are the channel impulse responses. A sparse regularized blind deconvolution model is developed by projecting the tissue reflectivity functions onto the null space of a cross-relation matrix and projecting the ultrasound pulse onto a low-resolution space. In this way, the computational load is greatly reduced and the estimation accuracy can be improved because the proposed deconvolution model contains fewer variables. Subsequently, an alternating direction method of multipliers (ADMM) algorithm is introduced to efficiently solve the proposed blind deconvolution problem. Finally, the performance of the proposed blind deconvolution method is examined using both computer-simulated data and practical in vitro and in vivo data. The results show a great improvement in the quality of ultrasound images in terms of signal-to-noise ratio and spatial resolution gain. PMID:24626035

  12. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  13. ISHM Decision Analysis Tool: Operations Concept

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  14. Solar Array Verification Analysis Tool (SAVANT) Developed

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  15. Comparative analysis of pharmacophore screening tools.

    PubMed

    Sanders, Marijn P A; Barbosa, Arménio J M; Zarzycka, Barbara; Nicolaes, Gerry A F; Klomp, Jan P G; de Vlieg, Jacob; Del Rio, Alberto

    2012-06-25

    The pharmacophore concept is of central importance in computer-aided drug design (CADD) mainly because of its successful application in medicinal chemistry and, in particular, high-throughput virtual screening (HTVS). The simplicity of the pharmacophore definition enables the complexity of molecular interactions between ligand and receptor to be reduced to a handful set of features. With many pharmacophore screening softwares available, it is of the utmost interest to explore the behavior of these tools when applied to different biological systems. In this work, we present a comparative analysis of eight pharmacophore screening algorithms (Catalyst, Unity, LigandScout, Phase, Pharao, MOE, Pharmer, and POT) for their use in typical HTVS campaigns against four different biological targets by using default settings. The results herein presented show how the performance of each pharmacophore screening tool might be specifically related to factors such as the characteristics of the binding pocket, the use of specific pharmacophore features, and the use of these techniques in specific steps/contexts of the drug discovery pipeline. Algorithms with rmsd-based scoring functions are able to predict more compound poses correctly as overlay-based scoring functions. However, the ratio of correctly predicted compound poses versus incorrectly predicted poses is better for overlay-based scoring functions that also ensure better performances in compound library enrichments. While the ensemble of these observations can be used to choose the most appropriate class of algorithm for specific virtual screening projects, we remarked that pharmacophore algorithms are often equally good, and in this respect, we also analyzed how pharmacophore algorithms can be combined together in order to increase the success of hit compound identification. This study provides a valuable benchmark set for further developments in the field of pharmacophore search algorithms, e.g., by using pose predictions and compound library enrichment criteria. PMID:22646988

  16. HisTOOLogy: an open-source tool for quantitative analysis of histological sections.

    PubMed

    Magliaro, C; Tirella, A; Mattei, G; Pirone, A; Ahluwalia, A

    2015-12-01

    HisTOOLogy is an open-source software for the quantification of digital colour images of histological sections. The simple graphical user interface enables both expert and non-expert users to rapidly extract useful information from stained tissue sections. The software's main feature is a generalizable colour separation algorithm based on k-means clustering which accurately and reproducibly returns the amount of colour per unit area for any stain, thus allowing the quantification of tissue components. Here we describe HisTOOLogy's algorithms and graphical user interface structure, showing how it can be used to separate different dye colours in several classical stains. In addition, to demonstrate how the tool can be employed to obtain quantitative information on biological tissues, the effect of different hepatic tissue decellularization protocols on cell removal and matrix preservation was assessed through image analysis using HisTOOLogy and compared with conventional DNA and total protein content assays. HisTOOLogy's performance was also compared with ImageJ's colour deconvolution plug-in, demonstrating its advantages in terms of ease of use and speed of colour separation. PMID:26258893

  17. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A; Buescher, Kevin L; Armstrong, Jerawan C

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  18. Parallelization of a blind deconvolution algorithm

    NASA Astrophysics Data System (ADS)

    Matson, Charles L.; Borelli, Kathy J.

    2006-09-01

    Often it is of interest to deblur imagery in order to obtain higher-resolution images. Deblurring requires knowledge of the blurring function - information that is often not available separately from the blurred imagery. Blind deconvolution algorithms overcome this problem by jointly estimating both the high-resolution image and the blurring function from the blurred imagery. Because blind deconvolution algorithms are iterative in nature, they can take minutes to days to deblur an image depending how many frames of data are used for the deblurring and the platforms on which the algorithms are executed. Here we present our progress in parallelizing a blind deconvolution algorithm to increase its execution speed. This progress includes sub-frame parallelization and a code structure that is not specialized to a specific computer hardware architecture.

  19. Deconvolution of Thomson scattering temperature profiles

    SciTech Connect

    Scannell, R.; Beurskens, M.; Carolan, P. G.; Kirk, A.; Walsh, M.; Osborne, T. H.

    2011-05-15

    Deconvolution of Thomson scattering (TS) profiles is required when the gradient length of the electron temperature (T{sub e}) or density (n{sub e}) are comparable to the instrument function length ({Delta}{sub R}). The most correct method for deconvolution to obtain underlying T{sub e} and n{sub e} profiles is by consideration of scattered signals. However, deconvolution at the scattered signal level is complex since it requires knowledge of all spectral and absolute calibration data. In this paper a simple technique is presented where only knowledge of the instrument function I(r) and the measured profiles, T{sub e,observed}(r) and n{sub e,observed}(r), are required to obtain underlying T{sub e}(r) and n{sub e}(r). This method is appropriate for most TS systems and is particularly important where high spatial sampling is obtained relative to {Delta}{sub R}.

  20. Increasing axial resolution of 3D data sets using deconvolution algorithms.

    PubMed

    Topor, P; Zimanyi, M; Mateasik, A

    2011-09-01

    Deconvolution algorithms are tools for the restoration of data degraded by blur and noise. An incorporation of regularization functions into the iterative form of reconstruction algorithms can improve the restoration performance and characteristics (e.g. noise and artefact handling). In this study, algorithms based on Richardson-Lucy deconvolution algorithm are tested. The ability of these algorithms to improve axial resolution of three-dimensional data sets is evaluated on model synthetic data. Finally, unregularized Richardson-Lucy algorithm is selected for the evaluation and reconstruction of three-dimensional chromosomal data sets of Drosophila melanogaster. Problems concerning the reconstruction process are discussed and further improvements are proposed. PMID:21599665

  1. Space variant deconvolution for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Prashanth, R.; Bhattacharya, Shanti

    2011-12-01

    We present a method for mitigating space variant blur occurring in the images acquired using Optical coherence tomography (OCT). The effect of Gaussian beam divergence on the image resolution is analyzed mathematically to develop space dependent two dimensional point spread functions that define the blurring kernel. Two standard deconvolution algorithms are used to deblur the images using the space dependent point spread functions. We show that the deconvolution method is effective in improving the transverse resolution of cross sectional OCT images at regions up to several times as deep as the confocal region of the Gaussian beam.

  2. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  3. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  4. HANSIS software tool for the automated analysis of HOLZ lines.

    PubMed

    Holec, D; Sridhara Rao, D V; Humphreys, C J

    2009-06-01

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns. PMID:19375228

  5. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  6. Knowledge base navigator facilitating regional analysis inter-tool communication.

    SciTech Connect

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  7. Histogram deconvolution - An aid to automated classifiers

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  8. Calibration of Wide-Field Deconvolution Microscopy for Quantitative Fluorescence Imaging

    PubMed Central

    Lee, Ji-Sook; Wee, Tse-Luen (Erika); Brown, Claire M.

    2014-01-01

    Deconvolution enhances contrast in fluorescence microscopy images, especially in low-contrast, high-background wide-field microscope images, improving characterization of features within the sample. Deconvolution can also be combined with other imaging modalities, such as confocal microscopy, and most software programs seek to improve resolution as well as contrast. Quantitative image analyses require instrument calibration and with deconvolution, necessitate that this process itself preserves the relative quantitative relationships between fluorescence intensities. To ensure that the quantitative nature of the data remains unaltered, deconvolution algorithms need to be tested thoroughly. This study investigated whether the deconvolution algorithms in AutoQuant X3 preserve relative quantitative intensity data. InSpeck Green calibration microspheres were prepared for imaging, z-stacks were collected using a wide-field microscope, and the images were deconvolved using the iterative deconvolution algorithms with default settings. Afterwards, the mean intensities and volumes of microspheres in the original and the deconvolved images were measured. Deconvolved data sets showed higher average microsphere intensities and smaller volumes than the original wide-field data sets. In original and deconvolved data sets, intensity means showed linear relationships with the relative microsphere intensities given by the manufacturer. Importantly, upon normalization, the trend lines were found to have similar slopes. In original and deconvolved images, the volumes of the microspheres were quite uniform for all relative microsphere intensities. We were able to show that AutoQuant X3 deconvolution software data are quantitative. In general, the protocol presented can be used to calibrate any fluorescence microscope or image processing and analysis procedure. PMID:24688321

  9. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  10. Friction analysis between tool and chip

    NASA Astrophysics Data System (ADS)

    Wang, Min; Xu, Binshi; Zhang, Jiaying; Dong, Shiyun

    2010-12-01

    The elastic-plasticity mechanics are applied to analyze the friction between tool and chip. According to the slip-line field theory, a series of theoretical formula and the friction coefficient is derived between the tool and chip. So the cutting process can be investigated. Based on the Orthogonal Cutting Model and the Mohr's circle stress, the cutting mechanism of the cladding and the surface integrity of machining can be studied.

  11. TAFFYS: An Integrated Tool for Comprehensive Analysis of Genomic Aberrations in Tumor Samples

    PubMed Central

    Feng, Huanqing; Wang, Minghui

    2015-01-01

    Background Tumor single nucleotide polymorphism (SNP) array is a common platform for investigating the cancer genomic aberration and the functionally important altered genes. Original SNP array signals are usually corrupted by noise, and need to be de-convoluted into absolute copy number profile by analytical methods. Unfortunately, in contrast with the popularity of tumor Affymetrix SNP array, the methods that are specifically designed for this platform are still limited. The complicated characteristics of noise in signals is one of the difficulties for dissecting tumor Affymetrix SNP array data, as they inevitably blur the distinction between aberrations and create an obstacle for the copy number aberration (CNA) identification. Results We propose a tool named TAFFYS for comprehensive analysis of tumor Affymetrix SNP array data. TAFFYS introduce a wavelet-based de-noising approach and copy number-specific signal variance model for suppressing and modelling the noise in signals. Then a hidden Markov model is employed for copy number inference. Finally, by using the absolute copy number profile, statistical significance of each aberration region is calculated in term of different aberration types, including amplification, deletion and loss of heterozygosity (LOH). The result shows that copy number specific-variance model and wavelet de-noising algorithm fits well with the Affymetrix SNP array signals, leading to more accurate estimation for diluted tumor sample (even with only 30% of cancer cells) than other existed methods. Results of examinations also demonstrate a good compatibility and extensibility for different Affymetrix SNP array platforms. Application on the 35 breast tumor samples shows that TAFFYS can automatically dissect the tumor samples and reveal statistically significant aberration regions where cancer-related genes locate. Conclusions TAFFYS provide an efficient and convenient tool for identifying the copy number alteration and allelic imbalance and assessing the recurrent aberrations for the tumor Affymetrix SNP array data. PMID:26111017

  12. Software Tools for High-Throughput Analysis and Archiving of Immunohistochemistry Staining Data Obtained with Tissue Microarrays

    PubMed Central

    Liu, Chih Long; Prapong, Wijan; Natkunam, Yasodha; Alizadeh, Ash; Montgomery, Kelli; Gilks, C. Blake; van de Rijn, Matt

    2002-01-01

    The creation of tissue microarrays (TMAs) allows for the rapid immunohistochemical analysis of thousands of tissue samples, with numerous different antibodies per sample. This technical development has created a need for tools to aid in the analysis and archival storage of the large amounts of data generated. We have developed a comprehensive system for high-throughput analysis and storage of TMA immunostaining data, using a combination of commercially available systems and novel software applications developed in our laboratory specifically for this purpose. Staining results are recorded directly into an Excel worksheet and are reformatted by a novel program (TMA-Deconvoluter) into a format suitable for hierarchical clustering analysis or other statistical analysis. Hierarchical clustering analysis is a powerful means of assessing relatedness within groups of tumors, based on their immunostaining with a panel of antibodies. Other analyses, such as generation of survival curves, construction of Cox regression models, or assessment of intra- or interobserver variation, can also be done readily on the reformatted data. Finally, the immunoprofile of a specific case can be rapidly retrieved from the archives and reviewed through the use of Stainfinder, a novel web-based program that creates a direct link between the clustered data and a digital image database. An on-line demonstration of this system is available at http://genome-www.stanford.edu/TMA/explore.shtml. PMID:12414504

  13. Semi-blind nonstationary deconvolution: Joint reflectivity and Q estimation

    NASA Astrophysics Data System (ADS)

    Gholami, Ali

    2015-06-01

    Source signature deconvolution and attenuation or inverse quality factor- (Q-) filtering are two challenging problems in seismic data analysis which are used for extending the temporal bandwidth of the data. The separate estimates of the wavelet and, especially, the Earth Q model are by themselves problematic and add further uncertainties to inverse problems which are clearly ill-conditioned. The two problems are formulated in the framework of polynomial extrapolation and a closed form solution is provided based on the Lagrange interpolation. Analysis of the stability issue shows that the errors in the estimated results grow exponentially with both the problem size N and the inverse of Q. In order to circumvent both the instability and uncertainty of the Q model, these problems are addressed in a unified formulation as a semi-blind nonstationary deconvolution (SeND) to decompose the observed trace into the least number of nonstationary wavelets selected from a dictionary via a basis pursuit algorithm. The dictionary is constructed from the known source wavelet with different propagation times, each attenuated with a range of possible Q values. Using the Horner's rule, an efficient algorithm is also provided for application of the dictionary and its adjoint. SeND is an extension of the conventional sparse spike deconvolution to its nonstationary form, which provides the reflectivity and Q models simultaneously without requiring a-priori Q information. Assuming that the wavelet and attenuation mechanism are both known, the numerical data SeND allows to estimate both the original reflectivity and the Q models with higher accuracy, especially with respect to conventional spectral ratio techniques. The application of the algorithm to field data finally indicates a substantial improvement in temporal resolution on a seismic record.

  14. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  15. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  16. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  17. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  18. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their

  19. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  20. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own

  1. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  2. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A minimum of 4Mb of free RAM is highly recommended. The UNIX version of FEAT includes both FEAT v3.6 for the Macintosh and XFEAT. XFEAT is written in C-language for Sun series workstations running SunOS, SGI workstations running IRIX, DECstations running ULTRIX, and Intergraph workstations running CLIX version 6. It requires the MIT X Window System, Version 11 Revision 4, with OSF/Motif 1.1.3, and 16Mb of RAM. The standard distribution medium for FEAT 3.6 (Macintosh version) is a set of three 3.5 inch Macintosh format diskettes. The standard distribution package for the UNIX version includes the three FEAT 3.6 Macintosh diskettes plus a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format which contains XFEAT. Alternate distribution media and formats for XFEAT are available upon request. FEAT has been under development since 1990. Both FEAT v3.6 for the Macintosh and XFEAT v3.5 were released in 1993.

  3. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

  4. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  5. Millennial scale system impulse response of polar climates - deconvolution results between ? 18O records from Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Reischmann, E.; Yang, X.; Rial, J. A.

    2013-12-01

    Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, ? 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the periodicity of the thermo-haline circulation (THC). Simplified models of the polar climate fluctuations are shown to support these findings.

  6. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  7. Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 ?m) and regression tree analysis

    USGS Publications Warehouse

    Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus

    2013-01-01

    This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 ?m wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 ?m region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

  8. Estimating missing information by maximum likelihood deconvolution.

    PubMed

    Heintzmann, Rainer

    2007-01-01

    The ability of iteratively constrained maximum likelihood (ML) deconvolution to reconstruct out-of-band information is discussed and exemplified by simulations. The frequency dependent relative energy regain, a novel way of quantifying the reconstruction ability, is introduced. The positivity constraint of ML deconvolution allows reconstructing information outside the spatial frequency bandwidth which is set by the optical system. This is demonstrated for noise-free and noisy data. It is also shown that this property depends on the type of object under investigation. An object is constructed where no significant out-of-band reconstruction is possible. It is concluded that in practical situations the amount of possible out-of-band reconstruction depends on the agreement between reality and the model describing "typical objects" incorporated into the algorithm by appropriate penalty functions. PMID:16914319

  9. Blind Poissonian images deconvolution with framelet regularization.

    PubMed

    Fang, Houzhang; Yan, Luxin; Liu, Hai; Chang, Yi

    2013-02-15

    We propose a maximum a posteriori blind Poissonian images deconvolution approach with framelet regularization for the image and total variation (TV) regularization for the point spread function. Compared with the TV based methods, our algorithm not only suppresses noise effectively but also recovers edges and detailed information. Moreover, the split Bregman method is exploited to solve the resulting minimization problem. Comparative results on both simulated and real images are reported. PMID:23455078

  10. Deconvolution/identification techniques for nonnegative signals

    SciTech Connect

    Goodman, D.M.; Yu, D.R.

    1991-11-01

    Several methods for solving the nonparametric deconvolution/identification problem when the unknown is nonnegative are presented. First we consider the constrained least squares method and discuss three ways to estimate the regularization parameter: the discrepancy principle, Mallow`s C{sub L}, and generalized cross validation. Next we consider maximum entropy methods. Last, we present a new conjugate gradient algorithm. A preliminary comparison is presented; detailed Monte-Carlo experiments will be presented at the conference. 13 refs.

  11. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  12. Real-time multi-view deconvolution

    PubMed Central

    Schmid, Benjamin; Huisken, Jan

    2015-01-01

    Summary: In light-sheet microscopy, overall image content and resolution are improved by acquiring and fusing multiple views of the sample from different directions. State-of-the-art multi-view (MV) deconvolution simultaneously fuses and deconvolves the images in 3D, but processing takes a multiple of the acquisition time and constitutes the bottleneck in the imaging pipeline. Here, we show that MV deconvolution in 3D can finally be achieved in real-time by processing cross-sectional planes individually on the massively parallel architecture of a graphics processing unit (GPU). Our approximation is valid in the typical case where the rotation axis lies in the imaging plane. Availability and implementation: Source code and binaries are available on github (https://github.com/bene51/), native code under the repository gpu_deconvolution, Java wrappers implementing Fiji plugins under SPIM_Reconstruction_Cuda. Contact: bschmid@mpi-cbg.de or huisken@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26112291

  13. Sparse deconvolution of B-scan images.

    PubMed

    Olofsson, Tomas; Wennerström, Erik

    2007-08-01

    In this paper, a new computationally efficient sparse deconvolution algorithm for the use on B-scan images from objects with relatively few scattering targets is presented. It is based on a linear image formation model that has been used earlier in connection with linear minimum mean squared error (MMSE) two-dimensional (2-D) deconvolution. The MMSE deconvolution results have shown improved resolution compared to synthetic aperture focusing technique (SAFT), but at the cost of increased computation time. The proposed algorithm uses the sparsity of the image, reducing the degrees of freedom in the reconstruction problem, to reduce the computation time and to improve the resolution. The dominating task in the algorithm consists in detecting the set of active scattering targets, which is done by iterating between one up-dating pass that detects new points to include in the set, and a down-dating pass that removes redundant points. In the up-date, a spatiotemporal matched filter is used to isolate potential candidates. A subset of those are chosen using a detection criterion. The amplitudes of the detected scatterers are found by MMSE. The algorithm properties are illustrated using synthetic and real B-scan. The results show excellent resolution enhancement- and noise-suppression capabilities. The involved computation times are analyzed. PMID:17703667

  14. Klonos: A Similarity Analysis Based Tool for Software Porting

    Energy Science and Technology Software Center (ESTSC)

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  15. Deconvolution of mineral absorption bands - An improved approach

    NASA Technical Reports Server (NTRS)

    Sunshine, Jessica M.; Pieters, Carle M.; Pratt, Stephen F.

    1990-01-01

    Although visible and near IR reflectance spectra contain absorption bands that are characteristic of the composition and structure of the absorbing species, deconvolving a complex spectrum is nontrivial. An improved approach to spectral deconvolution is presented that accurately represents absorption bands as discrete mathematical distributions and resolves composite absorption features into individual absorption bands. The frequently used Gaussian model of absorption bands is shown to be inappropriate for the Fe(2+) electronic transition absorptions in pyroxene spectra. A modified Gaussian model is derived using a power law relationship of energy to average bond length. The modified Gaussian model is shown to provide an objective and consistent tool for deconvolving individual absorption bands in the more complex orthopyroxene, clinopyroxene, pyroxene mixtures, and olivine spectra.

  16. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  17. Self-Constrained Euler Deconvolution Using Potential Field Data of Different Altitudes

    NASA Astrophysics Data System (ADS)

    Zhou, Wenna; Nan, Zeyu; Li, Jiyan

    2016-02-01

    Euler deconvolution has been developed as almost the most common tool in potential field data semi-automatic interpretation. The structural index (SI) is a main determining factor of the quality of depth estimation. In this paper, we first present an improved Euler deconvolution method to eliminate the influence of SI using potential field data of different altitudes. The different altitudes data can be obtained by the upward continuation or can be directly obtained by the airborne measurement realization. Euler deconvolution at different altitudes of a certain range has very similar calculation equation. Therefore, the ratio of Euler equations of two different altitudes can be calculated to discard the SI. Thus, the depth and location of geologic source can be directly calculated using the improved Euler deconvolution without any prior information. Particularly, the noise influence can be decreased using the upward continuation of different altitudes. The new method is called self-constrained Euler deconvolution (SED). Subsequently, based on the SED algorithm, we deduce the full tensor gradient (FTG) calculation form of the new improved method. As we all know, using multi-components data of FTG have added advantages in data interpretation. The FTG form is composed by x-, y- and z-directional components. Due to the using more components, the FTG form can get more accurate results and more information in detail. The proposed modification method is tested using different synthetic models, and the satisfactory results are obtained. Finally, we applied the new approach to Bishop model magnetic data and real gravity data. All the results demonstrate that the new approach is utility tool to interpret the potential field and full tensor gradient data.

  18. Healthcare BI: a tool for meaningful analysis.

    PubMed

    Rohloff, Rose

    2011-05-01

    Implementing an effective business intelligence (BI) system requires organizationwide preparation and education to allow for meaningful analysis of information. Hospital executives should take steps to ensure that: Staff entering data are proficient in how the data are to be used for decision making, and integration is based on clean data from primary sources of entry. Managers have the business acumen required for effective data analysis. Decision makers understand how multidimensional BI offers new ways of analysis that represent significant improvements over historical approaches using static reporting. PMID:21634274

  19. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  20. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  1. Application of regularized RichardsonLucy algorithm for deconvolution of confocal microscopy images

    PubMed Central

    Laasmaa, M; Vendelin, M; Peterson, P

    2011-01-01

    Although confocal microscopes have considerably smaller contribution of out-of-focus light than widefield microscopes, the confocal images can still be enhanced mathematically if the optical and data acquisition effects are accounted for. For that, several deconvolution algorithms have been proposed. As a practical solution, maximum-likelihood algorithms with regularization have been used. However, the choice of regularization parameters is often unknown although it has considerable effect on the result of deconvolution process. The aims of this work were: to find good estimates of deconvolution parameters; and to develop an open source software package that would allow testing different deconvolution algorithms and that would be easy to use in practice. Here, RichardsonLucy algorithm has been implemented together with the total variation regularization in an open source software package IOCBio Microscope. The influence of total variation regularization on deconvolution process is determined by one parameter. We derived a formula to estimate this regularization parameter automatically from the images as the algorithm progresses. To assess the effectiveness of this algorithm, synthetic images were composed on the basis of confocal images of rat cardiomyocytes. From the analysis of deconvolved results, we have determined under which conditions our estimation of total variation regularization parameter gives good results. The estimated total variation regularization parameter can be monitored during deconvolution process and used as a stopping criterion. An inverse relation between the optimal regularization parameter and the peak signal-to-noise ratio of an image is shown. Finally, we demonstrate the use of the developed software by deconvolving images of rat cardiomyocytes with stained mitochondria and sarcolemma obtained by confocal and widefield microscopes. PMID:21323670

  2. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  3. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  4. AstroStat: Statistical analysis tool

    NASA Astrophysics Data System (ADS)

    VO-India team

    2015-07-01

    AstroStat performs statistical analysis on data and is compatible with Virtual Observatory (VO) standards. It accepts data in a variety of formats and performs various statistical tests using a menu driven interface. Analyses, performed in R, include exploratory tests, visualizations, distribution fitting, correlation and causation, hypothesis testing, multivariate analysis and clustering. AstroStat is available in two versions with an identical interface and features: as a web service that can be run using any standard browser and as an offline application.

  5. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  6. Pervaporation: a useful tool for speciation analysis

    NASA Astrophysics Data System (ADS)

    Luque de Castro, M. D.; Papaefstathiou, I.

    1998-02-01

    The application of pervaporation as both an auxiliary and a fundamental device for speciation analysis in liquid and solid samples is discussed. Examples of various determinations, including the coupling of the technique to both a gas chromatograph and flow-injection configurations, applied mostly to environmental and biological samples, are presented, giving clear evidence of the double role of the pervaporation process.

  7. Deconvolution methods for image deblurring in optical coherence tomography.

    PubMed

    Liu, Yiheng; Liang, Yanmei; Mu, Guoguang; Zhu, Xiaonong

    2009-01-01

    Two-dimensional deconvolution methods are proposed to deblur optical coherence tomography images. One employs a two-dimensional deconvolution with a matrix given by the product of the longitudinal and transversal point-spread functions as its kernel, which can be taken as the general point-spread function of an optical coherence tomography system. The other uses two one-dimensional deconvolutions with the longitudinal and transversal point-spread functions successively. It is shown that the two deconvolution methods can deblur the experimentally obtained optical coherence tomography images effectively. PMID:19109602

  8. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  9. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  10. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  11. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    SciTech Connect

    Bush, B.; Penev, M.; Melaina, M.; Zuboy, J.

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  12. Radar Interferometry Time Series Analysis and Tools

    NASA Astrophysics Data System (ADS)

    Buckley, S. M.

    2006-12-01

    We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.

  13. Multivariate exploratory tools for microarray data analysis.

    PubMed

    Szabo, Aniko; Boucher, Kenneth; Jones, David; Tsodikov, Alexander D; Klebanov, Lev B; Yakovlev, Andrei Y

    2003-10-01

    The ultimate success of microarray technology in basic and applied biological sciences depends critically on the development of statistical methods for gene expression data analysis. The most widely used tests for differential expression of genes are essentially univariate. Such tests disregard the multidimensional structure of microarray data. Multivariate methods are needed to utilize the information hidden in gene interactions and hence to provide more powerful and biologically meaningful methods for finding subsets of differentially expressed genes. The objective of this paper is to develop methods of multidimensional search for biologically significant genes, considering expression signals as mutually dependent random variables. To attain these ends, we consider the utility of a pertinent distance between random vectors and its empirical counterpart constructed from gene expression data. The distance furnishes exploratory procedures aimed at finding a target subset of differentially expressed genes. To determine the size of the target subset, we resort to successive elimination of smaller subsets resulting from each step of a random search algorithm based on maximization of the proposed distance. Different stopping rules associated with this procedure are evaluated. The usefulness of the proposed approach is illustrated with an application to the analysis of two sets of gene expression data. PMID:14557111

  14. RCytoscape: tools for exploratory network analysis

    PubMed Central

    2013-01-01

    Background Biomolecular pathways and networks are dynamic and complex, and the perturbations to them which cause disease are often multiple, heterogeneous and contingent. Pathway and network visualizations, rendered on a computer or published on paper, however, tend to be static, lacking in detail, and ill-equipped to explore the variety and quantities of data available today, and the complex causes we seek to understand. Results RCytoscape integrates R (an open-ended programming environment rich in statistical power and data-handling facilities) and Cytoscape (powerful network visualization and analysis software). RCytoscape extends Cytoscape's functionality beyond what is possible with the Cytoscape graphical user interface. To illustrate the power of RCytoscape, a portion of the Glioblastoma multiforme (GBM) data set from the Cancer Genome Atlas (TCGA) is examined. Network visualization reveals previously unreported patterns in the data suggesting heterogeneous signaling mechanisms active in GBM Proneural tumors, with possible clinical relevance. Conclusions Progress in bioinformatics and computational biology depends upon exploratory and confirmatory data analysis, upon inference, and upon modeling. These activities will eventually permit the prediction and control of complex biological systems. Network visualizations -- molecular maps -- created from an open-ended programming environment rich in statistical power and data-handling facilities, such as RCytoscape, will play an essential role in this progression. PMID:23837656

  15. Strategies for the deconvolution of hypertelescope images

    NASA Astrophysics Data System (ADS)

    Aime, C.; Lantéri, H.; Diet, M.; Carlotti, A.

    2012-07-01

    Aims: We study the possibility of deconvolving hypertelescope images and propose a procedure that can be used provided that the densification factor is small enough to make the process reversible. Methods: We present the simulation of hypertelescope images for an array of cophased densified apertures. We distinguish between two types of aperture densification, one called FAD (full aperture densification) corresponding to Labeyrie's original technique, and the other FSD (full spectrum densification) corresponding to a densification factor twice as low. Images are compared to the Fizeau mode. A single image of the observed object is obtained in the hypertelescope modes, while in the Fizeau mode the response produces an ensemble of replicas of the object. Simulations are performed for noiseless images and in a photodetection regime. Assuming first that the point spread function (PSF) does not change much over the object extent, we use two classical techniques to deconvolve the images, namely the Richardson-Lucy and image space reconstruction algorithms. Results: Both algorithms fail to achieve satisfying results. We interpret this as meaning that it is inappropriate to deconvolve a relation that is not a convolution, even if the variation in the PSF is very small across the object extent. We propose instead the application of a redilution to the densified image prior to its deconvolution, i.e. to recover an image similar to the Fizeau observation. This inverse operation is possible only when the rate of densification is no more than in the FSD case. This being done, the deconvolution algorithms become efficient. The deconvolution brings together the replicas into a single high-quality image of the object. This is heuristically explained as an inpainting of the Fourier plane. This procedure makes it possible to obtain improved images while retaining the benefits of hypertelescopes for image acquisition consisting of detectors with a small number of pixels.

  16. GATB: Genome Assembly & Analysis Tool Box

    PubMed Central

    Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

    2014-01-01

    Motivation: Efficient and fast next-generation sequencing (NGS) algorithms are essential to analyze the terabytes of data generated by the NGS machines. A serious bottleneck can be the design of such algorithms, as they require sophisticated data structures and advanced hardware implementation. Results: We propose an open-source library dedicated to genome assembly and analysis to fasten the process of developing efficient software. The library is based on a recent optimized de-Bruijn graph implementation allowing complex genomes to be processed on desktop computers using fast algorithms with low memory footprints. Availability and implementation: The GATB library is written in C++ and is available at the following Web site http://gatb.inria.fr under the A-GPL license. Contact: lavenier@irisa.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990603

  17. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-09-13

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projection screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals.more »SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position.« less

  18. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    SciTech Connect

    2012-09-13

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projection screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals. SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position.

  19. Advanced tools for in vivo skin analysis.

    PubMed

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed. PMID:20534081

  20. Nonlinear Robustness Analysis Tools for Flight Control Law Validation & Verification

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhijit

    Loss of control in flight is among the highest aviation accident categories for both the number of accidents and the number of fatalities. The flight controls community is seeking an improved validation tools for safety critical flight control systems. Current validation tools rely heavily on linear analysis, which ignore the inherent nonlinear nature of the aircraft dynamics and flight control system. Specifically, current practices in validating the flight control system involve gridding the flight envelope and checking various criteria based on linear analysis to ensure safety of the flight control system. The analysis and certification methods currently applied assume the aircrafts' dynamics is linear. In reality, the behavior of the aircraft is always nonlinear due to its aerodynamic characteristics and physical limitations imposed by the actuators. This thesis develops nonlinear analysis tools capable of certifying flight control laws for nonlinear aircraft dynamics. The proposed analysis tools can handle both the aerodynamic nonlinearities and the physical limitations imposed by the actuators in the aircrafts' dynamics. This proposed validation technique will extend and enrich the predictive capability of existing flight control law validation methods to analyze nonlinearities. The objective of this thesis is to provide the flight control community with an advanced set of analysis tools to reduce aviation fatalities and accidents rate.

  1. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has began using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. ASSESS analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. Its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weaknesses of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weaknesses, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  2. Automated Scalability Analysis Tools for Message Passing Parallel Programs

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekhar R.; Mehra, Pankaj; Tucker, Deanne (Technical Monitor)

    1994-01-01

    In order to develop scalable parallel applications, a number of programming decisions have to be made during the development of the program. Performance tools that help in making these decisions are few, if existent. Traditionally, performance tools have focused on exposing performance bottlenecks of small-scale executions of the program. However, it is common knowledge that programs that perform exceptionally well on small processor configurations, more often than not, perform poorly when executed on larger processor configurations. Hence, new tools that predict the execution characteristics of scaled-up programs are an essential part of an application developers toolkit. In this paper we discuss important issues that need to be considered in order to build useful scalability analysis tools for parallel programs. We introduce a simple tool that automatically extracts scalability characteristics of a class of deterministic parallel programs. We show with the help of a number of results on the Intel iPSC/860, that predictions are within reasonable bounds.

  3. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. PMID:26572119

  4. Cost-Benefit Analysis: Tools for Decision Making.

    ERIC Educational Resources Information Center

    Bess, Gary

    2002-01-01

    Suggests that cost-benefit analysis can be a helpful tool for assessing difficult and complex problems in child care facilities. Defines cost-benefit analysis as an approach to determine the most economical way to manage a program, describes how to analyze costs and benefits through hypothetical scenarios, and discusses some of the problems

  5. Using Audience Analysis as a Learning and Evaluation Tool.

    ERIC Educational Resources Information Center

    Goodsell, Diana

    This paper describes an activity that is not only effective for teaching audience analysis to introductory public speaking students, but also serves as an instructor feedback tool. The activity outlined in the paper takes students through every step of the audience analysis process, from selecting question types to tallying and formulating

  6. Development of a climate data analysis tool (CDAT)

    SciTech Connect

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  7. Improved Gabor Deconvolution and Its Extended Applications

    NASA Astrophysics Data System (ADS)

    Sun, Xuekai; Sun, Sam Zandong

    2016-02-01

    In log time-frequency spectra, the nonstationary convolution model is a linear equation and thus we improved the Gabor deconvolution by employing a log hyperbolic smoothing scheme which can be implemented as an iteration process. Numerical tests and practical applications demonstrate that improved Gabor deconvolution can further broaden frequency bandwidth with less computational expenses than the ordinary method. Moreover, we attempt to enlarge this method's application value by addressing nonstationary and evaluating Q values. In fact, energy relationship of each hyperbolic bin (i.e., attenuation curve) can be taken as a quantitative indicator in balancing nonstationarity and conditioning seismic traces to the assumption of unchanging wavelet, which resultantly reveals more useful information for constrained reflectivity inversion. Meanwhile, a statistical method on Q-value estimation is also proposed by utilizing this linear model's gradient. In practice, not only estimations well agree with geologic settings, but also applications on Q-compensation migration are favorable in characterizing deep geologic structures, such as the pinch-out boundary and water channel.

  8. AIDA: An Adaptive Image Deconvolution Algorithm

    NASA Astrophysics Data System (ADS)

    Hom, Erik; Marchis, F.; Lee, T. K.; Haase, S.; Agard, D. A.; Sedat, J. W.

    2007-10-01

    We recently described an adaptive image deconvolution algorithm (AIDA) for myopic deconvolution of multi-frame and three-dimensional data acquired through astronomical and microscopic imaging [Hom et al., J. Opt. Soc. Am. A 24, 1580 (2007)]. AIDA is a reimplementation and extension of the MISTRAL method developed by Mugnier and co-workers and shown to yield object reconstructions with excellent edge preservation and photometric precision [J. Opt. Soc. Am. A 21, 1841 (2004)]. Written in Numerical Python with calls to a robust constrained conjugate gradient method, AIDA has significantly improved run times over the original MISTRAL implementation. AIDA includes a scheme to automatically balance maximum-likelihood estimation and object regularization, which significantly decreases the amount of time and effort needed to generate satisfactory reconstructions. Here, we present a gallery of results demonstrating the effectiveness of AIDA in processing planetary science images acquired using adaptive-optics systems. Offered as an open-source alternative to MISTRAL, AIDA is available for download and further development at: http://msg.ucsf.edu/AIDA. This work was supported in part by the W. M. Keck Observatory, the National Institutes of Health, NASA, the National Science Foundation Science and Technology Center for Adaptive Optics at UC-Santa Cruz, and the Howard Hughes Medical Institute.

  9. Structure preserving color deconvolution for immunohistochemistry images

    NASA Astrophysics Data System (ADS)

    Chen, Ting; Srinivas, Chukka

    2015-03-01

    Immunohistochemistry (IHC) staining is an important technique for the detection of one or more biomarkers within a single tissue section. In digital pathology applications, the correct unmixing of the tissue image into its individual constituent dyes for each biomarker is a prerequisite for accurate detection and identification of the underlying cellular structures. A popular technique thus far is the color deconvolution method1 proposed by Ruifrok et al. However, Ruifrok's method independently estimates the individual dye contributions at each pixel which potentially leads to "holes and cracks" in the cells in the unmixed images. This is clearly inadequate since strong spatial dependencies exist in the tissue images which contain rich cellular structures. In this paper, we formulate the unmixing algorithm into a least-square framework of image patches, and propose a novel color deconvolution method which explicitly incorporates the spatial smoothness and structure continuity constraint into a neighborhood graph regularizer. An analytical closed-form solution to the cost function is derived for this algorithm for fast implementation. The algorithm is evaluated on a clinical data set containing a number of 3,3-Diaminobenzidine (DAB) and hematoxylin (HTX) stained IHC slides and demonstrates better unmixing results than the existing strategy.

  10. Compressive Deconvolution in Medical Ultrasound Imaging.

    PubMed

    Chen, Zhouye; Basarab, Adrian; Kouame, Denis

    2016-03-01

    The interest of compressive sampling in ultrasound imaging has been recently extensively evaluated by several research teams. Following the different application setups, it has been shown that the RF data may be reconstructed from a small number of measurements and/or using a reduced number of ultrasound pulse emissions. Nevertheless, RF image spatial resolution, contrast and signal to noise ratio are affected by the limited bandwidth of the imaging transducer and the physical phenomenon related to US wave propagation. To overcome these limitations, several deconvolution-based image processing techniques have been proposed to enhance the ultrasound images. In this paper, we propose a novel framework, named compressive deconvolution, that reconstructs enhanced RF images from compressed measurements. Exploiting an unified formulation of the direct acquisition model, combining random projections and 2D convolution with a spatially invariant point spread function, the benefit of our approach is the joint data volume reduction and image quality improvement. The proposed optimization method, based on the Alternating Direction Method of Multipliers, is evaluated on both simulated and in vivo data. PMID:26513780

  11. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  12. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  13. A Semi-Automated Functional Test Data Analysis Tool

    SciTech Connect

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  14. Removing boundary artifacts for real-time iterated shrinkage deconvolution.

    PubMed

    Sorel, Michal

    2012-04-01

    We propose a solution to the problem of boundary artifacts appearing in several recently published fast deblurring algorithms based on iterated shrinkage thresholding in a sparse domain and Fourier domain deconvolution. Our approach adapts an idea proposed by Reeves for deconvolution by the Wiener filter. The time of computation less than doubles. PMID:22106148

  15. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  16. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  17. Nonlinear deconvolution with deblending: a new analyzing technique for spectroscopy

    NASA Astrophysics Data System (ADS)

    Sennhauser, C.; Berdyugina, S. V.; Fluri, D. M.

    2009-12-01

    Context: Spectroscopy data in general often deals with an entanglement of spectral line properties, especially in the case of blended line profiles, independently of how high the quality of the data may be. In stellar spectroscopy and spectropolarimetry, where atomic transition parameters are usually known, the use of multi-line techniques to increase the signal-to-noise ratio of observations has become common practice. These methods extract an average line profile by means of either least squares deconvolution (LSD) or principle component analysis (PCA). However, only a few methods account for the blending of line profiles, and when they do, they assume that line profiles add linearly. Aims: We abandon the simplification of linear line-adding for Stokes I and present a novel approach that accounts for the nonlinearity in blended profiles, also illuminating the process of a reasonable deconvolution of a spectrum. Only the combination of those two enables us to treat spectral line variables independently, constituting our method of nonlinear deconvolution with deblending (NDD). The improved interpretation of a common line profile achieved compensates for the additional expense in calculation time, especially when it comes to the application to (Zeeman) doppler imaging (ZDI). Methods: By examining how absorption lines of different depths blend with each other and describing the effects of line-adding in a mathematically simple, yet physically meaningful way, we discover how it is possible to express a total line depth in terms of a (nonlinear) combination of contributing individual components. Thus, we disentangle blended line profiles and underlying parameters in a truthful manner and strongly increase the reliability of the common line patterns retrieved. Results: By comparing different versions of LSD with our NDD technique applied to simulated atomic and molecular intensity spectra, we are able to illustrate the improvements provided by our method to the interpretation of the recovered mean line profiles. As a consequence, it is possible for the first time to retrieve an intrinsic line pattern from a molecular band, offering the opportunity to fully include them in a NDD-based ZDI. However, we also show that strong line broadening deters the existence of a unique solution for heavily blended lines such as in molecular bandheads.

  18. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 5 grid positions over a 2 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution in UDECON is achieved by searching for the minimum ABIC while shifting the sensor response (to account for possible mispositioning of the sample on the tray) and a smoothness parameter in ranges defined by user. Comparison of deconvolution results using sensor response estimated from integrated point source measurements and other methods suggest that the integrated point source estimate yields better results (smaller ABIC). The noise characteristics of magnetometer measurements and the reliability of the UDECON algorithm were tested using repeated (a total of 400 times) natural remanence measurement of a u-channel sample before and after stepwise alternating field demagnetizations. Using a series of synthetic data constructed based on real paleomagnetic record, we demonstrate that optimized deconvolution using UDECON can greatly help revealing detailed paleomagnetic information such as excursions that may be smoothed out during pass-through measurement. Application of UDECON to the vast amount of existing and future pass-through paleomagnetic and rock magnetic measurements on sediments recovered especially through ocean drilling programs will contribute to our understanding of the geodynamo and paleo-environment by providing more detailed records of geomagnetic and environmental changes.

  19. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  20. Proteomic Tools for the Analysis of Cytoskeleton Proteins.

    PubMed

    Scarpati, Michael; Heavner, Mary Ellen; Wiech, Eliza; Singh, Shaneen

    2016-01-01

    Proteomic analyses have become an essential part of the toolkit of the molecular biologist, given the widespread availability of genomic data and open source or freely accessible bioinformatics software. Tools are available for detecting homologous sequences, recognizing functional domains, and modeling the three-dimensional structure for any given protein sequence. Although a wealth of structural and functional information is available for a large number of cytoskeletal proteins, with representatives spanning all of the major subfamilies, the majority of cytoskeletal proteins remain partially or totally uncharacterized. Moreover, bioinformatics tools provide a means for studying the effects of synthetic mutations or naturally occurring variants of these cytoskeletal proteins. This chapter discusses various freely available proteomic analysis tools, with a focus on in silico prediction of protein structure and function. The selected tools are notable for providing an easily accessible interface for the novice, while retaining advanced functionality for more experienced computational biologists. PMID:26498799

  1. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  2. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  3. Discovery of protein acetylation patterns by deconvolution of peptide isomer mass spectra

    PubMed Central

    Abshiru, Nebiyu; Caron-Lizotte, Olivier; Rajan, Roshan Elizabeth; Jamai, Adil; Pomies, Christelle; Verreault, Alain; Thibault, Pierre

    2015-01-01

    Protein post-translational modifications (PTMs) play important roles in the control of various biological processes including protein–protein interactions, epigenetics and cell cycle regulation. Mass spectrometry-based proteomics approaches enable comprehensive identification and quantitation of numerous types of PTMs. However, the analysis of PTMs is complicated by the presence of indistinguishable co-eluting isomeric peptides that result in composite spectra with overlapping features that prevent the identification of individual components. In this study, we present Iso-PeptidAce, a novel software tool that enables deconvolution of composite MS/MS spectra of isomeric peptides based on features associated with their characteristic fragment ion patterns. We benchmark Iso-PeptidAce using dilution series prepared from mixtures of known amounts of synthetic acetylated isomers. We also demonstrate its applicability to different biological problems such as the identification of site-specific acetylation patterns in histones bound to chromatin assembly factor-1 and profiling of histone acetylation in cells treated with different classes of HDAC inhibitors. PMID:26468920

  4. Spatial deconvolution of spectropolarimetric data: an application to quiet Sun magnetic elements

    NASA Astrophysics Data System (ADS)

    Quintero Noda, C.; Asensio Ramos, A.; Orozco Surez, D.; Ruiz Cobo, B.

    2015-07-01

    Context. One of the difficulties in extracting reliable information about the thermodynamical and magnetic properties of solar plasmas from spectropolarimetric observations is the presence of light dispersed inside the instruments, known as stray light. Aims: We aim to analyze quiet Sun observations after the spatial deconvolution of the data. We examine the validity of the deconvolution process with noisy data as we analyze the physical properties of quiet Sun magnetic elements. Methods: We used a regularization method that decouples the Stokes inversion from the deconvolution process, so that large maps can be quickly inverted without much additional computational burden. We applied the method on Hinode quiet Sun spectropolarimetric data. We examined the spatial and polarimetric properties of the deconvolved profiles, comparing them with the original data. After that, we inverted the Stokes profiles using the Stokes Inversion based on Response functions (SIR) code, which allow us to obtain the optical depth dependence of the atmospheric physical parameters. Results: The deconvolution process increases the contrast of continuum images and makes the magnetic structures sharper. The deconvolved Stokes I profiles reveal the presence of the Zeeman splitting while the Stokes V profiles significantly change their amplitude. The area and amplitude asymmetries of these profiles increase in absolute value after the deconvolution process. We inverted the original Stokes profiles from a magnetic element and found that the magnetic field intensity reproduces the overall behavior of theoretical magnetic flux tubes, that is, the magnetic field lines are vertical in the center of the structure and start to fan when we move far away from the center of the magnetic element. The magnetic field vector inferred from the deconvolved Stokes profiles also mimic a magnetic flux tube but in this case we found stronger field strengths and the gradients along the line-of-sight are larger for the magnetic field intensity and for its inclination. Moreover, the discontinuity between the magnetic and non magnetic environment in the flux tube gets sharper. Conclusions: The deconvolution process used in this paper reveals information that the smearing induced by the point spread function (PSF) of the telescope hides. Additionally, the deconvolution is done with a low computational load, making it appealing for its use on the analysis of large data sets. A copy of the IDL code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/579/A3

  5. Perceived Image Quality Improvements from the Application of Image Deconvolution to Retinal Images from an Adaptive Optics Fundus Imager

    NASA Astrophysics Data System (ADS)

    Soliz, P.; Nemeth, S. C.; Erry, G. R. G.; Otten, L. J.; Yang, S. Y.

    Aim: The objective of this project was to apply an image restoration methodology based on wavefront measurements obtained with a Shack-Hartmann sensor and evaluating the restored image quality based on medical criteria.Methods: Implementing an adaptive optics (AO) technique, a fundus imager was used to achieve low-order correction to images of the retina. The high-order correction was provided by deconvolution. A Shack-Hartmann wavefront sensor measures aberrations. The wavefront measurement is the basis for activating a deformable mirror. Image restoration to remove remaining aberrations is achieved by direct deconvolution using the point spread function (PSF) or a blind deconvolution. The PSF is estimated using measured wavefront aberrations. Direct application of classical deconvolution methods such as inverse filtering, Wiener filtering or iterative blind deconvolution (IBD) to the AO retinal images obtained from the adaptive optical imaging system is not satisfactory because of the very large image size, dificulty in modeling the system noise, and inaccuracy in PSF estimation. Our approach combines direct and blind deconvolution to exploit available system information, avoid non-convergence, and time-consuming iterative processes. Results: The deconvolution was applied to human subject data and resulting restored images compared by a trained ophthalmic researcher. Qualitative analysis showed significant improvements. Neovascularization can be visualized with the adaptive optics device that cannot be resolved with the standard fundus camera. The individual nerve fiber bundles are easily resolved as are melanin structures in the choroid. Conclusion: This project demonstrated that computer-enhanced, adaptive optic images have greater detail of anatomical and pathological structures.

  6. Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.

    PubMed

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C

    2013-05-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  7. Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning

    PubMed Central

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  8. ML-blind deconvolution algorithm: recent developments

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Santosh; Szarowski, Donald H.; Turner, James N.; O'Connor, Nathan J.; Holmes, Timothy J.

    1996-04-01

    The Maximum Likelihood based blind deconvolution (ML-blind) algorithm is used to deblur 3D microscope images. This approach was first introduced to the microscope community by us circa 1992. The basic advantage of a blind algorithm is that it simplifies the user interface protocols and reconstructs both the object and the Point Spread Function. In this paper we will discuss the recent improvements to the algorithm that robustize the performance and accelerate the speed of convergence. For instance, powerful and physically justified constraints are enforced on the reconstructed PSF at every iteration for robustization. A line search technique is added to the object reconstruction to accelerate the convergence of the object estimate. A simple modification to the algorithm enables adaptation for the transmitted light brightfield modality. Finally, we incorporate montaging in order to process large data fields.

  9. Solving a Deconvolution Problem in Photon Spectrometry

    NASA Astrophysics Data System (ADS)

    Alice-Phos Collaboration; Aleksandrov, D.; Alme, J.; Basmanov, V.; Batyunya, B.; Blau, D.; Bogolyubsky, M.; Budilov, V.; Budnikov, D.; Buskenes, J. I.; Cai, X.; Chuman, F.; Deloff, A.; Demanov, V.; Djuvsland, O.; Dobrowolski, T.; Faltys, M.; Fehlker, D.; Fil'Chagin, S.; Hiei, A.; Hille, P. T.; Horaguchi, T.; Huang, M.; Il'Kaev, R.; Ilkiv, I.; Ippolitov, M.; Iwasaki, T.; Kazantsev, A.; Karadzhev, K.; Kharlov, Y.; Kucheryaev, Y.; Kurashvili, P.; Kuryakin, A.; Larsen, D. T.; Lindal, S.; Liu, L.; Lovhoiden, G.; Ma, K.; Mamonov, A.; Manko, V.; Mao, Y.; Mares, J.; Maruyama, Y.; Müller, H.; Mizoguchi, K.; Nazarenko, S.; Nazarov, G.; Nikolaev, S.; Nikulin, S.; Nomokonov, P.; Nystrand, J.; Pavlinov, A.; Peresunko, D.; Petrov, V.; Polak, K.; Polichtchouk, B.; Potcheptsov, T.; Punin, V.; Qvigstad, H.; Redlich, K.; Roehrich, D.; Sadovsky, S.; Senko, V.; Shigaki, K.; Sibiryak, I.; Siemiarczuk, T.; Skaali, B.; Skjerdal, K.; Shabratova, G.; Soloviev, A.; Stolpovsky, P.; Sugitate, T.; Sukhorukov, M.; Torii, H.; Tveter, T.; Ullaland, K.; Wikne, J.; Vikhlyantsev, O.; Vinogradov, A.; Vinogradov, Y.; Vodopyanov, A.; Wan, R.; Wang, Y.; Wilk, G.; Wang, D.; Xu, C.; Yin, Z.; Yanovsky, V.; Yuan, X.; Zaporozhets, S.; Zenin, A.; Zhang, X.; Zhou, D.; ALICE-PHOS Collaboration

    2010-08-01

    We solve numerically a deconvolution problem to extract the undisturbed spectrum from the measured distribution contaminated by the finite resolution of the measuring device. A problem of this kind emerges when one wants to infer the momentum distribution of the neutral pions by detecting the π0 decay photons using the photon spectrometer of the ALICE LHC experiment at CERN [1]. The underlying integral equation connecting the sought for pion spectrum and the measured gamma spectrum has been discretized and subsequently reduced to a system of linear algebraic equations. The latter system, however, is known to be ill-posed and must be regularized to obtain a stable solution. This task has been accomplished here by means of the Tikhonov regularization scheme combined with the L-curve method. The resulting pion spectrum is in an excellent quantitative agreement with the pion spectrum obtained from a Monte Carlo simulation.

  10. Multichannel blind deconvolution using low rank recovery

    NASA Astrophysics Data System (ADS)

    Romberg, Justin; Tian, Ning; Sabra, Karim

    2013-05-01

    We introduce a new algorithm for multichannel blind deconvolution. Given the outputs of K linear time- invariant channels driven by a common source, we wish to recover their impulse responses without knowledge of the source signal. Abstractly, this problem amounts to finding a solution to an overdetermined system of quadratic equations. We show how we can recast the problem as solving a system of underdetermined linear equations with a rank constraint. Recent results in the area of low rank recovery have shown that there are effective convex relaxations to problems of this type that are also scalable computationally, allowing us to recover 100s of channel responses after a moderate observation time. We illustrate the effectiveness of our methodology with a numerical simulation of a passive "noise imaging" experiment.

  11. Deconvolution of mixed magnetism in multilayer graphene

    SciTech Connect

    Swain, Akshaya Kumar; Bahadur, Dhirendra

    2014-06-16

    Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

  12. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  13. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  14. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively

  15. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and

  16. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale

  17. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  18. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  19. Separation analysis, a tool for analyzing multigrid algorithms

    NASA Technical Reports Server (NTRS)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  20. Generalized iterative deconvolution for receiver function estimation

    NASA Astrophysics Data System (ADS)

    Wang, Yinzhi; Pavlis, Gary L.

    2016-02-01

    This paper describes a generalization of the iterative deconvolution method commonly used as a component of passive array wavefield imaging. We show that the iterative method should be thought of as a sparse output deconvolution method with the number of terms retained dependent on the convergence criteria. The generalized method we introduce uses an inverse operator to shape the assumed wavelet to a peaked function at zero lag. We show that the conventional method is equivalent to using a damped least-squares spiking filter with extremely large damping and proper scaling. In that case, the inverse operator used in the generalized method reduces to the cross-correlation operator. The theoretical insight of realizing the output is a sparse series provides a basis for the second important addition of the generalized method-an output shaping wavelet. A constant output shaping wavelet is a critical component in scattered wave imaging to avoid mixing data of variable bandwidth. We demonstrate the new approach can improve resolution by using an inverse operator tuned to maximize resolution. We also show that the signal-to-noise ratio of the result can be improved by applying a different convergence criterion than the standard method, which measures the energy left after each iteration. The efficacy of the approach was evaluated with synthetic experiment in various signal and noise conditions. We further validated the approach with real data from the USArray. We compared our results with data from the EarthScope Automated Receiver Survey and found that our results show modest improvements in consistency measured by correlation coefficients with station stacks and a reduced number of outliers.

  1. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    NASA Astrophysics Data System (ADS)

    Clough, D.; Fletcher, S.; Longstaff, A. P.; Willoughby, P.

    2012-05-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  2. Discovery and New Frontiers Project Budget Analysis Tool

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  3. The Cube Analysis and Rendering Tool for Astronomy

    NASA Astrophysics Data System (ADS)

    Rosolowsky, E.; Kern, J.; Federl, P.; Jacobs, J.; Loveland, S.; Taylor, J.; Sivakoff, G.; Taylor, R.

    2015-09-01

    We present the design principles and current status of the Cube Analysis and Rendering Tool for Astronomy (CARTA). The CARTA project is designing a cube visualization tool for the Atacama Large Millimetre/submillimeter array. CARTA will join the domain-specific software already developed for millimetre-wave interferometry with sever-side visualization solution. This connection will enable archive-hosted exploration of three-dimensional data cubes. CARTA will also provide an indistinguishable desktop client. While such a goal is ambitious for a short project, the team is focusing on a well-developed framework which can readily accommodate community code development through plugins.

  4. A dataflow analysis tool for parallel processing of algorithms

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  5. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  6. Software tools for the analysis of high resolution mass spectra

    SciTech Connect

    Price, J.M.; Myers, J.D.; Rockwood, A.W.

    1995-12-31

    The authors present a suite of software tools for the analysis of the high resolution mass spectra of large molecular weight species. This program, called MERCURY, includes tools for: the prediction of isotope distribution patterns for arbitrary molecular formulas, including fragment ions; a prototype automated wavelet transform algorithm for assigning the charge states of spectral features; a {open_quotes}protein editor{close_quotes} to conveniently generate molecular formulas for biopolymers and biopolymer fragment ions from sequence information; and the assignment of protein sequences by computer assisted interpretation of ladder sequences from collisionally activated dissociation (CAD) data.

  7. Mass Spectrometry Tools for Analysis of Intermolecular Interactions

    PubMed Central

    Auclair, Jared R.; Somasundaran, Mohan; Green, Karin M.; Evans, James E.; Schiffer, Celia A.; Ringe, Dagmar; Petsko, Gregory A.; Agar, Jeffrey N.

    2015-01-01

    The small quantities of protein required for mass spectrometry (MS) make it a powerful tool to detect binding (proteinprotein, proteinsmall molecule, etc.) of proteins that are difficult to express in large quantities, as is the case for many intrinsically disordered proteins. Chemical cross-linking, proteolysis, and MS analysis, combined, are a powerful tool for the identification of binding domains. Here, we present a traditional approach to determine proteinprotein interaction binding sites using heavy water (18O) as a label. This technique is relatively inexpensive and can be performed on any mass spectrometer without specialized software. PMID:22821539

  8. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  9. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  10. Galaxy, a web-based genome analysis tool for experimentalists

    PubMed Central

    Blankenberg, Daniel; Von Kuster, Gregory; Coraor, Nathaniel; Ananda, Guruprasad; Lazarus, Ross; Mangan, Mary; Nekrutenko, Anton; Taylor, James

    2014-01-01

    High-throughput data production has revolutionized molecular biology. However, massive increases in data generation capacity require analysis approaches that are more sophisticated, and often very computationally intensive. Thus making sense of high-throughput data requires informatics support. Galaxy (http://galaxyproject.org) is a software system that provides this support through a framework that gives experimentalists simple interfaces to powerful tools, while automatically managing the computational details. Galaxy is available both as a publicly available web service, which provides tools for the analysis of genomic, comparative genomic, and functional genomic data, or a downloadable package that can be deployed in individual labs. Either way, it allows experimentalists without informatics or programming expertise to perform complex large-scale analysis with just a web browser. PMID:20069535

  11. Pathway analysis tools and toxicogenomics reference databases for risk assessment.

    PubMed

    Ganter, Brigitte; Zidek, Nadine; Hewitt, Philip R; Müller, Dieter; Vladimirova, Antoaneta

    2008-01-01

    The pharmaceutical industry has begun to leverage a range of new technologies (proteomics, pharmacogenomics, metabolomics and molecular toxicology [e.g., toxicogenomics]) and analysis tools that are becoming increasingly integrated in the area of drug discovery and development. The approach of analyzing the vast amount of toxicogenomics data generated using molecular pathway and networks analysis tools in combination with analysis of reference data will be the main focus of this review. We will demonstrate how this combined approach can increase the understanding of the molecular mechanisms that lead to chemical-induced toxicity and application of this knowledge to compound risk assessment. We will provide an example of the insights achieved through a molecular toxicology analysis based on the well-known hepatotoxicant lipopolysaccharide to illustrate the utility of these new tools in the analysis of complex data sets, both in vivo and in vitro. The ultimate objective is a better lead selection process that improves the chances for success across the different stages of drug discovery and development. PMID:18154447

  12. Blind deconvolution of two-dimensional complex data

    SciTech Connect

    Ghiglia, D.C.; Romero, L.A.

    1994-01-01

    Inspired by the work of Lane and Bates on automatic multidimensional deconvolution, the authors have developed a systematic approach and an operational code for performing the deconvolution of multiply-convolved two-dimensional complex data sets in the absence of noise. They explain, in some detail, the major algorithmic steps, where noise or numerical errors can cause problems, their approach in dealing with numerical rounding errors, and where special noise-mitigating techniques can be used toward making blind deconvolution practical. Several examples of deconvolved imagery are presented, and future research directions are noted.

  13. Phenostat: visualization and statistical tool for analysis of phenotyping data.

    PubMed

    Reuveni, Eli; Carola, Valeria; Banchaabouchi, Mumna Al; Rosenthal, Nadia; Hancock, John M; Gross, Cornelius

    2007-09-01

    The effective extraction of information from multidimensional data sets derived from phenotyping experiments is a growing challenge in biology. Data visualization tools are important resources that can aid in exploratory data analysis of complex data sets. Phenotyping experiments of model organisms produce data sets in which a large number of phenotypic measures are collected for each individual in a group. A critical initial step in the analysis of such multidimensional data sets is the exploratory analysis of data distribution and correlation. To facilitate the rapid visualization and exploratory analysis of multidimensional complex trait data, we have developed a user-friendly, web-based software tool called Phenostat. Phenostat is composed of a dynamic graphical environment that allows the user to inspect the distribution of multiple variables in a data set simultaneously. Individuals can be selected by directly clicking on the graphs and thus displaying their identity, highlighting corresponding values in all graphs, allowing their inclusion or exclusion from the analysis. Statistical analysis is provided by R package functions. Phenostat is particularly suited for rapid distribution and correlation analysis of subsets of data. An analysis of behavioral and physiologic data stemming from a large mouse phenotyping experiment using Phenostat reveals previously unsuspected correlations. Phenostat is freely available to academic institutions and nonprofit organizations and can be used from our website at: (http://www.bioinfo.embl.it/phenostat/). PMID:17674099

  14. Field Quality Analysis as a Tool to Monitor Magnet Production

    SciTech Connect

    Gupta, R.; Anerella, M.; Cozzolino, J.; Fisher, D.; Ghosh, A.; Jain, A.; Sampson, W.; Schmalzle, J.; Thompson, P.; Wanderer, P.; Willen, E.

    1997-10-18

    Field harmonics offer a powerful tool to examine the mechanical structure of accelerator magnets. A large deviation from the nominal values suggests a mechanical defect. Magnets with such defects are likely to have a poor quench performance. Similarly, a trend suggests a wear in tooling or a gradual change in the magnet assem-bly or in the size of a component. This paper presents the use of the field quality as a tool to monitor the magnet production of the Relativistic Heavy Ion Collider (RHIC). Several examples are briefly described. Field quality analysis can also rule out a suspected geometric error if it can not be supported by the symmetry and the magnitude of the measured harmonics.

  15. The design and implementation of a workflow analysis tool.

    PubMed

    Curcin, Vasa; Ghanem, Moustafa; Guo, Yike

    2010-09-13

    Motivated by the use of scientific workflows as a user-oriented mechanism for building executable scientific data integration and analysis applications, this article introduces a framework and a set of associated methods for analysing the execution properties of scientific workflows. Our framework uses a number of formal modelling techniques to characterize the process and data behaviour of workflows and workflow components and to reason about their functional and execution properties. We use the framework to design the architecture of a customizable tool that can be used to analyse the key execution properties of scientific workflows at authoring stage. Our design is generic and can be applied to a wide variety of scientific workflow languages and systems, and is evaluated by building a prototype of the tool for the Discovery Net system. We demonstrate and discuss the utility of the framework and tool using workflows from a real-world medical informatics study. PMID:20679131

  16. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  17. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  18. Limited-memory scaled gradient projection methods for real-time image deconvolution in microscopy

    NASA Astrophysics Data System (ADS)

    Porta, F.; Zanella, R.; Zanghirati, G.; Zanni, L.

    2015-04-01

    Gradient projection methods have given rise to effective tools for image deconvolution in several relevant areas, such as microscopy, medical imaging and astronomy. Due to the large scale of the optimization problems arising in nowadays imaging applications and to the growing request of real-time reconstructions, an interesting challenge to be faced consists in designing new acceleration techniques for the gradient schemes, able to preserve their simplicity and low computational cost of each iteration. In this work we propose an acceleration strategy for a state-of-the-art scaled gradient projection method for image deconvolution in microscopy. The acceleration idea is derived by adapting a step-length selection rule, recently introduced for limited-memory steepest descent methods in unconstrained optimization, to the special constrained optimization framework arising in image reconstruction. We describe how important issues related to the generalization of the step-length rule to the imaging optimization problem have been faced and we evaluate the improvements due to the acceleration strategy by numerical experiments on large-scale image deconvolution problems.

  19. Deconvolution of Complex 1D NMR Spectra Using Objective Model Selection

    PubMed Central

    Hughes, Travis S.; Wilson, Henry D.; de Vera, Ian Mitchelle S.; Kojetin, Douglas J.

    2015-01-01

    Fluorine (19F) NMR has emerged as a useful tool for characterization of slow dynamics in 19F-labeled proteins. One-dimensional (1D) 19F NMR spectra of proteins can be broad, irregular and complex, due to exchange of probe nuclei between distinct electrostatic environments; and therefore cannot be deconvoluted and analyzed in an objective way using currently available software. We have developed a Python-based deconvolution program, decon1d, which uses Bayesian information criteria (BIC) to objectively determine which model (number of peaks) would most likely produce the experimentally obtained data. The method also allows for fitting of intermediate exchange spectra, which is not supported by current software in the absence of a specific kinetic model. In current methods, determination of the deconvolution model best supported by the data is done manually through comparison of residual error values, which can be time consuming and requires model selection by the user. In contrast, the BIC method used by decond1d provides a quantitative method for model comparison that penalizes for model complexity helping to prevent over-fitting of the data and allows identification of the most parsimonious model. The decon1d program is freely available as a downloadable Python script at the project website (https://github.com/hughests/decon1d/). PMID:26241959

  20. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  1. Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment

    PubMed Central

    McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natlia; Gong, Ping; Burgoon, Lyle D.

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  2. Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)

    SciTech Connect

    Alkadi, Nasr E; Starke, Michael R; Ma, Ookie; Nimbalkar, Sachin U; Cox, Daryl

    2013-01-01

    IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

  3. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    PubMed

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-01

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. PMID:26002210

  4. Tools for comparative analysis of alternatives: competing or complementary perspectives?

    PubMed

    Hofstetter, Patrick; Bare, Jane C; Hammitt, James K; Murphy, Patricia A; Rice, Glenn E

    2002-10-01

    A third generation of environmental policy making and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of all risks associated with the decision alternatives will aid decision-makers in prioritizing alternatives that effectively reduce both target and countervailing risks. Starting with the metaphor of the ripples caused by a stone that is thrown into a pond, we identify 10 types of ripples that symbolize, in our case, risks that deserve closer examination: direct, upstream, downstream, accidental risks, occupational risks, risks due to offsetting behavior, change in disposable income, macro-economic changes, depletion of natural resources, and risks to the manmade environment. Tools to analyze these risks were developed independently and recently have been applied to overlapping fields of application. This suggests that either the tools should be linked in a unified framework for comparative analysis or that the appropriate field of application for single tools should be better understood. The goals of this article are to create a better foundation for the understanding of the nature and coverage of available tools and to identify the remaining gaps. None of the tools is designed to deal with all 10 types of risk. Provided data suggest that, of the 10 types of identified risks, those associated with changes in disposable income may be particularly significant when decision alternatives differ with respect to their effects on disposable income. Finally, the present analysis was limited to analytical questions and did not capture the important role of the decision-making process itself. PMID:12442983

  5. SATRAT: Staphylococcus aureus transcript regulatory network analysis tool

    PubMed Central

    Nagarajan, Vijayaraj; Elasri, Mohamed O.

    2015-01-01

    Staphylococcus aureus is a commensal organism that primarily colonizes the nose of healthy individuals. S. aureus causes a spectrum of infections that range from skin and soft-tissue infections to fatal invasive diseases. S. aureus uses a large number of virulence factors that are regulated in a coordinated fashion. The complex regulatory mechanisms have been investigated in numerous high-throughput experiments. Access to this data is critical to studying this pathogen. Previously, we developed a compilation of microarray experimental data to enable researchers to search, browse, compare, and contrast transcript profiles. We have substantially updated this database and have built a novel exploratory toolSATRATthe S. aureus transcript regulatory network analysis tool, based on the updated database. This tool is capable of performing deep searches using a query and generating an interactive regulatory network based on associations among the regulators of any query gene. We believe this integrated regulatory network analysis tool would help researchers explore the missing links and identify novel pathways that regulate virulence in S. aureus. Also, the data model and the network generation code used to build this resource is open sourced, enabling researchers to build similar resources for other bacterial systems. PMID:25653902

  6. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  7. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  8. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  9. Matrix of Response Functions for Deconvolution of Gamma-ray Spectra

    NASA Astrophysics Data System (ADS)

    Shustov, A. E.; Ulin, S. E.

    An approach for creation the response functions'matrix for the xenon gamma-ray detector is discussed. A set of gamma-ray spectra was obtained by Geant4 simulation to generate the matrix. Iterative algorithms used allow to deconvolve and restore initial gamma-ray spectra. Processed spectrum contains peaks that help to identify and estimate a activity of a radioactive source. Results and analysis of experimental spectra deconvolution are shown.

  10. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

  11. Blind deconvolution for ultrasound sequences using a noninverse greedy algorithm.

    PubMed

    Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

    2013-01-01

    The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

  12. The discrete Kalman filtering approach for seismic signals deconvolution

    SciTech Connect

    Kurniadi, Rizal; Nurhandoko, Bagus Endar B.

    2012-06-20

    Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

  13. Deconvolution Estimation in Measurement Error Models: The R Package decon

    PubMed Central

    Wang, Xiao-Feng; Wang, Bin

    2011-01-01

    Data from many scientific areas often come with measurement error. Density or distribution function estimation from contaminated data and nonparametric regression with errors-in-variables are two important topics in measurement error models. In this paper, we present a new software package decon for R, which contains a collection of functions that use the deconvolution kernel methods to deal with the measurement error problems. The functions allow the errors to be either homoscedastic or heteroscedastic. To make the deconvolution estimators computationally more efficient in R, we adapt the fast Fourier transform algorithm for density estimation with error-free data to the deconvolution kernel estimation. We discuss the practical selection of the smoothing parameter in deconvolution methods and illustrate the use of the package through both simulated and real examples. PMID:21614139

  14. Blind Deconvolution for Ultrasound Sequences Using a Noninverse Greedy Algorithm

    PubMed Central

    Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

    2013-01-01

    The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  16. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGESBeta

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  17. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  18. The RUMBA software: tools for neuroimaging data analysis.

    PubMed

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses. PMID:15067169

  19. Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

    1999-01-01

    A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

  20. AstroStat-A VO tool for statistical analysis

    NASA Astrophysics Data System (ADS)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  1. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  2. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  3. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  4. Development of New Modeling and Analysis Tools for Solar Sails

    NASA Technical Reports Server (NTRS)

    Lou, Michael; Fang, Houfei; Yang, Bingen

    2004-01-01

    Existing finite-element-based structural analysis codes are ineffective in treating deployable gossamer space systems, including solar sails that are formed by long space-deployable booms and extremely large thin-film membrane apertures. Recognizing this, the NASA Space transportation Technology Program has initiated and sponsored a focused research effort to develop new and computationally efficient structural modeling and analysis tools for solar sails. The technical approach of this ongoing effort will be described. Two solution methods, the Distributed Transfer Function Method and the Parameter-Variation-Principle method, based on which the technical approach was formatted are also discussed.

  5. Federal metering data analysis needs and existing tools

    SciTech Connect

    Henderson, Jordan W.; Fowler, Kimberly M.

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  6. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  7. Adaptive deconvolution using a SAW storage correlator

    NASA Astrophysics Data System (ADS)

    Bowers, J. E.; Kino, G. S.; Behar, D.; Olaisen, H.

    1981-05-01

    A new analog adaptive filter for deconvolving distorted signals is described. The filter uses a storage correlator which implements a clipped version of the least mean squared algorithm and uses a special iterative technique to achieve fast convergence. The new filter has a potential bandwidth of 100 MHz and would eventually handle pulsed signals of 10-microsecond width. For signals with time-bandwidth products of less than 100, the adaptation time is less than 1 ms, which allows operation in real time for most applications, including resolution of radar signals in a cluttered environment, removal of echoes from television signals, deconvolution of distorted signals in nondestructive evaluation, and also in telephony. The filter is particularly suited for radar and communications, as it processes signals directly in the VHF range. Two experiments related to ghost suppression of a pulse and to the field of NDE are described in this paper. The results are in good agreement with computer simulations and show a ghost suppression of 15 dB for the first example and a sidelobe suppression of 8 dB for a transducer signal. The adaptation time is less than 450 microseconds.

  8. Deconvolution of high rate flicker electroretinograms.

    PubMed

    Alokaily, A; Bhorquez, J; zdamar,

    2014-01-01

    Flicker electroretinograms are steady-state electroretinograms (ERGs) generated by high rate flash stimuli that produce overlapping periodic responses. When a flash stimulus is delivered at low rates, a transient response named flash ERG (FERG) representing the activation of neural structures within the outer retina is obtained. Although FERGs and flicker ERGs are used in the diagnosis of many retinal diseases, their waveform relationships have not been investigated in detail. This study examines this relationship by extracting transient FERGs from specially generated quasi steady-state flicker and ERGs at stimulation rates above 10 Hz and similarly generated conventional flicker ERGs. The ability to extract the transient FERG responses by deconvolving flicker responses to temporally jittered stimuli at high rates is investigated at varying rates. FERGs were obtained from seven normal subjects stimulated with LED-based displays, delivering steady-state and low jittered quasi steady-state responses at five rates (10, 15, 32, 50, 68 Hz). The deconvolution method enabled a successful extraction of "per stimulus" unit transient ERG responses for all high stimulation rates. The deconvolved FERGs were used successfully to synthesize flicker ERGs obtained at the same high stimulation rates. PMID:25571234

  9. Deconvolution techniques for improving the resolution of long-pulse lidars

    SciTech Connect

    Gurdev, L.L.; Dreischuh, T.N.; Stoyanov, D.V.

    1993-11-01

    Deconvolution techniques are developed for improving lidar resolution when the sampling intervals are shorter than the sensing laser pulse. Such approaches permit the maximum-resolved lidar return in the case of arbitrary-shaped long laser pulses such as those used in CO{sub 2} lidars. The general algorithms are based on the Fourier-deconvolution technique as well as on the solution of the first kind of Volterra integral equation. In the case of rectangular pulses a simple and convenient recurrence algorithm is proposed and is analyzed in detail. The effect of stationary additive noise on algorithm performance is investigated. The theoretical analysis is supported by computer simulations demonstrating the increased resolution of the retrieved lidar profiles. 13 refs., 8 figs.

  10. Multi-Diffusion-Tensor Fitting via Spherical Deconvolution: A Unifying Framework

    PubMed Central

    Schultz, Thomas; Westin, Carl-Fredrik; Kindlmann, Gordon

    2016-01-01

    In analyzing diffusion magnetic resonance imaging, multi-tensor models address the limitations of the single diffusion tensor in situations of partial voluming and fiber crossings. However, selection of a suitable number of fibers and numerical difficulties in model fitting have limited their practical use. This paper addresses both problems by making spherical deconvolution part of the fitting process: We demonstrate that with an appropriate kernel, the deconvolution provides a reliable approximative fit that is efficiently refined by a subsequent descent-type optimization. Moreover, deciding on the number of fibers based on the orientation distribution function produces favorable results when compared to the traditional F-Test. Our work demonstrates the benefits of unifying previously divergent lines of work in diffusion image analysis. PMID:20879289

  11. Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs

    SciTech Connect

    Howard, M.; Luttman, A.; Fowler, M.

    2014-11-01

    In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

  12. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    PubMed

    Bartocci, Ezio; Li, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  13. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  14. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

  15. Protocol analysis as a tool for behavior analysis

    PubMed Central

    Austin, John; Delaney, Peter F.

    1998-01-01

    The study of thinking is made difficult by the fact that many of the relevant stimuli and responses are not apparent. Although the use of verbal reports has a long history in psychology, it is only recently that Ericsson and Simon's (1993) book on verbal reports explicated the conditions under which such reports may be reliable and valid. We review some studies in behavior analysis and cognitive psychology that have used talk-aloud reporting. We review particular methods for collecting reliable and valid verbal reports using the talk-aloud method as well as discuss alternatives to the talk-aloud procedure that are effective under different task conditions, such as the use of reports after completion of very rapid task performances. We specifically caution against the practice of asking subjects to reflect on the causes of their own behavior and the less frequently discussed problems associated with providing inappropriate social stimulation to participants during experimental sessions. PMID:22477126

  16. Impact of beam deconvolution on noise properties in CMB measurements: Application to Planck LFI

    NASA Astrophysics Data System (ADS)

    Keihänen, E.; Kiiveri, K.; Lindholm, V.; Reinecke, M.; Suur-Uski, A.-S.

    2016-03-01

    We present an analysis of the effects of beam deconvolution on noise properties in CMB measurements. The analysis is built around the artDeco beam deconvolver code. We derive a low-resolution noise covariance matrix that describes the residual noise in deconvolution products, both in harmonic and pixel space. The matrix models the residual correlated noise that remains in time-ordered data after destriping, and the effect of deconvolution on this noise. To validate the results, we generate noise simulations that mimic the data from the Planck LFI instrument. A χ2 test for the full 70 GHz covariance in multipole range ℓ = 0 - 50 yields a mean reduced χ2 of 1.0037. We compare two destriping options, full and independent destriping, when deconvolving subsets of available data. Full destriping leaves substantially less residual noise, but leaves data sets intercorrelated. We also derive a white noise covariance matrix that provides an approximation of the full noise at high multipoles, and study the properties on high-resolution noise in pixel space through simulations.

  17. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  18. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and population statistics services to be registered with the GEOSS registry and made findable through the GEOSS Clearinghouse. The fusion of data sources and different web service interfaces illustrate the agility in using standard interfaces and help define the type of input and output interfaces needed to connect models and analysis tools within sensor webs.

  19. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to

  20. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  1. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  2. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  3. TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS

    SciTech Connect

    Da Rio, Nicola; Robberto, Massimo

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  4. PERISCOPE: An Online-Based Distributed Performance Analysis Tool

    NASA Astrophysics Data System (ADS)

    Benedict, Shajulin; Petkov, Ventsislav; Gerndt, Michael

    This paper presents PERISCOPE - an online distributed performance analysis tool that searches for a wide range of performance bottlenecks in parallel applications. It consists of a set of agents that capture and analyze application and hardware-related properties in an autonomous fashion. The paper focuses on the Periscope design, the different search methodologies, and the steps involved to do an online performance analysis. A new graphical user-friendly interface based on Eclipse is introduced. Through the use of this new easy-to-use graphical interface, remote execution, selection of the type of analysis, and the inspection of the found properties can be performed in an intuitive and easy way. In addition, a real-world application, namely, the GENE code, a grand challenge problem of plasma physics is analyzed using Periscope. The results are illustrated in terms of found properties and scalability issues.

  5. Deconvolution for three-dimensional acoustic source identification based on spherical harmonics beamforming

    NASA Astrophysics Data System (ADS)

    Chu, Zhigang; Yang, Yang; He, Yansong

    2015-05-01

    Spherical Harmonics Beamforming (SHB) with solid spherical arrays has become a particularly attractive tool for doing acoustic sources identification in cabin environments. However, it presents some intrinsic limitations, specifically poor spatial resolution and severe sidelobe contaminations. This paper focuses on overcoming these limitations effectively by deconvolution. First and foremost, a new formulation is proposed, which expresses SHB's output as a convolution of the true source strength distribution and the point spread function (PSF) defined as SHB's response to a unit-strength point source. Additionally, the typical deconvolution methods initially suggested for planar arrays, deconvolution approach for the mapping of acoustic sources (DAMAS), nonnegative least-squares (NNLS), Richardson-Lucy (RL) and CLEAN, are adapted to SHB successfully, which are capable of giving rise to highly resolved and deblurred maps. Finally, the merits of the deconvolution methods are validated and the relationships of source strength and pressure contribution reconstructed by the deconvolution methods vs. focus distance are explored both with computer simulations and experimentally. Several interesting results have emerged from this study: (1) compared with SHB, DAMAS, NNLS, RL and CLEAN all can not only improve the spatial resolution dramatically but also reduce or even eliminate the sidelobes effectively, allowing clear and unambiguous identification of single source or incoherent sources. (2) The availability of RL for coherent sources is highest, then DAMAS and NNLS, and that of CLEAN is lowest due to its failure in suppressing sidelobes. (3) Whether or not the real distance from the source to the array center equals the assumed one that is referred to as focus distance, the previous two results hold. (4) The true source strength can be recovered by dividing the reconstructed one by a coefficient that is the square of the focus distance divided by the real distance from the source to the array center. (5) The reconstructed pressure contribution is almost not affected by the focus distance, always approximating to the true one. This study will be of great significance to the accurate localization and quantification of acoustic sources in cabin environments.

  6. Informed constrained spherical deconvolution (iCSD).

    PubMed

    Roine, Timo; Jeurissen, Ben; Perrone, Daniele; Aelterman, Jan; Philips, Wilfried; Leemans, Alexander; Sijbers, Jan

    2015-08-01

    Diffusion-weighted (DW) magnetic resonance imaging (MRI) is a noninvasive imaging method, which can be used to investigate neural tracts in the white matter (WM) of the brain. However, the voxel sizes used in DW-MRI are relatively large, making DW-MRI prone to significant partial volume effects (PVE). These PVEs can be caused both by complex (e.g. crossing) WM fiber configurations and non-WM tissue, such as gray matter (GM) and cerebrospinal fluid. High angular resolution diffusion imaging methods have been developed to correctly characterize complex WM fiber configurations, but significant non-WM PVEs are also present in a large proportion of WM voxels. In constrained spherical deconvolution (CSD), the full fiber orientation distribution function (fODF) is deconvolved from clinically feasible DW data using a response function (RF) representing the signal of a single coherently oriented population of fibers. Non-WM PVEs cause a loss of precision in the detected fiber orientations and an emergence of false peaks in CSD, more prominently in voxels with GM PVEs. We propose a method, informed CSD (iCSD), to improve the estimation of fODFs under non-WM PVEs by modifying the RF to account for non-WM PVEs locally. In practice, the RF is modified based on tissue fractions estimated from high-resolution anatomical data. Results from simulation and in-vivo bootstrapping experiments demonstrate a significant improvement in the precision of the identified fiber orientations and in the number of false peaks detected under GM PVEs. Probabilistic whole brain tractography shows fiber density is increased in the major WM tracts and decreased in subcortical GM regions. The iCSD method significantly improves the fiber orientation estimation at the WM-GM interface, which is especially important in connectomics, where the connectivity between GM regions is analyzed. PMID:25660002

  7. The Smooth Decomposition as a nonlinear modal analysis tool

    NASA Astrophysics Data System (ADS)

    Bellizzi, Sergio; Sampaio, Rubens

    2015-12-01

    The Smooth Decomposition (SD) is a statistical analysis technique for finding structures in an ensemble of spatially distributed data such that the vector directions not only keep the maximum possible variance but also the motions, along the vector directions, are as smooth in time as possible. In this paper, the notion of the dual smooth modes is introduced and used in the framework of oblique projection to expand a random response of a system. The dual modes define a tool that transforms the SD in an efficient modal analysis tool. The main properties of the SD are discussed and some new optimality properties of the expansion are deduced. The parameters of the SD give access to modal parameters of a linear system (mode shapes, resonance frequencies and modal energy participations). In case of nonlinear systems, a richer picture of the evolution of the modes versus energy can be obtained analyzing the responses under several excitation levels. This novel analysis of a nonlinear system is illustrated by an example.

  8. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  9. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  10. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  11. Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.

    PubMed

    Villeneuve, Emma; Carfantan, Herv

    2014-10-01

    Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

  12. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow

  13. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  14. SNooPy: TypeIa supernovae analysis tools

    NASA Astrophysics Data System (ADS)

    Burns, Christopher R.; Stritzinger, Maximilian; Phillips, M. M.; Kattner, ShiAnne; Persson, S. E.; Madore, Barry F.; Freedman, Wendy L.; Boldt, Luis; Campillay, Abdo; Contreras, Carlos; Folatelli, Gaston; Gonzalez, Sergio; Krzeminski, Wojtek; Morrell, Nidia; Salgado, Francisco; Suntzeff, Nicholas B.

    2015-05-01

    The SNooPy package (also known as SNpy), written in Python, contains tools for the analysis of TypeIa supernovae. It offers interactive plotting of light-curve data and models (and spectra), computation of reddening laws and K-corrections, LM non-linear least-squares fitting of light-curve data, and various types of spline fitting, including Diercx and tension. The package also includes a SNIa lightcurve template generator in the CSP passbands, estimates of Milky-Way Extinction, and a module for dealing with filters and spectra.

  15. BEDTools: the Swiss-army tool for genome feature analysis

    PubMed Central

    Quinlan, Aaron R.

    2014-01-01

    Technological advances have enabled the use of DNA sequencing as a flexible tool to characterize genetic variation and to measure the activity of diverse cellular phenomena such as gene isoform expression and transcription factor binding. Extracting biological insight from the experiments enabled by these advances demands the analysis of large, multi-dimensional datasets. This unit describes the use of the BEDTools toolkit for the exploration of high-throughput genomics datasets. I present several protocols for common genomic analyses and demonstrate how simple BEDTools operations may be combined to create bespoke pipelines addressing complex questions. PMID:25199790

  16. BEDTools: The Swiss-Army Tool for Genome Feature Analysis.

    PubMed

    Quinlan, Aaron R

    2014-01-01

    Technological advances have enabled the use of DNA sequencing as a flexible tool to characterize genetic variation and to measure the activity of diverse cellular phenomena such as gene isoform expression and transcription factor binding. Extracting biological insight from the experiments enabled by these advances demands the analysis of large, multi-dimensional datasets. This unit describes the use of the BEDTools toolkit for the exploration of high-throughput genomics datasets. Several protocols are presented for common genomic analyses, demonstrating how simple BEDTools operations may be combined to create bespoke pipelines addressing complex questions. PMID:25199790

  17. Integrated network analysis and effective tools in plant systems biology

    PubMed Central

    Fukushima, Atsushi; Kanaya, Shigehiko; Nishida, Kozo

    2014-01-01

    One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1) network visualization tools, (2) pathway analyses, (3) genome-scale metabolic reconstruction, and (4) the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms. PMID:25408696

  18. Java Analysis Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, E.; Hix, W.; Guidry, M.; Smith, M.

    2002-12-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector graphic technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as stand-alone applications or distributed across a network. These tools, which have broad applications in general scientific visualization, are currently being used to explore and analyze a large library of nuclear reaction rates and visualize results of explosive nucleosynthesis calculations with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to easily include new rates and compare and adjust current ones. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic (ten-fold) reduction in visualization file sizes while maintaining high visual quality and interactive control. ORNL Physics Division is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  19. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    SciTech Connect

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  20. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    NASA Technical Reports Server (NTRS)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  1. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput shotgun proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a users and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  2. Using conjoint analysis as a program evaluation tool

    SciTech Connect

    Moe, R.; Dion, S.

    1994-11-01

    While conjoint analysis has typically been applied in utility market research to estimate penetration of utility programs, help identify optimal program design features, and measure the trade-offs customers make when evaluating utility program options, it has seldom been used as a program evaluation tool. This paper discusses the use of conjoint analysis to estimate free ridership rates in evaluations of two utility DSM programs: a residential high efficiency heat pump/central air conditioning program and a low-flow showerhead program. The two studies incorporated different approaches for data collection and data analysis. The first study used a phone-mail approach to collect the data, a ranking method for scoring, and a SAS program for analysis; the second study collected data through in-person interviews, used a rating method for scoring, and Bretton Clark software for data analysis. The paper describes the design and results of both conjoint studies and how the standard conjoint analysis outputs were used to estimate the free rider rate for each program and, for one program, predict the penetration of high-efficiency models under alternative rebate structures. It also presents the results of both analyses and compares the estimated free rider rates to those derived for the same programs using other methods.

  3. Tools for integrated sequence-structure analysis with UCSF Chimera

    PubMed Central

    Meng, Elaine C; Pettersen, Eric F; Couch, Gregory S; Huang, Conrad C; Ferrin, Thomas E

    2006-01-01

    Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a) provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b) facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit); (c) can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d) interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is available for Microsoft Windows, Apple Mac OS X, Linux, and other platforms from . PMID:16836757

  4. Improving space debris detection in GEO ring using image deconvolution

    NASA Astrophysics Data System (ADS)

    Núñez, Jorge; Núñez, Anna; Montojo, Francisco Javier; Condominas, Marta

    2015-07-01

    In this paper we present a method based on image deconvolution to improve the detection of space debris, mainly in the geostationary ring. Among the deconvolution methods we chose the iterative Richardson-Lucy (R-L), as the method that achieves better goals with a reasonable amount of computation. For this work, we used two sets of real 4096 × 4096 pixel test images obtained with the Telescope Fabra-ROA at Montsec (TFRM). Using the first set of data, we establish the optimal number of iterations in 7, and applying the R-L method with 7 iterations to the images, we show that the astrometric accuracy does not vary significantly while the limiting magnitude of the deconvolved images increases significantly compared to the original ones. The increase is in average about 1.0 magnitude, which means that objects up to 2.5 times fainter can be detected after deconvolution. The application of the method to the second set of test images, which includes several faint objects, shows that, after deconvolution, up to four previously undetected faint objects are detected in a single frame. Finally, we carried out a study of some economic aspects of applying the deconvolution method, showing that an important economic impact can be envisaged.

  5. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  6. The Lagrangian analysis tool LAGRANTO - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  7. The LAGRANTO Lagrangian analysis tool - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  8. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C; Temple, Brian A; Buescher, Kevin L

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  9. HPCTOOLKIT: tools for performance analysis of optimized parallel programs

    SciTech Connect

    Adhianto, Laksono; Marin, Gabriel; Mellor-Crummey, John M; Tallent, Joseph F

    2009-01-01

    HPCTOOLKIT is an integrated suite of tools that supports measurement, analysis, attribution, and presentation of application performance for both sequential and parallel programs. HPCTOOLKIT can pinpoint and quantify scalability bottlenecks in fully optimized parallel programs with a measurement overhead of only a few percent. Recently, new capabilities were added to HPCTOOLKIT for collecting call path profiles for fully optimized codes without any compiler support, pinpointing and quantifying bottlenecks in multithreaded programs, exploring performance information and source code using a new user interface, and displaying hierarchical space-time diagrams based on traces of asynchronous call path samples. This paper provides an overview of HPCTOOLKIT and illustrates its utility for performance analysis of parallel applications.

  10. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  11. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  12. Tutorial on platform for optical topography analysis tools.

    PubMed

    Sutoko, Stephanie; Sato, Hiroki; Maki, Atsushi; Kiguchi, Masashi; Hirabayashi, Yukiko; Atsumori, Hirokazu; Obata, Akiko; Funane, Tsukasa; Katura, Takusige

    2016-01-01

    Optical topography/functional near-infrared spectroscopy (OT/fNIRS) is a functional imaging technique that noninvasively measures cerebral hemoglobin concentration changes caused by neural activities. The fNIRS method has been extensively implemented to understand the brain activity in many applications, such as neurodisorder diagnosis and treatment, cognitive psychology, and psychiatric status evaluation. To assist users in analyzing fNIRS data with various application purposes, we developed a software called platform for optical topography analysis tools (POTATo). We explain how to handle and analyze fNIRS data in the POTATo package and systematically describe domain preparation, temporal preprocessing, functional signal extraction, statistical analysis, and data/result visualization for a practical example of working memory tasks. This example is expected to give clear insight in analyzing data using POTATo. The results specifically show the activated dorsolateral prefrontal cortex is consistent with previous studies. This emphasizes analysis robustness, which is required for validating decent preprocessing and functional signal interpretation. POTATo also provides a self-developed plug-in feature allowing users to create their own functions and incorporate them with established POTATo functions. With this feature, we continuously encourage users to improve fNIRS analysis methods. We also address the complications and resolving opportunities in signal analysis. PMID:26788547

  13. ELECTRA Launch and Re-Entry Safety Analysis Tool

    NASA Astrophysics Data System (ADS)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRAs development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for world population repartition leading to the choice of a worldwide used model called GPW V3. Then, the complementary part for validation has been numerous system tests, most of them by comparison with already existing tools, operationally used for example into the European Space port in French Guyana. The purpose of this article is to review the method and models chosen by CNES for describing physical phenomena and the results of validation process including comparison with other risk assessment tools.

  14. Deconvolution of spectral line profiles: solution of the inversion problem

    NASA Astrophysics Data System (ADS)

    Brablec, A.; Trunec, D.; Stastn, F.

    1999-08-01

    We present a method for the deconvolution of spectral line profiles - the subtraction of the apparatus function. This inversion problem requires solution of the Fredholm integral equation of the first kind. For this purpose we suggest the use of B-splines. The efficiency of the approach is demonstrated using test examples as well as the deconvolution of the Hicons/Journals/Common/beta" ALT="beta" ALIGN="MIDDLE"/> spectral line profile, measured by means of the Fabry-Perot interferometer. The Hicons/Journals/Common/beta" ALT="beta" ALIGN="MIDDLE"/> line was emitted by a low temperature plasma in RF discharge burning in water vapour at reduced pressure. This deconvolution method is compared with the standard least squares method, where the initial profile is described by the Voigt function.

  15. The direct deconvolution of X-ray spectra

    NASA Technical Reports Server (NTRS)

    Kahn, S. M.; Blissett, R. J.

    1980-01-01

    The method of deconvolution of proportional counter X-ray spectra as outlined by Blissett and Cruise is reviewed with particular emphasis on low-energy spectra. This method involves the expansion of the incident spectrum in terms of a set of orthonormal singular functions which propagate independently through the detector matrix. When applied to low-energy detectors which typically exhibit absorption features in their efficiency curves, the initial Blissett and Cruise prescription is shown to be unstable. Alternative methods for handling the efficiency are described and evaluated. In the deconvolution of steep spectra, additional distortions are shown to arise as a result of sidelobe oscillations in the effective response functions. A selective weighting procedure is thus introduced to suppress spurious features of this type. Finally, the role of filtering in the deconvolution procedure is discussed and several suitable forms for the filter are suggested.

  16. Accuracy of peak deconvolution algorithms within chromatographic integrators

    SciTech Connect

    Papas, A.N. ); Tougas, T.P. )

    1990-02-01

    The soundness of present-day algorithms to deconvolve overlapping skewed peaks was investigated. From simulated studies based on the exponentially modified Gaussian model (EMG), chromatographic peak area inaccuracies for unresolved peaks are presented for the two deconvolution methods, the tangent skim and the perpendicular drop method. These inherent inaccuracies, in many cases exceeding 50%, are much greater than those calculated from ideal Gaussian profiles. Multiple linear regression (MLR) was used to build models that predict the relative error for either peak deconvolution method. MLR also provided a means for determining influential independent variables, defining the required chromatographic relationships needed for prediction. Once forecasted errors for both methods are calculated, selection of either peak deconvolution method can be made by minimum errors. These selection boundaries are contrasted to method selection criteria of present data systems algorithms.

  17. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  18. An online database for plant image analysis software tools

    PubMed Central

    2013-01-01

    Background Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. Results We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. Conclusions The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work. PMID:24107223

  19. Validating whole slide digital morphometric analysis as a microscopy tool.

    PubMed

    Diller, Robert B; Kellar, Robert S

    2015-02-01

    Whole slide imaging (WSI) can be used to quantify multiple responses within tissue sections during histological analysis. Feature Analysis on Consecutive Tissue Sections (FACTS®) allows the investigator to perform digital morphometric analysis (DMA) within specified regions of interest (ROI) across multiple serial sections at faster rates when compared with manual morphometry methods. Using FACTS® in conjunction with WSI is a powerful analysis tool, which allows DMA to target specific ROI across multiple tissue sections stained for different biomarkers. DMA may serve as an appropriate alternative to classic, manual, histologic morphometric measures, which have historically relied on the selection of high-powered fields of views and manual scoring (e.g., a gold standard). In the current study, existing preserved samples were used to determine if DMA would provide similar results to manual counting methods. Rodent hearts (n=14, left ventricles) were stained with Masson's trichrome, and reacted for cluster of differentiation 68 (CD-68). This study found no statistical significant difference between a classic, manual method and the use of digital algorithms to perform the similar counts (p=0.38). DMA offers researchers the ability to accurately evaluate morphological characteristics in a reproducible fashion without investigator bias and with higher throughput. PMID:25399639

  20. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of tornadoes, flash floods, storminess, extreme weather events, etc. LCAT will expand the suite of NWS climate products. The LCAT development utilizes NWS Operations and Services Improvement Process (OSIP) to document the field and user requirements, develop solutions, and prioritize resources. OSIP is a five work-stage process separated by four gate reviews. LCAT is currently at work-stage three: Research Demonstration and Solution Analysis. Gate 1 and 2 reviews identified LCAT as a high strategic priority project with a very high operational need. The Integrated Working Team, consisting of NWS field representatives, assists in tool function design and identification of LCAT operational deployment support.

  1. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  2. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  3. Resonance frequency analysis: a new diagnostic tool for dental ankylosis.

    PubMed

    Bertl, Michael H; Weinberger, Thomas; Schwarz, Kerstin; Gruber, Reinhard; Crismani, Adriano G

    2012-06-01

    Ankylosed teeth are considered in orthodontic treatment planning; however, diagnostic tools to quantify the rigidity of the tooth-to-bone connection are rare. Resonance frequency analysis (RFA) can quantify the rigidity of the dental implant-to-bone connection and thus may serve as a potential diagnostic tool to identify ankylosed teeth. To test this assumption, we examined 15 and 30 primary mandibular molars, with and without clinical signs of ankylosis, using the Osstell Mentor system. A cut-off implant stability quotient (ISQ) of 43 provided a specificity of 100% and a sensitivity of 53.3% when measured in the mesio-distal direction or a sensitivity of 20% when measured in the bucco-lingual direction. Based on a receiver-operating characteristic (ROC), the area under the curve (AUC) of 0.807 showed the mesio-distal direction of measurement to be a test of moderate discriminatory power. Given its non-invasiveness, RFA may serve as a quantitative diagnostic supplement to the clinical examination of potentially ankylosed primary molars. PMID:22607343

  4. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  5. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  6. Improved Cell Typing by Charge-State Deconvolution of matrix-assisted laser desorption/ionization Mass Spectra

    SciTech Connect

    Wilkes, Jon G.; Buzantu, Dan A.; Dare, Diane J.; Dragan, Yvonne P.; Chiarelli, M. Paul; Holland, Ricky D.; Beaudoin, Michael; Heinze, Thomas M.; Nayak, Rajesh; Shvartsburg, Alexandre A.

    2006-05-30

    Robust, specific, and rapid identification of toxic strains of bacteria and viruses, to guide the mitigation of their adverse health effects and optimum implementation of other response actions, remains a major analytical challenge. This need has driven the development of methods for classification of microorganisms using mass spectrometry, particularly matrix-assisted laser desorption ionization MS (MALDI) that allows high throughput analyses with minimum sample preparation. We describe a novel approach to cell typing based on pattern recognition of MALDI spectra, which involves charge-state deconvolution in conjunction with a new correlation analysis procedure. The method is applicable to both prokaryotic and eukaryotic cells. Charge-state deconvolution improves the quantitative reproducibility of spectra because multiply-charged ions resulting from the same biomarker attaching a different number of protons are recognized and their abundances are combined. This allows a clearer distinction of bacterial strains or of cancerous and normal liver cells. Improved class distinction provided by charge-state deconvolution was demonstrated by cluster spacing on canonical variate score charts and by correlation analyses. Deconvolution may enhance detection of early disease state or therapy progress markers in various tissues analyzed by MALDI.

  7. Decision Analysis Tool to Compare Energy Pathways for Transportation

    SciTech Connect

    Bloyd, Cary N.

    2010-06-30

    With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). Biomass is seen as an important domestic energy feedstock, and there are multiple pathways in which it can be linked to the transport sector. Contenders include the use of cellulosic ethanol from biomass to replace gasoline or the use of a biomass-fueled combined cycle electrical power generation facility in conjunction plug-in hybrid electric vehicles (PHEVs). This paper reviews a project that is developing a scenario decision analysis tool to assist policy makers, program managers, and others to obtain a better understanding of these uncertain possibilities and how they may interact over time.

  8. Validation of tool mark analysis of cut costal cartilage.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2012-03-01

    This study was designed to establish the potential error rate associated with the generally accepted method of tool mark analysis of cut marks in costal cartilage. Three knives with different blade types were used to make experimental cut marks in costal cartilage of pigs. Each cut surface was cast, and each cast was examined by three analysts working independently. The presence of striations, regularity of striations, and presence of a primary and secondary striation pattern were recorded for each cast. The distance between each striation was measured. The results showed that striations were not consistently impressed on the cut surface by the blade's cutting edge. Also, blade type classification by the presence or absence of striations led to a 65% misclassification rate. Use of the classification tree and cross-validation methods and inclusion of the mean interstriation distance decreased the error rate to c. 50%. PMID:22081951

  9. Input Range Testing for the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  10. Bioelectrical impedance analysis: A new tool for assessing fish condition

    USGS Publications Warehouse

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%,n= 60, yielding anR2of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  11. PARSESNP: a tool for the analysis of nucleotide polymorphisms

    PubMed Central

    Taylor, Nicholas E.; Greene, Elizabeth A.

    2003-01-01

    PARSESNP is a tool for the display and analysis of polymorphisms in genes. Using a reference DNA sequence, an exon/intron position model and a list of polymorphisms, it determines the effects of these polymorphisms on the expressed gene product, as well as the changes in restriction enzyme recognition sites. It shows the locations and effects of the polymorphisms in summary on a stylized graphic and in detail on a display of the protein sequence aligned with the DNA sequence. The addition of a homology model, in the form of an alignment of related protein sequences, allows for prediction of the severity of missense changes. PARSESNP is available on the World Wide Web at http://www.proweb.org/parsesnp/. PMID:12824424

  12. Sensitivity analysis of an information fusion tool: OWA operator

    NASA Astrophysics Data System (ADS)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  13. Combinatorial tools for the analysis of transcriptional regulation

    SciTech Connect

    Bergeron, A.; Gaul, E.; Bergeron, D.

    1996-12-31

    In this paper, we discuss virtual experiments for the study of major regulatory processes such as translation, signalization or transcription pathways. An essential part of these processes is the formation of protein clusters held together by a small number of binding domains that can be shared by many different proteins. Analysis of these clusters is complicated by the vast number of different arrangements of proteins that can trigger a specific reaction. We propose combinatorial tools that can help predict the effects on the rate of transcription of either changes in transcriptional factors concentration, or due to the introduction of chimeras combining domains not usually present on a protein. 15 refs., 5 figs., 3 tabs.

  14. Molecular tools for analysis of gene function in parasitic microorganisms.

    PubMed

    Meissner, Markus; Agop-Nersesian, Carolina; Sullivan, William J

    2007-07-01

    With the completion of several genome sequences for parasitic protozoa, research in molecular parasitology entered the "post-genomic" era. Accompanied by global transcriptome and proteome analysis, huge datasets have been generated that have added many novel candidates to the list of drug and vaccine targets. The challenge is now to validate these factors and to bring science back to the bench to perform a detailed characterization. In some parasites, like Trypanosoma brucei, high-throughput genetic screens have been established using RNA interference [for a detailed review, see Motyka and Englund (2004)]. In most protozoan parasites, however, more time-consuming approaches have to be employed to identify and characterize the function of promising candidates in detail. This review aims to summarize the status of molecular genetic tools available for a variety of protozoan pathogens and discuss how they can be implemented to advance our understanding of parasite biology. PMID:17401559

  15. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  16. Development of unified plotting tools for GA transport analysis

    NASA Astrophysics Data System (ADS)

    Buuck, M.; Candy, J.

    2011-10-01

    A collection of python classes for the TGYRO suite of codes (NEO, GYRO, TGYRO, TGLF) has been developed that provide both the expert user with conceptually simple access to all code output data, and the casual end user with simple command-line control of plotting. The user base for these transport analysis codes continues to grow, raising the urgency of modernizing and unifying the plotting tools used for post-simulation analysis. Simultaneously, there is a push toward larger-scale fusion modeling underscoring the need for a revised, modernized approach to data management and analysis. The TGYRO suite is currently in use at all major fusion laboratories worldwide, and allows the user to make steady-state profile predictions for existing devices and future reactors, and simultaneously to carry out fundamental research on plasma transport (both collisional and turbulent). Work supported by US DOE under DE-FG02-95ER54309 and the National Undergraduate Fellowship in Fusion Science and Engineering.

  17. Immunoglobulin analysis tool: a novel tool for the analysis of human and mouse heavy and light chain transcripts.

    PubMed

    Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F; Ippolito, Gregory C; Zemlin, Michael

    2012-01-01

    Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft() Excel() based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts - such as the usage of germline gene segments, or the length and composition of the CDR-3 region - IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai's H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel() 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set to its fullest. PMID:22754554

  18. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.

  19. Receiver function deconvolution using transdimensional hierarchical Bayesian inference

    NASA Astrophysics Data System (ADS)

    Kolb, J. M.; Leki?, V.

    2014-06-01

    Teleseismic waves can convert from shear to compressional (Sp) or compressional to shear (Ps) across impedance contrasts in the subsurface. Deconvolving the parent waveforms (P for Ps or S for Sp) from the daughter waveforms (S for Ps or P for Sp) generates receiver functions which can be used to analyse velocity structure beneath the receiver. Though a variety of deconvolution techniques have been developed, they are all adversely affected by background and signal-generated noise. In order to take into account the unknown noise characteristics, we propose a method based on transdimensional hierarchical Bayesian inference in which both the noise magnitude and noise spectral character are parameters in calculating the likelihood probability distribution. We use a reversible-jump implementation of a Markov chain Monte Carlo algorithm to find an ensemble of receiver functions whose relative fits to the data have been calculated while simultaneously inferring the values of the noise parameters. Our noise parametrization is determined from pre-event noise so that it approximates observed noise characteristics. We test the algorithm on synthetic waveforms contaminated with noise generated from a covariance matrix obtained from observed noise. We show that the method retrieves easily interpretable receiver functions even in the presence of high noise levels. We also show that we can obtain useful estimates of noise amplitude and frequency content. Analysis of the ensemble solutions produced by our method can be used to quantify the uncertainties associated with individual receiver functions as well as with individual features within them, providing an objective way for deciding which features warrant geological interpretation. This method should make possible more robust inferences on subsurface structure using receiver function analysis, especially in areas of poor data coverage or under noisy station conditions.

  20. The Watershed Deposition Tool: A Tool for Incorporating Atmospheric Deposition in Watershed Analysis

    EPA Science Inventory

    The tool for providing the linkage between air and water quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. The Watershed Deposition Tool (WDT) takes gridded output of at...

  1. Second generation sequencing allows for mtDNA mixture deconvolution and high resolution detection of heteroplasmy

    PubMed Central

    Holland, Mitchell M.; McQuillan, Megan R.; O’Hanlon, Katherine A.

    2011-01-01

    Aim To use parallel array pyrosequencing to deconvolute mixtures of mitochondrial DNA (mtDNA) sequence and provide high resolution analysis of mtDNA heteroplasmy. Methods The hypervariable segment 1 (HV1) of the mtDNA control region was analyzed from 30 individuals using the 454 GS Junior instrument. Mock mixtures were used to evaluate the system’s ability to deconvolute mixtures and to reliably detect heteroplasmy, including heteroplasmic differences between 5 family members of the same maternal lineage. Amplicon sequencing was performed on polymerase chain reaction (PCR) products generated with primers that included multiplex identifiers (MID) and adaptors for pyrosequencing. Data analysis was performed using NextGENe® software. The analysis of an autosomal short tandem repeat (STR) locus (D18S51) and a Y-STR locus (DYS389 I/II) was performed simultaneously with a portion of HV1 to illustrate that multiplexing can encompass different markers of forensic interest. Results Mixtures, including heteroplasmic variants, can be detected routinely down to a component ratio of 1:250 (20 minor variant copies with a coverage rate of 5000 sequences) and can be readily detected down to 1:1000 (0.1%) with expanded coverage. Amplicon sequences from D18S51, DYS389 I/II, and the second half of HV1 were successfully partitioned and analyzed. Conclusions The ability to routinely deconvolute mtDNA mixtures down to a level of 1:250 allows for high resolution analysis of mtDNA heteroplasmy, and for differentiation of individuals from the same maternal lineage. The pyrosequencing approach results in poor resolution of homopolymeric sequences, and PCR/sequencing artifacts require a filtering mechanism similar to that for STR stutter and spectral bleed through. In addition, chimeric sequences from jumping PCR must be addressed to make the method operational. PMID:21674826

  2. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    PubMed

    Caruana, Matthew V; Carvalho, Susana; Braun, David R; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W K

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  3. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  4. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  5. Distortion Analysis ToolkitA Software Tool for Easy Analysis of Nonlinear Audio Systems

    NASA Astrophysics Data System (ADS)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  6. Deconvolution and chromatic aberration corrections in quantifying colocalization of a transcription factor in three-dimensional cellular space.

    PubMed

    Abraham, Thomas; Allan, Sarah E; Levings, Megan K

    2010-08-01

    In the realm of multi-dimensional confocal microscopy, colocalization analysis of fluorescent emission signals has proven to be an invaluable tool for detecting molecular interactions between biological macromolecules at the subcellular level. We show here that image processing operations such as the deconvolution and chromatic corrections play a crucial role in the accurate determination of colocalization between biological macromolecules particularly when the fluorescent signals are faint, and when the fluorescent signals are in the blue and red emission regions. The cellular system presented here describes quantification of an activated forkhead box P3 (FOXP3) transcription factor in three-dimensional (3D) cellular space. 293T cells transfected with a conditionally active form of FOXP3 were stained for anti-FOXP3 conjugated to a fluorescent red dye (Phycoerythrin), and counterstained for DNA (nucleus) with fluorescent blue dye (Hoechst). Due to the broad emission spectra of these dyes, the fluorescent signals were collected only from peak regions and were acquired sequentially. Since the PE signal was weak, a confocal pinhole size of two Airy size was used to collect the 3D image data sets. The raw images supplemented with the spectral data show the preferential association of activated FOXP3 molecules with the nucleus. However, the PE signals were found to be highly diffusive and colocalization quantification from these raw images was not possible. In order to deconvolve the 3D raw image data set, point spread functions (PSFs) of these emissions were measured. From the measured PSF, we found that chromatic shifts between the blue and red colors were quite considerable. Followed by the applications of both the axial and lateral chromatic corrections, colocalization analysis performed on the deconvolved-chromatic corrected-3D image data set showed that 98% of DNA molecules were associated with FOXP3 molecules, whereas only 66% of FOXP3 molecules were colocalized with DNA molecules. In conclusion, our studies clearly demonstrate the importance of PSF measurements, chromatic aberration corrections followed by deconvolution in the accurate determination of transcription factors in the 3D cellular space. The reported imaging and processing methods can be a practical guide for quantitative fluorescence imaging of similar cellular systems and can provide a basis for further development. PMID:20392647

  7. Y0: An innovative tool for spatial data analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Jeremy C.

    1993-08-01

    This paper describes an advanced analysis and visualization tool, called Y0 (pronounced ``Why not?!''), that has been developed to directly support the scientific process for earth and space science research. Y0 aids the scientific research process by enabling the user to formulate algorithms and models within an integrated environment, and then interactively explore the solution space with the aid of appropriate visualizations. Y0 has been designed to provide strong support for both quantitative analysis and rich visualization. The user's algorithm or model is defined in terms of algebraic formulas in cells on worksheets, in a similar fashion to spreadsheet programs. Y0 is specifically designed to provide the data types and rich function set necessary for effective analysis and manipulation of remote sensing data. This includes various types of arrays, geometric objects, and objects for representing geographic coordinate system mappings. Visualization of results is tailored to the needs of remote sensing, with straightforward methods of composing, comparing, and animating imagery and graphical information, with reference to geographical coordinate systems. Y0 is based on advanced object-oriented technology. It is implemented in C++ for use in Unix environments, with a user interface based on the X window system. Y0 has been delivered under contract to Unidata, a group which provides data and software support to atmospheric researches in universities affiliated with UCAR. This paper will explore the key concepts in Y0, describe its utility for remote sensing analysis and visualization, and will give a specific example of its application to the problem of measuring glacier flow rates from Landsat imagery.

  8. Petrographic Image Analysis as a tool to improve reservoir data

    SciTech Connect

    Ballentine, F.M.; Shelby, D.L.; Philipson, C.A. )

    1990-09-01

    Petrographic Image Analysis (PIA) can be used to acquire improved core analysis data in percussion sidewalls because the analysis focuses on the unaltered portions of the cores, thereby eliminating the bias that may be created by shattered and disturbed areas. It is also a valuable tool for reservoir characterization of all rock types. Petrophysical data acquired from laboratory tests on conventional and percussion sidewall cores, wireline logs, and PIA techniques were compared for a cored interval in the Tuscaloosa Formation. Comparison of log and conventional core porosities yielded good agreement. Sidewall core laboratory porosities were 2 to 10 porosity percent too high, owing to the shattering effect. However, PIA performed on percussion core samples compared well with both the conventional core laboratory and log porosities. Permeabilities obtained from conventional core laboratory measurements and PIA analyses compared well. Sidewall permeabilities (derived from particle size analysis) showed similar trends, but tended to be optimistic. These improved Tuscaloosa porosity and permeability values, as well as other parameters derived from PIA, can be useful in log interpretation and reservoir evaluation. PIA formation factor and porosity were used to define the cementation exponent m. Capillary pressure curves generated by PIA were utilized to correlate irreducible water saturation to PIA porosity and permeability values. PIA porosity and permeability were also used to determine critical water saturations. PIA is valuable as an aid to reservoir characterization. The Smackover Formation has been separated into several hydraulic units based on pore type and fluid flow properties. PIA was used to help delineate the differences between the pore systems in these reservoir units. Binary images and shape factor distributions were utilized to characterize the distributions and shapes of the pores within each unit.

  9. EVA - An Interactive Online Tool for Extreme Value Analysis

    NASA Astrophysics Data System (ADS)

    Zingerle, C.; Buchauer, M.; Neururer, A.; Schellander, H.

    2009-09-01

    Forecasting and analysing extreme events and their impact is a duty of operational forecasters, though it happens not very frequently. In such situations forecasters often rely on a synopsis of different forecast models, own experience, historical observations and intuition. Especially historical data are usually not available at the entirety and timeliness needed in operational forecasting and warning. A forecaster needs a comprehensive overview. He has no time to dig data from a database, search for extremes and compile a rather complicated extreme value analysis on the data. On the other hand in the field of engineering expertise on extreme events is often asked from a modern weather service and in a lot of cases time for elaboration is limited. EVA (Extreme Value Analysis) was developed at ZAMG during METEORISK, a project among alpine weather- and hydrological services dealing with meteorological and hydrological risks. The EVA system consists of two main components: An effective database containing pre-processed precipitation data (rain, snow and snow height) from meteorological events of durations from 1 minute up to 15 days measured at each station in the partner regions. The second part of the system is a set of web-tools to deal with the actual extreme value analysis. Different theoretical models can be chosen to calculate annualities. Presentation of the output is either tabular showing all extreme events at a station together with the theoretically calculated return times, or graphical where parameters like precipitation amount at certain return times and confidence intervals are plotted together with the empirical distribution of the actual measurements. Additional plots (quantile-quantile plots, empirical and fitted theoretical distribution model) allowing a more detailed assessment of the extreme value analysis can be requested. To complete analysis of a special extreme event ECMWF ERA40 sea level and upper air pressure fields and temperature distribution are available within the system. During the years after Meteorisk, the EVA System has been expanded by ZAMG adding further parameters like wind speed and temperature. The system has lately been harmonized, so that ZAMG has now only one platform providing fast extreme value analysis for all kind of interesting meteorological parameter. A further development is the EVA-maps application. Forecasted extreme events at station locations and actual measurements are compared to historical extreme events. Return times of the forecasted and measured events are classified and displayed in a map. A mouse-over menu offers detailed analysis of the situation at each station. EVA-maps is a powerful assistance to the forecasters, where they get a comprehensive overview of forecasted precipitation in relation to extreme events of the past.

  10. Generalized Analysis Tools for Multi-Spacecraft Missions

    NASA Astrophysics Data System (ADS)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI SR-001, 1998. [2] Chanteur, G.: Spatial Interpolation for Four Spacecraft: Theory, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 371-393, ISSI SR-001, 1998. [3] Chanteur, G.: Accuracy of field gradient estimations by Cluster: Explanation of its dependency upon elongation and planarity of the tetrahedron, pp. 265-268, ESA SP-449, 2000. [4] Vogt, J., Paschmann, G., and Chanteur, G.: Reciprocal Vectors, pp. 33-46, ISSI SR-008, 2008.

  11. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

  12. New tools for the analysis and design of building envelopes

    SciTech Connect

    Papamichael, K.; Winkelmann, F.C.; Buhl, W.F.; Chauvet, H.

    1994-08-01

    We describe the integrated development of PowerDOE, a new version of the DOE-2 building energy analysis program, and the Building Design Advisor (BDA), a multimedia-based design tool that assists building designers with the concurrent consideration of multiple design solutions with respect to multiple design criteria. PowerDOE has a windows-based Graphical User Interface (GUI) that makes it easier to use than DOE-2, while retaining DOE-2`s calculation power and accuracy. BDA, with a similar GUI, is designed to link to multiple analytical models and databases. In its first release it is linked to PowerDOE and a Daylighting Analysis Module, as well as to a Case Studies Database and a Schematic Graphic Editor. These allow building designers to set performance goals and address key building envelope parameters from the initial, schematic phases of building design to the detailed specification of building components and systems required by PowerDOE. The consideration of the thermal performance of building envelopes through PowerDOE and BDA is integrated with non-thermal envelope performance aspects, such as daylighting, as well as with the performance of non-envelope building components and systems, such as electric lighting and HVAC. Future versions of BDA will support links to CAD and electronic product catalogs, as well as provide context-dependent design advice to improve performance.

  13. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  14. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  15. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  16. Real-time image deconvolution on the GPU

    NASA Astrophysics Data System (ADS)

    Klosowski, James T.; Krishnan, Shankar

    2011-01-01

    Two-dimensional image deconvolution is an important and well-studied problem with applications to image deblurring and restoration. Most of the best deconvolution algorithms use natural image statistics that act as priors to regularize the problem. Recently, Krishnan and Fergus provide a fast deconvolution algorithm that yields results comparable to the current state of the art. They use a hyper-Laplacian image prior to regularize the problem. The resulting optimization problem is solved using alternating minimization in conjunction with a half-quadratic penalty function. In this paper, we provide an efficient CUDA implementation of their algorithm on the GPU. Our implementation leverages many wellknown CUDA optimization techniques, as well as several others that have a significant impact on this particular algorithm. We discuss each of these, as well as make a few observations regarding the CUFFT library. Our experiments were run on an Nvidia GeForce GTX 260. For a single channel image of size 710 x 470, we obtain over 40 fps, while on a larger image of size 1900 x 1266, we get almost 6 fps (without counting disk I/O). In addition to linear performance, we believe ours is the first implementation to perform deconvolutions at video rates. Our running times also demonstrate that our GPU implementation is over 27 times faster than the original CPU implementation.

  17. Spent Nuclear Fuel Characterization Through Neutron Flux Deconvolution

    SciTech Connect

    Hartman, Michael R.; Lee, John C.

    2001-06-17

    A method to determine the composition of spent fuel through spectral deconvolution of the neutron flux emitted from the fuel is proposed. Recently developed GaAs({sup 10}B) semiconductor detector arrays are used. The results of Monte Carlo simulations of the detector responses, illustrating the feasibility of the spectral unfolding technique for spent fuel characterization, are presented.

  18. Blind deconvolution using an improved L0 sparse representation

    NASA Astrophysics Data System (ADS)

    Ye, Pengzhao; Feng, Huajun; Li, Qi; Xu, Zhihai; Chen, Yueting

    2014-09-01

    In this paper, we present a method for single image blind deconvolution. Many common forms of blind deconvolution methods need to previously generate a salient image, while the paper presents a novel L0 sparse expression to directly solve the ill-positioned problem. It has no need to filter the blurred image as a restoration step and can use the gradient information as a fidelity term during optimization. The key to blind deconvolution problem is to estimate an accurate kernel. First, based on L2 sparse expression using gradient operator as a prior, the kernel can be estimated roughly and efficiently in the frequency domain. We adopt the multi-scale scheme which can estimate blur kernel from coarser level to finer level. After the estimation of this level's kernel, L0 sparse representation is employed as the fidelity term during restoration. After derivation, L0 norm can be approximately converted to a sum term and L1 norm term which can be addressed by the Split-Bregman method. By using the estimated blur kernel and the TV deconvolution model, the final restoration image is obtained. Experimental results show that the proposed method is fast and can accurately reconstruct the kernel, especially when the blur is motion blur, defocus blur or the superposition of the two. The restored image is of higher quality than that of some of the art algorithms.

  19. Deconvolution of astronomical images using SOR with adaptive relaxation.

    PubMed

    Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

    2011-07-01

    We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity. PMID:21747506

  20. Robust multichannel blind deconvolution via fast alternating minimization.

    PubMed

    Sroubek, Filip; Milanfar, Peyman

    2012-04-01

    Blind deconvolution, which comprises simultaneous blur and image estimations, is a strongly ill-posed problem. It is by now well known that if multiple images of the same scene are acquired, this multichannel (MC) blind deconvolution problem is better posed and allows blur estimation directly from the degraded images. We improve the MC idea by adding robustness to noise and stability in the case of large blurs or if the blur size is vastly overestimated. We formulate blind deconvolution as an l(1) -regularized optimization problem and seek a solution by alternately optimizing with respect to the image and with respect to blurs. Each optimization step is converted to a constrained problem by variable splitting and then is addressed with an augmented Lagrangian method, which permits simple and fast implementation in the Fourier domain. The rapid convergence of the proposed method is illustrated on synthetically blurred data. Applicability is also demonstrated on the deconvolution of real photos taken by a digital camera. PMID:22084050

  1. Improving the efficiency of deconvolution algorithms for sound source localization.

    PubMed

    Lylloff, Oliver; Fernndez-Grande, Efrn; Agerkvist, Finn; Hald, Jrgen; Roig, Elisabet Tiana; Andersen, Martin S

    2015-07-01

    The localization of sound sources with delay-and-sum (DAS) beamforming is limited by a poor spatial resolution-particularly at low frequencies. Various methods based on deconvolution are examined to improve the resolution of the beamforming map, which can be modeled by a convolution of the unknown acoustic source distribution and the beamformer's response to a point source, i.e., point-spread function. A significant limitation of deconvolution is, however, an additional computational effort compared to beamforming. In this paper, computationally efficient deconvolution algorithms are examined with computer simulations and experimental data. Specifically, the deconvolution problem is solved with a fast gradient projection method called Fast Iterative Shrikage-Thresholding Algorithm (FISTA), and compared with a Fourier-based non-negative least squares algorithm. The results indicate that FISTA tends to provide an improved spatial resolution and is up to 30% faster and more robust to noise. In the spirit of reproducible research, the source code is available online. PMID:26233017

  2. Motion-compensated blind deconvolution of scanning laser opthalmoscope imagery

    NASA Astrophysics Data System (ADS)

    O'Connor, Nathan J.; Bartsch, Dirk-Uwe G.; Freeman, William R.; Holmes, Timothy J.

    1998-06-01

    A deconvolution algorithm for use with scanning laser ophthalmoscope (SLO) data is being developed. The SLO is fundamentally a confocal microscope in which the objective lens is the human ocular lens. 3D data is collected by raster scanning to form images at different depths in retinal and choroidal layers. In this way, 3D anatomy may be imaged and stored as a series of optical sections.Given the poor optical quality of the human lens and random eye motion during data acquisition, any deconvolution method applied to SLO data must be able to account for distortions present in the observed data. The algorithm presented compensates for image warping and frame-to-frame displacement due to random eye motion, smearing along the optic axis, sensor saturation, and other problems. A preprocessing step is first used to compensate for frame-to-frame image displacement. The image warping, caused by random eye motion during raster scanning, is corrected. Finally, a maximum likelihood based blind deconvolution algorithm is used to correct severe blurring along the optic axis. The blind deconvolution algorithm contains an iterative search for subpixel displacements remaining after image warping and frame-to-frame displacements are corrected. This iterative search is formulated to ensure that the likelihood functional is non-decreasing.

  3. Machine learning deconvolution filter kernels for image restoration

    NASA Astrophysics Data System (ADS)

    Mainali, Pradip; Wittebrood, Rimmert

    2015-03-01

    In this paper, we propose a novel algorithm to recover a sharp image from its corrupted form by deconvolution. The algorithm learns the deconvolution process. This is achieved by learning the deconvolution filter kernels for the set of learnt basic pixel patterns. The algorithm consists of the offline learning and online filtering stages. In the one-time offline learning stage, the algorithm learns the dictionary of various local characteristics of the pixel patch as the basic pixel patterns from a huge number of natural images in the training database. Later, the deconvolution filter coefficients for each pixel pattern is optimized by using the source and the corrupted image pairs in the training database. In the online stage, the algorithm only needs to find the nearest matching pixel pattern in the dictionary for each pixel and filter it using the filter optimized for the corresponding pixel pattern. Experimental results on natural images show that our method achieves the state-of-art result on an image deblurring. The proposed approach can be applied to recover a sharp image for applications such as camera, HD/UHD TV, document scanning systems etc.

  4. Marginal blind deconvolution of adaptive optics retinal images.

    PubMed

    Blanco, L; Mugnier, L M

    2011-11-01

    Adaptive Optics corrected flood imaging of the retina has been in use for more than a decade and is now a well-developed technique. Nevertheless, raw AO flood images are usually of poor contrast because of the three-dimensional nature of the imaging, meaning that the image contains information coming from both the in-focus plane and the out-of-focus planes of the object, which also leads to a loss in resolution. Interpretation of such images is therefore difficult without an appropriate post-processing, which typically includes image deconvolution. The deconvolution of retina images is difficult because the point spread function (PSF) is not well known, a problem known as blind deconvolution. We present an image model for dealing with the problem of imaging a 3D object with a 2D conventional imager in which the recorded 2D image is a convolution of an invariant 2D object with a linear combination of 2D PSFs. The blind deconvolution problem boils down to estimating the coefficients of the PSF linear combination. We show that the conventional method of joint estimation fails even for a small number of coefficients. We derive a marginal estimation of the unknown parameters (PSF coefficients, object Power Spectral Density and noise level) followed by a MAP estimation of the object. We show that the marginal estimation has good statistical convergence properties and we present results on simulated and experimental data. PMID:22109201

  5. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    Gonzlez, C.; Velilla, C.; Snchez-Girn, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

  6. Breast image feature learning with adaptive deconvolutional networks

    NASA Astrophysics Data System (ADS)

    Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.

    2012-03-01

    Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpiñán, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

  7. Fluence estimation by deconvolution via l1-norm minimization

    NASA Astrophysics Data System (ADS)

    Garca Hernndez, J. C.; Lazaro-Ponthus, D.; Gmar, M.; Barthe, J.

    2011-03-01

    Advances in radiotherapy irradiation techniques have led to very complex treatments requiring for a more stringent control. The dosimetric properties of electronic portal imaging devices (EPID) encouraged their use for treatment verification. Two main approaches have been proposed: the forward approach, where measured portal dose images are compared to predicted dose images and the backward approach, where EPID images are used to estimate the dose delivered to the patient. Both approaches need EPID images to be converted into a fluence distribution by deconvolution. However, deconvolution is an ill-posed problem which is very sensitive to small variations on input data. This study presents the application of a deconvolution method based on l1-norm minimization; this is a method known for being very stable while working with noisy data. The algorithm was first evaluated on synthetic images with different noise levels, the results were satisfactory. Deconvolution algorithm was then applied to experimental portal images; the required EPID response kernel and energy fluence images were computed by Monte-Carlo calculation, accelerator treatment head and EPID models had already been commissioned in a previous work. The obtained fluence images were in good agreement with simulated fluence images. This deconvolution algorithm may be generalized to an inverse problem with a general operator, where image formation is not longer modeled by a convolution but by a linear operation that might be seen as a position-dependent convolution. Moreover, this procedure would be detector independent and could be used for any detector type provided its response function is known.

  8. Suspected-target pesticide screening using gas chromatography-quadrupole time-of-flight mass spectrometry with high resolution deconvolution and retention index/mass spectrum library.

    PubMed

    Zhang, Fang; Wang, Haoyang; Zhang, Li; Zhang, Jing; Fan, Ruojing; Yu, Chongtian; Wang, Wenwen; Guo, Yinlong

    2014-10-01

    A strategy for suspected-target screening of pesticide residues in complicated matrices was exploited using gas chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS). The screening workflow followed three key steps of, initial detection, preliminary identification, and final confirmation. The initial detection of components in a matrix was done by a high resolution mass spectrum deconvolution; the preliminary identification of suspected pesticides was based on a special retention index/mass spectrum (RI/MS) library that contained both the first-stage mass spectra (MS(1) spectra) and retention indices; and the final confirmation was accomplished by accurate mass measurements of representative ions with their response ratios from the MS(1) spectra or representative product ions from the second-stage mass spectra (MS(2) spectra). To evaluate the applicability of the workflow in real samples, three matrices of apple, spinach, and scallion, each spiked with 165 test pesticides in a set of concentrations, were selected as the models. The results showed that the use of high-resolution TOF enabled effective extractions of spectra from noisy chromatograms, which was based on a narrow mass window (5 mDa) and suspected-target compounds identified by the similarity match of deconvoluted full mass spectra and filtering of linear RIs. On average, over 74% of pesticides at 50 ng/mL could be identified using deconvolution and the RI/MS library. Over 80% of pesticides at 5 ng/mL or lower concentrations could be confirmed in each matrix using at least two representative ions with their response ratios from the MS(1) spectra. In addition, the application of product ion spectra was capable of confirming suspected pesticides with specificity for some pesticides in complicated matrices. In conclusion, GC-QTOF MS combined with the RI/MS library seems to be one of the most efficient tools for the analysis of suspected-target pesticide residues in complicated matrices. PMID:25059143

  9. Analysis of the influence of tool dynamics in diamond turning

    SciTech Connect

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  10. HPLC analysis as a tool for assessing targeted liposome composition.

    PubMed

    Oswald, Mira; Platscher, Michael; Geissler, Simon; Goepferich, Achim

    2016-01-30

    Functionalized phospholipids are indispensable materials for the design of targeted liposomes. Control over the quality and quantity of phospholipids is thereby key in the successful development and manufacture of such formulations. This was also the case for a complex liposomal preparation composed of 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC), Cholesterol (CHO), 1,2-distearoyl-sn-glycero-3-phosphoethanolamine-N-[amino(polyethylene glycol)-2000] (DSPE-PEG2000). To this end, an RP-HPLC method was developed. Detection was done via evaporative light scattering (ELS) for liposomal components. The method was validated for linearity, precision, accuracy, sensitivity and robustness. The liposomal compounds had a non-linear quadratic response in the concentration range of 0.012-0.42mg/ml with a correlation coefficient greater than 0.99 with an accuracy of method confirmed 95-105% of the theoretical concentration. Furthermore, degradation products from the liposomal formulation could be identified. The presented method was successfully implemented as a control tool during the preparation of functionalized liposomes. It underlined the benefit of HPLC analysis of phospholipids during liposome preparation as an easy and rapid control method for the functionalized lipid at each preparation step as well as for the quantification of all components. PMID:26570988

  11. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  12. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob. PMID:18948494

  13. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  14. Clinical decision support tools: analysis of online drug information databases

    PubMed Central

    Clauson, Kevin A; Marsh, Wallace A; Polen, Hyla H; Seamon, Matthew J; Ortiz, Blanca I

    2007-01-01

    Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information databases. Methods Five commercially available and two freely available online drug information databases were evaluated according to scope (presence or absence of answer), completeness (the comprehensiveness of the answers), and ease of use. Additionally, a composite score integrating all three criteria was utilized. Fifteen weighted categories comprised of 158 questions were used to conduct the analysis. Descriptive statistics and Chi-square were used to summarize the evaluation components and make comparisons between databases. Scheffe's multiple comparison procedure was used to determine statistically different scope and completeness scores. The composite score was subjected to sensitivity analysis to investigate the effect of the choice of percentages for scope and completeness. Results The rankings for the databases from highest to lowest, based on composite scores were Clinical Pharmacology, Micromedex, Lexi-Comp Online, Facts & Comparisons 4.0, Epocrates Online Premium, RxList.com, and Epocrates Online Free. Differences in scope produced three statistical groupings with Group 1 (best) performers being: Clinical Pharmacology, Micromedex, Facts & Comparisons 4.0, Lexi-Comp Online, Group 2: Epocrates Premium and RxList.com and Group 3: Epocrates Free (p < 0.05). Completeness scores were similarly stratified. Collapsing the databases into two groups by access (subscription or free), showed the subscription databases performed better than the free databases in the measured criteria (p < 0.001). Conclusion Online drug information databases, which belong to clinical decision support, vary in their ability to answer questions across a range of categories. PMID:17346336

  15. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  16. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  17. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    NASA Astrophysics Data System (ADS)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency management, to know in advance the different levels of risk perception and preparedness existing among several sectors of the population. Knowing where the most vulnerable population is located may optimize the use of resources, better direct the initial efforts and organize the evacuation and attention procedures. As part of the CBEWS, a comprehensive survey was applied in the study area to measure, among others features, the levels of risk perception, preparation and information received about natural hazards. After a statistical and direct analysis on a complete social dataset recorded, a spatial information distribution is actually in progress. Based on boundaries features (municipalities and sub-districts) of Italian Institute of Statistics (ISTAT), a local scale background has been granted (a private address level is not accessible for privacy rules so the local districts-ID inside municipality has been the detail level performed) and a spatial location of the surveyed population has been completed. The geometric component has been defined and actually it is possible to create a local distribution of social parameters derived from perception questionnaries results. A lot of raw information and social-statistical analysis offer different mirror and "visual concept" of risk perception. For this reason a concrete complete GeoDB is under working for the complete organization of the dataset. By a technical point of view the environment for data sharing is based on a complete open source web-service environment, to offer manually-made and user-friendly interface to this kind of information. Final aim is to offer different switches of dataset, using the same scale prototype and data hierarchical structure, to provide and compare social location of risk perception in the most detailed level.

  18. Enabling computational tools for tomographic image and engineering data analysis

    NASA Astrophysics Data System (ADS)

    Raghu, Shravan Kumar

    Use of computational tools for solving various engineering problems involving structural mechanics, materials science and molecular dynamics will be discussed. Solutions for three different problems have been developed using powerful computational tools such as MATLAB, LAMMPS and ANSYS. The three problems that were investigated include: (a) Determination of the morphological properties like the fiber volume fraction, void percent by using the SEM images of the transverse cross section of the Glass-Fiber Reinforced Polymer (GFRP) and Carbon-Fiber Reinforced Polymer (CFRP) Composite samples in conjunction with MATLAB software. A special code using MATLAB program was developed to analyze the SEM images. The code uses various image edge detection and shape detection techniques to determine the desired properties of the samples. The results obtained using the image analysis are in good agreement with the experimental values. (b) The second part presents computational studies to predict the melting point of metallic nanowires. 3-D atomistic models of gold (Au) nanowires were developed to investigate the nanoscale effects over its melting behavior. Molecular Dynamics (MD) simulations were performed using LAMMPS software package in the Intel Core i7 computers at the Joint School of Nanoscience and Nanoengineering Computation Lab. The Melting point simulations were carried out for various gold nanowires of different diameters and lengths. In accordance with the melting point theory of nanowire, the theoretical melting points of gold nanowire of various diameters were calculated. The results indicate a convergence behavior in the melting point of gold nanowires with increase in its diameter. The study indicated that MD simulation studies of the gold nanowires closely resembled the theoretically values. The MD simulations were performed using two different embedded-atom method (EAM) potentials, both of which affirm similar behavior thus assuring the reliability of the simulation results. Also, a convergence behavior in the melting point over different lengths of the wire providing the stable configuration of the nanowire structure was observed and can be used for performing the simulations. (c) The third part presents a computation method to visualize the ANSYS simulation results using the 3D stereoscopic facility available in the CAVE Laboratory at the Joint School of Nanoscience and Nanoengineering. This method can be used to visualize various failures in structures that are subjected to mechanical, thermal and other types of loadings.

  19. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    NASA Astrophysics Data System (ADS)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  20. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

    2010-05-01

    The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. Using a syringe, this low-viscosity, transparent glue could be easily applied on the target area. We found that the glue permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  1. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, D.; Kueppers, U.; Castro, J. M.

    2009-12-01

    The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with a two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. This low-viscosity, transparent glue allowed for an easy application on the target area by means of a syringe and permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  2. Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details

    PubMed Central

    Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca

    2011-01-01

    Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners. PMID:21904538

  3. Arabidopsis Co-expression Tool (ACT): web server tools for microarray-based gene expression analysis.

    PubMed

    Manfield, Iain W; Jen, Chih-Hung; Pinney, John W; Michalopoulos, Ioannis; Bradford, James R; Gilmartin, Philip M; Westhead, David R

    2006-07-01

    The Arabidopsis Co-expression Tool, ACT, ranks the genes across a large microarray dataset according to how closely their expression follows the expression of a query gene. A database stores pre-calculated co-expression results for approximately 21,800 genes based on data from over 300 arrays. These results can be corroborated by calculation of co-expression results for user-defined sub-sets of arrays or experiments from the NASC/GARNet array dataset. Clique Finder (CF) identifies groups of genes which are consistently co-expressed with each other across a user-defined co-expression list. The parameters can be altered easily to adjust cluster size and the output examined for optimal inclusion of genes with known biological roles. Alternatively, a Scatter Plot tool displays the correlation coefficients for all genes against two user-selected queries on a scatter plot which can be useful for visual identification of clusters of genes with similar r-values. User-input groups of genes can be highlighted on the scatter plots. Inclusion of genes with known biology in sets of genes identified using CF and Scatter Plot tools allows inferences to be made about the roles of the other genes in the set and both tools can therefore be used to generate short lists of genes for further characterization. ACT is freely available at www.Arabidopsis.leeds.ac.uk/ACT. PMID:16845059

  4. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    NASA Astrophysics Data System (ADS)

    González, Adriana; Delouille, Véronique; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting). The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  5. De-convoluting mixed crude oil in Prudhoe Bay Field, North Slope, Alaska

    USGS Publications Warehouse

    Peters, K.E.; Scott, Ramos L.; Zumberge, J.E.; Valin, Z.C.; Bird, K.J.

    2008-01-01

    Seventy-four crude oil samples from the Barrow arch on the North Slope of Alaska were studied to assess the relative volumetric contributions from different source rocks to the giant Prudhoe Bay Field. We applied alternating least squares to concentration data (ALS-C) for 46 biomarkers in the range C19-C35 to de-convolute mixtures of oil generated from carbonate rich Triassic Shublik Formation and clay rich Jurassic Kingak Shale and Cretaceous Hue Shale-gamma ray zone (Hue-GRZ) source rocks. ALS-C results for 23 oil samples from the prolific Ivishak Formation reservoir of the Prudhoe Bay Field indicate approximately equal contributions from Shublik Formation and Hue-GRZ source rocks (37% each), less from the Kingak Shale (26%), and little or no contribution from other source rocks. These results differ from published interpretations that most oil in the Prudhoe Bay Field originated from the Shublik Formation source rock. With few exceptions, the relative contribution of oil from the Shublik Formation decreases, while that from the Hue-GRZ increases in reservoirs along the Barrow arch from Point Barrow in the northwest to Point Thomson in the southeast (???250 miles or 400 km). The Shublik contribution also decreases to a lesser degree between fault blocks within the Ivishak pool from west to east across the Prudhoe Bay Field. ALS-C provides a robust means to calculate the relative amounts of two or more oil types in a mixture. Furthermore, ALS-C does not require that pure end member oils be identified prior to analysis or that laboratory mixtures of these oils be prepared to evaluate mixing. ALS-C of biomarkers reliably de-convolutes mixtures because the concentrations of compounds in mixtures vary as linear functions of the amount of each oil type. ALS of biomarker ratios (ALS-R) cannot be used to de-convolute mixtures because compound ratios vary as nonlinear functions of the amount of each oil type.

  6. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  7. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  8. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    SciTech Connect

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  9. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  10. Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification

    NASA Astrophysics Data System (ADS)

    Lopes, R.; Vicente, D.; Silva, N.

    2009-05-01

    Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

  11. Online Analysis of Wind and Solar Part II: Transmission Tool

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  12. Online Analysis of Wind and Solar Part I: Ramping Tool

    SciTech Connect

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  13. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  14. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras

  15. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  16. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  17. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing (NGS) data. We make OmicsTract and SynpExtractor publicly and freely available for non-commercial use in order to strengthen and build capacity for the phylogenetic paradigm of omics analysis. PMID:26230532

  18. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of global climate change is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 19512002 occurred in northern hemisphere countries (especially during JanuaryApril), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50N during February-March to 10N during August-September. Precipitation decreases occurred most commonly in countries between 020N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 20702099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate change. Moreover, Climate Wizard is not a static product, but rather a data analysis framework designed to be used for climate change impact and adaption planning, which can be expanded to include other information, such as downscaled future projections of hydrology, soil moisture, wildfire, vegetation, marine conditions, disease, and agricultural productivity. PMID:20016827

  19. GENOME RESOURCES AND COMPARATIVE ANALYSIS TOOLS FOR CARDIOVASCULAR RESEARCH

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Disorders of the cardiovascular (CV) system are often caused by the interaction of genetic and environmental factors that jointly contribute to individual susceptibility. Genomic data and bioinformatics tools generated from genome projects, coupled with functional verification, offer novel approache...

  20. SOURCE PULSE ENHANCEMENT BY DECONVOLUTION OF AN EMPIRICAL GREEN'S FUNCTION.

    USGS Publications Warehouse

    Mueller, Charles S.

    1985-01-01

    Observations of the earthquake source-time function are enhanced if path, recording-site, and instrument complexities can be removed from seismograms. Assuming that a small earthquake has a simple source, its seismogram can be treated as an empirical Green's function and deconvolved from the seismogram of a larger and/or more complex earthquake by spectral division. When the deconvolution is well posed, the quotient spectrum represents the apparent source-time function of the larger event. This study shows that with high-quality locally recorded earthquake data it is feasible to Fourier transform the quotient and obtain a useful result in the time domain. In practice, the deconvolution can be stabilized by one of several simple techniques. Application of the method is given. Refs.

  1. Image deconvolution under Poisson noise using SURE-LET approach

    NASA Astrophysics Data System (ADS)

    Xue, Feng; Liu, Jiaqi; Meng, Gang; Yan, Jing; Zhao, Min

    2015-10-01

    We propose an image deconvolution algorithm when the data is contaminated by Poisson noise. By minimizing Stein's unbiased risk estimate (SURE), the SURE-LET method was firstly proposed to deal with Gaussian noise corruption. Our key contribution is to demonstrate that the SURE-LET algorithm is also applicable for Poisson noisy image and proposed an efficient algorithm. The formulation of SURE requires knowledge of Gaussian noise variance. We experimentally found a simple and direct link between the noise variance estimated by median absolute difference (MAD) method and the optimal one that leads to the best deconvolution performance in terms of mean squared error (MSE). Extensive experiments show that this optimal noise variance works satisfactorily for a wide range of natural images.

  2. Deconvolution of impulse response by the truncated singular value decomposition

    NASA Astrophysics Data System (ADS)

    Cha, Hao; Dai, Mingzhen

    1992-12-01

    The impulse response of the target plays a very important part in the target identification. However, the impulse response of an electromagnetic system can be obtained through solving the convolution problem. In this paper, the ill-posed problem is solved directly in the time domain by the truncated singular value decomposition (TSVD) method. The less singular values which cause the problem to be ill-posed are discarded, so that the deconvolution problem is changed into a well-posed problem. The impulse response of a sphere is used to make simulations. The results show that the TSVD algorithm can improve the result of the deconvolution by about 10 dB (SNR) compared with the conventional conjugated gradient method.

  3. Deconvolution of Energy Spectra in the ATIC Experiment

    NASA Technical Reports Server (NTRS)

    Batkov, K. E.; Panov, A. D.; Adams, J. H.; Ahn, H. S.; Bashindzhagyan, G. L.; Chang, J.; Christl, M.; Fazley, A. R.; Ganel, O.; Gunasigha, R. M.; Guzik, T. G.

    2005-01-01

    The Advanced Thin Ionization Calorimeter (ATIC) balloon-borne experiment is designed to perform cosmic- ray elemental spectra measurements from below 100 GeV up to tens TeV for nuclei from hydrogen to iron. The instrument is composed of a silicon matrix detector followed by a carbon target, interleaved with scintillator tracking layers, and a segmented BGO calorimeter composed of 320 individual crystals totalling 18 radiation lengths, used to determine the particle energy. The technique for deconvolution of the energy spectra measured in the thin calorimeter is based on detailed simulations of the response of the ATIC instrument to different cosmic ray nuclei over a wide energy range. The method of deconvolution is described and energy spectrum of carbon obtained by this technique is presented.

  4. A pitfall in the reconstruction of fibre ODFs using spherical deconvolution of diffusion MRI data

    PubMed Central

    Parker, G.D.; Marshall, D.; Rosin, P.L.; Drage, N.; Richmond, S.; Jones, D.K.

    2013-01-01

    Diffusion weighted (DW) MRI facilitates non-invasive quantification of tissue microstructure and, in combination with appropriate signal processing, three-dimensional estimates of fibrous orientation. In recent years, attention has shifted from the diffusion tensor model, which assumes a unimodal Gaussian diffusion displacement profile to recover fibre orientation (with various well-documented limitations), towards more complex high angular resolution diffusion imaging (HARDI) analysis techniques. Spherical deconvolution (SD) approaches assume that the fibre orientation density function (fODF) within a voxel can be obtained by deconvolving a common single fibre response function from the observed set of DW signals. In practice, this common response function is not known a priori and thus an estimated fibre response must be used. Here the establishment of this single-fibre response function is referred to as calibration. This work examines the vulnerability of two different SD approaches to inappropriate response function calibration: (1) constrained spherical harmonic deconvolution (CSHD)a technique that exploits spherical harmonic basis sets and (2) damped RichardsonLucy (dRL) deconvolutiona technique based on the standard RichardsonLucy deconvolution. Through simulations, the impact of a discrepancy between the calibrated diffusion profiles and the observed (Target) DW-signals in both single and crossing-fibre configurations was investigated. The results show that CSHD produces spurious fODF peaks (consistent with well known ringing artefacts) as the discrepancy between calibration and target response increases, while dRL demonstrates a lower over-all sensitivity to miscalibration (with a calibration response function for a highly anisotropic fibre being optimal). However, dRL demonstrates a reduced ability to resolve low anisotropy crossing-fibres compared to CSHD. It is concluded that the range and spatial-distribution of expected single-fibre anisotropies within an image must be carefully considered to ensure selection of the appropriate algorithm, parameters and calibration. Failure to choose the calibration response function carefully may severely impact the quality of any resultant tractography. PMID:23085109

  5. Bayesian deconvolution of mass and ion mobility spectra: from binary interactions to polydisperse ensembles.

    PubMed

    Marty, Michael T; Baldwin, Andrew J; Marklund, Erik G; Hochberg, Georg K A; Benesch, Justin L P; Robinson, Carol V

    2015-04-21

    Interpretation of mass spectra is challenging because they report a ratio of two physical quantities, mass and charge, which may each have multiple components that overlap in m/z. Previous approaches to disentangling the two have focused on peak assignment or fitting. However, the former struggle with complex spectra, and the latter are generally computationally intensive and may require substantial manual intervention. We propose a new data analysis approach that employs a Bayesian framework to separate the mass and charge dimensions. On the basis of this approach, we developed UniDec (Universal Deconvolution), software that provides a rapid, robust, and flexible deconvolution of mass spectra and ion mobility-mass spectra with minimal user intervention. Incorporation of the charge-state distribution in the Bayesian prior probabilities provides separation of the m/z spectrum into its physical mass and charge components. We have evaluated our approach using systems of increasing complexity, enabling us to deduce lipid binding to membrane proteins, to probe the dynamics of subunit exchange reactions, and to characterize polydispersity in both protein assemblies and lipoprotein Nanodiscs. The general utility of our approach will greatly facilitate analysis of ion mobility and mass spectra. PMID:25799115

  6. Exothermic Behavior of Thermal Decomposition of Sodium Percarbonate: Kinetic Deconvolution of Successive Endothermic and Exothermic Processes.

    PubMed

    Nakano, Masayoshi; Wada, Takeshi; Koga, Nobuyoshi

    2015-09-24

    This study focused on the kinetic modeling of the thermal decomposition of sodium percarbonate (SPC, sodium carbonate-hydrogen peroxide (2/3)). The reaction is characterized by apparently different kinetic profiles of mass-loss and exothermic behavior as recorded by thermogravimetry and differential scanning calorimetry, respectively. This phenomenon results from a combination of different kinetic features of the reaction involving two overlapping mass-loss steps controlled by the physico-geometry of the reaction and successive endothermic and exothermic processes caused by the detachment and decomposition of H2O2(g). For kinetic modeling, the overall reaction was initially separated into endothermic and exothermic processes using kinetic deconvolution analysis. Then, both of the endothermic and exothermic processes were further separated into two reaction steps accounting for the physico-geometrically controlled reaction that occurs in two steps. Kinetic modeling through kinetic deconvolution analysis clearly illustrates the appearance of the net exothermic effect is the result of a slight delay of the exothermic process to the endothermic process in each physico-geometrically controlled reaction step. This demonstrates that kinetic modeling attempted in this study is useful for interpreting the exothermic behavior of solid-state reactions such as the oxidative decomposition of solids and thermal decomposition of oxidizing agent. PMID:26371394

  7. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  8. Blind deconvolution estimation of fluorescence measurements through quadratic programming

    NASA Astrophysics Data System (ADS)

    Campos-Delgado, Daniel U.; Gutierrez-Navarro, Omar; Arce-Santana, Edgar R.; Skala, Melissa C.; Walsh, Alex J.; Jo, Javier A.

    2015-07-01

    Time-deconvolution of the instrument response from fluorescence lifetime imaging microscopy (FLIM) data is usually necessary for accurate fluorescence lifetime estimation. In many applications, however, the instrument response is not available. In such cases, a blind deconvolution approach is required. An iterative methodology is proposed to address the blind deconvolution problem departing from a dataset of FLIM measurements. A linear combination of a base conformed by Laguerre functions models the fluorescence impulse response of the sample at each spatial point in our formulation. Our blind deconvolution estimation (BDE) algorithm is formulated as a quadratic approximation problem, where the decision variables are the samples of the instrument response and the scaling coefficients of the basis functions. In the approximation cost function, there is a bilinear dependence on the decision variables. Hence, due to the nonlinear nature of the estimation process, an alternating least-squares scheme iteratively solves the approximation problem. Our proposal searches for the samples of the instrument response with a global perspective, and the scaling coefficients of the basis functions locally at each spatial point. First, the iterative methodology relies on a least-squares solution for the instrument response, and quadratic programming for the scaling coefficients applied just to a subset of the measured fluorescence decays to initially estimate the instrument response to speed up the convergence. After convergence, the final stage computes the fluorescence impulse response at all spatial points. A comprehensive validation stage considers synthetic and experimental FLIM datasets of ex vivo atherosclerotic plaques and human breast cancer cell samples that highlight the advantages of the proposed BDE algorithm under different noise and initial conditions in the iterative scheme and parameters of the proposal.

  9. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    SciTech Connect

    Jodoin, Vincent J; Lee, Ronald W; Peplow, Douglas E.; Lefebvre, Jordan P

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  10. Detection of Flat Bottom Holes Using Sparse Deconvolution

    NASA Astrophysics Data System (ADS)

    Carcreff, Ewen; Bourguignon, Sbastien; Duclos, Aroune; Simon, Laurent; Idier, Jrme

    Ultrasonic non destructive testing (NDT) is an efficient method to detect flaws in industrial parts. The detection of flat bottom holes (FBH) is a typical problem, which serves as a reference in the NDT community. It is nevertheless a hard task if the FBH is short because its echo overlaps with the backwall echo. In this paper, we propose to use a sparse deconvolution approach to separate the FBH echo from the backwall echo and hence to detect the FBH with high resolution. From experimental data acquired with a FBH drilled in an aluminum plate, we show that the FBH echo can be modeled as a high-pass filtered version of the incident ultrasonic wave. Therefore, we build a propagation model depending on the instrument response and of a specific attenuation function. A sparse deconvolution technique is then proposed to precisely locate the flaw and the backwall positions. In application to real data, we show that the developed approach is more efficient than conventional techniques such as gates or invariant sparse deconvolution.

  11. Hopfield neural network deconvolution for weak lensing measurement

    NASA Astrophysics Data System (ADS)

    Nurbaeva, G.; Tewes, M.; Courbin, F.; Meylan, G.

    2015-05-01

    Weak gravitational lensing has the potential to place tight constraints on the equation of the state of dark energy. However, this will only be possible if shear measurement methods can reach the required level of accuracy. We present a new method for measuring the ellipticity of galaxies used in weak lensing surveys. The method makes use of direct deconvolution of the data by the total point spread function (PSF). We adopt a linear algebra formalism that represents the PSF as a Toeplitz matrix. This allows us to solve the convolution equation by applying the Hopfield neural network iterative scheme. The ellipticity of galaxies in the deconvolved images are then measured using second-order moments of the autocorrelation function of the images. To our knowledge, it is the first time full image deconvolution has been used to measure weak lensing shear. We apply our method to the simulated weak lensing data proposed in the GREAT10 challenge and obtain a quality factor of Q = 87. This result is obtained after applying image denoising to the data, prior to the deconvolution. The additive and multiplicative biases on the shear power spectrum are then √{A}= + 0.09 × 10-4 and ℳ/2 = +0.0357, respectively.

  12. Deconvolution methods for LINC/NIRVANA data reduction

    NASA Astrophysics Data System (ADS)

    Anconelli, Barbara; Bertero, Mario; Boccacci, Patrizia; Carbillet, Marcel; Lanteri, Henri; Correia, Serge

    2004-10-01

    LINC/NIRVANA (LN) is the German-Italian beam combiner for the Large Binocular Telescope (LBT). It is a Fizeau interferometer and it will provide multiple images of the same target corresponding to different orientations of the baseline. For each one of these images the resolution is not uniform over the field since it is the resolution of a 22.8m mirror in the direction of the baseline and that of a 8.4m mirror in the orthogonal one. Therefore a unique high-resolution image can only be obtained by means of deconvolution methods. Four-years ongoing work of our group on this problem has already clarified the effects of partial adaptive optics corrections and partial coverage of the u,v plane and has produced the Software Package AIRY, a set of modules IDL-based and CAOS-compatible, which can be used for simulation and/or deconvolution of multiple images from the LBT instrument LN. In this paper we present a general approach to the design of methods for the simultaneous deconvolution of multiple images of the same object. These can include both quick-look methods, to be used for routinely process LN images, and ad-hoc methods for specific classes of astronomical objects. We describe several examples of these methods whose implementation and validation is in progress. Finally we present the last version of the Software Package AIRY.

  13. Tissue-specific sparse deconvolution for brain CT perfusion.

    PubMed

    Fang, Ruogu; Jiang, Haodi; Huang, Junzhou

    2015-12-01

    Enhancing perfusion maps in low-dose computed tomography perfusion (CTP) for cerebrovascular disease diagnosis is a challenging task, especially for low-contrast tissue categories where infarct core and ischemic penumbra usually occur. Sparse perfusion deconvolution has been recently proposed to effectively improve the image quality and diagnostic accuracy of low-dose perfusion CT by extracting the complementary information from the high-dose perfusion maps to restore the low-dose using a joint spatio-temporal model. However the low-contrast tissue classes where infarct core and ischemic penumbra are likely to occur in cerebral perfusion CT tend to be over-smoothed, leading to loss of essential biomarkers. In this paper, we propose a tissue-specific sparse deconvolution approach to preserve the subtle perfusion information in the low-contrast tissue classes. We first build tissue-specific dictionaries from segmentations of high-dose perfusion maps using online dictionary learning, and then perform deconvolution-based hemodynamic parameters estimation for block-wise tissue segments on the low-dose CTP data. Extensive validation on clinical datasets of patients with cerebrovascular disease demonstrates the superior performance of our proposed method compared to state-of-art, and potentially improve diagnostic accuracy by increasing the differentiation between normal and ischemic tissues in the brain. PMID:26055434

  14. Spherical Deconvolution Improves Quality of Single Particle Reconstruction

    PubMed Central

    Kishchenko, Gregory P.; Leith, Ardean

    2014-01-01

    One single-particle reconstruction technique is the reconstruction of macromolecules from projection images of randomly oriented particles (SPRR). In SPRR the reliability and consequent interpretation of the final reconstruction is affected by errors arising from incorrect assignment of projection angles to individual particles. In order to improve the resolution of SPRR we studied the influence of imperfect assignment on 3D blurring. We find that this blurring can be described as a Point Spread Function (PSF) that depends on the distance from geometrical center of the reconstructed volume and that blurring is higher at the periphery. This particular PSF can be described by an almost pure tangential angular function with a negligible radial component. We have developed a reliable algorithm for spherical deconvolution of the 3D reconstruction. This spherical deconvolution operation was tested on reconstructions of GroEL and mitochondrial ribosomes. We show that spherical deconvolution improves the quality of SPRR by reducing blurring and enhancing high frequency components, particularly near the periphery of the reconstruction. PMID:24841283

  15. Analysis of the structure of vibration signals for tool wear detection

    NASA Astrophysics Data System (ADS)

    Alonso, F. J.; Salgado, D. R.

    2008-04-01

    The objective of this work is to develop a reliable tool condition monitoring system (TCMS) for industrial application. The proposed TCMS is based on the analysis of the structure of the tool vibration signals using singular spectrum analysis (SSA) and cluster analysis. SSA is a novel non-parametric technique of time series analysis that decomposes the acquired tool vibration signals into an additive set of time series. Cluster analysis is used to group the SSA decomposition in order to obtain several independent components in the frequency domain that are presented to a feedforward back-propagation (FFBP) neural network to determine the tool flank wear. The results show that this use of SSA and cluster analysis provides an efficient automatic signal processing method, and that the proposed TCMS based on this procedure, is fast and reliable for tool wear monitoring.

  16. Simultaneous deconvolution of the bivariate distribution of molecular weight and chemical composition of polyolefins made with ziegler-natta catalysts.

    PubMed

    Alghyamah, Abdulaziz A; Soares, João B P

    2009-02-18

    Polyolefins made with Ziegler-Natta catalysts have non-uniform distributions of molecular weight (MWD) and chemical composition (CCD). The MWD is usually measured by high-temperature gel permeation chromatography (GPC) and the CCD by either temperature rising elution fractionation (TREF) or crystallization analysis fractionation (CRYSTAF). A mathematical model is needed to quantify the information provided by these analytical techniques and to relate it to the presence of multiple site types on Ziegler-Natta catalysts. We developed a robust computer algorithm to deconvolute the MWD and CCD of polyolefins simultaneously using Flory's most probable distribution and the cumulative CCD component of Stockmayer's distribution, which includes the soluble fraction commonly present in linear low-density polyethylene (LLDPE) resins and have applied this procedure for the first time to several industrial LLDPE resins. The deconvolution results are reproducible and consistent with theoretical expectations. PMID:21706614

  17. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed instruction tailored to their experience level along with proper support and mentoring.This project was funded by a grant from the National Science Foundation, Grant # PHY1157078.

  18. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The Lagrangian specificity of the required measurements is assured by the fact that a transducer enclosed within a solid material is necessarily linked in motion to the particles of the material which surround it. This Lagrangian instrumentation is described in the second chapter. The authors are concerned with the techniques considered today to be the most effective. These are, for stress : piezoresistive gauges (50 Ω and low impedance) and piezoelectric techniques (PVF2 gauges, quartz transducers) ; and for particle velocity : electromagnetic gauges, VISAR and IDL Doppler laser interferometers. In each case both the physical principles as well as techniques of use are set out in detail. For the most part, the authors use their own experience to describe the calibration of these instrumentation systems and to compare their characteristics : measurement range, response time, accuracy, useful recording time, detection area... These characteristics should be taken into account by the physicist when he has to choose the instrumentation systems best adapted to the Lagrangian analysis he intends to apply to any given material. The discussion at the end of chapter 2 should guide his choice both for plane and spherical one-dimensional motions. The third chapter examines to what extent the accuracy of Lagrangian analysis is affected by the accuracies of the numerical analysis methods and experimental techniques. By means of a discussion of different cases of analysis, the authors want to make the reader aware of the different kinds of sources of errors that may be encountered. This work brings up to date the state of studies on Lagrangian analysis methods based on a wide review of bibliographical sources together with the contribution made to research in this field by the four authors themselves in the course of the last ten years. Le formage des métaux par explosif, la consolidation dynamique des poudres, la balistique terminale, l'abattage des roches par explosif, sont autant d'applications, dans les domaines civil et militaire, qui exigent d'approfondir les connaissances que l'on a des comportements des matériaux chargés par des ondes de contrainte de forte intensité. C'est dans ce domaine que les méthodes d'analyse lagrangienne, sujets de cet ouvrage, seront un outil d'investigation intéressant pour le mécanicien du solide. L'analyse lagrangienne a été développée autour des années 1970 par Fowles et Williams. L'idée de base repose sur l'intégration des équations de la mécanique en utilisant l'histoire des contraintes ou des vitesses matérielles obtenues au moyen de capteurs minces sur le trajet d'une onde de contrainte. Sont ainsi obtenues toutes les grandeurs cinématiques et mécaniques contenues dans les équations de conservation. Dans le premier chapitre, les auteurs introduisent les outils mathématiques destinés à analyser les mouvements monodimensionnels plan et sphérique. Pour le premier de ces mouvements, sont décrites les méthodes mathématiques d'analyse propres aux trois régimes de propagation d'onde rencontrés: l'onde instationnaire non amortie, l'onde simple et l'onde instationnaire amortie. Par ailleurs, pour chacun de ces régimes sont traités les cas où l'on dispose initialement des profils des contraintes ou des vitesses matérielles. Les auteurs insistent sur le fait que l'une ou l'autre des deux collections de données (contraintes et vitesses matérielles) sont suffisantes pour intégrer les équations de conservation dans le cas du mouvement plan, alors que les deux collections de données sont indispensables dans le cas de mouvement sphérique. Cependant, malgré cette difficulté supplémentaire, l'analyse lagrangienne du mouvement sphérique reste particulièrement intéressante pour le mécanicien puisqu'elle permet d'accéder au comportement du matériau pour des processus de déformation différents de celui imposé par le mouvement plan. Les méthodes exposées dans le premier chapitre reposent sur la mesure lagrangienne de la vitesse matérielle et de la contrainte, en fonction du temps dans le matériau comprimé par une onde longitudinale plane ou sphérique. La spécificité lagrangienne des mesures requises est assurec par le fait qu'un capteur inclus dans un matériau solide est lié dans le mouvement aux particules du matériau qui l'entourent. Cette métrologie lagrangienne est décrite dans le deuxième chapitre. Les auteurs se sont intéressés aux techniques considérées aujourd'hui comme les plus performantes ; elles se rapportent d'une part à la vitesse matérielle: jauges électromagnétiques, interféromètres Doppler laser de types VISAR et IDL ; d'autre part à la contrainte : jauges piézorésistives (50 Ω et basse impédance) et techniques piézoélectriques (jauges PVF2-capteurs à quartz). Dans chacun des cas les principes de fonctionnement ainsi que les techniques de mise en œuvre sont détaillés. Les auteurs utilisent largement leurs propres résultats pour décrire l'étalonnage de ces moyens métrologiques et comparer leurs performances : étendue de mesure, temps de réponse, précision, durée maximale d'utilisation, surface visée... Ce sont ces performances que devra prendre en compte le mécanicien lorsqu'il devra choisir parmi les différents moyens métrologiques la technique la mieux adaptée à l'analyse lagrangienne qu'il se proposera d'effectuer dans un matériau donné. Les arguments développés à la fin du chapitre 2 devraient lui permettre de guider son choix aussi bien dans le cas du mouvement monodimensionnel plan que dans le cas du mouvement monodimensionnel sphérique. Dans le troisième chapitre la précision de l'analyse lagrangienne est examinée sous l'aspect de l'influence des précisions des méthodes d'analyse numérique d'une part et des techniques expérimentales d'autre part. A partir du traitement de différents cas concrets d'analyse, les auteurs cherchent à sensibiliser le lecteur aux différentes sources d'erreurs rencontrées. Cet ouvrage constitue un état de l'art sur les méthodes d'analyse lagrangienne à partir d'une large étude bibliographique et de la contribution apportée dans ce domaine par les quatre auteurs depuis une dizaine d'années.

  19. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  20. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malag, Marcello; Turco, Emilio

    2014-12-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  1. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  2. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  3. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    PubMed

    Xavier, Daniela; Vzquez, Sara; Higuera, Clara; Morn, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. PMID:21554926

  4. Mechanisms proposed for spectrogram correlation and transformation deconvolution in FM bat sonar

    NASA Astrophysics Data System (ADS)

    Simmons, James A.

    2005-09-01

    Big brown bats use time/frequency distributions to represent FM biosonar pulses and echoes as a consequence of reception through frequency tuned channels of the inner ear and subsequent processing by similarly tuned neural channels in the auditory pathway. Integration time is 350 ?s, yet delay resolution is 2-10 ?s, which must be based on detecting changes in the echo spectrum caused by interference between overlapping reflections inside the integration time. However, bats perceive not merely the echo interference spectrum but the numerical value of the delay separation from the spectrum, which requires deconvolution. Because spectrograms are the initial representation, this process is spectrogram correlation and transformation (SCAT). Proposed SCAT deconvolution mechanisms include extraction of echo envelope ripples for time-domain spectrometry, cepstral analysis of echoes, use of coherent or noncoherent reconstruction with basis functions, segmentation of onsets of overlapping replicas at moderate to long time separations, and localization of the occurrence of spectral interference ripples at specific times within dechirped spectrograms. Physiological evidence from single-unit recordings reveals a cepstral-like time-frequency process based on freqlets, both single-unit and multiunit responses reveal which may prove to be time-domain basis functions, and multiunit responses exhibit modulations by onset and envelope ripple. [Work supported by NIH and ONR.

  5. Thorium concentrations in the lunar surface. V - Deconvolution of the central highlands region

    NASA Technical Reports Server (NTRS)

    Metzger, A. E.; Etchegaray-Ramirez, M. I.; Haines, E. L.

    1982-01-01

    The distribution of thorium in the lunar central highlands measured from orbit by the Apollo 16 gamma-ray spectrometer is subjected to a deconvolution analysis to yield improved spatial resolution and contrast. Use of two overlapping data fields for complete coverage also provides a demonstration of the technique's ability to model concentrations several degrees beyond the data track. Deconvolution reveals an association between Th concentration and the Kant Plateau, Descartes Mountain and Cayley plains surface formations. The Kant Plateau and Descartes Mountains model with Th less than 1 part per million, which is typical of farside highlands but is infrequently seen over any other nearside highland portions of the Apollo 15 and 16 ground tracks. It is noted that, if the Cayley plains are the result of basin-forming impact ejecta, the distribution of Th concentration with longitude supports an origin from the Imbrium basin rather than the Nectaris or Orientale basins. Nectaris basin materials are found to have a Th concentration similar to that of the Descartes Mountains, evidence that the latter may have been emplaced as Nectaris basin impact deposits.

  6. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. PMID:26679001

  7. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  8. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key

  9. Assessing the surgeon's technical skills: analysis of the available tools.

    PubMed

    Memon, Muhammed Ashraf; Brigden, David; Subramanya, Manjunath S; Memon, Breda

    2010-05-01

    The concept of assessing competency in surgical practice is not new and has taken on an added urgency in view of the recent high-profile inquiries into "botched cases" involving surgeons of various levels in different parts of the world. Until very recently, surgeons in the United Kingdom and other parts of the world, although required to undergo formal and compulsory examinations to test their factual knowledge and decision making, were not required to demonstrate technical ability. Therefore, there existed (and still exist) no objective assessment criteria to test trainees' surgical skill, especially during the exit examination, which, if passed, provides unrestricted license to surgeons to practice their specialties. However, with the introduction of a new curriculum by various surgical societies and a demand from the lay community for better standards, new assessment tools are emerging that focus on technical competency and that could objectively and reliably measure surgical skills. Furthermore, training authorities and hospitals are keen to embrace these changes for satisfactory accreditation and reaccreditation processes and to assure the public of the safety of the public and private health care systems. In the United Kingdom, two new surgical tools (Surgical Direct Observation of Procedural Skill, and Procedure Based Assessments) have been simultaneously introduced to assess surgical trainees. The authors describe these two assessment methods, provide an overview of other assessment tools currently or previously used to assess surgical skills, critically analyze the two new assessment tools, and reflect on the merit of simultaneously introducing them. PMID:20520044

  10. Mapping and spatiotemporal analysis tool for hydrological data: Spellmap

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

  11. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  12. Tools for Education Policy Analysis [with CD-ROM].

    ERIC Educational Resources Information Center

    Mingat, Alain; Tan, Jee-Peng

    This manual contains a set of tools to assist policymakers in analyzing and revamping educational policy. Its main focus is on some economic and financial aspects of education and selected features in the arrangements for service delivery. Originally offered as a series of training workshops for World Bank staff to work with clients in the…

  13. Maximum correlated Kurtosis deconvolution and application on gear tooth chip fault detection

    NASA Astrophysics Data System (ADS)

    McDonald, Geoff L.; Zhao, Qing; Zuo, Ming J.

    2012-11-01

    In this paper a new deconvolution method is presented for the detection of gear and bearing faults from vibration data. The proposed maximum correlated Kurtosis deconvolution method takes advantage of the periodic nature of the faults as well as the impulse-like vibration behaviour associated with most types of faults. The results are compared to the standard minimum entropy deconvolution method on both simulated and experimental data. The experimental data is from a gearbox with gear chip fault, and the results are compared between healthy and faulty vibrations. The results indicate that the proposed maximum correlated Kurtosis deconvolution method performs considerably better than the traditional minimum entropy deconvolution method, and often performs several times better at fault detection. In addition to this improved performance, deconvolution of separate fault periods is possible; allowing for concurrent fault detection. Finally, an online implementation is proposed and shown to perform well and be computationally achievable on a personal computer.

  14. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    SciTech Connect

    Rath, Frank

    2008-05-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  15. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    SciTech Connect

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  16. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  17. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  18. Determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1991-01-01

    The final report for work on the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution is presented. Papers and theses prepared during the research report period are included. Among all the research results reported, note should be made of the specific investigation of the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution. A methodology was developed to determine design and operation parameters for error minimization when deconvolution is included in data analysis. An error surface is plotted versus the signal-to-noise ratio (SNR) and all parameters of interest. Instrumental characteristics will determine a curve in this space. The SNR and parameter values which give the projection from the curve to the surface, corresponding to the smallest value for the error, are the optimum values. These values are constrained by the curve and so will not necessarily correspond to an absolute minimum in the error surface.

  19. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  20. Analysis tools for the calibration and commissioning of the AOF

    NASA Astrophysics Data System (ADS)

    Garcia-Rissmann, Aurea; Kolb, Johann; Le Louarn, Miska; Madec, Pierre-Yves; Muller, Nicolas

    2013-12-01

    The Adaptive Optics Facility (AOF) is an AO-oriented upgrade envisaged to be implemented at the UT4 in Paranal in 2013-2014, and which could serve as a test case for the E-ELT. Counting on the largest Deformable Secondary Mirror ever built (1170 actuators) and on four off-axes Na laser launch telescopes, the AOF will operate in distinct modes (GLAO, LTAO, SCAO), in accordance to the instruments attached to the 2 telescope Nasmyth ports (GALACSI+MUSE, GRAAL+HAWK-I) and to the Cassegrain port (ERIS). Tools are under development to allow a fast testing of important parameters for these systems when at commissioning and for posterior assessment of telemetry data. These concern the determination of turbulence parameters and Cn2 profiling, measurement of Strehl and ensquared energies, misregistration calculation, bandwidth & overall performance, etc. Our tools are presented as Graphical User Interfaces developed in the Matlab environment, and will be able to grab through a dedicated server data saved in SPARTA standards. We present here the tools developed up to present date and discuss details of what can be obtained from the AOF, based on simulations.

  1. The digital penalized LMS deconvolution method for TPC X-ray polarimeter signal processing

    NASA Astrophysics Data System (ADS)

    He, L.; Deng, Z.; Li, H.; Liu, Y. N.; Feng, H.

    2015-04-01

    This article presents the Digital Penalized LMS (Least Mean Square) deconvolution method for processing the X-ray polarimeter readout electronics output signal. The deconvolution filter is used to recover the detector signal high frequency component, which is lost due to the limited bandwidth of the readout electronics. The DPLMS deconvolution method does not need to know the transfer function of the readout electronics system in advance and can restrain the deconvolution noise by using a noise constraint. In this paper, this method will be applied to process the simulation data generated by GEANT4 and the resulting photoelectron angular resolution of a X-ray polarimeter will be presented.

  2. An automated technique for detailed ?-FTIR mapping of diamond and spectral deconvolution

    NASA Astrophysics Data System (ADS)

    Howell, Dan; Griffin, Bill; O'Neill, Craig; O'Reilly, Suzanne; Pearson, Norman; Handley, Heather

    2010-05-01

    Since the original classification of diamonds based upon their absorption in the one-phonon region of the mid-infrared (IR) range was first introduced, a vast amount of research has been carried out in this field. The result today is that IR analysis has become the principle tool for classifying diamonds based upon the concentration and aggregation state of nitrogen, the most common impurity found within their crystal lattice. These studies have shown that diamonds can contain a large range of nitrogen, from nominally nitrogen free i.e. below detection limits (termed Type II) to nitrogen rich (termed Type I) with up to 5000 ppm. It has also been shown that the nitrogen concentration, aggregation and distribution in an individual stone can be either homogeneous or heterogeneous. Nitrogen has been shown to reside within diamond in three different aggregation states. The first is in the form of single substitutional nitrogen atoms, known as C centres. Diamonds that contain nitrogen only in this form are termed Type Ib. The second aggregation state is pairs of nitrogen atoms forming A centres (termed Type IaA diamonds) and the final state is four nitrogen atoms tetrahedrally arrange around a vacancy, forming a B centre (termed Type IaB). The sequence of aggregation has been shown to progress from C centres to A centres to B centres and is a function of time and temperature. As such it is a commonly used tool in the geological study of diamonds to gauge their mantle residence time / temperature history. The first step in the sequence is thought to occur relatively quickly in geological terms; the vast age of most diamonds therefore makes Type Ib samples rare in cratonic diamond deposits. The second step takes considerably more time, meaning that the A to B centre conversion may not always continue through to completion. So diamonds containing a mixture of both A and B centres are commonly termed Type IaAB. IR analysis of diamond also has the capability of identifying other commonly found defects and impurities. Whether these are intrinsic defects like platelets, extrinsic defects like hydrogen or boron atoms, or inclusions of minerals or fluids. Recent technological developments in the field of spectroscopy allow detailed ?-FTIR analysis to be performed rapidly in an automated fashion. The Nicolet iN10 microscope has an integrated design that maximises signal throughput and allows spectra to be collected with greater efficiency than is possible with conventional ?-FTIR spectrometer-microscope systems. Combining this with a computer controlled x-y stage allows for the automated measuring of several thousand spectra in only a few hours. This affords us the ability to record 2D IR maps of diamond plates with minimal effort, but has created the need for an automated technique to process the large quantities of IR spectra and obtain quantitative data from them. We will present new software routines that can process large batches of IR spectra, including baselining, conversion to absorption coefficient, and deconvolution to identify and quantify the various nitrogen components. Possible sources of error in each step of the process will be highlighted so that the data produced can be critically assessed. The end result will be the production of various false colour 2D maps that show the distribution of nitrogen concentrations and aggregation states, as well as other identifiable components.

  3. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jrgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. PMID:24522836

  4. Creation Of The Residual Stress By Influence Of Wear Of Cutting Tool And Their Analysis

    NASA Astrophysics Data System (ADS)

    Kordk, Marek; ?illikov, Mria; Mrazik, Jozef; Martin?ek, Juraj; Janota, Miroslav; Nicielnik, Henryk

    2015-12-01

    The aim of this paper is analysis of turned bearing ring made of material 14109 (DIN 100Cr6) without heat treatment. For the analysis a mechanical destructive method was chosen. Analysis focused on existence and character of residual stresses after turning operation of bearing ring by tool with different level of wear. The experiment reveals the relationships between residual stress creation and cutting tool wear.

  5. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    SciTech Connect

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  6. Optimal application of Morrison's iterative noise removal for deconvolution. Appendices

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1987-01-01

    Morrison's iterative method of noise removal, or Morrison's smoothing, is applied in a simulation to noise-added data sets of various noise levels to determine its optimum use. Morrison's smoothing is applied for noise removal alone, and for noise removal prior to deconvolution. For the latter, an accurate method is analyzed to provide confidence in the optimization. The method consists of convolving the data with an inverse filter calculated by taking the inverse discrete Fourier transform of the reciprocal of the transform of the response of the system. Various length filters are calculated for the narrow and wide Gaussian response functions used. Deconvolution of non-noisy data is performed, and the error in each deconvolution calculated. Plots are produced of error versus filter length; and from these plots the most accurate length filters determined. The statistical methodologies employed in the optimizations of Morrison's method are similar. A typical peak-type input is selected and convolved with the two response functions to produce the data sets to be analyzed. Both constant and ordinate-dependent Gaussian distributed noise is added to the data, where the noise levels of the data are characterized by their signal-to-noise ratios. The error measures employed in the optimizations are the L1 and L2 norms. Results of the optimizations for both Gaussians, both noise types, and both norms include figures of optimum iteration number and error improvement versus signal-to-noise ratio, and tables of results. The statistical variation of all quantities considered is also given.

  7. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  8. Powerful tool for design analysis of linear control systems

    SciTech Connect

    Maddux, Jr, A S

    1982-05-10

    The methods for designing linear controls for electronic or mechanical systems have been understood and put to practice. What has not been readily available to engineers, however, is a practical, quick and inexpensive method for analyzing these linear control (feedback) systems once they have been designed into the electronic or mechanical hardware. Now, the PET, manufactured by Commodore Business Machines (CBM), operating with several peripherals via the IEEE 488 Bus, brings to the engineer for about $4000 a complete set of office tools for analyzing these system designs.

  9. An agent-based tool for infrastructure interdependency policy analysis.

    SciTech Connect

    North, M. J.

    2000-12-14

    Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructure interdependencies such as those between the electric power and natural gas markets. These markets are undergoing fundamental transformations including major changes in electric generator fuel sources. Electric generators that use natural gas as a fuel source are rapidly gaining market share. These generators introduce direct interdependency between the electric power and natural gas markets. These interdependencies have been investigated using the emergent behavior of CAS model agents within the Spot Market Agent Research Tool Version 2.0 Plus Natural Gas (SMART II+).

  10. Parachute system design, analysis, and simulation tool. Status report

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-12-31

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  11. Peak Studio: a tool for the visualization and analysis of fragment analysis files.

    PubMed

    McCafferty, Jonathan; Reid, Robert; Spencer, Melanie; Hamp, Timothy; Fodor, Anthony

    2012-10-01

    While emerging technologies such as next-generation sequencing are increasingly important tools for the analysis of metagenomic communities, molecular fingerprinting techniques such as automated ribosomal intergenic spacer analysis (ARISA) and terminal restriction fragment length polymorphisms (T-RFLP) remain in use due to their rapid speed and low cost. Peak Studio is a java-based graphical user interface (GUI) designed for the visualization and analysis of fragment analysis (FSA) files generated by the Applied Biosystems capillary electrophoresis instrument. Specifically designed for ARISA and T-RFLP experiments, Peak Studio provides the user the ability to freely adjust the parameters of a peak-calling algorithm and immediately see the implications for downstream clustering by principal component analysis. Peak Studio is fully open-source and, unlike proprietary solutions, can be deployed on any computer running Windows, OS X or Linux. Peak Studio allows data to be saved in multiple formats and can serve as a pre-processing suite that prepares data for statistical analysis in programs such as SAS or R. PMID:23760901

  12. Deconvolution/identification techniques for 1-D transient signals

    SciTech Connect

    Goodman, D.M.

    1990-10-01

    This paper discusses a variety of nonparametric deconvolution and identification techniques that we have developed for application to 1-D transient signal problems. These methods are time-domain techniques that use direct methods for matrix inversion. Therefore, they are not appropriate for large data'' problems. These techniques involve various regularization methods and permit the use of certain kinds of a priori information in estimating the unknown. These techniques have been implemented in a package using standard FORTRAN that should make the package readily transportable to most computers. This paper is also meant to be an instruction manual for the package. 25 refs., 17 figs., 1 tab.

  13. Multi-frame blind deconvolution using sparse priors

    NASA Astrophysics Data System (ADS)

    Dong, Wende; Feng, Huajun; Xu, Zhihai; Li, Qi

    2012-05-01

    In this paper, we propose a method for multi-frame blind deconvolution. Two sparse priors, i.e., the natural image gradient prior and an l1-norm based prior are used to regularize the latent image and point spread functions (PSFs) respectively. An alternating minimization approach is adopted to solve the resulted optimization problem. We use both gray scale blurred frames from a data set and some colored ones which are captured by a digital camera to verify the robustness of our approach. Experimental results show that the proposed method can accurately reconstruct PSFs with complex structures and the restored images are of high quality.

  14. Direct linear algebraic deconvolution imaging of Compton telescope data.

    NASA Astrophysics Data System (ADS)

    Dixon, D. D.; Buchholz, J.; O'Neill, T. J.; Tmer, O. T.; White, R. S.; Zych, A. D.; Wheaton, W. A.

    A general direct linear algebraic deconvolution method for imaging Compton gamma-ray telescope event data is described. This method gives an image of the gamma ray source distribution that is linearly related to the binned event data. Two algorithms for imposing the positivity constraint are investigated and compared; the Non-negative Least Squares method of Lawson and Hanson (1974) and the Inequality Constrained Generalized Least Squares method of Werner (1990). The latter is applied here to CGRO COMPTEL event data for Viewing Period # 1 centered on the Crab Nebula. Preliminary flux images and their statistical significant are presented.

  15. Deconvolution techniques applied to Fabry-Perot interferometry data

    NASA Astrophysics Data System (ADS)

    Kerr, Robert B.; Noto, John; Jackson, James M.

    1994-09-01

    Spectral line profiles from light, atomic gases in upper planetary atmospheres are commonly non-Maxwellian. The velocity distributions of these light gases are perturbed in complex ways by atmospheric escape processes, by the paucity of thermalizing collisions, and by infrequent but important collisions with hot ions in the plasmasphere. It has long been recognized that the velocity distributions can be used to unfold the physical processes leading to atmospheric escape and to the partitioning of neutral gas trajectory classifications (ballistic, escaping, or satellite). Unfortunately, isolation of the velocity distribution from the measured emission line profile is not a simple matter, especially when neither of the velocity distributions are non- Maxwellian and when the instrument function used to measure the profile is also a complex function. We have experimented with several techniques to accurately retrieve the velocity distribution of atomic hydrogen in the earth's exosphere from the hydrogen Balmer-alpha (H(subscript (alpha) )) emission line profile measured with a Fabry-Perot interferometer. Although the derived velocity distribution remains subject to contamination of the measured emission by extraterrestrial and terrestrial sources, the technique to decompose the actual emission function from the combined instrument function plus emission function is established in this work -- and is applicable to many other similar problems. In particular, two techniques are compared. First, a classical deconvolution technique is developed using objective, low-pass filtering. Second, a nonlinear deconvolution algorithm, commonly referred to as `CLEAN' by the radio astronomy community that developed it, is applied to the optical H(subscript (alpha) ) spectra. We find that this second technique is useful for an accurate isolation of the velocity distribution of atomic hydrogen in earth's exosphere, while the classical deconvolution is more useful for determining the full width at half maximum of the emission. The CLEAN technique does a superior job of isolating low signal-to-noise information in the emission profile wings, of particular interest for the derivation of the escaping atomic hydrogen population. It is particularly important that the CLEAN technique, when properly applied, is not susceptible to the addition of unrealistic information in the low signal-to-noise region of emission line extrema, whereas common deconvolution techniques are often quite suspect in these regions. Using this new technique and a new ability to ascribe hydrogen column abundance to H(subscript (alpha) ) brightness measurements, we are now poised to derive atomic hydrogen escape fluxes without dependence upon models of escape flux dynamics.

  16. Reconstructing the Genomic Content of Microbiome Taxa through Shotgun Metagenomic Deconvolution

    PubMed Central

    Carr, Rogan; Shen-Orr, Shai S.; Borenstein, Elhanan

    2013-01-01

    Metagenomics has transformed our understanding of the microbial world, allowing researchers to bypass the need to isolate and culture individual taxa and to directly characterize both the taxonomic and gene compositions of environmental samples. However, associating the genes found in a metagenomic sample with the specific taxa of origin remains a critical challenge. Existing binning methods, based on nucleotide composition or alignment to reference genomes allow only a coarse-grained classification and rely heavily on the availability of sequenced genomes from closely related taxa. Here, we introduce a novel computational framework, integrating variation in gene abundances across multiple samples with taxonomic abundance data to deconvolve metagenomic samples into taxa-specific gene profiles and to reconstruct the genomic content of community members. This assembly-free method is not bounded by various factors limiting previously described methods of metagenomic binning or metagenomic assembly and represents a fundamentally different approach to metagenomic-based genome reconstruction. An implementation of this framework is available at http://elbo.gs.washington.edu/software.html. We first describe the mathematical foundations of our framework and discuss considerations for implementing its various components. We demonstrate the ability of this framework to accurately deconvolve a set of metagenomic samples and to recover the gene content of individual taxa using synthetic metagenomic samples. We specifically characterize determinants of prediction accuracy and examine the impact of annotation errors on the reconstructed genomes. We finally apply metagenomic deconvolution to samples from the Human Microbiome Project, successfully reconstructing genus-level genomic content of various microbial genera, based solely on variation in gene count. These reconstructed genera are shown to correctly capture genus-specific properties. With the accumulation of metagenomic data, this deconvolution framework provides an essential tool for characterizing microbial taxa never before seen, laying the foundation for addressing fundamental questions concerning the taxa comprising diverse microbial communities. PMID:24146609

  17. Reconstructing the genomic content of microbiome taxa through shotgun metagenomic deconvolution.

    PubMed

    Carr, Rogan; Shen-Orr, Shai S; Borenstein, Elhanan

    2013-01-01

    Metagenomics has transformed our understanding of the microbial world, allowing researchers to bypass the need to isolate and culture individual taxa and to directly characterize both the taxonomic and gene compositions of environmental samples. However, associating the genes found in a metagenomic sample with the specific taxa of origin remains a critical challenge. Existing binning methods, based on nucleotide composition or alignment to reference genomes allow only a coarse-grained classification and rely heavily on the availability of sequenced genomes from closely related taxa. Here, we introduce a novel computational framework, integrating variation in gene abundances across multiple samples with taxonomic abundance data to deconvolve metagenomic samples into taxa-specific gene profiles and to reconstruct the genomic content of community members. This assembly-free method is not bounded by various factors limiting previously described methods of metagenomic binning or metagenomic assembly and represents a fundamentally different approach to metagenomic-based genome reconstruction. An implementation of this framework is available at http://elbo.gs.washington.edu/software.html. We first describe the mathematical foundations of our framework and discuss considerations for implementing its various components. We demonstrate the ability of this framework to accurately deconvolve a set of metagenomic samples and to recover the gene content of individual taxa using synthetic metagenomic samples. We specifically characterize determinants of prediction accuracy and examine the impact of annotation errors on the reconstructed genomes. We finally apply metagenomic deconvolution to samples from the Human Microbiome Project, successfully reconstructing genus-level genomic content of various microbial genera, based solely on variation in gene count. These reconstructed genera are shown to correctly capture genus-specific properties. With the accumulation of metagenomic data, this deconvolution framework provides an essential tool for characterizing microbial taxa never before seen, laying the foundation for addressing fundamental questions concerning the taxa comprising diverse microbial communities. PMID:24146609

  18. Energy life-cycle analysis modeling and decision support tool

    SciTech Connect

    Hoza, M.; White, M.E.

    1993-06-01

    As one of DOE`s five multi-program national laboratories, Pacific Northwest Laboratory (PNL) develops and deploys technology for national missions in energy and the environment. The Energy Information Systems Group, within the Laboratory`s Computer Sciences Department, focuses on the development of the computational and data communications infrastructure and automated tools for the Transmission and Distribution energy sector and for advanced process engineering applications. The energy industry is being forced to operate in new ways and under new constraints. It is in a reactive mode, reacting to policies and politics, and to economics and environmental pressures. The transmission and distribution sectors are being forced to find new ways to maximize the use of their existing infrastructure, increase energy efficiency, and minimize environmental impacts, while continuing to meet the demands of an ever increasing population. The creation of a sustainable energy future will be a challenge for both the soft and hard sciences. It will require that we as creators of our future be bold in the way we think about our energy future and aggressive in its development. The development of tools to help bring about a sustainable future will not be simple either. The development of ELCAM, for example, represents a stretch for the computational sciences as well as for each of the domain sciences such as economics, which will have to be team members.

  19. QUANTITATIVE TRAIT LOCUS ANALYSIS AS A GENE DISCOVERY TOOL

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative trait locus analysis has been a mainstay approach for obtaining a genetic description of complex agronomic traits for plants. What is sometimes overlooked is the role QTL analysis can play in identifying genes that underlay complex traits. In this chapter, I will describe the basic st...

  20. Core Curriculum Analysis: A Tool for Educational Design

    ERIC Educational Resources Information Center

    Levander, Lena M.; Mikkola, Minna

    2009-01-01

    This paper examines the outcome of a dimensional core curriculum analysis. The analysis process was an integral part of an educational development project, which aimed to compact and clarify the curricula of the degree programmes. The task was also in line with the harmonising of the degree structures as part of the Bologna process within higher