Science.gov

Sample records for deconvolution analysis tool

  1. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  2. PVT Analysis With A Deconvolution Algorithm

    SciTech Connect

    Kouzes, Richard T.

    2011-02-01

    Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

  3. Chemometric data analysis for deconvolution of overlapped ion mobility profiles.

    PubMed

    Zekavat, Behrooz; Solouki, Touradj

    2012-11-01

    We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution. PMID:22948903

  4. Importance of FTIR Spectra Deconvolution for the Analysis of Amorphous Calcium Phosphates

    NASA Astrophysics Data System (ADS)

    Brangule, Agnese; Agris Gross, Karlis

    2015-03-01

    This work will consider Fourier transform infrared spectroscopy - diffuse reflectance infrared reflection (FTIR-DRIFT) for collecting the spectra and deconvolution to identify changes in bonding as a means of more powerful detection. Spectra were recorded from amorphous calcium phosphate synthesized by wet precipitation, and from bone. FTIR-DRIFT was used to study the chemical environments of PO4, CO3 and amide. Deconvolution of spectra separated overlapping bands in the ʋ4PO4, ʋ2CO3, ʋ3CO3 and amide region allowing a more detailed analysis of changes at the atomic level. Amorphous calcium phosphate dried at 80 oC, despite showing an X-ray diffraction amorphous structure, displayed carbonate in positions resembling a carbonated hydroxyapatite. Additional peaks were designated as A1 type, A2 type or B type. Deconvolution allowed the separation of CO3 positions in bone from amide peaks. FTIR-DRIFT spectrometry in combination with deconvolution offers an advanced tool for qualitative and quantitative determination of CO3, PO4 and HPO4 and shows promise to measure the degree of order.

  5. Deconvolution Program

    Energy Science and Technology Software Center (ESTSC)

    1999-02-18

    The program is suitable for a lot of applications in applied mathematics, experimental physics, signal analytical system and some engineering applications range i.e. deconvolution spectrum, signal analysis and system property analysis etc.

  6. Multispectral imaging analysis: spectral deconvolution and applications in biology

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Ahmed, Wamiq; Bayraktar, Bulent; Rajwa, Bartek; Sturgis, Jennifer; Robinson, J. P.

    2005-03-01

    Multispectral imaging has been in use for over half a century. Owing to advances in digital photographic technology, multispectral imaging is now used in settings ranging from clinical medicine to industrial quality control. Our efforts focus on the use of multispectral imaging coupled with spectral deconvolution for measurement of endogenous tissue fluorophores and for animal tissue analysis by multispectral fluorescence, absorbance, and reflectance data. Multispectral reflectance and fluorescence images may be useful in evaluation of pathology in histological samples. For example, current hematoxylin/eosin diagnosis limits spectral analysis to shades of red and blue/grey. It is possible to extract much more information using multispectral techniques. To collect this information, a series of filters or a device such as an acousto-optical tunable filter (AOTF) or liquid-crystal filter (LCF) can be used with a CCD camera, enabling collection of images at many more wavelengths than is possible with a simple filter wheel. In multispectral data processing the "unmixing" of reflectance or fluorescence data and analysis and the classification based upon these spectra is required for any classification. In addition to multispectral techniques, extraction of topological information may be possible by reflectance deconvolution or multiple-angle imaging, which could aid in accurate diagnosis of skin lesions or isolation of specific biological components in tissue. The goal of these studies is to develop spectral signatures that will provide us with specific and verifiable tissue structure/function information. In addition, relatively complex classification techniques must be developed so that the data are of use to the end user.

  7. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    PubMed Central

    2010-01-01

    Background Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results. PMID:20429911

  8. Use of new spectral analysis methods in gamma spectra deconvolution

    NASA Astrophysics Data System (ADS)

    Pinault, Jean Louis

    1991-07-01

    A general deconvolution method applicable to X and gamma ray spectrometry is proposed. Using new spectral analysis methods, it is applied to an actual case: the accurate on-line analysis of three elements (Ca, Si, Fe) in a cement plant using neutron capture gamma rays. Neutrons are provided by a low activity (5 μg) 252Cf source; the detector is a BGO 3 in. × 8 in. scintillator. The principle of the method rests on the Fourier transform of the spectrum. The search for peaks and determination of peak areas are worked out in the Fourier representation, which enables separation of background and peaks and very efficiently discriminates peaks, or elements represented by several peaks. First the spectrum is transformed so that in the new representation the full width at half maximum (FWHM) is independent of energy. Thus, the spectrum is arranged symmetrically and transformed into the Fourier representation. The latter is multiplied by a function in order to transform original Gaussian into Lorentzian peaks. An autoregressive filter is calculated, leading to a characteristic polynomial whose complex roots represent both the location and the width of each peak, provided that the absolute value is lower than unit. The amplitude of each component (the area of each peak or the sum of areas of peaks characterizing an element) is fitted by the weighted least squares method, taking into account that errors in spectra are independent and follow a Poisson law. Very accurate results are obtained, which would be hard to achieve by other methods. The DECO FORTRAN code has been developed for compatible PC microcomputers. Some features of the code are given.

  9. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer studies is critically discussed, where special emphasis is set on evaluating different data processing strategies on the example of enriched stable Sr isotopes.1 The analytical key parameters such as blank (Kr, Sr and Rb), variation of the natural Sr isotopic composition in the sample, mass bias, interferences (Rb) and total combined uncertainty are considered. A full metrological protocol for data processing using IPD is presented based on data gained during two transgenerational marking studies of fish, where the transfer of a Sr isotope double spike (84Sr and 86Sr) from female spawners of common carp (Cyprinus carpio L.) and brown trout (Salmo trutta f.f.)2 to the centre of the otoliths of their offspring was studied by (LA)-MC-ICP-MS. 1J. Irrgeher, A. Zitek, M. Cervicek and T. Prohaska, J. Anal. At. Spectrom., 2014, 29, 193-200. 2A. Zitek, J. Irrgeher, M. Kletzl, T. Weismann and T. Prohaska, Fish. Manage. Ecol., 2013, 20, 654-361.

  10. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  11. Data Dependent Peak Model Based Spectrum Deconvolution for Analysis of High Resolution LC-MS Data

    PubMed Central

    2015-01-01

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  12. Data dependent peak model based spectrum deconvolution for analysis of high resolution LC-MS data.

    PubMed

    Wei, Xiaoli; Shi, Xue; Kim, Seongho; Patrick, Jeffrey S; Binkley, Joe; Kong, Maiying; McClain, Craig; Zhang, Xiang

    2014-02-18

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  13. Application of spectral deconvolution and inverse mechanistic modelling as a tool for root cause investigation in protein chromatography.

    PubMed

    Brestrich, Nina; Hahn, Tobias; Hubbuch, Jürgen

    2016-03-11

    In chromatographic protein purification, process variations, aging of columns, or processing errors can lead to deviations of the expected elution behavior of product and contaminants and can result in a decreased pool purity or yield. A different elution behavior of all or several involved species leads to a deviating chromatogram. The causes for deviations are however hard to identify by visual inspection and complicate the correction of a problem in the next cycle or batch. To overcome this issue, a tool for root cause investigation in protein chromatography was developed. The tool combines a spectral deconvolution with inverse mechanistic modelling. Mid-UV spectral data and Partial Least Squares Regression were first applied to deconvolute peaks to obtain the individual elution profiles of co-eluting proteins. The individual elution profiles were subsequently used to identify errors in process parameters by curve fitting to a mechanistic chromatography model. The functionality of the tool for root cause investigation was successfully demonstrated in a model protein study with lysozyme, cytochrome c, and ribonuclease A. Deviating chromatograms were generated by deliberately caused errors in the process parameters flow rate and sodium-ion concentration in loading and elution buffer according to a design of experiments. The actual values of the three process parameters and, thus, the causes of the deviations were estimated with errors of less than 4.4%. Consequently, the established tool for root cause investigation is a valuable approach to rapidly identify process variations, aging of columns, or processing errors. This might help to minimize batch rejections or contribute to an increased productivity. PMID:26879457

  14. A further analysis for the minimum-variance deconvolution filter performance

    NASA Technical Reports Server (NTRS)

    Chi, Chong-Yung

    1987-01-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  15. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    NASA Astrophysics Data System (ADS)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the activating position of the main Si-O-T (where T is Al/Si) stretching band in the FTIR spectrum, which gives an indication of the relative changes in the Si/Al ratio. Also, the effect of the cation and silicate concentration in the activating solution has been discussed using the Fourier self deconvolution technique.

  16. Investigation of the CLEAN deconvolution method for use with Late Time Response analysis of multiple objects

    NASA Astrophysics Data System (ADS)

    Hutchinson, Simon; Taylor, Christopher T.; Fernando, Michael; Andrews, David; Bowring, Nicholas

    2014-10-01

    This paper investigates the application of the CLEAN non-linear deconvolution method to Late Time Response (LTR) analysis for detecting multiple objects in Concealed Threat Detection (CTD). When an Ultra-Wide Band (UWB) frequency radar signal is used to illuminate a conductive target, surface currents are induced upon the object which in turn give rise to LTR signals. These signals are re-radiated from the target and the results from a number of targets are presented. The experiment was performed using double ridged horn antenna in a pseudo-monostatic arrangement. A Vector network analyser (VNA) has been used to provide the UWB Frequency Modulated Continuous Wave (FMCW) radar signal. The distance between the transmitting antenna and the target objects has been kept at 1 metre for all the experiments performed and the power level at the VNA was set to 0dBm. The targets in the experimental setup are suspended in air in a laboratory environment. Matlab has been used in post processing to perform linear and non-linear deconvolution of the signal. The Wiener filter, Fast Fourier Transform (FFT) and Continuous Wavelet Transform (CWT) are used to process the return signals and extract the LTR features from the noise clutter. A Generalized Pencil-of-Function (GPOF) method was then used to extract the complex poles of the signal. Artificial Neural Networks (ANN) and Linear Discriminant Analysis (LDA) have been used to classify the data.

  17. Fully automated deconvolution method for on-line analysis of time-resolved fluorescence spectroscopy data based on an iterative Laguerre expansion technique

    NASA Astrophysics Data System (ADS)

    Dabir, Aditi S.; Trivedi, Chintan A.; Ryu, Yeontack; Pande, Paritosh; Jo, Javier A.

    2009-03-01

    Time-resolved fluorescence spectroscopy (TRFS) is a powerful analytical tool for quantifying the biochemical composition of organic and inorganic materials. The potential of TRFS for tissue diagnosis has been recently demonstrated. To facilitate the translation of TRFS to the clinical arena, algorithms for online TRFS data analysis are essential. A fast model-free TRFS deconvolution algorithm based on the Laguerre expansion method has previously been introduced. One limitation of this method, however, is the need to heuristically select two parameters that are crucial for the accurate estimation of the fluorescence decay: the Laguerre parameter α and the expansion order. Here, a new implementation of the Laguerre deconvolution method is introduced, in which a nonlinear least-square optimization of the Laguerre parameter α is performed, and the optimal expansion order is selected based on a minimum description length criterion (MDL). In addition, estimation of the zero-time delay between the recorded instrument response and fluorescence decay is also performed based on normalized mean square error criterion (NMSE). The method is validated on experimental data from fluorescence lifetime standards, endogenous tissue fluorophores, and human tissue. The proposed automated Laguerre deconvolution method will facilitate online applications of TRFS, such as real-time clinical tissue diagnosis.

  18. XAP, a program for deconvolution and analysis of complex X-ray spectra

    USGS Publications Warehouse

    Quick, James E.; Haleby, Abdul Malik

    1989-01-01

    The X-ray analysis program (XAP) is a spectral-deconvolution program written in BASIC and specifically designed to analyze complex spectra produced by energy-dispersive X-ray analytical systems (EDS). XAP compensates for spectrometer drift, utilizes digital filtering to remove background from spectra, and solves for element abundances by least-squares, multiple-regression analysis. Rather than base analyses on only a few channels, broad spectral regions of a sample are reconstructed from standard reference spectra. The effects of this approach are (1) elimination of tedious spectrometer adjustments, (2) removal of background independent of sample composition, and (3) automatic correction for peak overlaps. Although the program was written specifically to operate a KEVEX 7000 X-ray fluorescence analytical system, it could be adapted (with minor modifications) to analyze spectra produced by scanning electron microscopes, electron microprobes, and probes, and X-ray defractometer patterns obtained from whole-rock powders.

  19. MORESANE: MOdel REconstruction by Synthesis-ANalysis Estimators. A sparse deconvolution algorithm for radio interferometric imaging

    NASA Astrophysics Data System (ADS)

    Dabbech, A.; Ferrari, C.; Mary, D.; Slezak, E.; Smirnov, O.; Kenyon, J. S.

    2015-04-01

    Context. Recent years have been seeing huge developments of radio telescopes and a tremendous increase in their capabilities (sensitivity, angular and spectral resolution, field of view, etc.). Such systems make designing more sophisticated techniques mandatory not only for transporting, storing, and processing this new generation of radio interferometric data, but also for restoring the astrophysical information contained in such data. Aims.In this paper we present a new radio deconvolution algorithm named MORESANEand its application to fully realistic simulated data of MeerKAT, one of the SKA precursors. This method has been designed for the difficult case of restoring diffuse astronomical sources that are faint in brightness, complex in morphology, and possibly buried in the dirty beam's side lobes of bright radio sources in the field. Methods.MORESANE is a greedy algorithm that combines complementary types of sparse recovery methods in order to reconstruct the most appropriate sky model from observed radio visibilities. A synthesis approach is used for reconstructing images, in which the synthesis atoms representing the unknown sources are learned using analysis priors. We applied this new deconvolution method to fully realistic simulations of the radio observations of a galaxy cluster and of an HII region in M 31. Results.We show that MORESANE is able to efficiently reconstruct images composed of a wide variety of sources (compact point-like objects, extended tailed radio galaxies, low-surface brightness emission) from radio interferometric data. Comparisons with the state of the art algorithms indicate that MORESANE provides competitive results in terms of both the total flux/surface brightness conservation and fidelity of the reconstructed model. MORESANE seems particularly well suited to recovering diffuse and extended sources, as well as bright and compact radio sources known to be hosted in galaxy clusters.

  20. Analysis of force-deconvolution methods in frequency-modulation atomic force microscopy.

    PubMed

    Welker, Joachim; Illek, Esther; Giessibl, Franz J

    2012-01-01

    In frequency-modulation atomic force microscopy the direct observable is the frequency shift of an oscillating cantilever in a force field. This frequency shift is not a direct measure of the actual force, and thus, to obtain the force, deconvolution methods are necessary. Two prominent methods proposed by Sader and Jarvis (Sader-Jarvis method) and Giessibl (matrix method) are investigated with respect to the deconvolution quality. Both methods show a nontrivial dependence of the deconvolution quality on the oscillation amplitude. The matrix method exhibits spikelike features originating from a numerical artifact. By interpolation of the data, the spikelike features can be circumvented. The Sader-Jarvis method has a continuous amplitude dependence showing two minima and one maximum, which is an inherent property of the deconvolution algorithm. The optimal deconvolution depends on the ratio of the amplitude and the characteristic decay length of the force for the Sader-Jarvis method. However, the matrix method generally provides the higher deconvolution quality. PMID:22496997

  1. Fried deconvolution

    NASA Astrophysics Data System (ADS)

    Gilles, Jérôme; Osher, Stanley

    2012-06-01

    In this paper we present a new approach to deblur the effect of atmospheric turbulence in the case of long range imaging. Our method is based on an analytical formulation, the Fried kernel, of the atmosphere modulation transfer function (MTF) and a framelet based deconvolution algorithm. An important parameter is the refractive index structure which requires specific measurements to be known. Then we propose a method which provides a good estimation of this parameter from the input blurred image. The final algorithms are very easy to implement and show very good results on both simulated blur and real images.

  2. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  3. Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis.

    PubMed

    Murase, Kenya; Miyazaki, Shohei

    2007-05-21

    We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (F(p)), mean capillary transit time (T(c)), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate F(p) values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated F(p) values on T(c), PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the F(p) values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while T(c) did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent. PMID:17473352

  4. Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis

    NASA Astrophysics Data System (ADS)

    Murase, Kenya; Miyazaki, Shohei

    2007-05-01

    We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (Fp), mean capillary transit time (Tc), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate Fp values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated Fp values on Tc, PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the Fp values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while Tc did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent.

  5. Demand Response Analysis Tool

    SciTech Connect

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be used by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.

  6. Demand Response Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be usedmore » by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.« less

  7. TSVD analysis of Euler deconvolution to improve estimating magnetic source parameters: An example from the sele area, Sweden

    NASA Astrophysics Data System (ADS)

    Beiki, Majid

    2013-03-01

    In this paper, I introduce a new approach based on truncated singular value decomposition (TSVD) analysis for improving implementation of grid-based Euler deconvolution with constraints of quasi 2D magnetic sources. I will show that by using TSVD analysis of the gradient matrix of magnetic field anomaly (reduced to pole) for data points located within a square window centered at the maximum of the analytic signal amplitude, we are able to estimate the strike direction and dip angle of 2D structures from the acquired eigenvectors. It is also shown that implementation of the standard grid-based Euler deconvolution can be considerably improved by solving the Euler's homogeneity equation for source location and structural index, simultaneously, using the TSVD method. The dimensionality of the magnetic anomalies can be indicated from the ratio between the smallest and intermediate eigenvalues acquired from the TSVD analysis of the gradient matrix. For 2D magnetic sources, the uncertainty of the estimated source location and structural index is significantly reduced by truncating the smallest eigenvalue. Application of the method is demonstrated on an aeromagnetic data set from the sele area in Sweden. The geology of this area is dominated by several dike swarms. For these dolerite dikes, the introduced method has provided useful information of strike directions and dip angles in addition to the estimated source location and structural index.

  8. ATAMM analysis tool

    NASA Technical Reports Server (NTRS)

    Jones, Robert; Stoughton, John; Mielke, Roland

    1991-01-01

    Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

  9. MS-DIAL: Data Independent MS/MS Deconvolution for Comprehensive Metabolome Analysis

    PubMed Central

    Tsugawa, Hiroshi; Cajka, Tomas; Kind, Tobias; Ma, Yan; Higgins, Brendan; Ikeda, Kazutaka; Kanazawa, Mitsuhiro; VanderGheynst, Jean; Fiehn, Oliver; Arita, Masanori

    2015-01-01

    Data-independent acquisition (DIA) in liquid chromatography tandem mass spectrometry (LC-MS/MS) provides more comprehensive untargeted acquisition of molecular data. Here we provide an open-source software pipeline, MS-DIAL, to demonstrate how DIA improves simultaneous identification and quantification of small molecules by mass spectral deconvolution. For reversed phase LC-MS/MS, our program with an enriched LipidBlast library identified total 1,023 lipid compounds from nine algal strains to highlight their chemotaxonomic relationships. PMID:25938372

  10. Combining deconvolution and noise analysis for the estimation of transmitter release rates at the calyx of held.

    PubMed

    Neher, E; Sakaba, T

    2001-01-15

    The deconvolution method has been used in the past to estimate release rates of synaptic vesicles, but it cannot be applied to synapses where nonlinear interactions of quanta occur. We have extended this method to take into account a nonlinear current component resulting from the delayed clearance of glutamate from the synaptic cleft. We applied it to the calyx of Held and verified the important assumption of constant miniature EPSC (mEPSC) size by combining deconvolution with a variant of nonstationary fluctuation analysis. We found that amplitudes of mEPSCs decreased strongly after extended synaptic activity. Cyclothiazide (CTZ), an inhibitor of glutamate receptor desensitization, eliminated this reduction, suggesting that postsynaptic receptor desensitization occurs during strong synaptic activity at the calyx of Held. Constant mEPSC sizes could be obtained in the presence of CTZ and kynurenic acid (Kyn), a low-affinity blocker of AMPA-receptor channels. CTZ and Kyn prevented postsynaptic receptor desensitization and saturation and also minimized voltage-clamp errors. Therefore, we conclude that in the presence of these drugs, release rates at the calyx of Held can be reliably estimated over a wide range of conditions. Moreover, the method presented should provide a convenient way to study the kinetics of transmitter release at other synapses. PMID:11160425

  11. Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis

    SciTech Connect

    Greenberg, M.; Ebel, D.S.

    2009-03-19

    We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

  12. Independent component analysis (ICA) algorithms for improved spectral deconvolution of overlapped signals in 1H NMR analysis: application to foods and related products.

    PubMed

    Monakhova, Yulia B; Tsikin, Alexey M; Kuballa, Thomas; Lachenmeier, Dirk W; Mushtakova, Svetlana P

    2014-05-01

    The major challenge facing NMR spectroscopic mixture analysis is the overlapping of signals and the arising impossibility to easily recover the structures for identification of the individual components and to integrate separated signals for quantification. In this paper, various independent component analysis (ICA) algorithms [mutual information least dependent component analysis (MILCA); stochastic non-negative ICA (SNICA); joint approximate diagonalization of eigenmatrices (JADE); and robust, accurate, direct ICA algorithm (RADICAL)] as well as deconvolution methods [simple-to-use-interactive self-modeling mixture analysis (SIMPLISMA) and multivariate curve resolution-alternating least squares (MCR-ALS)] are applied for simultaneous (1)H NMR spectroscopic determination of organic substances in complex mixtures. Among others, we studied constituents of the following matrices: honey, soft drinks, and liquids used in electronic cigarettes. Good quality spectral resolution of up to eight-component mixtures was achieved (correlation coefficients between resolved and experimental spectra were not less than 0.90). In general, the relative errors in the recovered concentrations were below 12%. SIMPLISMA and MILCA algorithms were found to be preferable for NMR spectra deconvolution and showed similar performance. The proposed method was used for analysis of authentic samples. The resolved ICA concentrations match well with the results of reference gas chromatography-mass spectrometry as well as the MCR-ALS algorithm used for comparison. ICA deconvolution considerably improves the application range of direct NMR spectroscopy for analysis of complex mixtures. PMID:24604756

  13. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  14. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    NASA Astrophysics Data System (ADS)

    Xuan, Chuang; Oda, Hirokuni

    2015-11-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  15. Physics analysis tools

    SciTech Connect

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

  16. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  17. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  18. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  19. Deconvolution Analysis for Classifying Gastric Adenocarcinoma Patients Based on Differential Scanning Calorimetry Serum Thermograms

    PubMed Central

    Vega, Sonia; Garcia-Gonzalez, María Asuncion; Lanas, Angel; Velazquez-Campoy, Adrian; Abian, Olga

    2015-01-01

    Recently, differential scanning calorimetry (DSC) has been acknowledged as a novel tool for diagnosing and monitoring several diseases. This highly sensitive technique has been traditionally used to study thermally induced protein folding/unfolding transitions. In previous research papers, DSC profiles from blood samples of patients were analyzed and they exhibited marked differences in the thermal denaturation profile. Thus, we investigated the use of this novel technology in blood serum samples from 25 healthy subjects and 30 patients with gastric adenocarcinoma (GAC) at different stages of tumor development with a new multiparametric approach. The analysis of the calorimetric profiles of blood serum from GAC patients allowed us to discriminate three stages of cancer development (I to III) from those of healthy individuals. After a multiparametric analysis, a classification of blood serum DSC parameters from patients with GAC is proposed. Certain parameters exhibited significant differences (P < 0.05) and allowed the discrimination of healthy subjects/patients from patients at different tumor stages. The results of this work validate DSC as a novel technique for GAC patient classification and staging, and offer new graphical tools and value ranges for the acquired parameters in order to discriminate healthy from diseased subjects with increased disease burden. PMID:25614381

  20. Deconvolution analysis for classifying gastric adenocarcinoma patients based on differential scanning calorimetry serum thermograms.

    PubMed

    Vega, Sonia; Garcia-Gonzalez, María Asuncion; Lanas, Angel; Velazquez-Campoy, Adrian; Abian, Olga

    2015-01-01

    Recently, differential scanning calorimetry (DSC) has been acknowledged as a novel tool for diagnosing and monitoring several diseases. This highly sensitive technique has been traditionally used to study thermally induced protein folding/unfolding transitions. In previous research papers, DSC profiles from blood samples of patients were analyzed and they exhibited marked differences in the thermal denaturation profile. Thus, we investigated the use of this novel technology in blood serum samples from 25 healthy subjects and 30 patients with gastric adenocarcinoma (GAC) at different stages of tumor development with a new multiparametric approach. The analysis of the calorimetric profiles of blood serum from GAC patients allowed us to discriminate three stages of cancer development (I to III) from those of healthy individuals. After a multiparametric analysis, a classification of blood serum DSC parameters from patients with GAC is proposed. Certain parameters exhibited significant differences (P < 0.05) and allowed the discrimination of healthy subjects/patients from patients at different tumor stages. The results of this work validate DSC as a novel technique for GAC patient classification and staging, and offer new graphical tools and value ranges for the acquired parameters in order to discriminate healthy from diseased subjects with increased disease burden. PMID:25614381

  1. PCard Data Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes,more » Email status (date sent, response), audit resolution.« less

  2. PCard Data Analysis Tool

    SciTech Connect

    Hilts, Jim

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes, Email status (date sent, response), audit resolution.

  3. Graphical Contingency Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore » the best action by interactively evaluate candidate actions.« less

  4. Transmission Planning Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identifymore » weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysis and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.« less

  5. A posteriori analysis of spatial filters for approximate deconvolution large eddy simulations of homogeneous incompressible flows

    NASA Astrophysics Data System (ADS)

    Staples, Anne; San, Omer

    2012-11-01

    We investigate the effect of low-pass spatial filters for approximate deconvolution large eddy simulation (AD-LES) of turbulent incompressible flows. We propose the hyper-differential filter as a means of increasing the accuracy of the AD-LES model without increasing the computational cost. Box filters, Pade filters, and differential filters with a wide range of parameters are studied in the AD-LES framework. The AD-LES model, in conjunction with these spatial filters, is tested in the numerical simulation of the three-dimensional Taylor-Green vortex problem. The numerical results are benchmarked against direct numerical simulation (DNS) data. An under-resolved numerical simulation is also used for comparison purposes. According to the criteria used, the numerical results yield the following two conclusions: first, the AD-LES model equipped with any of these spatial filters yields accurate results at a fraction of the computational cost of DNS. Second, the most accurate results are obtained with the hyper-differential filter, followed by the differential filter.

  6. Multisensor analysis tool

    NASA Astrophysics Data System (ADS)

    Gerlach, Frank W.; Cook, Daniel B.

    1991-09-01

    The multisensor analysis tool (MSAT) was developed as a software aid for the design and parametric analysis of electro-optical sensors and multisensor suites. An executable version of MSAT is uniquely specified by combining a generic window-based shell with user-supplied sensor knowledge. The application-independent shell utilizes buttons, pull-down menus, plots, on-line help, and a spreadsheet-like interface. Sensor knowledge is supplied by users in the form of those equations which define a model; no programming or debugging is required. MSAT writes source code from this knowledge and links it with the shell. The tool has no concept of either input or output; any parameter may be varied in order to monitor its effects on all other related parameters. MSAT runs on a Sun workstation, comes with full documentation, and has a database which is being continuously populated with new sensor models and knowledge.

  7. Swift Science Analysis Tools

    NASA Astrophysics Data System (ADS)

    Marshall, F. E.; Swift Team Team

    2003-05-01

    Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (http://swiftsc.gsfc.nasa.gov), and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

  8. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  9. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  10. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  11. Stack Trace Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree.more » The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  12. Stack Trace Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2008-01-16

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallel application. STAT uses the MRNet free based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to form a single call prefix tree.more »The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence classes. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  13. Stack Trace Analysis Tool

    SciTech Connect

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  14. Convolution-deconvolution in DIGES

    SciTech Connect

    Philippacopoulos, A.J.; Simos, N.

    1995-05-01

    Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities.

  15. Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis

    PubMed Central

    Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

    2015-01-01

    The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

  16. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  17. Neutron multiplicity analysis tool

    SciTech Connect

    Stewart, Scott L

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This program was developed to help speed the analysis of Monte Carlo neutron transport simulation (MCNP) data, and only requires the count-rate data to calculate the mass of material using INCC's analysis methods instead of the full neutron multiplicity distribution required to run analysis in INCC. This paper describes what is implemented within EXCOM, including the methods used, how the program corrects for deadtime, and how uncertainty is calculated. This paper also describes how to use EXCOM within Excel.

  18. Geodetic Strain Analysis Tool

    NASA Technical Reports Server (NTRS)

    Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

    2011-01-01

    A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

  19. A new spectral deconvolution - selected ion monitoring method for the analysis of alkylated polycyclic aromatic hydrocarbons in complex mixtures.

    PubMed

    Robbat, Albert; Wilton, Nicholas M

    2014-07-01

    A new gas chromatography/mass spectrometry (GC/MS) method is proffered for the analysis of polycyclic aromatic hydrocarbons (PAH) and their alkylated homologs in complex samples. Recent work elucidated the fragmentation pathways of alkylated PAH, concluding that multiple fragmentation patterns per homolog (MFPPH) are needed to correctly identify all isomers. Programming the MS in selected ion monitoring (SIM) mode to detect homolog-specific MFPPH ions delivers the selectivity and sensitivity that the conventional SIM and/or full scan mass spectrometry methods fail to provide. New spectral deconvolution software eliminates the practice of assigning alkylated homolog peaks via pattern recognition within laboratory-defined retention windows. Findings show that differences in concentration by SIM/molecular ion detection of C1-C4 PAH, now the standard, yield concentration differences compared to SIM/MFPPH of thousands of percent for some homologs. The SIM/MFPPH methodology is also amenable to the analysis of polycyclic aromatic sulfur heterocycles (PASH) and their alkylated homologs, since many PASH have the same m/z ions as those of PAH and, thus, are false positives in SIM/1-ion PAH detection methods. PMID:24840423

  20. Constrained spherical deconvolution analysis of the limbic network in human, with emphasis on a direct cerebello-limbic pathway

    PubMed Central

    Arrigo, Alessandro; Mormina, Enricomaria; Anastasi, Giuseppe Pio; Gaeta, Michele; Calamuneri, Alessandro; Quartarone, Angelo; De Salvo, Simona; Bruschetta, Daniele; Rizzo, Giuseppina; Trimarchi, Fabio; Milardi, Demetrio

    2014-01-01

    The limbic system is part of an intricate network which is involved in several functions like memory and emotion. Traditionally the role of the cerebellum was considered mainly associated to motion control; however several evidences are raising about a role of the cerebellum in learning skills, emotions control, mnemonic and behavioral processes involving also connections with limbic system. In 15 normal subjects we studied limbic connections by probabilistic Constrained Spherical Deconvolution (CSD) tractography. The main result of our work was to prove for the first time in human brain the existence of a direct cerebello-limbic pathway which was previously hypothesized but never demonstrated. We also extended our analysis to the other limbic connections including cingulate fasciculus, inferior longitudinal fasciculus, uncinated fasciculus, anterior thalamic connections and fornix. Although these pathways have been already described in the tractographic literature we provided reconstruction, quantitative analysis and Fractional Anisotropy (FA) right-left symmetry comparison using probabilistic CSD tractography that is known to provide a potential improvement compared to previously used Diffusion Tensor Imaging (DTI) techniques. The demonstration of the existence of cerebello-limbic pathway could constitute an important step in the knowledge of the anatomic substrate of non-motor cerebellar functions. Finally the CSD statistical data about limbic connections in healthy subjects could be potentially useful in the diagnosis of pathological disorders damaging this system. PMID:25538606

  1. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  2. Effect of ultra-strong static magnetic field on bacteria: application of Fourier-transform infrared spectroscopy combined with cluster analysis and deconvolution.

    PubMed

    Hu, Xing; Qiu, Zunan; Wang, Yerui; She, Zichao; Qian, Guangren; Ren, Zhongming

    2009-09-01

    A new method based on Fourier-transform infrared (FTIR) spectroscopy combined with cluster analysis and deconvolution was established to investigate the biological effect of an ultra-strong static magnetic field (SMF) of 10.0 T on Escherichia coli and Staphylococcus aureus. FTIR spectroscopy was applied to characterize the spectroscopic fingerprints of these bacterial cells with or without the treatment of the SMF. After the calculation, the results of cluster analysis indicated that the SMF had significant effects on E. coli compared with S. aureus, and the effects were reflected by the changes of spectral region of 1500-1200 cm(-1). The deconvolution results of this major indication region showed that the composition and conformation of nucleic acid, protein, and fatty acid of E. coli were altered under the magnetic conditions. PMID:19441065

  3. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  4. New spectral deconvolution algorithms for the analysis of polycyclic aromatic hydrocarbons and sulfur heterocycles by comprehensive two-dimensional gas chromatography-quadrupole mass spectrometery.

    PubMed

    Antle, Patrick M; Zeigler, Christian D; Gankin, Yuriy; Robbat, Albert

    2013-11-01

    New mass spectral deconvolution algorithms have been developed for comprehensive two-dimensional gas chromatography/quadrupole mass spectrometry (GC × GC/qMS). This paper reports the first use of spectral deconvolution of full scan quadrupole GC × GC/MS data for the quantitative analysis of polycyclic aromatic hydrocarbons (PAH) and polycyclic aromatic sulfur heterocycles (PASH) in coal tar-contaminated soil. A method employing four ions per isomer and multiple fragmentation patterns per alkylated homologue (MFPPH) is used to quantify target compounds. These results are in good agreement with GC/MS concentrations, and an examination of method precision, accuracy, selectivity, and sensitivity is discussed. MFPPH and SIM/1-ion concentration differences are also examined. PMID:24063305

  5. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

  6. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  7. Automated Deconvolution of Overlapped Ion Mobility Profiles

    NASA Astrophysics Data System (ADS)

    Brantley, Matthew; Zekavat, Behrooz; Harper, Brett; Mason, Rachel; Solouki, Touradj

    2014-10-01

    Presence of unresolved ion mobility (IM) profiles limits the efficient utilization of IM mass spectrometry (IM-MS) systems for isomer differentiation. Here, we introduce an automated ion mobility deconvolution (AIMD) computer software for streamlined deconvolution of overlapped IM-MS profiles. AIMD is based on a previously reported post-IM/collision-induced dissociation (CID) deconvolution approach [ J. Am. Soc. Mass Spectrom. 23, 1873 (2012)] and, unlike the previously reported manual approach, it does not require resampling of post-IM/CID data. A novel data preprocessing approach is utilized to improve the accuracy and efficiency of the deconvolution process. Results from AIMD analysis of overlapped IM profiles of data from (1) Waters Synapt G1 for a binary mixture of isomeric peptides (amino acid sequences: GRGDS and SDGRG) and (2) Waters Synapt G2-S for a binary mixture of isomeric trisaccharides (raffinose and isomaltotriose) are presented.

  8. Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols

    EPA Science Inventory

    In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

  9. Digital deconvolution filter derived from linear discriminant analysis and application for multiphoton fluorescence microscopy.

    PubMed

    Sullivan, Shane Z; Schmitt, Paul D; Muir, Ryan D; DeWalt, Emma L; Simpson, Garth J

    2014-04-01

    A digital filter derived from linear discriminant analysis (LDA) is developed for recovering impulse responses in photon counting from a high speed photodetector (rise time of ~1 ns) and applied to remove ringing distortions from impedance mismatch in multiphoton fluorescence microscopy. Training of the digital filter was achieved by defining temporally coincident and noncoincident transients and identifying the projection within filter-space that best separated the two classes. Once trained, data analysis by digital filtering can be performed quickly. Assessment of the reliability of the approach was performed through comparisons of simulated voltage transients, in which the ground truth results were known a priori. The LDA filter was also found to recover deconvolved impulses for single photon counting from highly distorted ringing waveforms from an impedance mismatched photomultiplier tube. The LDA filter was successful in removing these ringing distortions from two-photon excited fluorescence micrographs and through data simulations was found to extend the dynamic range of photon counting by approximately 3 orders of magnitude through minimization of detector paralysis. PMID:24559143

  10. FSSC Science Tools: Pulsar Analysis

    NASA Technical Reports Server (NTRS)

    Thompson, Dave

    2010-01-01

    This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

  11. Blind decorrelation and deconvolution algorithm for multiple-input multiple-output system: II. Analysis and simulation

    NASA Astrophysics Data System (ADS)

    Chen, Da-Ching; Yu, Tommy; Yao, Kung; Pottie, Gregory J.

    1999-11-01

    For single-input multiple-output (SIMO) systems blind deconvolution based on second-order statistics has been shown promising given that the sources and channels meet certain assumptions. In our previous paper we extend the work to multiple-input multiple-output (MIMO) systems by introducing a blind deconvolution algorithm to remove all channel dispersion followed by a blind decorrelation algorithm to separate different sources from their instantaneous mixture. In this paper we first explore more details embedded in our algorithm. Then we present simulation results to show that our algorithm is applicable to MIMO systems excited by a broad class of signals such as speech, music and digitally modulated symbols.

  12. Unsupervised Blind Deconvolution

    NASA Astrophysics Data System (ADS)

    Baena-Galle, R.; Kann, L.; Mugnier, L.; Gudimetla, R.; Johnson, R.; Gladysz, S.

    2013-09-01

    "Blind" deconvolution is rarely executed blindly. All available methods have parameters which the user fine-tunes until the most visually-appealing reconstruction is achieved. The "art" of deconvolution is to find constraints which allow for the best estimate of an object to be recovered, but in practice these parameterized constraints often reduce deconvolution to the struggle of trial and error. In the course of AFOSR-sponsored activities we are developing a general maximum a posteriori framework for the problem of imaging through atmospheric turbulence, with the emphasis on multi-frame blind deconvolution. Our aim is to develop deconvolution strategy which is reference-less, i.e. no calibration PSF is required, extendable to longer exposures, and applicable to imaging with adaptive optics. In the first part of the project the focus has been on developing a new theory of statistics of images taken through turbulence, both with-, and without adaptive optics. Images and their Fourier transforms have been described as random phasor sums, their fluctuations controlled by wavefront "cells" and moments of the phase. The models were validated using simulations and real data from the 3.5m telescope at the Starfire Optical Range in New Mexico. Another important ingredient of the new framework is the capability to estimate the average PSF automatically from the target observations. A general approach, applicable to any type of object, has been proposed. Here use is made of an object-cancelling transformation of the image sequence. This transformation yields information about the atmospheric PSF. Currently, the PSF estimation module and the theoretical constraints on PSF variability are being incorporated into multi-frame blind deconvolution. In preliminary simulation tests we obtained significantly sharper images with respect to the starting observations and PSF estimates which closely track the input kernels. Thanks to access to the SOR 3.5m telescope we are now testing our deconvolution approach on images of real, extended objects. Adaptive-optics-assisted I-band observations at SOR rarely exceed Strehl ratios of 15% and therefore, in many cases, deconvolution post adaptive optics would be necessary for object identification.

  13. Logistics Process Analysis ToolProcess Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component wasmore » added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  14. Logistics Process Analysis ToolProcess Analysis Tool

    SciTech Connect

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).

  15. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published. PMID:18495751

  16. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  17. Comparison of environmental TLD (thermoluminescent dosimeter) results obtained using glow curve deconvolution and region of interest analysis

    SciTech Connect

    Not Available

    1987-01-01

    We tested a Harshaw Model 4000 TLD Reader in the Sandia Environmental TLD Program. An extra set of LiF TLD-700 chips were prepared for each field location and calibration level. At the end of quarter one, half of the TLDs were read on the Model 4000 and the other half were read on our standard Harshaw Model 2000. This presentation compares the results of the two systems. The Model 4000 results are reported for two regions of interest and for background subtraction using Harshaw Glow Curve Deconvolution Software.

  18. Tiling Microarray Analysis Tools

    SciTech Connect

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)

  19. Wavespace-Based Coherent Deconvolution

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Cattafesta, Louis N., III

    2012-01-01

    Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

  20. Sandia PUF Analysis Tool

    SciTech Connect

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  1. Sandia PUF Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics suchmore » as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less

  2. Tiling Microarray Analysis Tools

    Energy Science and Technology Software Center (ESTSC)

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more » 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  3. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  4. EASY-GOING deconvolution: Automated MQMAS NMR spectrum analysis based on a model with analytical crystallite excitation efficiencies

    NASA Astrophysics Data System (ADS)

    Grimminck, Dennis L. A. G.; van Meerten, Bas; Verkuijlen, Margriet H. W.; van Eck, Ernst R. H.; Leo Meerts, W.; Kentgens, Arno P. M.

    2013-03-01

    The EASY-GOING deconvolution (EGdeconv) program is extended to enable fast and automated fitting of multiple quantum magic angle spinning (MQMAS) spectra guided by evolutionary algorithms. We implemented an analytical crystallite excitation model for spectrum simulation. Currently these efficiencies are limited to two-pulse and z-filtered 3QMAS spectra of spin 3/2 and 5/2 nuclei, whereas for higher spin-quantum numbers ideal excitation is assumed. The analytical expressions are explained in full to avoid ambiguity and facilitate others to use them. The EGdeconv program can fit interaction parameter distributions. It currently includes a Gaussian distribution for the chemical shift and an (extended) Czjzek distribution for the quadrupolar interaction. We provide three case studies to illustrate EGdeconv's capabilities for fitting MQMAS spectra. The EGdeconv program is available as is on our website http://egdeconv.science.ru.nl for 64-bit Linux operating systems.

  5. Heliostat cost-analysis tool

    SciTech Connect

    Brandt, L.D.; Chang, R.E.

    1981-10-01

    Estimated production costs of solar energy systems serve as guides for future component development and as measures of the potential economic viability of the technologies. The analysis of heliostat costs is particularly important since the heliostat field is the largest cost component of a solar central receiver plant. A heliostat cost analysis tool (HELCAT) that processes manufacturing, transportation, and installation cost data has been developed to provide a consistent structure for cost analyses. HELCAT calculates a representative product price based on direct input data (e.g. direct materials, direct labor, capital requirements) and various economic, financial, and accounting assumptions. The characteristics of this tool and its initial application in the evaluation of second generation heliostat cost estimates are discussed. A set of nominal economic and financial parameters is also suggested.

  6. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  7. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  8. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  9. Windprofiler optimization using digital deconvolution procedures

    NASA Astrophysics Data System (ADS)

    Hocking, W. K.; Hocking, A.; Hocking, D. G.; Garbanzo-Salas, M.

    2014-10-01

    Digital improvements to data acquisition procedures used for windprofiler radars have the potential for improving the height coverage at optimum resolution, and permit improved height resolution. A few newer systems already use this capability. Real-time deconvolution procedures offer even further optimization, and this has not been effectively employed in recent years. In this paper we demonstrate the advantages of combining these features, with particular emphasis on the advantages of real-time deconvolution. Using several multi-core CPUs, we have been able to achieve speeds of up to 40 GHz from a standard commercial motherboard, allowing data to be digitized and processed without the need for any type of hardware except for a transmitter (and associated drivers), a receiver and a digitizer. No Digital Signal Processor chips are needed, allowing great flexibility with analysis algorithms. By using deconvolution procedures, we have then been able to not only optimize height resolution, but also have been able to make advances in dealing with spectral contaminants like ground echoes and other near-zero-Hz spectral contamination. Our results also demonstrate the ability to produce fine-resolution measurements, revealing small-scale structures within the backscattered echoes that were previously not possible to see. Resolutions of 30 m are possible for VHF radars. Furthermore, our deconvolution technique allows the removal of range-aliasing effects in real time, a major bonus in many instances. Results are shown using new radars in Canada and Costa Rica.

  10. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  11. IMPAIR: massively parallel deconvolution on the GPU

    NASA Astrophysics Data System (ADS)

    Sherry, Michael; Shearer, Andy

    2013-02-01

    The IMPAIR software is a high throughput image deconvolution tool for processing large out-of-core datasets of images, varying from large images with spatially varying PSFs to large numbers of images with spatially invariant PSFs. IMPAIR implements a parallel version of the tried and tested Richardson-Lucy deconvolution algorithm regularised via a custom wavelet thresholding library. It exploits the inherently parallel nature of the convolution operation to achieve quality results on consumer grade hardware: through the NVIDIA Tesla GPU implementation, the multi-core OpenMP implementation, and the cluster computing MPI implementation of the software. IMPAIR aims to address the problem of parallel processing in both top-down and bottom-up approaches: by managing the input data at the image level, and by managing the execution at the instruction level. These combined techniques will lead to a scalable solution with minimal resource consumption and maximal load balancing. IMPAIR is being developed as both a stand-alone tool for image processing, and as a library which can be embedded into non-parallel code to transparently provide parallel high throughput deconvolution.

  12. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  13. A posteriori analysis of low-pass spatial filters for approximate deconvolution large eddy simulations of homogeneous incompressible flows

    NASA Astrophysics Data System (ADS)

    San, O.; Staples, A. E.; Iliescu, T.

    2015-01-01

    The goal of this paper is twofold: first, it investigates the effect of low-pass spatial filters for approximate deconvolution large eddy simulation (AD-LES) of turbulent incompressible flows. Second, it proposes the hyper-differential filter as a means of increasing the accuracy of the AD-LES model without increasing the computational cost. Box filters, Pad\\'{e} filters, and differential filters with a wide range of parameters are studied in the AD-LES framework. The AD-LES model, in conjunction with these spatial filters, is tested in the numerical simulation of the three-dimensional Taylor-Green vortex problem. The numerical results are benchmarked against direct numerical simulation (DNS) data. An under-resolved numerical simulation is also used for comparison purposes. Four criteria are used to investigate the AD-LES model equipped with these spatial filters: (i) the time series of the volume-averaged enstrophy; (ii) the volume-averaged third-order structure function; (iii) the $L^2$-norm of the velocity and vorticity errors; and (iv) the volume-averaged velocity and vorticity correlation coefficients. According to these criteria, the numerical results yield the following two conclusions: first, the AD-LES model equipped with any of these spatial filters yields accurate results at a fraction of the computational cost of DNS. Second, the most accurate results are obtained with the hyper-differential filter, followed by the differential filter. We demonstrate that the results highly depend on the selection of the filtering procedure. Although a careful parameter choice makes each class of filters used in this study competitive, it seems that filters whose transfer function resembles that of the Fourier cut-off filter (such as the hyper-differential filters) tend to perform best.

  14. Rapid gas chromatographic analysis of less abundant compounds in distilled spirits by direct injection with ethanol-water venting and mass spectrometric data deconvolution.

    PubMed

    Macnamara, Kevin; Lee, Michelle; Robbat, Albert

    2010-01-01

    The principal trace secondary compounds common to fermentation-derived distilled spirits can be rapidly quantified by directly injecting 5muL of spirit without sample preparation to a narrow-bore 0.15mm internal diameter capillary column. The ethanol-water is removed in an initial solvent venting step using a programmed temperature vapourization injector, followed by splitless transfer of the target analytes to the column. The larger injection facilitates trace analysis and ethanol-water removal extends column lifetime. Problems of coelution between analytes or with sample matrix were surmounted by using mass spectral deconvolution software for quantification. All operations in the analysis from injection with solvent venting to data reduction are fully automated for unattended sequential sample analysis. The synergy of the various contributory steps combines to offer an effective novel solution for this analysis. Applications include quantification of low ppm amounts of acids and esters and sub-ppm profiling of trace compounds from both the raw material malt and the ageing in wood barrels. PMID:19959175

  15. Bayesian least squares deconvolution

    NASA Astrophysics Data System (ADS)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  16. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

  17. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  18. Deconvolution using a neural network

    SciTech Connect

    Lehman, S.K.

    1990-11-15

    Viewing one dimensional deconvolution as a matrix inversion problem, we compare a neural network backpropagation matrix inverse with LMS, and pseudo-inverse. This is a largely an exercise in understanding how our neural network code works. 1 ref.

  19. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  20. ETAT: Expository Text Analysis Tool.

    PubMed

    Vidal-Abarca, Eduardo; Reyes, Héctor; Gilabert, Ramiro; Calpe, Javier; Soria, Emilio; Graesser, Arthur C

    2002-02-01

    Qualitative methods that analyze the coherence of expository texts not only are time consuming, but also present challenges in collecting data on coding reliability. We describe software that analyzes expository texts more rapidly and produces a notable level of objectivity. ETAT (Expository Text Analysis Tool) analyzes the coherence of expository texts. ETAT adopts a symbolic representational system, known as conceptual graph structures. ETAT follows three steps: segmentation of a text into nodes, classification of the unidentified nodes, and linking the nodes with relational arcs. ETAT automatically constructs a graph in the form of nodes and their interrelationships, along with various attendant statistics and information about noninterrelated, isolated nodes. ETAT was developed in Java, so it is compatible with virtually all computer systems. PMID:12060996

  1. Automated resolution of chromatographic signals by independent component analysis-orthogonal signal deconvolution in comprehensive gas chromatography/mass spectrometry-based metabolomics.

    PubMed

    Domingo-Almenara, Xavier; Perera, Alexandre; Ramírez, Noelia; Brezmes, Jesus

    2016-07-01

    Comprehensive gas chromatography-mass spectrometry (GC×GC-MS) provides a different perspective in metabolomics profiling of samples. However, algorithms for GC×GC-MS data processing are needed in order to automatically process the data and extract the purest information about the compounds appearing in complex biological samples. This study shows the capability of independent component analysis-orthogonal signal deconvolution (ICA-OSD), an algorithm based on blind source separation and distributed in an R package called osd, to extract the spectra of the compounds appearing in GC×GC-MS chromatograms in an automated manner. We studied the performance of ICA-OSD by the quantification of 38 metabolites through a set of 20 Jurkat cell samples analyzed by GC×GC-MS. The quantification by ICA-OSD was compared with a supervised quantification by selective ions, and most of the R(2) coefficients of determination were in good agreement (R(2)>0.90) while up to 24 cases exhibited an excellent linear relation (R(2)>0.95). We concluded that ICA-OSD can be used to resolve co-eluted compounds in GC×GC-MS. PMID:27208528

  2. Climate Data Analysis Tools - (CDAT)

    NASA Astrophysics Data System (ADS)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware software, CDAT's focus is to allow climate researchers the ability to access and analyze multi-dimensional distributed climate datasets.

  3. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  4. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  5. Deconvolution of the vestibular evoked myogenic potential.

    PubMed

    Lütkenhöner, Bernd; Basel, Türker

    2012-02-01

    The vestibular evoked myogenic potential (VEMP) and the associated variance modulation can be understood by a convolution model. Two functions of time are incorporated into the model: the motor unit action potential (MUAP) of an average motor unit, and the temporal modulation of the MUAP rate of all contributing motor units, briefly called rate modulation. The latter is the function of interest, whereas the MUAP acts as a filter that distorts the information contained in the measured data. Here, it is shown how to recover the rate modulation by undoing the filtering using a deconvolution approach. The key aspects of our deconvolution algorithm are as follows: (1) the rate modulation is described in terms of just a few parameters; (2) the MUAP is calculated by Wiener deconvolution of the VEMP with the rate modulation; (3) the model parameters are optimized using a figure-of-merit function where the most important term quantifies the difference between measured and model-predicted variance modulation. The effectiveness of the algorithm is demonstrated with simulated data. An analysis of real data confirms the view that there are basically two components, which roughly correspond to the waves p13-n23 and n34-p44 of the VEMP. The rate modulation corresponding to the first, inhibitory component is much stronger than that corresponding to the second, excitatory component. But the latter is more extended so that the two modulations have almost the same equivalent rectangular duration. PMID:22079097

  6. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

  7. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  8. Deconvolution of gas chromatographic data

    NASA Technical Reports Server (NTRS)

    Howard, S.; Rayborn, G. H.

    1980-01-01

    The use of deconvolution methods on gas chromatographic data to obtain an accurate determination of the relative amounts of each material present by mathematically separating the merged peaks is discussed. Data were obtained on a gas chromatograph with a flame ionization detector. Chromatograms of five xylenes with differing degrees of separation were generated by varying the column temperature at selected rates. The merged peaks were then successfully separated by deconvolution. The concept of function continuation in the frequency domain was introduced in striving to reach the theoretical limit of accuracy, but proved to be only partially successful.

  9. Deconvolution using the complex cepstrum

    SciTech Connect

    Riley, H B

    1980-12-01

    The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

  10. Modeling of pharmacokinetic systems using stochastic deconvolution.

    PubMed

    Kakhi, Maziar; Chittenden, Jason

    2013-12-01

    In environments where complete mechanistic knowledge of the system dynamics is not available, a synergy of first-principle concepts, stochastic methods and statistical approaches can provide an efficient, accurate, and insightful strategy for model development. In this work, a system of ordinary differential equations describing system pharmacokinetics (PK) was coupled to a Wiener process for tracking the absorption rate coefficient, and was embedded in a nonlinear mixed effects population PK formalism. The procedure is referred to as "stochastic deconvolution" and it is proposed as a diagnostic tool to inform on a mapping function between the fraction of the drug absorbed and the fraction of the drug dissolved when applying one-stage methods to in vitro-in vivo correlation modeling. The goal of this work was to show that stochastic deconvolution can infer an a priori specified absorption profile given dense observational (simulated) data. The results demonstrate that the mathematical model is able to accurately reproduce the simulated data in scenarios where solution strategies for linear, time-invariant systems would assuredly fail. To this end, PK systems that are representative of Michaelis-Menten kinetics and enterohepatic circulation were investigated. Furthermore, the solution times are manageable using a modest computer hardware platform. PMID:24174399

  11. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  12. Model Analysis ToolKit

    SciTech Connect

    Harp, Dylan R.

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  13. Model Analysis ToolKit

    Energy Science and Technology Software Center (ESTSC)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  14. 2010 Solar Market Transformation Analysis and Tools

    SciTech Connect

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  15. Budget Risk & Prioritization Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  16. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  17. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  18. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  19. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  20. Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle.

    PubMed

    Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

    2015-05-01

    UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement. PMID:25703359

  1. Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle

    NASA Astrophysics Data System (ADS)

    Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

    2015-05-01

    UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement.

  2. Photogrammetry Tool for Forensic Analysis

    NASA Technical Reports Server (NTRS)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  3. Genomic sequence analysis tools: a user's guide.

    PubMed

    Fortna, A; Gardiner, K

    2001-03-01

    The wealth of information from various genome sequencing projects provides the biologist with a new perspective from which to analyze, and design experiments with, mammalian systems. The complexity of the information, however, requires new software tools, and numerous such tools are now available. Which type and which specific system is most effective depends, in part, upon how much sequence is to be analyzed and with what level of experimental support. Here we survey a number of mammalian genomic sequence analysis systems with respect to the data they provide and the ease of their use. The hope is to aid the experimental biologist in choosing the most appropriate tool for their analyses. PMID:11226611

  4. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  5. A Clearer Picture of Total Variation Blind Deconvolution.

    PubMed

    Perrone, Daniele; Favaro, Paolo

    2016-06-01

    Blind deconvolution is the problem of recovering a sharp image and a blur kernel from a noisy blurry image. Recently, there has been a significant effort on understanding the basic mechanisms to solve blind deconvolution. While this effort resulted in the deployment of effective algorithms, the theoretical findings generated contrasting views on why these approaches worked. On the one hand, one could observe experimentally that alternating energy minimization algorithms converge to the desired solution. On the other hand, it has been shown that such alternating minimization algorithms should fail to converge and one should instead use a so-called Variational Bayes approach. To clarify this conundrum, recent work showed that a good image and blur prior is instead what makes a blind deconvolution algorithm work. Unfortunately, this analysis did not apply to algorithms based on total variation regularization. In this manuscript, we provide both analysis and experiments to get a clearer picture of blind deconvolution. Our analysis reveals the very reason why an algorithm based on total variation works. We also introduce an implementation of this algorithm and show that, in spite of its extreme simplicity, it is very robust and achieves a performance comparable to the top performing algorithms. PMID:26372205

  6. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  7. Performance analysis of GYRO: a tool evaluation

    NASA Astrophysics Data System (ADS)

    Worley, P.; Candy, J.; Carrington, L.; Huck, K.; Kaiser, T.; Mahinthakumar, G.; Malony, A.; Moore, S.; Reed, D.; Roth, P.; Shan, H.; Shende, S.; Snavely, A.; Sreepathi, S.; Wolf, F.; Zhang, Y.

    2005-01-01

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wallclock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  8. A new scoring function for top-down spectral deconvolution

    SciTech Connect

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showed that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.

  9. A new scoring function for top-down spectral deconvolution

    DOE PAGESBeta

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore » that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less

  10. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  11. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  12. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  13. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  14. From sensor networks to connected analysis tools

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the applications and web services developed so far as well as to be developed in the future.

  15. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  16. Blind deconvolution applied to acoustical systems identification with supporting experimental results

    NASA Astrophysics Data System (ADS)

    Roan, Michael J.; Gramann, Mark R.; Erling, Josh G.; Sibul, Leon H.

    2003-10-01

    Many acoustical applications require the analysis of a signal that is corrupted by an unknown filtering function. Examples arise in the areas of noise or vibration control, room acoustics, structural vibration analysis, and speech processing. Here, the observed signal can be modeled as the convolution of the desired signal with an unknown system impulse response. Blind deconvolution refers to the process of learning the inverse of this unknown impulse response and applying it to the observed signal to remove the filtering effects. Unlike classical deconvolution, which requires prior knowledge of the impulse response, blind deconvolution requires only reasonable prior estimates of the input signal's statistics. The significant contribution of this work lies in experimental verification of a blind deconvolution algorithm in the context of acoustical system identification. Previous experimental work concerning blind deconvolution in acoustics has been minimal, as previous literature concerning blind deconvolution uses computer simulated data. This paper examines experiments involving three classical acoustic systems: driven pipe, driven pipe with open side branch, and driven pipe with Helmholtz resonator side branch. Experimental results confirm that the deconvolution algorithm learns these systems' inverse impulse responses, and that application of these learned inverses removes the effects of the filters.

  17. Virtual tools for teaching electrocardiographic rhythm analysis.

    PubMed

    Criley, John Michael; Nelson, William P

    2006-01-01

    Electrocardiographic (ECG) rhythm analysis is inadequately taught in training programs, resulting in undertrained physicians reading a large percentage of the 40 million ECGs recorded annually. The effective use of simple tools (calipers, ruler, and magnifier) required for crucial measurements and comparisons of intervals requires considerable time for interactive instruction and is difficult to teach in the classroom. The ECGViewer (Blaufuss Medical Multimedia Laboratories, Palo Alto, Calif) program was developed using virtual tools easily manipulated by computer mouse that can be used to analyze archived scanned ECGs on computer screens and classroom projection. Trainees manipulate the on-screen tools from their seats by wireless mouse while the instructor makes corrections with a second mouse, in clear view of the trainees. An on-screen ladder diagram may be constructed by the trainee and critiqued by the instructor. The ECGViewer program has been successfully used and well received by trainees at medical school, residence, and subspecialty fellow level. PMID:16387064

  18. SMART: Spectroscopic Modeling Analysis and Reduction Tool

    NASA Astrophysics Data System (ADS)

    IRS Team at Cornell University

    2012-10-01

    SMART is an IDL-based software tool, developed by the IRS Instrument Team at Cornell University, that allows users to reduce and analyze Spitzer data from all four modules of the Infrared Spectrograph, including the peak-up arrays. The software is designed to make full use of the ancillary files generated in the Spitzer Science Center pipeline so that it can either remove or flag artifacts and corrupted data and maximize the signal-to-noise ratio in the extraction routines. It can be run in both interactive and batch modes. SMART includes visualization tools for assessing data quality, basic arithmetic operations for either two-dimensional images or one-dimensional spectra, extraction of both point and extended sources, and a suite of spectral analysis tools.

  19. Fairing Separation Analysis Using SepTOOL

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Kosareo, Daniel N.

    2015-01-01

    This document describes the relevant equations programmed in spreadsheet software, SepTOOL, developed by ZIN Technologies, Inc. (ZIN) to determine the separation clearance between a launch vehicle payload fairing and remaining stages. The software uses closed form rigid body dynamic solutions of the vehicle in combination with flexible body dynamics of the fairing, which is obtained from flexible body dynamic analysis or from test data, and superimposes the two results to obtain minimum separation clearance for any given set of flight trajectory conditions. Using closed form solutions allows SepTOOL to perform separation calculations several orders of magnitude faster compared to numerical methods which allows users to perform real time parameter studies. Moreover, SepTOOL can optimize vehicle performance to minimize separation clearance. This tool can evaluate various shapes and sizes of fairings along with different vehicle configurations and trajectories. These geometries and parameters are inputted in a user friendly interface. Although the software was specifically developed for evaluating the separation clearance of launch vehicle payload fairings, separation dynamics of other launch vehicle components can be evaluated provided that aerodynamic loads acting on the vehicle during the separation event are negligible. This document describes the development of SepTOOL providing analytical procedure and theoretical equations whose implementation of these equations is not disclosed. Realistic examples are presented, and the results are verified with ADAMS (MSC Software Corporation) simulations. It should be noted that SepTOOL is a preliminary separation clearance assessment software for payload fairings and should not be used for final clearance analysis.

  20. Rock fracture characterization with GPR by means of deterministic deconvolution

    NASA Astrophysics Data System (ADS)

    Arosio, Diego

    2016-03-01

    In this work I address GPR characterization of rock fracture parameters, namely thickness and filling material. Rock fractures can generally be considered as thin beds, i.e., two interfaces whose separation is smaller than the resolution limit dictated by the Rayleigh's criterion. The analysis of the amplitude of the thin bed response in the time domain might permit to estimate fracture features for arbitrarily thin beds, but it is difficult to achieve and could be applied only to favorable cases (i.e., when all factors affecting amplitude are identified and corrected for). Here I explore the possibility to estimate fracture thickness and filling in the frequency domain by means of GPR. After introducing some theoretical aspects of thin bed response, I simulate GPR data on sandstone blocks with air- and water-filled fractures of known thickness. On the basis of some simplifying assumptions, I propose a 4-step procedure in which deterministic deconvolution is used to retrieve the magnitude and phase of the thin bed response in the selected frequency band. After deconvolved curves are obtained, fracture thickness and filling are estimated by means of a fitting process, which presents higher sensitivity to fracture thickness. Results are encouraging and suggest that GPR could be a fast and effective tool to determine fracture parameters in non-destructive manner. Further GPR experiments in the lab are needed to test the proposed processing sequence and to validate the results obtained so far.

  1. Deconvolution of dynamic mechanical networks

    PubMed Central

    Hinczewski, Michael; von Hansen, Yann; Netz, Roland R.

    2010-01-01

    Time-resolved single-molecule biophysical experiments yield data that contain a wealth of dynamic information, in addition to the equilibrium distributions derived from histograms of the time series. In typical force spectroscopic setups the molecule is connected via linkers to a readout device, forming a mechanically coupled dynamic network. Deconvolution of equilibrium distributions, filtering out the influence of the linkers, is a straightforward and common practice. We have developed an analogous dynamic deconvolution theory for the more challenging task of extracting kinetic properties of individual components in networks of arbitrary complexity and topology. Our method determines the intrinsic linear response functions of a given object in the network, describing the power spectrum of conformational fluctuations. The practicality of our approach is demonstrated for the particular case of a protein linked via DNA handles to two optically trapped beads at constant stretching force, which we mimic through Brownian dynamics simulations. Each well in the protein free energy landscape (corresponding to folded, unfolded, or possibly intermediate states) will have its own characteristic equilibrium fluctuations. The associated linear response function is rich in physical content, because it depends both on the shape of the well and its diffusivity—a measure of the internal friction arising from such processes as the transient breaking and reformation of bonds in the protein structure. Starting from the autocorrelation functions of the equilibrium bead fluctuations measured in this force clamp setup, we show how an experimentalist can accurately extract the state-dependent protein diffusivity using a straightforward two-step procedure. PMID:21118989

  2. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  3. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  4. Fuzzy logic components for iterative deconvolution systems

    NASA Astrophysics Data System (ADS)

    Northan, Brian M.

    2013-02-01

    Deconvolution systems rely heavily on expert knowledge and would benefit from approaches that capture this expert knowledge. Fuzzy logic is an approach that is used to capture expert knowledge rules and produce outputs that range in degree. This paper describes a fuzzy-deconvolution-system that integrates traditional Richardson-Lucy deconvolution with fuzzy components. The system is intended for restoration of 3D widefield images taken under conditions of refractive index mismatch. The system uses a fuzzy rule set for calculating sample refractive index, a fuzzy median filter for inter-iteration noise reduction, and a fuzzy rule set for stopping criteria.

  5. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  6. Bayesian approach based blind image deconvolution with fuzzy median filter

    NASA Astrophysics Data System (ADS)

    Mohan, S. Chandra; Rajan, K.; Srinivasan, R.

    2011-10-01

    The inverse problem associated with reconstruction of Poisson blurred images has attracted attention in recent years. In this paper, we propose an alternative unified approach to blind image deconvolution problem using fuzzy median filter as Gibbs prior to model the nature of inter pixel interaction for better edge preserving reconstruction. The performance of the algorithm at various SNR levels has been studied quantitatively using PSNR, RMSE and universal quality index (UQI). Comparative analysis with existing methods has also been carried out.

  7. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  8. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  9. Minimum entropy deconvolution and blind equalisation

    NASA Technical Reports Server (NTRS)

    Satorius, E. H.; Mulligan, J. J.

    1992-01-01

    Relationships between minimum entropy deconvolution, developed primarily for geophysics applications, and blind equalization are pointed out. It is seen that a large class of existing blind equalization algorithms are directly related to the scale-invariant cost functions used in minimum entropy deconvolution. Thus the extensive analyses of these cost functions can be directly applied to blind equalization, including the important asymptotic results of Donoho.

  10. Microfracturing and new tools improve formation analysis

    SciTech Connect

    McMechan, D.E.; Venditto, J.J.; Heemstra, T. . Power and Environment Committee); Simpson, G. ); Friend, L.L.; Rothman, E. )

    1992-12-07

    This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

  11. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    SciTech Connect

    Donovan, Richard L.; Schwartz, Michael J.; Selby, Kevin B.; Uecker, Norma J.

    2010-08-11

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S&S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S&S) directives set will be conducted using software tools to analyze the present directives with a view toward 1) identifying areas of positive synergism among topical areas, 2) identifying areas of unnecessary duplication within and among topical areas, and 3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  12. GIS-based hydrogeochemical analysis tools (QUIMET)

    NASA Astrophysics Data System (ADS)

    Velasco, V.; Tubau, I.; Vázquez-Suñè, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernàndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

    2014-09-01

    A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

  13. Data analysis tool for comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry.

    PubMed

    Castillo, Sandra; Mattila, Ismo; Miettinen, Jarkko; Orešič, Matej; Hyötyläinen, Tuulia

    2011-04-15

    Data processing and identification of unknown compounds in comprehensive two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GC×GC/TOFMS) analysis is a major challenge, particularly when large sample sets are analyzed. Herein, we present a method for efficient treatment of large data sets produced by GC×GC/TOFMS implemented as a freely available open source software package, Guineu. To handle large data sets and to efficiently utilize all the features available in the vendor software (baseline correction, mass spectral deconvolution, peak picking, integration, library search, and signal-to-noise filtering), data preprocessed by instrument software are used as a starting point for further processing. Our software affords alignment of the data, normalization, data filtering, and utilization of retention indexes in the verification of identification as well as a novel tool for automated group-type identification of the compounds. Herein, different features of the software are studied in detail and the performance of the system is verified by the analysis of a large set of standard samples as well as of a large set of authentic biological samples, including the control samples. The quantitative features of our GC×GC/TOFMS methodology are also studied to further demonstrate the method performance and the experimental results confirm the reliability of the developed procedure. The methodology has already been successfully used for the analysis of several thousand samples in the field of metabolomics. PMID:21434611

  14. Integrated FDIR Analysis Tool for Space Applications

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni; Di Tommaso, Umberto

    2013-08-01

    The crucial role of health management in space applications has been the subject of many studies carried out by NASA and ESA and is held in high regard by Thales Alenia Space. The common objective is to improve reliability and availability of space systems. This paper will briefly illustrate the evolution of IDEHAS (IntegrateD Engineering Harness Avionics and Software), an advanced tool currently used in Thales Alenia Space - Italy in several space programs and recently enhanced to fully support FDIR (Fault Detection Isolation and Recovery) analysis. The FDIR analysis logic flow will be presented, emphasizing the improvements offered to Mission Support & Operations activities. Finally the benefits provided to the Company and a list of possible future enhancements will be given.

  15. Blind source deconvolution for deep Earth seismology

    NASA Astrophysics Data System (ADS)

    Stefan, W.; Renaut, R.; Garnero, E. J.; Lay, T.

    2007-12-01

    We present an approach to automatically estimate an empirical source characterization of deep earthquakes recorded teleseismically and subsequently remove the source from the recordings by applying regularized deconvolution. A principle goal in this work is to effectively deblur the seismograms, resulting in more impulsive and narrower pulses, permitting better constraints in high resolution waveform analyses. Our method consists of two stages: (1) we first estimate the empirical source by automatically registering traces to their 1st principal component with a weighting scheme based on their deviation from this shape, we then use this shape as an estimation of the earthquake source. (2) We compare different deconvolution techniques to remove the source characteristic from the trace. In particular Total Variation (TV) regularized deconvolution is used which utilizes the fact that most natural signals have an underlying spareness in an appropriate basis, in this case, impulsive onsets of seismic arrivals. We show several examples of deep focus Fiji-Tonga region earthquakes for the phases S and ScS, comparing source responses for the separate phases. TV deconvolution is compared to the water level deconvolution, Tikenov deconvolution, and L1 norm deconvolution, for both data and synthetics. This approach significantly improves our ability to study subtle waveform features that are commonly masked by either noise or the earthquake source. Eliminating source complexities improves our ability to resolve deep mantle triplications, waveform complexities associated with possible double crossings of the post-perovskite phase transition, as well as increasing stability in waveform analyses used for deep mantle anisotropy measurements.

  16. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  17. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools applicable to the earliest stage of design, where more informed analysis of possible alternatives could yield the most benefit and the greatest cost savings both economic and environmental. This is where computer modeling and simulation can really lead to better and energy efficient buildings. Both apply to internal environment and human comfort, and environmental impact from surroundings.

  18. Multi-Mission Power Analysis Tool

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2011-01-01

    Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

  19. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  20. A Robust Deconvolution Method based on Transdimensional Hierarchical Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Kolb, J.; Lekic, V.

    2012-12-01

    Analysis of P-S and S-P conversions allows us to map receiver side crustal and lithospheric structure. This analysis often involves deconvolution of the parent wave field from the scattered wave field as a means of suppressing source-side complexity. A variety of deconvolution techniques exist including damped spectral division, Wiener filtering, iterative time-domain deconvolution, and the multitaper method. All of these techniques require estimates of noise characteristics as input parameters. We present a deconvolution method based on transdimensional Hierarchical Bayesian inference in which both noise magnitude and noise correlation are used as parameters in calculating the likelihood probability distribution. Because the noise for P-S and S-P conversion analysis in terms of receiver functions is a combination of both background noise - which is relatively easy to characterize - and signal-generated noise - which is much more difficult to quantify - we treat measurement errors as an known quantity, characterized by a probability density function whose mean and variance are model parameters. This transdimensional Hierarchical Bayesian approach has been successfully used previously in the inversion of receiver functions in terms of shear and compressional wave speeds of an unknown number of layers [1]. In our method we used a Markov chain Monte Carlo (MCMC) algorithm to find the receiver function that best fits the data while accurately assessing the noise parameters. In order to parameterize the receiver function we model the receiver function as an unknown number of Gaussians of unknown amplitude and width. The algorithm takes multiple steps before calculating the acceptance probability of a new model, in order to avoid getting trapped in local misfit minima. Using both observed and synthetic data, we show that the MCMC deconvolution method can accurately obtain a receiver function as well as an estimate of the noise parameters given the parent and daughter components. Furthermore, we demonstrate that this new approach is far less susceptible to generating spurious features even at high noise levels. Finally, the method yields not only the most-likely receiver function, but also quantifies its full uncertainty. [1] Bodin, T., M. Sambridge, H. Tkalčić, P. Arroucau, K. Gallagher, and N. Rawlinson (2012), Transdimensional inversion of receiver functions and surface wave dispersion, J. Geophys. Res., 117, B02301

  1. Built Environment Analysis Tool: April 2013

    SciTech Connect

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  2. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    SciTech Connect

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  3. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  4. ISHM Decision Analysis Tool: Operations Concept

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  5. Solar Array Verification Analysis Tool (SAVANT) Developed

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  6. Comparative analysis of pharmacophore screening tools.

    PubMed

    Sanders, Marijn P A; Barbosa, Arménio J M; Zarzycka, Barbara; Nicolaes, Gerry A F; Klomp, Jan P G; de Vlieg, Jacob; Del Rio, Alberto

    2012-06-25

    The pharmacophore concept is of central importance in computer-aided drug design (CADD) mainly because of its successful application in medicinal chemistry and, in particular, high-throughput virtual screening (HTVS). The simplicity of the pharmacophore definition enables the complexity of molecular interactions between ligand and receptor to be reduced to a handful set of features. With many pharmacophore screening softwares available, it is of the utmost interest to explore the behavior of these tools when applied to different biological systems. In this work, we present a comparative analysis of eight pharmacophore screening algorithms (Catalyst, Unity, LigandScout, Phase, Pharao, MOE, Pharmer, and POT) for their use in typical HTVS campaigns against four different biological targets by using default settings. The results herein presented show how the performance of each pharmacophore screening tool might be specifically related to factors such as the characteristics of the binding pocket, the use of specific pharmacophore features, and the use of these techniques in specific steps/contexts of the drug discovery pipeline. Algorithms with rmsd-based scoring functions are able to predict more compound poses correctly as overlay-based scoring functions. However, the ratio of correctly predicted compound poses versus incorrectly predicted poses is better for overlay-based scoring functions that also ensure better performances in compound library enrichments. While the ensemble of these observations can be used to choose the most appropriate class of algorithm for specific virtual screening projects, we remarked that pharmacophore algorithms are often equally good, and in this respect, we also analyzed how pharmacophore algorithms can be combined together in order to increase the success of hit compound identification. This study provides a valuable benchmark set for further developments in the field of pharmacophore search algorithms, e.g., by using pose predictions and compound library enrichment criteria. PMID:22646988

  7. HisTOOLogy: an open-source tool for quantitative analysis of histological sections.

    PubMed

    Magliaro, C; Tirella, A; Mattei, G; Pirone, A; Ahluwalia, A

    2015-12-01

    HisTOOLogy is an open-source software for the quantification of digital colour images of histological sections. The simple graphical user interface enables both expert and non-expert users to rapidly extract useful information from stained tissue sections. The software's main feature is a generalizable colour separation algorithm based on k-means clustering which accurately and reproducibly returns the amount of colour per unit area for any stain, thus allowing the quantification of tissue components. Here we describe HisTOOLogy's algorithms and graphical user interface structure, showing how it can be used to separate different dye colours in several classical stains. In addition, to demonstrate how the tool can be employed to obtain quantitative information on biological tissues, the effect of different hepatic tissue decellularization protocols on cell removal and matrix preservation was assessed through image analysis using HisTOOLogy and compared with conventional DNA and total protein content assays. HisTOOLogy's performance was also compared with ImageJ's colour deconvolution plug-in, demonstrating its advantages in terms of ease of use and speed of colour separation. PMID:26258893

  8. Multichannel Wiener deconvolution of vertical seismic profiles

    SciTech Connect

    Haldorsen, J.B.U. ); Miller, D.E. ); Walsh, J.J. )

    1994-10-01

    The authors describe a technique for performing optimal, least-squares deconvolution of vertical seismic profile (VSP) data. The method is a two-step process that involves (1) estimating the source signature and (2) applying a least-squares optimum deconvolution operator that minimizes the noise not coherent with the source signature estimate. The optimum inverse problem, formulated in the frequency domain, gives as a solution an operator that can be interpreted as a simple inverse to the estimated aligned signature multiplied by semblance across the array. An application to a zero-offset VSP acquired with a dynamite source shows the effectiveness of the operator in attaining the two conflicting goals of adaptively spiking the effective source signature and minimizing the noise. Signature design for seismic surveys could benefit from observing that the optimum deconvolution operator gives a flat signal spectrum if and only if the seismic source has the same amplitude spectrum as the noise.

  9. Parallelization of a blind deconvolution algorithm

    NASA Astrophysics Data System (ADS)

    Matson, Charles L.; Borelli, Kathy J.

    2006-09-01

    Often it is of interest to deblur imagery in order to obtain higher-resolution images. Deblurring requires knowledge of the blurring function - information that is often not available separately from the blurred imagery. Blind deconvolution algorithms overcome this problem by jointly estimating both the high-resolution image and the blurring function from the blurred imagery. Because blind deconvolution algorithms are iterative in nature, they can take minutes to days to deblur an image depending how many frames of data are used for the deblurring and the platforms on which the algorithms are executed. Here we present our progress in parallelizing a blind deconvolution algorithm to increase its execution speed. This progress includes sub-frame parallelization and a code structure that is not specialized to a specific computer hardware architecture.

  10. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A; Buescher, Kevin L; Armstrong, Jerawan C

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  11. Space variant deconvolution for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Prashanth, R.; Bhattacharya, Shanti

    2011-12-01

    We present a method for mitigating space variant blur occurring in the images acquired using Optical coherence tomography (OCT). The effect of Gaussian beam divergence on the image resolution is analyzed mathematically to develop space dependent two dimensional point spread functions that define the blurring kernel. Two standard deconvolution algorithms are used to deblur the images using the space dependent point spread functions. We show that the deconvolution method is effective in improving the transverse resolution of cross sectional OCT images at regions up to several times as deep as the confocal region of the Gaussian beam.

  12. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  13. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  14. Calibration of Wide-Field Deconvolution Microscopy for Quantitative Fluorescence Imaging

    PubMed Central

    Lee, Ji-Sook; Wee, Tse-Luen (Erika); Brown, Claire M.

    2014-01-01

    Deconvolution enhances contrast in fluorescence microscopy images, especially in low-contrast, high-background wide-field microscope images, improving characterization of features within the sample. Deconvolution can also be combined with other imaging modalities, such as confocal microscopy, and most software programs seek to improve resolution as well as contrast. Quantitative image analyses require instrument calibration and with deconvolution, necessitate that this process itself preserves the relative quantitative relationships between fluorescence intensities. To ensure that the quantitative nature of the data remains unaltered, deconvolution algorithms need to be tested thoroughly. This study investigated whether the deconvolution algorithms in AutoQuant X3 preserve relative quantitative intensity data. InSpeck Green calibration microspheres were prepared for imaging, z-stacks were collected using a wide-field microscope, and the images were deconvolved using the iterative deconvolution algorithms with default settings. Afterwards, the mean intensities and volumes of microspheres in the original and the deconvolved images were measured. Deconvolved data sets showed higher average microsphere intensities and smaller volumes than the original wide-field data sets. In original and deconvolved data sets, intensity means showed linear relationships with the relative microsphere intensities given by the manufacturer. Importantly, upon normalization, the trend lines were found to have similar slopes. In original and deconvolved images, the volumes of the microspheres were quite uniform for all relative microsphere intensities. We were able to show that AutoQuant X3 deconvolution software data are quantitative. In general, the protocol presented can be used to calibrate any fluorescence microscope or image processing and analysis procedure. PMID:24688321

  15. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  16. Knowledge base navigator facilitating regional analysis inter-tool communication.

    SciTech Connect

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  17. Friction analysis between tool and chip

    NASA Astrophysics Data System (ADS)

    Wang, Min; Xu, Binshi; Zhang, Jiaying; Dong, Shiyun

    2010-12-01

    The elastic-plasticity mechanics are applied to analyze the friction between tool and chip. According to the slip-line field theory, a series of theoretical formula and the friction coefficient is derived between the tool and chip. So the cutting process can be investigated. Based on the Orthogonal Cutting Model and the Mohr's circle stress, the cutting mechanism of the cladding and the surface integrity of machining can be studied.

  18. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  19. Usage-Based Evolution of Visual Analysis Tools

    SciTech Connect

    Hetzler, Elizabeth G.; Rose, Stuart J.; McQuerry, Dennis L.; Medvick, Patricia A.

    2005-06-12

    Visual analysis tools have been developed to help people in many different domains more effectively explore, understand, and make decisions from their information. Challenges in making a successful tool include suitability within a user's work processes, and tradeoffs between analytic power and tool complexity, both of which impact ease of learning. This paper describes experience working with users to help them apply visual analysis tools in several different domains, and examples of how the tools evolved significantly to better match users' goals and processes.

  20. Software Tools for High-Throughput Analysis and Archiving of Immunohistochemistry Staining Data Obtained with Tissue Microarrays

    PubMed Central

    Liu, Chih Long; Prapong, Wijan; Natkunam, Yasodha; Alizadeh, Ash; Montgomery, Kelli; Gilks, C. Blake; van de Rijn, Matt

    2002-01-01

    The creation of tissue microarrays (TMAs) allows for the rapid immunohistochemical analysis of thousands of tissue samples, with numerous different antibodies per sample. This technical development has created a need for tools to aid in the analysis and archival storage of the large amounts of data generated. We have developed a comprehensive system for high-throughput analysis and storage of TMA immunostaining data, using a combination of commercially available systems and novel software applications developed in our laboratory specifically for this purpose. Staining results are recorded directly into an Excel worksheet and are reformatted by a novel program (TMA-Deconvoluter) into a format suitable for hierarchical clustering analysis or other statistical analysis. Hierarchical clustering analysis is a powerful means of assessing relatedness within groups of tumors, based on their immunostaining with a panel of antibodies. Other analyses, such as generation of survival curves, construction of Cox regression models, or assessment of intra- or interobserver variation, can also be done readily on the reformatted data. Finally, the immunoprofile of a specific case can be rapidly retrieved from the archives and reviewed through the use of Stainfinder, a novel web-based program that creates a direct link between the clustered data and a digital image database. An on-line demonstration of this system is available at http://genome-www.stanford.edu/TMA/explore.shtml. PMID:12414504

  1. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  2. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  3. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  4. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings. PMID:22045723

  5. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  6. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A minimum of 4Mb of free RAM is highly recommended. The UNIX version of FEAT includes both FEAT v3.6 for the Macintosh and XFEAT. XFEAT is written in C-language for Sun series workstations running SunOS, SGI workstations running IRIX, DECstations running ULTRIX, and Intergraph workstations running CLIX version 6. It requires the MIT X Window System, Version 11 Revision 4, with OSF/Motif 1.1.3, and 16Mb of RAM. The standard distribution medium for FEAT 3.6 (Macintosh version) is a set of three 3.5 inch Macintosh format diskettes. The standard distribution package for the UNIX version includes the three FEAT 3.6 Macintosh diskettes plus a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format which contains XFEAT. Alternate distribution media and formats for XFEAT are available upon request. FEAT has been under development since 1990. Both FEAT v3.6 for the Macintosh and XFEAT v3.5 were released in 1993.

  7. Millennial scale system impulse response of polar climates - deconvolution results between δ 18O records from Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Reischmann, E.; Yang, X.; Rial, J. A.

    2013-12-01

    Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, δ 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the periodicity of the thermo-haline circulation (THC). Simplified models of the polar climate fluctuations are shown to support these findings.

  8. Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 μm) and regression tree analysis

    USGS Publications Warehouse

    Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus

    2013-01-01

    This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 μm wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 μm region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

  9. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  10. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  11. Blind Poissonian images deconvolution with framelet regularization.

    PubMed

    Fang, Houzhang; Yan, Luxin; Liu, Hai; Chang, Yi

    2013-02-15

    We propose a maximum a posteriori blind Poissonian images deconvolution approach with framelet regularization for the image and total variation (TV) regularization for the point spread function. Compared with the TV based methods, our algorithm not only suppresses noise effectively but also recovers edges and detailed information. Moreover, the split Bregman method is exploited to solve the resulting minimization problem. Comparative results on both simulated and real images are reported. PMID:23455078

  12. Sparse deconvolution of B-scan images.

    PubMed

    Olofsson, Tomas; Wennerström, Erik

    2007-08-01

    In this paper, a new computationally efficient sparse deconvolution algorithm for the use on B-scan images from objects with relatively few scattering targets is presented. It is based on a linear image formation model that has been used earlier in connection with linear minimum mean squared error (MMSE) two-dimensional (2-D) deconvolution. The MMSE deconvolution results have shown improved resolution compared to synthetic aperture focusing technique (SAFT), but at the cost of increased computation time. The proposed algorithm uses the sparsity of the image, reducing the degrees of freedom in the reconstruction problem, to reduce the computation time and to improve the resolution. The dominating task in the algorithm consists in detecting the set of active scattering targets, which is done by iterating between one up-dating pass that detects new points to include in the set, and a down-dating pass that removes redundant points. In the up-date, a spatiotemporal matched filter is used to isolate potential candidates. A subset of those are chosen using a detection criterion. The amplitudes of the detected scatterers are found by MMSE. The algorithm properties are illustrated using synthetic and real B-scan. The results show excellent resolution enhancement- and noise-suppression capabilities. The involved computation times are analyzed. PMID:17703667

  13. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  14. Self-Constrained Euler Deconvolution Using Potential Field Data of Different Altitudes

    NASA Astrophysics Data System (ADS)

    Zhou, Wenna; Nan, Zeyu; Li, Jiyan

    2016-02-01

    Euler deconvolution has been developed as almost the most common tool in potential field data semi-automatic interpretation. The structural index (SI) is a main determining factor of the quality of depth estimation. In this paper, we first present an improved Euler deconvolution method to eliminate the influence of SI using potential field data of different altitudes. The different altitudes data can be obtained by the upward continuation or can be directly obtained by the airborne measurement realization. Euler deconvolution at different altitudes of a certain range has very similar calculation equation. Therefore, the ratio of Euler equations of two different altitudes can be calculated to discard the SI. Thus, the depth and location of geologic source can be directly calculated using the improved Euler deconvolution without any prior information. Particularly, the noise influence can be decreased using the upward continuation of different altitudes. The new method is called self-constrained Euler deconvolution (SED). Subsequently, based on the SED algorithm, we deduce the full tensor gradient (FTG) calculation form of the new improved method. As we all know, using multi-components data of FTG have added advantages in data interpretation. The FTG form is composed by x-, y- and z-directional components. Due to the using more components, the FTG form can get more accurate results and more information in detail. The proposed modification method is tested using different synthetic models, and the satisfactory results are obtained. Finally, we applied the new approach to Bishop model magnetic data and real gravity data. All the results demonstrate that the new approach is utility tool to interpret the potential field and full tensor gradient data.

  15. Klonos: A Similarity Analysis Based Tool for Software Porting

    Energy Science and Technology Software Center (ESTSC)

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  16. A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra

    NASA Astrophysics Data System (ADS)

    Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

    2013-06-01

    A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

  17. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  18. Value Analysis: A Tool for Community Colleges.

    ERIC Educational Resources Information Center

    White, Rita A.

    Adoption of a value analysis program is proposed to aid colleges in identifying and implementing educationally sound labor-saving devices and procedures, enabling them to meet more students' needs at less cost with no quality reduction and a minimum of staff resistance. Value analysis is defined as a method for studying how well a product does…

  19. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    ERIC Educational Resources Information Center

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  20. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  1. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  2. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  3. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  4. Performance evaluation of image deconvolution techniques in space-variant astronomical imaging systems with nonlinearities

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Janout, Petr; Bednář, Jan; Krasula, Lukáš; Vítek, Stanislav; Švihlík, Jan; Páta, Petr

    2015-09-01

    There are various deconvolution methods for suppression of blur in images. In this paper a survey of image deconvolution techniques is presented with focus on methods designed to handle images acquired with wide-field astronomical imaging systems. Image blur present in such images is space-variant especially due to space-variant point spread function (PSF) of the lens. The imaging system can contain also nonlinear electro-optical elements. Analysis of nonlinear and space-variant imaging systems is usually simplified so that the system is considered as linear and space-invariant (LSI) under specific constraints. Performance analysis of selected image deconvolution methods is presented in this paper, while considering space-variant nature of wide-field astronomical imaging system. Impact of nonlinearity on the overall performance of image deconvolution technique is also analyzed. Test images with characteristics obtained from the real system with space-variant wide-field input lens and nonlinear image intensifier are used for the performance analysis.

  5. AstroStat: Statistical analysis tool

    NASA Astrophysics Data System (ADS)

    VO-India team

    2015-07-01

    AstroStat performs statistical analysis on data and is compatible with Virtual Observatory (VO) standards. It accepts data in a variety of formats and performs various statistical tests using a menu driven interface. Analyses, performed in R, include exploratory tests, visualizations, distribution fitting, correlation and causation, hypothesis testing, multivariate analysis and clustering. AstroStat is available in two versions with an identical interface and features: as a web service that can be run using any standard browser and as an offline application.

  6. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  7. Deconvolution methods for image deblurring in optical coherence tomography.

    PubMed

    Liu, Yiheng; Liang, Yanmei; Mu, Guoguang; Zhu, Xiaonong

    2009-01-01

    Two-dimensional deconvolution methods are proposed to deblur optical coherence tomography images. One employs a two-dimensional deconvolution with a matrix given by the product of the longitudinal and transversal point-spread functions as its kernel, which can be taken as the general point-spread function of an optical coherence tomography system. The other uses two one-dimensional deconvolutions with the longitudinal and transversal point-spread functions successively. It is shown that the two deconvolution methods can deblur the experimentally obtained optical coherence tomography images effectively. PMID:19109602

  8. Strategies for the deconvolution of hypertelescope images

    NASA Astrophysics Data System (ADS)

    Aime, C.; Lantéri, H.; Diet, M.; Carlotti, A.

    2012-07-01

    Aims: We study the possibility of deconvolving hypertelescope images and propose a procedure that can be used provided that the densification factor is small enough to make the process reversible. Methods: We present the simulation of hypertelescope images for an array of cophased densified apertures. We distinguish between two types of aperture densification, one called FAD (full aperture densification) corresponding to Labeyrie's original technique, and the other FSD (full spectrum densification) corresponding to a densification factor twice as low. Images are compared to the Fizeau mode. A single image of the observed object is obtained in the hypertelescope modes, while in the Fizeau mode the response produces an ensemble of replicas of the object. Simulations are performed for noiseless images and in a photodetection regime. Assuming first that the point spread function (PSF) does not change much over the object extent, we use two classical techniques to deconvolve the images, namely the Richardson-Lucy and image space reconstruction algorithms. Results: Both algorithms fail to achieve satisfying results. We interpret this as meaning that it is inappropriate to deconvolve a relation that is not a convolution, even if the variation in the PSF is very small across the object extent. We propose instead the application of a redilution to the densified image prior to its deconvolution, i.e. to recover an image similar to the Fizeau observation. This inverse operation is possible only when the rate of densification is no more than in the FSD case. This being done, the deconvolution algorithms become efficient. The deconvolution brings together the replicas into a single high-quality image of the object. This is heuristically explained as an inpainting of the Fourier plane. This procedure makes it possible to obtain improved images while retaining the benefits of hypertelescopes for image acquisition consisting of detectors with a small number of pixels.

  9. Comparison of Deconvolution Filters for Photoacoustic Tomography

    PubMed Central

    Van de Sompel, Dominique; Sasportas, Laura S.; Jokerst, Jesse V.; Gambhir, Sanjiv S.

    2016-01-01

    In this work, we compare the merits of three temporal data deconvolution methods for use in the filtered backprojection algorithm for photoacoustic tomography (PAT). We evaluate the standard Fourier division technique, the Wiener deconvolution filter, and a Tikhonov L-2 norm regularized matrix inversion method. Our experiments were carried out on subjects of various appearances, namely a pencil lead, two man-made phantoms, an in vivo subcutaneous mouse tumor model, and a perfused and excised mouse brain. All subjects were scanned using an imaging system with a rotatable hemispherical bowl, into which 128 ultrasound transducer elements were embedded in a spiral pattern. We characterized the frequency response of each deconvolution method, compared the final image quality achieved by each deconvolution technique, and evaluated each method’s robustness to noise. The frequency response was quantified by measuring the accuracy with which each filter recovered the ideal flat frequency spectrum of an experimentally measured impulse response. Image quality under the various scenarios was quantified by computing noise versus resolution curves for a point source phantom, as well as the full width at half maximum (FWHM) and contrast-to-noise ratio (CNR) of selected image features such as dots and linear structures in additional imaging subjects. It was found that the Tikhonov filter yielded the most accurate balance of lower and higher frequency content (as measured by comparing the spectra of deconvolved impulse response signals to the ideal flat frequency spectrum), achieved a competitive image resolution and contrast-to-noise ratio, and yielded the greatest robustness to noise. While the Wiener filter achieved a similar image resolution, it tended to underrepresent the lower frequency content of the deconvolved signals, and hence of the reconstructed images after backprojection. In addition, its robustness to noise was poorer than that of the Tikhonov filter. The performance of the Fourier filter was found to be the poorest of all three methods, based on the reconstructed images’ lowest resolution (blurriest appearance), generally lowest contrast-to-noise ratio, and lowest robustness to noise. Overall, the Tikhonov filter was deemed to produce the most desirable image reconstructions. PMID:27031832

  10. Atomic spectral line free parameter deconvolution procedure

    NASA Astrophysics Data System (ADS)

    Milosavljević, V.; Poparić, G.

    2001-03-01

    We report an advanced numerical procedure for deconvolution of theoretical asymmetric convolution integral of a Gaussian and a plasma broadened spectral line profile jA,R(λ) for spectal lines. Our method determines all broadening parameters, self-consistently and directly from the line profile with minimal assumptions or prior knowledge. This method is useful for obtaining complete information on all plasma parameters directly from the recorded shape of a single line, which is very important in case no other diagnostic methods are available. The method is also convenient for determination of plasma parameters in the case of a symmetrical profile such as Voigt one.

  11. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  12. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  13. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    SciTech Connect

    Bush, B.; Penev, M.; Melaina, M.; Zuboy, J.

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  14. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    SciTech Connect

    Andraka, Charles E.

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals. SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.

  15. RCytoscape: tools for exploratory network analysis

    PubMed Central

    2013-01-01

    Background Biomolecular pathways and networks are dynamic and complex, and the perturbations to them which cause disease are often multiple, heterogeneous and contingent. Pathway and network visualizations, rendered on a computer or published on paper, however, tend to be static, lacking in detail, and ill-equipped to explore the variety and quantities of data available today, and the complex causes we seek to understand. Results RCytoscape integrates R (an open-ended programming environment rich in statistical power and data-handling facilities) and Cytoscape (powerful network visualization and analysis software). RCytoscape extends Cytoscape's functionality beyond what is possible with the Cytoscape graphical user interface. To illustrate the power of RCytoscape, a portion of the Glioblastoma multiforme (GBM) data set from the Cancer Genome Atlas (TCGA) is examined. Network visualization reveals previously unreported patterns in the data suggesting heterogeneous signaling mechanisms active in GBM Proneural tumors, with possible clinical relevance. Conclusions Progress in bioinformatics and computational biology depends upon exploratory and confirmatory data analysis, upon inference, and upon modeling. These activities will eventually permit the prediction and control of complex biological systems. Network visualizations -- molecular maps -- created from an open-ended programming environment rich in statistical power and data-handling facilities, such as RCytoscape, will play an essential role in this progression. PMID:23837656

  16. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    Energy Science and Technology Software Center (ESTSC)

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals.more » SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.« less

  17. A New Web-based Tool for Aerosol Data Analysis: the AERONET Data Synergy Tool

    NASA Astrophysics Data System (ADS)

    Giles, D. M.; Holben, B. N.; Slutsker, I.; Welton, E. J.; Chin, M.; Schmaltz, J.; Kucsera, T.; Diehl, T.

    2006-12-01

    The Aerosol Robotic Network (AERONET) provides important aerosol microphysical and optical properties via an extensive distribution of continental sites and sparsely-distributed coastal and oceanic sites among the major oceans and inland seas. These data provide only spatial point measurements while supplemental data are needed for a complete aerosol analysis. Ancillary data sets (e.g., MODIS true color imagery and back trajectory analyses) are available by navigating to several web data sources. In an effort to streamline aerosol data discovery and analysis, a new web data tool called the "AERONET Data Synergy Tool" was launched from the AERONET web site. This tool provides access to ground-based (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. The Data Synergy Tool user can access these data sources to obtain properties such as the optical depth, composition, absorption, size, spatial and vertical distribution, and source region of aerosols. AERONET Ascension Island and COVE platform site data will be presented to highlight the Data Synergy Tool capabilities in analyzing urban haze, smoke, and dust aerosol events over the ocean. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  18. GATB: Genome Assembly & Analysis Tool Box

    PubMed Central

    Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

    2014-01-01

    Motivation: Efficient and fast next-generation sequencing (NGS) algorithms are essential to analyze the terabytes of data generated by the NGS machines. A serious bottleneck can be the design of such algorithms, as they require sophisticated data structures and advanced hardware implementation. Results: We propose an open-source library dedicated to genome assembly and analysis to fasten the process of developing efficient software. The library is based on a recent optimized de-Bruijn graph implementation allowing complex genomes to be processed on desktop computers using fast algorithms with low memory footprints. Availability and implementation: The GATB library is written in C++ and is available at the following Web site http://gatb.inria.fr under the A-GPL license. Contact: lavenier@irisa.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990603

  19. Serial concept maps: tools for concept analysis.

    PubMed

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments. PMID:17547345

  20. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-09-13

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projection screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals.more »SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position.« less

  1. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has began using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. ASSESS analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. Its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weaknesses of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weaknesses, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  2. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  3. Development of data analysis tool for combat system integration

    NASA Astrophysics Data System (ADS)

    Shin, Seung-Chun; Shin, Jong-Gye; Oh, Dae-Kyun

    2013-03-01

    System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT) for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  4. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. PMID:26572119

  5. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h. PMID:18802442

  6. Compressive Deconvolution in Medical Ultrasound Imaging.

    PubMed

    Chen, Zhouye; Basarab, Adrian; Kouame, Denis

    2016-03-01

    The interest of compressive sampling in ultrasound imaging has been recently extensively evaluated by several research teams. Following the different application setups, it has been shown that the RF data may be reconstructed from a small number of measurements and/or using a reduced number of ultrasound pulse emissions. Nevertheless, RF image spatial resolution, contrast and signal to noise ratio are affected by the limited bandwidth of the imaging transducer and the physical phenomenon related to US wave propagation. To overcome these limitations, several deconvolution-based image processing techniques have been proposed to enhance the ultrasound images. In this paper, we propose a novel framework, named compressive deconvolution, that reconstructs enhanced RF images from compressed measurements. Exploiting an unified formulation of the direct acquisition model, combining random projections and 2D convolution with a spatially invariant point spread function, the benefit of our approach is the joint data volume reduction and image quality improvement. The proposed optimization method, based on the Alternating Direction Method of Multipliers, is evaluated on both simulated and in vivo data. PMID:26513780

  7. Structure preserving color deconvolution for immunohistochemistry images

    NASA Astrophysics Data System (ADS)

    Chen, Ting; Srinivas, Chukka

    2015-03-01

    Immunohistochemistry (IHC) staining is an important technique for the detection of one or more biomarkers within a single tissue section. In digital pathology applications, the correct unmixing of the tissue image into its individual constituent dyes for each biomarker is a prerequisite for accurate detection and identification of the underlying cellular structures. A popular technique thus far is the color deconvolution method1 proposed by Ruifrok et al. However, Ruifrok's method independently estimates the individual dye contributions at each pixel which potentially leads to "holes and cracks" in the cells in the unmixed images. This is clearly inadequate since strong spatial dependencies exist in the tissue images which contain rich cellular structures. In this paper, we formulate the unmixing algorithm into a least-square framework of image patches, and propose a novel color deconvolution method which explicitly incorporates the spatial smoothness and structure continuity constraint into a neighborhood graph regularizer. An analytical closed-form solution to the cost function is derived for this algorithm for fast implementation. The algorithm is evaluated on a clinical data set containing a number of 3,3-Diaminobenzidine (DAB) and hematoxylin (HTX) stained IHC slides and demonstrates better unmixing results than the existing strategy.

  8. Improved Gabor Deconvolution and Its Extended Applications

    NASA Astrophysics Data System (ADS)

    Sun, Xuekai; Sun, Sam Zandong

    2016-02-01

    In log time-frequency spectra, the nonstationary convolution model is a linear equation and thus we improved the Gabor deconvolution by employing a log hyperbolic smoothing scheme which can be implemented as an iteration process. Numerical tests and practical applications demonstrate that improved Gabor deconvolution can further broaden frequency bandwidth with less computational expenses than the ordinary method. Moreover, we attempt to enlarge this method's application value by addressing nonstationary and evaluating Q values. In fact, energy relationship of each hyperbolic bin (i.e., attenuation curve) can be taken as a quantitative indicator in balancing nonstationarity and conditioning seismic traces to the assumption of unchanging wavelet, which resultantly reveals more useful information for constrained reflectivity inversion. Meanwhile, a statistical method on Q-value estimation is also proposed by utilizing this linear model's gradient. In practice, not only estimations well agree with geologic settings, but also applications on Q-compensation migration are favorable in characterizing deep geologic structures, such as the pinch-out boundary and water channel.

  9. Real-time multi-view deconvolution

    PubMed Central

    Schmid, Benjamin; Huisken, Jan

    2015-01-01

    Summary: In light-sheet microscopy, overall image content and resolution are improved by acquiring and fusing multiple views of the sample from different directions. State-of-the-art multi-view (MV) deconvolution simultaneously fuses and deconvolves the images in 3D, but processing takes a multiple of the acquisition time and constitutes the bottleneck in the imaging pipeline. Here, we show that MV deconvolution in 3D can finally be achieved in real-time by processing cross-sectional planes individually on the massively parallel architecture of a graphics processing unit (GPU). Our approximation is valid in the typical case where the rotation axis lies in the imaging plane. Availability and implementation: Source code and binaries are available on github (https://github.com/bene51/), native code under the repository ‘gpu_deconvolution’, Java wrappers implementing Fiji plugins under ‘SPIM_Reconstruction_Cuda’. Contact: bschmid@mpi-cbg.de or huisken@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26112291

  10. Database tools for enhanced analysis of TMX-U data

    SciTech Connect

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-03-06

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers.

  11. Development of a climate data analysis tool (CDAT)

    SciTech Connect

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  12. A Semi-Automated Functional Test Data Analysis Tool

    SciTech Connect

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  13. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  14. Nonlinear deconvolution with deblending: a new analyzing technique for spectroscopy

    NASA Astrophysics Data System (ADS)

    Sennhauser, C.; Berdyugina, S. V.; Fluri, D. M.

    2009-12-01

    Context: Spectroscopy data in general often deals with an entanglement of spectral line properties, especially in the case of blended line profiles, independently of how high the quality of the data may be. In stellar spectroscopy and spectropolarimetry, where atomic transition parameters are usually known, the use of multi-line techniques to increase the signal-to-noise ratio of observations has become common practice. These methods extract an average line profile by means of either least squares deconvolution (LSD) or principle component analysis (PCA). However, only a few methods account for the blending of line profiles, and when they do, they assume that line profiles add linearly. Aims: We abandon the simplification of linear line-adding for Stokes I and present a novel approach that accounts for the nonlinearity in blended profiles, also illuminating the process of a reasonable deconvolution of a spectrum. Only the combination of those two enables us to treat spectral line variables independently, constituting our method of nonlinear deconvolution with deblending (NDD). The improved interpretation of a common line profile achieved compensates for the additional expense in calculation time, especially when it comes to the application to (Zeeman) doppler imaging (ZDI). Methods: By examining how absorption lines of different depths blend with each other and describing the effects of line-adding in a mathematically simple, yet physically meaningful way, we discover how it is possible to express a total line depth in terms of a (nonlinear) combination of contributing individual components. Thus, we disentangle blended line profiles and underlying parameters in a truthful manner and strongly increase the reliability of the common line patterns retrieved. Results: By comparing different versions of LSD with our NDD technique applied to simulated atomic and molecular intensity spectra, we are able to illustrate the improvements provided by our method to the interpretation of the recovered mean line profiles. As a consequence, it is possible for the first time to retrieve an intrinsic line pattern from a molecular band, offering the opportunity to fully include them in a NDD-based ZDI. However, we also show that strong line broadening deters the existence of a unique solution for heavily blended lines such as in molecular bandheads.

  15. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 5 grid positions over a 2 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution in UDECON is achieved by searching for the minimum ABIC while shifting the sensor response (to account for possible mispositioning of the sample on the tray) and a smoothness parameter in ranges defined by user. Comparison of deconvolution results using sensor response estimated from integrated point source measurements and other methods suggest that the integrated point source estimate yields better results (smaller ABIC). The noise characteristics of magnetometer measurements and the reliability of the UDECON algorithm were tested using repeated (a total of 400 times) natural remanence measurement of a u-channel sample before and after stepwise alternating field demagnetizations. Using a series of synthetic data constructed based on real paleomagnetic record, we demonstrate that optimized deconvolution using UDECON can greatly help revealing detailed paleomagnetic information such as excursions that may be smoothed out during pass-through measurement. Application of UDECON to the vast amount of existing and future pass-through paleomagnetic and rock magnetic measurements on sediments recovered especially through ocean drilling programs will contribute to our understanding of the geodynamo and paleo-environment by providing more detailed records of geomagnetic and environmental changes.

  16. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  17. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  18. Design and analysis tools for concurrent blackboard systems

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1991-01-01

    A set of blackboard system design and analysis tools that consists of a knowledge source organizer, a knowledge source input/output connectivity analyzer, and a validated blackboard system simulation model is discussed. The author presents the structure and functionality of the knowledge source input/output connectivity analyzer. An example outlining the use of the analyzer to aid in the design of a concurrent tactical decision generator for air-to-air combat is presented. The blackboard system design and analysis tools were designed for generic blackboard systems and are application independent.

  19. Discovery of protein acetylation patterns by deconvolution of peptide isomer mass spectra

    PubMed Central

    Abshiru, Nebiyu; Caron-Lizotte, Olivier; Rajan, Roshan Elizabeth; Jamai, Adil; Pomies, Christelle; Verreault, Alain; Thibault, Pierre

    2015-01-01

    Protein post-translational modifications (PTMs) play important roles in the control of various biological processes including protein–protein interactions, epigenetics and cell cycle regulation. Mass spectrometry-based proteomics approaches enable comprehensive identification and quantitation of numerous types of PTMs. However, the analysis of PTMs is complicated by the presence of indistinguishable co-eluting isomeric peptides that result in composite spectra with overlapping features that prevent the identification of individual components. In this study, we present Iso-PeptidAce, a novel software tool that enables deconvolution of composite MS/MS spectra of isomeric peptides based on features associated with their characteristic fragment ion patterns. We benchmark Iso-PeptidAce using dilution series prepared from mixtures of known amounts of synthetic acetylated isomers. We also demonstrate its applicability to different biological problems such as the identification of site-specific acetylation patterns in histones bound to chromatin assembly factor-1 and profiling of histone acetylation in cells treated with different classes of HDAC inhibitors. PMID:26468920

  20. Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning

    PubMed Central

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  1. Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.

    PubMed

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C

    2013-05-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  2. Least-squares (LS) deconvolution of a series of overlapping cortical auditory evoked potentials: a simulation and experimental study

    NASA Astrophysics Data System (ADS)

    Bardy, Fabrice; Van Dun, Bram; Dillon, Harvey; Cowan, Robert

    2014-08-01

    Objective. To evaluate the viability of disentangling a series of overlapping ‘cortical auditory evoked potentials’ (CAEPs) elicited by different stimuli using least-squares (LS) deconvolution, and to assess the adaptation of CAEPs for different stimulus onset-asynchronies (SOAs). Approach. Optimal aperiodic stimulus sequences were designed by controlling the condition number of matrices associated with the LS deconvolution technique. First, theoretical considerations of LS deconvolution were assessed in simulations in which multiple artificial overlapping responses were recovered. Second, biological CAEPs were recorded in response to continuously repeated stimulus trains containing six different tone-bursts with frequencies 8, 4, 2, 1, 0.5, 0.25 kHz separated by SOAs jittered around 150 (120-185), 250 (220-285) and 650 (620-685) ms. The control condition had a fixed SOA of 1175 ms. In a second condition, using the same SOAs, trains of six stimuli were separated by a silence gap of 1600 ms. Twenty-four adults with normal hearing (<20 dB HL) were assessed. Main results. Results showed disentangling of a series of overlapping responses using LS deconvolution on simulated waveforms as well as on real EEG data. The use of rapid presentation and LS deconvolution did not however, allow the recovered CAEPs to have a higher signal-to-noise ratio than for slowly presented stimuli. The LS deconvolution technique enables the analysis of a series of overlapping responses in EEG. Significance. LS deconvolution is a useful technique for the study of adaptation mechanisms of CAEPs for closely spaced stimuli whose characteristics change from stimulus to stimulus. High-rate presentation is necessary to develop an understanding of how the auditory system encodes natural speech or other intrinsically high-rate stimuli.

  3. Perceived Image Quality Improvements from the Application of Image Deconvolution to Retinal Images from an Adaptive Optics Fundus Imager

    NASA Astrophysics Data System (ADS)

    Soliz, P.; Nemeth, S. C.; Erry, G. R. G.; Otten, L. J.; Yang, S. Y.

    Aim: The objective of this project was to apply an image restoration methodology based on wavefront measurements obtained with a Shack-Hartmann sensor and evaluating the restored image quality based on medical criteria.Methods: Implementing an adaptive optics (AO) technique, a fundus imager was used to achieve low-order correction to images of the retina. The high-order correction was provided by deconvolution. A Shack-Hartmann wavefront sensor measures aberrations. The wavefront measurement is the basis for activating a deformable mirror. Image restoration to remove remaining aberrations is achieved by direct deconvolution using the point spread function (PSF) or a blind deconvolution. The PSF is estimated using measured wavefront aberrations. Direct application of classical deconvolution methods such as inverse filtering, Wiener filtering or iterative blind deconvolution (IBD) to the AO retinal images obtained from the adaptive optical imaging system is not satisfactory because of the very large image size, dificulty in modeling the system noise, and inaccuracy in PSF estimation. Our approach combines direct and blind deconvolution to exploit available system information, avoid non-convergence, and time-consuming iterative processes. Results: The deconvolution was applied to human subject data and resulting restored images compared by a trained ophthalmic researcher. Qualitative analysis showed significant improvements. Neovascularization can be visualized with the adaptive optics device that cannot be resolved with the standard fundus camera. The individual nerve fiber bundles are easily resolved as are melanin structures in the choroid. Conclusion: This project demonstrated that computer-enhanced, adaptive optic images have greater detail of anatomical and pathological structures.

  4. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  5. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  6. Solving a Deconvolution Problem in Photon Spectrometry

    NASA Astrophysics Data System (ADS)

    Alice-Phos Collaboration; Aleksandrov, D.; Alme, J.; Basmanov, V.; Batyunya, B.; Blau, D.; Bogolyubsky, M.; Budilov, V.; Budnikov, D.; Buskenes, J. I.; Cai, X.; Chuman, F.; Deloff, A.; Demanov, V.; Djuvsland, O.; Dobrowolski, T.; Faltys, M.; Fehlker, D.; Fil'Chagin, S.; Hiei, A.; Hille, P. T.; Horaguchi, T.; Huang, M.; Il'Kaev, R.; Ilkiv, I.; Ippolitov, M.; Iwasaki, T.; Kazantsev, A.; Karadzhev, K.; Kharlov, Y.; Kucheryaev, Y.; Kurashvili, P.; Kuryakin, A.; Larsen, D. T.; Lindal, S.; Liu, L.; Lovhoiden, G.; Ma, K.; Mamonov, A.; Manko, V.; Mao, Y.; Mares, J.; Maruyama, Y.; Müller, H.; Mizoguchi, K.; Nazarenko, S.; Nazarov, G.; Nikolaev, S.; Nikulin, S.; Nomokonov, P.; Nystrand, J.; Pavlinov, A.; Peresunko, D.; Petrov, V.; Polak, K.; Polichtchouk, B.; Potcheptsov, T.; Punin, V.; Qvigstad, H.; Redlich, K.; Roehrich, D.; Sadovsky, S.; Senko, V.; Shigaki, K.; Sibiryak, I.; Siemiarczuk, T.; Skaali, B.; Skjerdal, K.; Shabratova, G.; Soloviev, A.; Stolpovsky, P.; Sugitate, T.; Sukhorukov, M.; Torii, H.; Tveter, T.; Ullaland, K.; Wikne, J.; Vikhlyantsev, O.; Vinogradov, A.; Vinogradov, Y.; Vodopyanov, A.; Wan, R.; Wang, Y.; Wilk, G.; Wang, D.; Xu, C.; Yin, Z.; Yanovsky, V.; Yuan, X.; Zaporozhets, S.; Zenin, A.; Zhang, X.; Zhou, D.; ALICE-PHOS Collaboration

    2010-08-01

    We solve numerically a deconvolution problem to extract the undisturbed spectrum from the measured distribution contaminated by the finite resolution of the measuring device. A problem of this kind emerges when one wants to infer the momentum distribution of the neutral pions by detecting the π0 decay photons using the photon spectrometer of the ALICE LHC experiment at CERN [1]. The underlying integral equation connecting the sought for pion spectrum and the measured gamma spectrum has been discretized and subsequently reduced to a system of linear algebraic equations. The latter system, however, is known to be ill-posed and must be regularized to obtain a stable solution. This task has been accomplished here by means of the Tikhonov regularization scheme combined with the L-curve method. The resulting pion spectrum is in an excellent quantitative agreement with the pion spectrum obtained from a Monte Carlo simulation.

  7. Quantitative deconvolution of human thermal infrared emittance.

    PubMed

    Arthur, D T J; Khan, M M

    2013-01-01

    The bioheat transfer models conventionally employed in etiology of human thermal infrared (TIR) emittance rely upon two assumptions; universal graybody emissivity and significant transmission of heat from subsurface tissue layers. In this work, a series of clinical and laboratory experiments were designed and carried out to conclusively evaluate the validity of the two assumptions. Results obtained from the objective analyses of TIR images of human facial and tibial regions demonstrated significant variations in spectral thermophysical properties at different anatomic locations on human body. The limited validity of the two assumptions signifies need for quantitative deconvolution of human TIR emittance in clinical, psychophysiological and critical applications. A novel approach to joint inversion of the bioheat transfer model is also introduced, levering the deterministic temperature-dependency of proton resonance frequency in low-lipid human soft tissue for characterizing the relationship between subsurface 3D tissue temperature profiles and corresponding TIR emittance. PMID:23086533

  8. Deconvolution of mixed magnetism in multilayer graphene

    NASA Astrophysics Data System (ADS)

    Swain, Akshaya Kumar; Bahadur, Dhirendra

    2014-06-01

    Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

  9. Iterative deconvolution of quadrupole split NMR spectra

    NASA Astrophysics Data System (ADS)

    Mila, Frédéric; Takigawa, Masashi

    2013-08-01

    We propose a simple method to deconvolute NMR spectra of quadrupolar nuclei in order to separate the distribution of local magnetic hyperfine field from the quadrupole splitting. It is based on an iterative procedure which allows to express the intensity of a single NMR line directly as a linear combination of the intensities of the total experimental spectrum at a few related frequencies. This procedure is argued to be an interesting complement to Fourier transformation since it can lead to a significant noise reduction in some frequency ranges. This is demonstrated in the case of the 11B-NMR spectrum in SrCu2(BO3)2 at a field of 31.7 T, where a magnetization plateau at 1/6 of the saturation has been observed.

  10. Deconvolution of mixed magnetism in multilayer graphene

    SciTech Connect

    Swain, Akshaya Kumar; Bahadur, Dhirendra

    2014-06-16

    Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

  11. ML-blind deconvolution algorithm: recent developments

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Santosh; Szarowski, Donald H.; Turner, James N.; O'Connor, Nathan J.; Holmes, Timothy J.

    1996-04-01

    The Maximum Likelihood based blind deconvolution (ML-blind) algorithm is used to deblur 3D microscope images. This approach was first introduced to the microscope community by us circa 1992. The basic advantage of a blind algorithm is that it simplifies the user interface protocols and reconstructs both the object and the Point Spread Function. In this paper we will discuss the recent improvements to the algorithm that robustize the performance and accelerate the speed of convergence. For instance, powerful and physically justified constraints are enforced on the reconstructed PSF at every iteration for robustization. A line search technique is added to the object reconstruction to accelerate the convergence of the object estimate. A simple modification to the algorithm enables adaptation for the transmitted light brightfield modality. Finally, we incorporate montaging in order to process large data fields.

  12. Generalized iterative deconvolution for receiver function estimation

    NASA Astrophysics Data System (ADS)

    Wang, Yinzhi; Pavlis, Gary L.

    2016-02-01

    This paper describes a generalization of the iterative deconvolution method commonly used as a component of passive array wavefield imaging. We show that the iterative method should be thought of as a sparse output deconvolution method with the number of terms retained dependent on the convergence criteria. The generalized method we introduce uses an inverse operator to shape the assumed wavelet to a peaked function at zero lag. We show that the conventional method is equivalent to using a damped least-squares spiking filter with extremely large damping and proper scaling. In that case, the inverse operator used in the generalized method reduces to the cross-correlation operator. The theoretical insight of realizing the output is a sparse series provides a basis for the second important addition of the generalized method-an output shaping wavelet. A constant output shaping wavelet is a critical component in scattered wave imaging to avoid mixing data of variable bandwidth. We demonstrate the new approach can improve resolution by using an inverse operator tuned to maximize resolution. We also show that the signal-to-noise ratio of the result can be improved by applying a different convergence criterion than the standard method, which measures the energy left after each iteration. The efficacy of the approach was evaluated with synthetic experiment in various signal and noise conditions. We further validated the approach with real data from the USArray. We compared our results with data from the EarthScope Automated Receiver Survey and found that our results show modest improvements in consistency measured by correlation coefficients with station stacks and a reduced number of outliers.

  13. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively

  14. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  15. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale

  16. Selected Tools for Risk Analysis in Logistics Processes

    NASA Astrophysics Data System (ADS)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  17. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  18. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  19. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  20. PIXE analysis of obsidian tools from radiocarbon dated archaeological contexts

    NASA Astrophysics Data System (ADS)

    Butalag, K.; Calcagnile, L.; Quarta, G.; Maruccio, L.; D'Elia, M.

    2008-05-01

    Obsidian tools from archaeological sites in Turkey and Southern Italy, 14C-AMS dated to the Neolithic were submitted to IBA analyses at CEDAD, University of Salento, Italy. PIXE and PIGE quantitative analysis of major and trace elements allowed the separation of the samples in different compositional groups corresponding to different sources of obsidian, identified on the basis of published data.

  1. Advanced Statistical and Data Analysis Tools for Astrophysics

    NASA Technical Reports Server (NTRS)

    Kashyap, V.; Scargle, Jeffrey D. (Technical Monitor)

    2001-01-01

    The goal of the project is to obtain, derive, and develop statistical and data analysis tools that would be of use in the analyses of high-resolution, high-sensitivity data that are becoming available with new instruments. This is envisioned as a cross-disciplinary effort with a number of collaborators.

  2. Separation analysis, a tool for analyzing multigrid algorithms

    NASA Technical Reports Server (NTRS)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  3. V-Lab{trademark}: Virtual laboratories -- The analysis tool for structural analysis of composite components

    SciTech Connect

    1999-07-01

    V-Lab{trademark}, an acronym for Virtual Laboratories, is a design and analysis tool for fiber-reinforced composite components. This program allows the user to perform analysis, numerical experimentation, and design prototyping using advanced composite stress and failure analysis tools. The software was designed to be intuitive and easy to use, even by designers who are not experts in composite materials or structural analysis. V-Lab{trademark} is the software tool every specialist in design engineering, structural analysis, research and development and repair needs to perform accurate, fast and economical analysis of composite components.

  4. Discovery and New Frontiers Project Budget Analysis Tool

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  5. Thermal Analysis for Condition Monitoring of Machine Tool Spindles

    NASA Astrophysics Data System (ADS)

    Clough, D.; Fletcher, S.; Longstaff, A. P.; Willoughby, P.

    2012-05-01

    Decreasing tolerances on parts manufactured, or inspected, on machine tools increases the requirement to have a greater understanding of machine tool capabilities, error sources and factors affecting asset availability. Continuous usage of a machine tool during production processes causes heat generation typically at the moving elements, resulting in distortion of the machine structure. These effects, known as thermal errors, can contribute a significant percentage of the total error in a machine tool. There are a number of design solutions available to the machine tool builder to reduce thermal error including, liquid cooling systems, low thermal expansion materials and symmetric machine tool structures. However, these can only reduce the error not eliminate it altogether. It is therefore advisable, particularly in the production of high value parts, for manufacturers to obtain a thermal profile of their machine, to ensure it is capable of producing in tolerance parts. This paper considers factors affecting practical implementation of condition monitoring of the thermal errors. In particular is the requirement to find links between temperature, which is easily measureable during production and the errors which are not. To this end, various methods of testing including the advantages of thermal images are shown. Results are presented from machines in typical manufacturing environments, which also highlight the value of condition monitoring using thermal analysis.

  6. Blind deconvolution of two-dimensional complex data

    SciTech Connect

    Ghiglia, D.C.; Romero, L.A.

    1994-01-01

    Inspired by the work of Lane and Bates on automatic multidimensional deconvolution, the authors have developed a systematic approach and an operational code for performing the deconvolution of multiply-convolved two-dimensional complex data sets in the absence of noise. They explain, in some detail, the major algorithmic steps, where noise or numerical errors can cause problems, their approach in dealing with numerical rounding errors, and where special noise-mitigating techniques can be used toward making blind deconvolution practical. Several examples of deconvolved imagery are presented, and future research directions are noted.

  7. Analysis tools for non-radially pulsating objects

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Pollard, K. R.; Cottrell, P. L.

    2007-06-01

    At the University of Canterbury we have been developing a set of tools for the analysis of spectra of varying types of non-radially pulsating objects. This set currently includes: calculation of the moments, calculations of the phase across the profile as well as basic binary profile fitting for determination of orbital characteristics and projected rotational velocity (v sin i) measurement. Recently the ability to calculate cross-correlation profiles using either specified or synthesized line lists has been added, all implemented in MATLAB. A number of observations of γ Doradus candidates is currently being used to test these tools. For information on our observing facilities see Pollard et al. (2007).

  8. The Cube Analysis and Rendering Tool for Astronomy

    NASA Astrophysics Data System (ADS)

    Rosolowsky, E.; Kern, J.; Federl, P.; Jacobs, J.; Loveland, S.; Taylor, J.; Sivakoff, G.; Taylor, R.

    2015-09-01

    We present the design principles and current status of the Cube Analysis and Rendering Tool for Astronomy (CARTA). The CARTA project is designing a cube visualization tool for the Atacama Large Millimetre/submillimeter array. CARTA will join the domain-specific software already developed for millimetre-wave interferometry with sever-side visualization solution. This connection will enable archive-hosted exploration of three-dimensional data cubes. CARTA will also provide an indistinguishable desktop client. While such a goal is ambitious for a short project, the team is focusing on a well-developed framework which can readily accommodate community code development through plugins.

  9. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  10. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  11. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  12. Galaxy, a web-based genome analysis tool for experimentalists

    PubMed Central

    Blankenberg, Daniel; Von Kuster, Gregory; Coraor, Nathaniel; Ananda, Guruprasad; Lazarus, Ross; Mangan, Mary; Nekrutenko, Anton; Taylor, James

    2014-01-01

    High-throughput data production has revolutionized molecular biology. However, massive increases in data generation capacity require analysis approaches that are more sophisticated, and often very computationally intensive. Thus making sense of high-throughput data requires informatics support. Galaxy (http://galaxyproject.org) is a software system that provides this support through a framework that gives experimentalists simple interfaces to powerful tools, while automatically managing the computational details. Galaxy is available both as a publicly available web service, which provides tools for the analysis of genomic, comparative genomic, and functional genomic data, or a downloadable package that can be deployed in individual labs. Either way, it allows experimentalists without informatics or programming expertise to perform complex large-scale analysis with just a web browser. PMID:20069535

  13. Pathway analysis tools and toxicogenomics reference databases for risk assessment.

    PubMed

    Ganter, Brigitte; Zidek, Nadine; Hewitt, Philip R; Müller, Dieter; Vladimirova, Antoaneta

    2008-01-01

    The pharmaceutical industry has begun to leverage a range of new technologies (proteomics, pharmacogenomics, metabolomics and molecular toxicology [e.g., toxicogenomics]) and analysis tools that are becoming increasingly integrated in the area of drug discovery and development. The approach of analyzing the vast amount of toxicogenomics data generated using molecular pathway and networks analysis tools in combination with analysis of reference data will be the main focus of this review. We will demonstrate how this combined approach can increase the understanding of the molecular mechanisms that lead to chemical-induced toxicity and application of this knowledge to compound risk assessment. We will provide an example of the insights achieved through a molecular toxicology analysis based on the well-known hepatotoxicant lipopolysaccharide to illustrate the utility of these new tools in the analysis of complex data sets, both in vivo and in vitro. The ultimate objective is a better lead selection process that improves the chances for success across the different stages of drug discovery and development. PMID:18154447

  14. Limited-memory scaled gradient projection methods for real-time image deconvolution in microscopy

    NASA Astrophysics Data System (ADS)

    Porta, F.; Zanella, R.; Zanghirati, G.; Zanni, L.

    2015-04-01

    Gradient projection methods have given rise to effective tools for image deconvolution in several relevant areas, such as microscopy, medical imaging and astronomy. Due to the large scale of the optimization problems arising in nowadays imaging applications and to the growing request of real-time reconstructions, an interesting challenge to be faced consists in designing new acceleration techniques for the gradient schemes, able to preserve their simplicity and low computational cost of each iteration. In this work we propose an acceleration strategy for a state-of-the-art scaled gradient projection method for image deconvolution in microscopy. The acceleration idea is derived by adapting a step-length selection rule, recently introduced for limited-memory steepest descent methods in unconstrained optimization, to the special constrained optimization framework arising in image reconstruction. We describe how important issues related to the generalization of the step-length rule to the imaging optimization problem have been faced and we evaluate the improvements due to the acceleration strategy by numerical experiments on large-scale image deconvolution problems.

  15. Deconvolution of Complex 1D NMR Spectra Using Objective Model Selection

    PubMed Central

    Hughes, Travis S.; Wilson, Henry D.; de Vera, Ian Mitchelle S.; Kojetin, Douglas J.

    2015-01-01

    Fluorine (19F) NMR has emerged as a useful tool for characterization of slow dynamics in 19F-labeled proteins. One-dimensional (1D) 19F NMR spectra of proteins can be broad, irregular and complex, due to exchange of probe nuclei between distinct electrostatic environments; and therefore cannot be deconvoluted and analyzed in an objective way using currently available software. We have developed a Python-based deconvolution program, decon1d, which uses Bayesian information criteria (BIC) to objectively determine which model (number of peaks) would most likely produce the experimentally obtained data. The method also allows for fitting of intermediate exchange spectra, which is not supported by current software in the absence of a specific kinetic model. In current methods, determination of the deconvolution model best supported by the data is done manually through comparison of residual error values, which can be time consuming and requires model selection by the user. In contrast, the BIC method used by decond1d provides a quantitative method for model comparison that penalizes for model complexity helping to prevent over-fitting of the data and allows identification of the most parsimonious model. The decon1d program is freely available as a downloadable Python script at the project website (https://github.com/hughests/decon1d/). PMID:26241959

  16. Tools for Large-Scale Mobile Malware Analysis

    SciTech Connect

    Bierma, Michael

    2014-01-01

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000 Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.

  17. Virtual tool mark generation for efficient striation analysis.

    PubMed

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

  18. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  19. The design and implementation of a workflow analysis tool.

    PubMed

    Curcin, Vasa; Ghanem, Moustafa; Guo, Yike

    2010-09-13

    Motivated by the use of scientific workflows as a user-oriented mechanism for building executable scientific data integration and analysis applications, this article introduces a framework and a set of associated methods for analysing the execution properties of scientific workflows. Our framework uses a number of formal modelling techniques to characterize the process and data behaviour of workflows and workflow components and to reason about their functional and execution properties. We use the framework to design the architecture of a customizable tool that can be used to analyse the key execution properties of scientific workflows at authoring stage. Our design is generic and can be applied to a wide variety of scientific workflow languages and systems, and is evaluated by building a prototype of the tool for the Discovery Net system. We demonstrate and discuss the utility of the framework and tool using workflows from a real-world medical informatics study. PMID:20679131

  20. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  1. MISR Data Visualization and Analysis Using the hdfscan Tool

    NASA Astrophysics Data System (ADS)

    Crean, K. A.; Diner, D. J.; Banerjee, P. K.

    2001-05-01

    A new software tool called hdfscan is available to display and analyze data formatted using the HDF-EOS grid and swath interfaces, as well as the native HDF SDS, vdata, vgroup, and raster interfaces. The hdfscan tool can display data in both image and textual form, and can also display attributes, metadata, annotations, file structure, projection information, and simple data statistics. hdfscan also includes a data editing capability. In addition, the tool contains unique features to aid in the interpretation of data from the Multi-angle Imaging SpectroRadiometer (MISR) instrument, which currently flies aboard NASA's Terra spacecraft. These features include the ability to unscale and unpack MISR data fields; the ability to display MISR data flag values according to their interpreted values as well as their raw values; and knowledge of special MISR fill values. MISR measures upwelling radiance from Earth in 4 spectral bands corresponding to blue, green, red, and near-infrared wavelengths, at each of 9 view angles including the nadir (vertical) direction plus 26.1, 45.6, 60.0, and 70.5 degrees forward and aftward of nadir. Data products derived from MISR measurements aim at improving our understanding of the Earth's environment and climate. The hdfscan tool runs in one of two modes, as selected by the user: command-line mode, or via a graphical user interface (GUI). This provides a user with flexibility in using the tool for either batch mode processing or interactive analysis. This presentation will describe features and functionalities of the hdfscan tool. The user interface will be shown, and menu options will be explained. Information on how to obtain the tool will be provided.

  2. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  3. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Astrophysics Data System (ADS)

    Doyle, Monica M.; O'Neil, Daniel A.; Christensen, Carissa B.

    2005-02-01

    Forecasting technology capabilities requires a tool and a process for capturing state-of-the-art technology metrics and estimates for future metrics. A decision support tool, known as the Advanced Technology Lifecycle Analysis System (ATLAS), contains a Technology Tool Box (TTB) database designed to accomplish this goal. Sections of this database correspond to a Work Breakdown Structure (WBS) developed by NASA's Exploration Systems Research and Technology (ESRT) Program. These sections cover the waterfront of technologies required for human and robotic space exploration. Records in each section include technology performance, operations, and programmatic metrics. Timeframes in the database provide metric values for the state of the art (Timeframe 0) and forecasts for timeframes that correspond to spiral development milestones in NASA's Exploration Systems Mission Directorate (ESMD) development strategy. Collecting and vetting data for the TTB will involve technologists from across the agency, the aerospace industry and academia. Technologists will have opportunities to submit technology metrics and forecasts to the TTB development team. Semi-annual forums will facilitate discussions about the basis of forecast estimates. As the tool and process mature, the TTB will serve as a powerful communication and decision support tool for the ESRT program.

  4. The discrete Kalman filtering approach for seismic signals deconvolution

    SciTech Connect

    Kurniadi, Rizal; Nurhandoko, Bagus Endar B.

    2012-06-20

    Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

  5. Blind Deconvolution for Ultrasound Sequences Using a Noninverse Greedy Algorithm

    PubMed Central

    Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

    2013-01-01

    The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

  6. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    PubMed

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-01

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. PMID:26002210

  7. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  8. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  9. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  10. Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

    1999-01-01

    A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

  11. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

  12. [Incident reporting as a tool for error analysis in medicine].

    PubMed

    Staender, S

    2001-07-01

    Incident Reporting is a tool for error analysis that has its tradition in high-risk industries such as aviation, nuclear power plants or the chemical process industry. The main purpose is to detect insufficiencies in the system as well as in the individual process of each enterprise. In medicine, incident reporting has only rarely been used for error analysis. Pioneering is the domain of anaesthesiology when it comes to apply this tool. In this report the method of incident reporting is described in general with its indications, requirements as well as its limitations. Furthermore a model for the definition of a critical incident in medicine is described and the first conclusions out of a national program of incident reporting in Switzerland are given. PMID:11512219

  13. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    EPA Science Inventory

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  14. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGESBeta

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  16. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  17. Deconvolution of high rate flicker electroretinograms.

    PubMed

    Alokaily, A; Bóhorquez, J; Özdamar, Ö

    2014-01-01

    Flicker electroretinograms are steady-state electroretinograms (ERGs) generated by high rate flash stimuli that produce overlapping periodic responses. When a flash stimulus is delivered at low rates, a transient response named flash ERG (FERG) representing the activation of neural structures within the outer retina is obtained. Although FERGs and flicker ERGs are used in the diagnosis of many retinal diseases, their waveform relationships have not been investigated in detail. This study examines this relationship by extracting transient FERGs from specially generated quasi steady-state flicker and ERGs at stimulation rates above 10 Hz and similarly generated conventional flicker ERGs. The ability to extract the transient FERG responses by deconvolving flicker responses to temporally jittered stimuli at high rates is investigated at varying rates. FERGs were obtained from seven normal subjects stimulated with LED-based displays, delivering steady-state and low jittered quasi steady-state responses at five rates (10, 15, 32, 50, 68 Hz). The deconvolution method enabled a successful extraction of "per stimulus" unit transient ERG responses for all high stimulation rates. The deconvolved FERGs were used successfully to synthesize flicker ERGs obtained at the same high stimulation rates. PMID:25571234

  18. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  19. AstroStat-A VO tool for statistical analysis

    NASA Astrophysics Data System (ADS)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  20. Nonnegative image reconstruction from sparse Fourier data: a new deconvolution algorithm

    NASA Astrophysics Data System (ADS)

    Bonettini, S.; Prato, M.

    2010-09-01

    This paper deals with image restoration problems where the data are nonuniform samples of the Fourier transform of the unknown object. We study the inverse problem in both semidiscrete and fully discrete formulations, and our analysis leads to an optimization problem involving the minimization of the data discrepancy under nonnegativity constraints. In particular, we show that such a problem is equivalent to a deconvolution problem in the image space. We propose a practical algorithm, based on the gradient projection method, to compute a regularized solution in the discrete case. The key point in our deconvolution-based approach is that the fast Fourier transform can be employed in the algorithm implementation without the need of preprocessing the data. A numerical experimentation on simulated and real data from the NASA RHESSI mission is also performed.

  1. Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs

    SciTech Connect

    Howard, M.; Luttman, A.; Fowler, M.

    2014-11-01

    In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

  2. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  3. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  4. Federal metering data analysis needs and existing tools

    SciTech Connect

    Henderson, Jordan W.; Fowler, Kimberly M.

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  5. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  6. Impact of beam deconvolution on noise properties in CMB measurements: Application to Planck LFI

    NASA Astrophysics Data System (ADS)

    Keihänen, E.; Kiiveri, K.; Lindholm, V.; Reinecke, M.; Suur-Uski, A.-S.

    2016-03-01

    We present an analysis of the effects of beam deconvolution on noise properties in CMB measurements. The analysis is built around the artDeco beam deconvolver code. We derive a low-resolution noise covariance matrix that describes the residual noise in deconvolution products, both in harmonic and pixel space. The matrix models the residual correlated noise that remains in time-ordered data after destriping, and the effect of deconvolution on this noise. To validate the results, we generate noise simulations that mimic the data from the Planck LFI instrument. A χ2 test for the full 70 GHz covariance in multipole range ℓ = 0 - 50 yields a mean reduced χ2 of 1.0037. We compare two destriping options, full and independent destriping, when deconvolving subsets of available data. Full destriping leaves substantially less residual noise, but leaves data sets intercorrelated. We also derive a white noise covariance matrix that provides an approximation of the full noise at high multipoles, and study the properties on high-resolution noise in pixel space through simulations.

  7. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  8. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    PubMed

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  9. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

  10. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to

  11. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  12. Protocol analysis as a tool for behavior analysis

    PubMed Central

    Austin, John; Delaney, Peter F.

    1998-01-01

    The study of thinking is made difficult by the fact that many of the relevant stimuli and responses are not apparent. Although the use of verbal reports has a long history in psychology, it is only recently that Ericsson and Simon's (1993) book on verbal reports explicated the conditions under which such reports may be reliable and valid. We review some studies in behavior analysis and cognitive psychology that have used talk-aloud reporting. We review particular methods for collecting reliable and valid verbal reports using the “talk-aloud” method as well as discuss alternatives to the talk-aloud procedure that are effective under different task conditions, such as the use of reports after completion of very rapid task performances. We specifically caution against the practice of asking subjects to reflect on the causes of their own behavior and the less frequently discussed problems associated with providing inappropriate social stimulation to participants during experimental sessions. PMID:22477126

  13. Evaluating control displays with the Engineering Control Analysis Tool (ECAT)

    SciTech Connect

    Plott, B.

    2006-07-01

    In the Nuclear Power Industry increased use of automated sensors and advanced control systems is expected to reduce and/or change manning requirements. However, critical questions remain regarding the extent to which safety will be compromised if the cognitive workload associated with monitoring multiple automated systems is increased. Can operators/engineers maintain an acceptable level of performance if they are required to supervise multiple automated systems and respond appropriately to off-normal conditions? The interface to/from the automated systems must provide the information necessary for making appropriate decisions regarding intervention in the automated process, but be designed so that the cognitive load is neither too high nor too low for the operator who is responsible for the monitoring and decision making. This paper will describe a new tool that was developed to enhance the ability of human systems integration (HSI) professionals and systems engineers to identify operational tasks in which a high potential for human overload and error can be expected. The tool is entitled the Engineering Control Analysis Tool (ECAT). ECAT was designed and developed to assist in the analysis of: Reliability Centered Maintenance (RCM), operator task requirements, human error probabilities, workload prediction, potential control and display problems, and potential panel layout problems. (authors)

  14. A new bioinformatics analysis tools framework at EMBL–EBI

    PubMed Central

    Goujon, Mickael; McWilliam, Hamish; Li, Weizhong; Valentin, Franck; Squizzato, Silvano; Paern, Juri; Lopez, Rodrigo

    2010-01-01

    The EMBL-EBI provides access to various mainstream sequence analysis applications. These include sequence similarity search services such as BLAST, FASTA, InterProScan and multiple sequence alignment tools such as ClustalW, T-Coffee and MUSCLE. Through the sequence similarity search services, the users can search mainstream sequence databases such as EMBL-Bank and UniProt, and more than 2000 completed genomes and proteomes. We present here a new framework aimed at both novice as well as expert users that exposes novel methods of obtaining annotations and visualizing sequence analysis results through one uniform and consistent interface. These services are available over the web and via Web Services interfaces for users who require systematic access or want to interface with customized pipe-lines and workflows using common programming languages. The framework features novel result visualizations and integration of domain and functional predictions for protein database searches. It is available at http://www.ebi.ac.uk/Tools/sss for sequence similarity searches and at http://www.ebi.ac.uk/Tools/msa for multiple sequence alignments. PMID:20439314

  15. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  16. Application of the Lucy–Richardson Deconvolution Procedure to High Resolution Photoemission Spectra

    SciTech Connect

    Rameau, J.; Yang, H.-B.; Johnson, P.D.

    2010-07-01

    Angle-resolved photoemission has developed into one of the leading probes of the electronic structure and associated dynamics of condensed matter systems. As with any experimental technique the ability to resolve features in the spectra is ultimately limited by the resolution of the instrumentation used in the measurement. Previously developed for sharpening astronomical images, the Lucy-Richardson deconvolution technique proves to be a useful tool for improving the photoemission spectra obtained in modern hemispherical electron spectrometers where the photoelectron spectrum is displayed as a 2D image in energy and momentum space.

  17. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

  18. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  19. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  20. Deconvolution for three-dimensional acoustic source identification based on spherical harmonics beamforming

    NASA Astrophysics Data System (ADS)

    Chu, Zhigang; Yang, Yang; He, Yansong

    2015-05-01

    Spherical Harmonics Beamforming (SHB) with solid spherical arrays has become a particularly attractive tool for doing acoustic sources identification in cabin environments. However, it presents some intrinsic limitations, specifically poor spatial resolution and severe sidelobe contaminations. This paper focuses on overcoming these limitations effectively by deconvolution. First and foremost, a new formulation is proposed, which expresses SHB's output as a convolution of the true source strength distribution and the point spread function (PSF) defined as SHB's response to a unit-strength point source. Additionally, the typical deconvolution methods initially suggested for planar arrays, deconvolution approach for the mapping of acoustic sources (DAMAS), nonnegative least-squares (NNLS), Richardson-Lucy (RL) and CLEAN, are adapted to SHB successfully, which are capable of giving rise to highly resolved and deblurred maps. Finally, the merits of the deconvolution methods are validated and the relationships of source strength and pressure contribution reconstructed by the deconvolution methods vs. focus distance are explored both with computer simulations and experimentally. Several interesting results have emerged from this study: (1) compared with SHB, DAMAS, NNLS, RL and CLEAN all can not only improve the spatial resolution dramatically but also reduce or even eliminate the sidelobes effectively, allowing clear and unambiguous identification of single source or incoherent sources. (2) The availability of RL for coherent sources is highest, then DAMAS and NNLS, and that of CLEAN is lowest due to its failure in suppressing sidelobes. (3) Whether or not the real distance from the source to the array center equals the assumed one that is referred to as focus distance, the previous two results hold. (4) The true source strength can be recovered by dividing the reconstructed one by a coefficient that is the square of the focus distance divided by the real distance from the source to the array center. (5) The reconstructed pressure contribution is almost not affected by the focus distance, always approximating to the true one. This study will be of great significance to the accurate localization and quantification of acoustic sources in cabin environments.

  1. Informed constrained spherical deconvolution (iCSD).

    PubMed

    Roine, Timo; Jeurissen, Ben; Perrone, Daniele; Aelterman, Jan; Philips, Wilfried; Leemans, Alexander; Sijbers, Jan

    2015-08-01

    Diffusion-weighted (DW) magnetic resonance imaging (MRI) is a noninvasive imaging method, which can be used to investigate neural tracts in the white matter (WM) of the brain. However, the voxel sizes used in DW-MRI are relatively large, making DW-MRI prone to significant partial volume effects (PVE). These PVEs can be caused both by complex (e.g. crossing) WM fiber configurations and non-WM tissue, such as gray matter (GM) and cerebrospinal fluid. High angular resolution diffusion imaging methods have been developed to correctly characterize complex WM fiber configurations, but significant non-WM PVEs are also present in a large proportion of WM voxels. In constrained spherical deconvolution (CSD), the full fiber orientation distribution function (fODF) is deconvolved from clinically feasible DW data using a response function (RF) representing the signal of a single coherently oriented population of fibers. Non-WM PVEs cause a loss of precision in the detected fiber orientations and an emergence of false peaks in CSD, more prominently in voxels with GM PVEs. We propose a method, informed CSD (iCSD), to improve the estimation of fODFs under non-WM PVEs by modifying the RF to account for non-WM PVEs locally. In practice, the RF is modified based on tissue fractions estimated from high-resolution anatomical data. Results from simulation and in-vivo bootstrapping experiments demonstrate a significant improvement in the precision of the identified fiber orientations and in the number of false peaks detected under GM PVEs. Probabilistic whole brain tractography shows fiber density is increased in the major WM tracts and decreased in subcortical GM regions. The iCSD method significantly improves the fiber orientation estimation at the WM-GM interface, which is especially important in connectomics, where the connectivity between GM regions is analyzed. PMID:25660002

  2. GLIDER: Free tool imagery data visualization, analysis and mining

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.

    2009-12-01

    Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis tools generally do not provide a good interface between visualization and analysis, especially a 3D view, and GLIDER fills this gap. This feature gives the users extremely useful spatial context to their data and analysis/mining results. This presentation will demonstrate the latest version of GLIDER and also describe its supporting documentation such as video tutorial, online resources etc.

  3. Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.

    PubMed

    Villeneuve, Emma; Carfantan, Hervé

    2014-10-01

    Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

  4. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  5. Principles and tools for collaborative entity-based intelligence analysis.

    PubMed

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis. PMID:20075480

  6. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  7. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  8. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  9. Image and Data-analysis Tools For Paleoclimatic Reconstructions

    NASA Astrophysics Data System (ADS)

    Pozzi, M.

    It comes here proposed a directory of instruments and computer science resources chosen in order to resolve the problematic ones that regard the paleoclimatic recon- structions. They will come discussed in particular the following points: 1) Numerical analysis of paleo-data (fossils abundances, species analyses, isotopic signals, chemical-physical parameters, biological data): a) statistical analyses (uni- variate, diversity, rarefaction, correlation, ANOVA, F and T tests, Chi^2) b) multidi- mensional analyses (principal components, corrispondence, cluster analysis, seriation, discriminant, autocorrelation, spectral analysis) neural analyses (backpropagation net, kohonen feature map, hopfield net genetic algorithms) 2) Graphical analysis (visu- alization tools) of paleo-data (quantitative and qualitative fossils abundances, species analyses, isotopic signals, chemical-physical parameters): a) 2-D data analyses (graph, histogram, ternary, survivorship) b) 3-D data analyses (direct volume rendering, iso- surfaces, segmentation, surface reconstruction, surface simplification,generation of tetrahedral grids). 3) Quantitative and qualitative digital image analysis (macro and microfossils image analysis, Scanning Electron Microscope. and Optical Polarized Microscope images capture and analysis, morphometric data analysis, 3-D reconstruc- tions): a) 2D image analysis (correction of image defects, enhancement of image de- tail, converting texture and directionality to grey scale or colour differences, visual enhancement using pseudo-colour, pseudo-3D, thresholding of image features, binary image processing, measurements, stereological measurements, measuring features on a white background) b) 3D image analysis (basic stereological procedures, two dimen- sional structures; area fraction from the point count, volume fraction from the point count, three dimensional structures: surface area and the line intercept count, three dimensional microstructures; line length and the area point count, measurement using grids, measuring area with pixels, measurement parameters, shape and position, im- age processing to enable thresholding and measurement, image processing to extract measurable information, combining multiple images, photogrammetry measurement application)

  10. BEDTools: The Swiss-Army Tool for Genome Feature Analysis.

    PubMed

    Quinlan, Aaron R

    2014-01-01

    Technological advances have enabled the use of DNA sequencing as a flexible tool to characterize genetic variation and to measure the activity of diverse cellular phenomena such as gene isoform expression and transcription factor binding. Extracting biological insight from the experiments enabled by these advances demands the analysis of large, multi-dimensional datasets. This unit describes the use of the BEDTools toolkit for the exploration of high-throughput genomics datasets. Several protocols are presented for common genomic analyses, demonstrating how simple BEDTools operations may be combined to create bespoke pipelines addressing complex questions. PMID:25199790

  11. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity. PMID:16675027

  12. BEDTools: the Swiss-army tool for genome feature analysis

    PubMed Central

    Quinlan, Aaron R.

    2014-01-01

    Technological advances have enabled the use of DNA sequencing as a flexible tool to characterize genetic variation and to measure the activity of diverse cellular phenomena such as gene isoform expression and transcription factor binding. Extracting biological insight from the experiments enabled by these advances demands the analysis of large, multi-dimensional datasets. This unit describes the use of the BEDTools toolkit for the exploration of high-throughput genomics datasets. I present several protocols for common genomic analyses and demonstrate how simple BEDTools operations may be combined to create bespoke pipelines addressing complex questions. PMID:25199790

  13. Integrated network analysis and effective tools in plant systems biology

    PubMed Central

    Fukushima, Atsushi; Kanaya, Shigehiko; Nishida, Kozo

    2014-01-01

    One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1) network visualization tools, (2) pathway analyses, (3) genome-scale metabolic reconstruction, and (4) the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms. PMID:25408696

  14. SNooPy: TypeIa supernovae analysis tools

    NASA Astrophysics Data System (ADS)

    Burns, Christopher R.; Stritzinger, Maximilian; Phillips, M. M.; Kattner, ShiAnne; Persson, S. E.; Madore, Barry F.; Freedman, Wendy L.; Boldt, Luis; Campillay, Abdo; Contreras, Carlos; Folatelli, Gaston; Gonzalez, Sergio; Krzeminski, Wojtek; Morrell, Nidia; Salgado, Francisco; Suntzeff, Nicholas B.

    2015-05-01

    The SNooPy package (also known as SNpy), written in Python, contains tools for the analysis of TypeIa supernovae. It offers interactive plotting of light-curve data and models (and spectra), computation of reddening laws and K-corrections, LM non-linear least-squares fitting of light-curve data, and various types of spline fitting, including Diercx and tension. The package also includes a SNIa lightcurve template generator in the CSP passbands, estimates of Milky-Way Extinction, and a module for dealing with filters and spectra.

  15. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  16. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    NASA Technical Reports Server (NTRS)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  17. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    SciTech Connect

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  18. Java Analysis Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, E.; Hix, W.; Guidry, M.; Smith, M.

    2002-12-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector graphic technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as stand-alone applications or distributed across a network. These tools, which have broad applications in general scientific visualization, are currently being used to explore and analyze a large library of nuclear reaction rates and visualize results of explosive nucleosynthesis calculations with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to easily include new rates and compare and adjust current ones. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic (ten-fold) reduction in visualization file sizes while maintaining high visual quality and interactive control. ORNL Physics Division is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  19. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  20. Minimum variance image blending for robust ultrasound image deconvolution

    NASA Astrophysics Data System (ADS)

    Park, Sungchan; Kang, Jooyoung; Kim, Yun-Tae; Kim, Kyuhong; Kim, Jung-Ho; Song, Jong Keun

    2014-03-01

    When the ultrasound wave propagates the human body, its velocity and attenuation change at each region, which make the PSF shape different. To solve the PSF estimation problem is ill-posed case and rarely error free which produces the PSF estimation errors and make the image overblurred by the sidelobe artifacts. For the commercialization of the ultrasound deconvolution method, the robustness of the image deconvolution without artifacts is essential. There exist many minimum variance beamformer algorithms. It is robust to noise and shows high resolution efficiently. We consider the channel data as image pixel and we present a new spatial varying MV (minimum variance) blending scheme with the deconvolved imageges in the image processing domain. With a stochastic image blending of the deconvolution images, we obtain high resolution results which suppress the blur artifacts enough although the input deconvolution images have restoration errors. We verify our algorithm on the real data. In all the case, we can observe that the artifacts are suppressed and show the highest resolution among the deconvolution methods.

  1. Improving space debris detection in GEO ring using image deconvolution

    NASA Astrophysics Data System (ADS)

    Núñez, Jorge; Núñez, Anna; Montojo, Francisco Javier; Condominas, Marta

    2015-07-01

    In this paper we present a method based on image deconvolution to improve the detection of space debris, mainly in the geostationary ring. Among the deconvolution methods we chose the iterative Richardson-Lucy (R-L), as the method that achieves better goals with a reasonable amount of computation. For this work, we used two sets of real 4096 × 4096 pixel test images obtained with the Telescope Fabra-ROA at Montsec (TFRM). Using the first set of data, we establish the optimal number of iterations in 7, and applying the R-L method with 7 iterations to the images, we show that the astrometric accuracy does not vary significantly while the limiting magnitude of the deconvolved images increases significantly compared to the original ones. The increase is in average about 1.0 magnitude, which means that objects up to 2.5 times fainter can be detected after deconvolution. The application of the method to the second set of test images, which includes several faint objects, shows that, after deconvolution, up to four previously undetected faint objects are detected in a single frame. Finally, we carried out a study of some economic aspects of applying the deconvolution method, showing that an important economic impact can be envisaged.

  2. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  3. The LAGRANTO Lagrangian analysis tool - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  4. The Lagrangian analysis tool LAGRANTO - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  5. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C; Temple, Brian A; Buescher, Kevin L

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  6. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  7. Topological Tools For The Analysis Of Active Region Filament Stability

    NASA Astrophysics Data System (ADS)

    DeLuca, Edward E.; Savcheva, A.; van Ballegooijen, A.; Pariat, E.; Aulanier, G.; Su, Y.

    2012-05-01

    The combination of accurate NLFFF models and high resolution MHD simulations allows us to study the changes in stability of an active region filament before a CME. Our analysis strongly supports the following sequence of events leading up to the CME: first there is a build up of magnetic flux in the filament through flux cancellation beneath a developing flux rope; as the flux rope develops a hyperbolic flux tube (HFT) forms beneath the flux rope; reconnection across the HFT raises the flux rope while adding addition flux to it; the eruption is triggered when the flux rope becomes torus-unstable. The work applies topological analysis tools that have been developed over the past decade and points the way for future work on the critical problem of CME initiation in solar active regions. We will present the uses of this approach, current limitations and future prospects.

  8. Message Correlation Analysis Tool for NOvA

    NASA Astrophysics Data System (ADS)

    Lu, Qiming; Biery, Kurt A.; Kowalkowski, James B.

    2012-12-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  9. Stacks: an analysis tool set for population genomics

    PubMed Central

    CATCHEN, JULIAN; HOHENLOHE, PAUL A.; BASSHAM, SUSAN; AMORES, ANGEL; CRESKO, WILLIAM A.

    2014-01-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics. PMID:23701397

  10. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  11. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  12. MIDAW: a web tool for statistical analysis of microarray data.

    PubMed

    Romualdi, Chiara; Vitulo, Nicola; Del Favero, Micky; Lanfranchi, Gerolamo

    2005-07-01

    MIDAW (microarray data analysis web tool) is a web interface integrating a series of statistical algorithms that can be used for processing and interpretation of microarray data. MIDAW consists of two main sections: data normalization and data analysis. In the normalization phase the simultaneous processing of several experiments with background correction, global and local mean and variance normalization are carried out. The data analysis section allows graphical display of expression data for descriptive purposes, estimation of missing values, reduction of data dimension, discriminant analysis and identification of marker genes. The statistical results are organized in dynamic web pages and tables, where the transcript/gene probes contained in a specific microarray platform can be linked (according to user choice) to external databases (GenBank, Entrez Gene, UniGene). Tutorial files help the user throughout the statistical analysis to ensure that the forms are filled out correctly. MIDAW has been developed using Perl and PHP and it uses R/Bioconductor languages and routines. MIDAW is GPL licensed and freely accessible at http://muscle.cribi.unipd.it/midaw/. Perl and PHP source codes are available from the authors upon request. PMID:15980553

  13. Spectral semi-blind deconvolution with least trimmed squares regularization

    NASA Astrophysics Data System (ADS)

    Deng, Lizhen; Zhu, Hu

    2014-11-01

    A spectral semi-blind deconvolution with least trimmed squares regularization (SBD-LTS) is proposed to improve spectral resolution. Firstly, the regularization term about the spectrum data is modeled as the form of least trimmed squares, which can help to preserve the peak details better. Then the regularization term about the PSF is modeled as L1-norm to enhance the stability of kernel estimation. The cost function of SBD-LTS is formulated and the numerical solution processes are deduced for deconvolving the spectra and estimating the PSF. The deconvolution results of simulated infrared spectra demonstrate that the proposed SBD-LTS can recover the spectrum effectively and estimate the PSF accurately, as well as has a merit on preserving the details, especially in the case of noise. The deconvolution result of experimental Raman spectrum indicates that SBD-LTS can resolve the spectrum and improve the resolution effectively.

  14. Extracting ocean surface information from altimeter returns - The deconvolution method

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ernesto; Chapman, Bruce

    1989-01-01

    An evaluation of the deconvolution method for estimating ocean surface parameters from ocean altimeter waveforms is presented. It is shown that this method presents a fast, accurate way of determining the ocean surface parameters from noisy altimeter data. Three parameters may be estimated by using this method, including the altimeter-height error, the ocean-surface standard deviation, and the ocean-surface skewness. By means of a Monte Carlo experiment, an 'optimum' deconvolution algorithm and the accuracies with which the above parameters may be estimated using this algorithm are determined. Then the influence of instrument effects, such as errors in calibration and pointing-angle estimation, on the estimated parameters is examined. Finally, the deconvolution algorithm is used to estimate height and ocean-surface parameters from Seasat data.

  15. The direct deconvolution of X-ray spectra

    NASA Technical Reports Server (NTRS)

    Kahn, S. M.; Blissett, R. J.

    1980-01-01

    The method of deconvolution of proportional counter X-ray spectra as outlined by Blissett and Cruise is reviewed with particular emphasis on low-energy spectra. This method involves the expansion of the incident spectrum in terms of a set of orthonormal singular functions which propagate independently through the detector matrix. When applied to low-energy detectors which typically exhibit absorption features in their efficiency curves, the initial Blissett and Cruise prescription is shown to be unstable. Alternative methods for handling the efficiency are described and evaluated. In the deconvolution of steep spectra, additional distortions are shown to arise as a result of sidelobe oscillations in the effective response functions. A selective weighting procedure is thus introduced to suppress spurious features of this type. Finally, the role of filtering in the deconvolution procedure is discussed and several suitable forms for the filter are suggested.

  16. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  17. Validating whole slide digital morphometric analysis as a microscopy tool.

    PubMed

    Diller, Robert B; Kellar, Robert S

    2015-02-01

    Whole slide imaging (WSI) can be used to quantify multiple responses within tissue sections during histological analysis. Feature Analysis on Consecutive Tissue Sections (FACTS®) allows the investigator to perform digital morphometric analysis (DMA) within specified regions of interest (ROI) across multiple serial sections at faster rates when compared with manual morphometry methods. Using FACTS® in conjunction with WSI is a powerful analysis tool, which allows DMA to target specific ROI across multiple tissue sections stained for different biomarkers. DMA may serve as an appropriate alternative to classic, manual, histologic morphometric measures, which have historically relied on the selection of high-powered fields of views and manual scoring (e.g., a gold standard). In the current study, existing preserved samples were used to determine if DMA would provide similar results to manual counting methods. Rodent hearts (n=14, left ventricles) were stained with Masson's trichrome, and reacted for cluster of differentiation 68 (CD-68). This study found no statistical significant difference between a classic, manual method and the use of digital algorithms to perform the similar counts (p=0.38). DMA offers researchers the ability to accurately evaluate morphological characteristics in a reproducible fashion without investigator bias and with higher throughput. PMID:25399639

  18. An online database for plant image analysis software tools

    PubMed Central

    2013-01-01

    Background Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. Results We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. Conclusions The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work. PMID:24107223

  19. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of tornadoes, flash floods, storminess, extreme weather events, etc. LCAT will expand the suite of NWS climate products. The LCAT development utilizes NWS Operations and Services Improvement Process (OSIP) to document the field and user requirements, develop solutions, and prioritize resources. OSIP is a five work-stage process separated by four gate reviews. LCAT is currently at work-stage three: Research Demonstration and Solution Analysis. Gate 1 and 2 reviews identified LCAT as a high strategic priority project with a very high operational need. The Integrated Working Team, consisting of NWS field representatives, assists in tool function design and identification of LCAT operational deployment support.

  20. Deconvolution of dynamic dual photon microscopy images of cerebral microvasculature to assess the hemodynamic status of the brain

    NASA Astrophysics Data System (ADS)

    Mehrabian, Hatef; Lindvere, Liis; Stefanovic, Bojana; Martel, Anne L.

    2011-03-01

    Assessing the hemodynamic status of the brain and its variations in response to stimulations is required to understand the local cerebral circulatory mechanisms. Dynamic contrast enhanced imaging of cerebral microvasculature provides information that can be used in understanding physiology of cerebral diseases. Bolus tracking is used to extract characteristic parameters that quantify local cerebral blood flow. However, post-processing of the data is needed to segment the field of view (FOV) and to perform deconvolution to remove the effects of input bolus profile and the path it travels to reach the imaging window. Finding the arterial input function (AIF) and dealing with the ill-posedness of deconvolution system make this process are the main challenges. We propose using ICA to segment the FOV and to extract a local AIF as well as the venous output function that is required for deconvolution. This also helps to stabilize the system as ICA suppresses noise efficiently. Tikhoniv regularization (with L-curve analysis to find the best regularization parameter) is used to make the system stable. In-vivo dynamic 2PLSM images of a rat brain in two conditions (when the animal is at rest and when it is stimulated) are used in this study. The experimental along with the simulation studies provided promising results that demonstrate the feasibility and importance of performing deconvolution.

  1. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  2. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  3. Analysis of flow cytometry data using an automatic processing tool.

    PubMed

    Jeffries, David; Zaidi, Irfan; de Jong, Bouke; Holland, Martin J; Miles, David J C

    2008-09-01

    In spite of recent advances in flow cytometry technology, most cytometry data is still analyzed manually which is labor-intensive for large datasets and prone to bias and inconsistency. We designed an automatic processing tool (APT) to rapidly and consistently define and describe cell populations across large datasets. Image processing, smoothing, and clustering algorithms were used to generate an expert system that automatically reproduces the functionality of commercial manual cytometry processing tools. The algorithms were developed using a dataset collected from CMV-infected infants and combined within a graphical user interface, to create the APT. The APT was used to identify regulatory T-cells in HIV-infected adults, based on expression of FOXP3. Results from the APT were compared directly with the manual analyses of five immunologists and showed close agreement, with a concordance correlation coefficient of 0.96 (95% CI 0.91-0.98). The APT was well accepted by users and able to process around 100 data files per hour. By applying consistent criteria to all data generated by a study, the APT can provide a level of objectivity that is difficult to match using conventional manual analysis. PMID:18613039

  4. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  5. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  6. Validation of tool mark analysis of cut costal cartilage.

    PubMed

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2012-03-01

    This study was designed to establish the potential error rate associated with the generally accepted method of tool mark analysis of cut marks in costal cartilage. Three knives with different blade types were used to make experimental cut marks in costal cartilage of pigs. Each cut surface was cast, and each cast was examined by three analysts working independently. The presence of striations, regularity of striations, and presence of a primary and secondary striation pattern were recorded for each cast. The distance between each striation was measured. The results showed that striations were not consistently impressed on the cut surface by the blade's cutting edge. Also, blade type classification by the presence or absence of striations led to a 65% misclassification rate. Use of the classification tree and cross-validation methods and inclusion of the mean interstriation distance decreased the error rate to c. 50%. PMID:22081951

  7. Gnome--an Internet-based sequence analysis tool.

    PubMed

    Nakai, K; Tokimori, T; Ogiwara, A; Uchiyama, I; Niiyama, T

    1994-09-01

    Gnome (GenomeNet Open Mail-service Environment) is a sequence analysis tool that enables an end-user to make use of several Internet- (mainly e-mail) based services with an easy-to-use graphical user interface. Users can conduct homology and motif searches, and database-entry retrieval against the latest databases by emitting search requests to and receiving their results form a search-server by e-mail. The search results are viewed and managed efficiently with this system. The Macintosh and X (Motif) versions of the Gnome client and the UNIX version of the Gnome server are available to academic users free of charge. PMID:7828072

  8. Input Range Testing for the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  9. PARSESNP: a tool for the analysis of nucleotide polymorphisms

    PubMed Central

    Taylor, Nicholas E.; Greene, Elizabeth A.

    2003-01-01

    PARSESNP is a tool for the display and analysis of polymorphisms in genes. Using a reference DNA sequence, an exon/intron position model and a list of polymorphisms, it determines the effects of these polymorphisms on the expressed gene product, as well as the changes in restriction enzyme recognition sites. It shows the locations and effects of the polymorphisms in summary on a stylized graphic and in detail on a display of the protein sequence aligned with the DNA sequence. The addition of a homology model, in the form of an alignment of related protein sequences, allows for prediction of the severity of missense changes. PARSESNP is available on the World Wide Web at http://www.proweb.org/parsesnp/. PMID:12824424

  10. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  11. Development of unified plotting tools for GA transport analysis

    NASA Astrophysics Data System (ADS)

    Buuck, M.; Candy, J.

    2011-10-01

    A collection of python classes for the TGYRO suite of codes (NEO, GYRO, TGYRO, TGLF) has been developed that provide both the expert user with conceptually simple access to all code output data, and the casual end user with simple command-line control of plotting. The user base for these transport analysis codes continues to grow, raising the urgency of modernizing and unifying the plotting tools used for post-simulation analysis. Simultaneously, there is a push toward larger-scale fusion modeling underscoring the need for a revised, modernized approach to data management and analysis. The TGYRO suite is currently in use at all major fusion laboratories worldwide, and allows the user to make steady-state profile predictions for existing devices and future reactors, and simultaneously to carry out fundamental research on plasma transport (both collisional and turbulent). Work supported by US DOE under DE-FG02-95ER54309 and the National Undergraduate Fellowship in Fusion Science and Engineering.

  12. Second generation sequencing allows for mtDNA mixture deconvolution and high resolution detection of heteroplasmy

    PubMed Central

    Holland, Mitchell M.; McQuillan, Megan R.; O’Hanlon, Katherine A.

    2011-01-01

    Aim To use parallel array pyrosequencing to deconvolute mixtures of mitochondrial DNA (mtDNA) sequence and provide high resolution analysis of mtDNA heteroplasmy. Methods The hypervariable segment 1 (HV1) of the mtDNA control region was analyzed from 30 individuals using the 454 GS Junior instrument. Mock mixtures were used to evaluate the system’s ability to deconvolute mixtures and to reliably detect heteroplasmy, including heteroplasmic differences between 5 family members of the same maternal lineage. Amplicon sequencing was performed on polymerase chain reaction (PCR) products generated with primers that included multiplex identifiers (MID) and adaptors for pyrosequencing. Data analysis was performed using NextGENe® software. The analysis of an autosomal short tandem repeat (STR) locus (D18S51) and a Y-STR locus (DYS389 I/II) was performed simultaneously with a portion of HV1 to illustrate that multiplexing can encompass different markers of forensic interest. Results Mixtures, including heteroplasmic variants, can be detected routinely down to a component ratio of 1:250 (20 minor variant copies with a coverage rate of 5000 sequences) and can be readily detected down to 1:1000 (0.1%) with expanded coverage. Amplicon sequences from D18S51, DYS389 I/II, and the second half of HV1 were successfully partitioned and analyzed. Conclusions The ability to routinely deconvolute mtDNA mixtures down to a level of 1:250 allows for high resolution analysis of mtDNA heteroplasmy, and for differentiation of individuals from the same maternal lineage. The pyrosequencing approach results in poor resolution of homopolymeric sequences, and PCR/sequencing artifacts require a filtering mechanism similar to that for STR stutter and spectral bleed through. In addition, chimeric sequences from jumping PCR must be addressed to make the method operational. PMID:21674826

  13. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.

  14. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  15. Multi-Parseval frame-based nonconvex sparse image deconvolution

    NASA Astrophysics Data System (ADS)

    Shao, Wen-Ze; Deng, Hai-Song; Wei, Zhi-Hui

    2012-06-01

    Image deconvolution is an ill-posed, low-level vision task, restoring a clear image from the blurred and noisy observation. From the perspective of statistics, previous work on image deconvolution has been formulated as a maximum a posteriori or a general Bayesian inference problem, with Gaussian or heavy-tailed non-Gaussian prior image models (e.g., a student's t distribution). We propose a Parseval frame-based nonconvex image deconvolution strategy via penalizing the l0-norm of the coefficients of multiple different Parseval frames. With these frames, flexible filtering operators are provided to adaptively capture the point singularities, the curvilinear edges and the oscillating textures in natural images. The proposed optimization problem is implemented by borrowing the idea of recent penalty decomposition method, resulting in a simple and efficient iteration algorithm. Experimental results show that the proposed deconvolution scheme is highly competitive among state-of-the-art methods, in both the improvement of signal-to-noise ratio and visual perception.

  16. Deconvolution of astronomical images using SOR with adaptive relaxation.

    PubMed

    Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

    2011-07-01

    We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity. PMID:21747506

  17. Motion-compensated blind deconvolution of scanning laser opthalmoscope imagery

    NASA Astrophysics Data System (ADS)

    O'Connor, Nathan J.; Bartsch, Dirk-Uwe G.; Freeman, William R.; Holmes, Timothy J.

    1998-06-01

    A deconvolution algorithm for use with scanning laser ophthalmoscope (SLO) data is being developed. The SLO is fundamentally a confocal microscope in which the objective lens is the human ocular lens. 3D data is collected by raster scanning to form images at different depths in retinal and choroidal layers. In this way, 3D anatomy may be imaged and stored as a series of optical sections.Given the poor optical quality of the human lens and random eye motion during data acquisition, any deconvolution method applied to SLO data must be able to account for distortions present in the observed data. The algorithm presented compensates for image warping and frame-to-frame displacement due to random eye motion, smearing along the optic axis, sensor saturation, and other problems. A preprocessing step is first used to compensate for frame-to-frame image displacement. The image warping, caused by random eye motion during raster scanning, is corrected. Finally, a maximum likelihood based blind deconvolution algorithm is used to correct severe blurring along the optic axis. The blind deconvolution algorithm contains an iterative search for subpixel displacements remaining after image warping and frame-to-frame displacements are corrected. This iterative search is formulated to ensure that the likelihood functional is non-decreasing.

  18. Blind deconvolution using an improved L0 sparse representation

    NASA Astrophysics Data System (ADS)

    Ye, Pengzhao; Feng, Huajun; Li, Qi; Xu, Zhihai; Chen, Yueting

    2014-09-01

    In this paper, we present a method for single image blind deconvolution. Many common forms of blind deconvolution methods need to previously generate a salient image, while the paper presents a novel L0 sparse expression to directly solve the ill-positioned problem. It has no need to filter the blurred image as a restoration step and can use the gradient information as a fidelity term during optimization. The key to blind deconvolution problem is to estimate an accurate kernel. First, based on L2 sparse expression using gradient operator as a prior, the kernel can be estimated roughly and efficiently in the frequency domain. We adopt the multi-scale scheme which can estimate blur kernel from coarser level to finer level. After the estimation of this level's kernel, L0 sparse representation is employed as the fidelity term during restoration. After derivation, L0 norm can be approximately converted to a sum term and L1 norm term which can be addressed by the Split-Bregman method. By using the estimated blur kernel and the TV deconvolution model, the final restoration image is obtained. Experimental results show that the proposed method is fast and can accurately reconstruct the kernel, especially when the blur is motion blur, defocus blur or the superposition of the two. The restored image is of higher quality than that of some of the art algorithms.

  19. Machine learning deconvolution filter kernels for image restoration

    NASA Astrophysics Data System (ADS)

    Mainali, Pradip; Wittebrood, Rimmert

    2015-03-01

    In this paper, we propose a novel algorithm to recover a sharp image from its corrupted form by deconvolution. The algorithm learns the deconvolution process. This is achieved by learning the deconvolution filter kernels for the set of learnt basic pixel patterns. The algorithm consists of the offline learning and online filtering stages. In the one-time offline learning stage, the algorithm learns the dictionary of various local characteristics of the pixel patch as the basic pixel patterns from a huge number of natural images in the training database. Later, the deconvolution filter coefficients for each pixel pattern is optimized by using the source and the corrupted image pairs in the training database. In the online stage, the algorithm only needs to find the nearest matching pixel pattern in the dictionary for each pixel and filter it using the filter optimized for the corresponding pixel pattern. Experimental results on natural images show that our method achieves the state-of-art result on an image deblurring. The proposed approach can be applied to recover a sharp image for applications such as camera, HD/UHD TV, document scanning systems etc.

  20. Improving the efficiency of deconvolution algorithms for sound source localization.

    PubMed

    Lylloff, Oliver; Fernández-Grande, Efrén; Agerkvist, Finn; Hald, Jørgen; Roig, Elisabet Tiana; Andersen, Martin S

    2015-07-01

    The localization of sound sources with delay-and-sum (DAS) beamforming is limited by a poor spatial resolution-particularly at low frequencies. Various methods based on deconvolution are examined to improve the resolution of the beamforming map, which can be modeled by a convolution of the unknown acoustic source distribution and the beamformer's response to a point source, i.e., point-spread function. A significant limitation of deconvolution is, however, an additional computational effort compared to beamforming. In this paper, computationally efficient deconvolution algorithms are examined with computer simulations and experimental data. Specifically, the deconvolution problem is solved with a fast gradient projection method called Fast Iterative Shrikage-Thresholding Algorithm (FISTA), and compared with a Fourier-based non-negative least squares algorithm. The results indicate that FISTA tends to provide an improved spatial resolution and is up to 30% faster and more robust to noise. In the spirit of reproducible research, the source code is available online. PMID:26233017

  1. Spent Nuclear Fuel Characterization Through Neutron Flux Deconvolution

    SciTech Connect

    Hartman, Michael R.; Lee, John C.

    2001-06-17

    A method to determine the composition of spent fuel through spectral deconvolution of the neutron flux emitted from the fuel is proposed. Recently developed GaAs({sup 10}B) semiconductor detector arrays are used. The results of Monte Carlo simulations of the detector responses, illustrating the feasibility of the spectral unfolding technique for spent fuel characterization, are presented.

  2. Blind deconvolution subject to sparse representation for fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Dai, Qionghai; Cai, Qiang; Guo, Peiyuan; Liu, Zaiwen

    2013-01-01

    Blind deconvolution is an effective fluorescence microscopic image processing technique to improve the quality of degraded digital images resulting from photon counting noise and out-of-focus blur. For solving the severely ill-posed problem of deconvolution, in this paper we propose an alternate minimized blind deconvolution method which considers sparse representation as constraint condition to confine the solution space of traditional Richardson-Lucy method and Gaussian model as initial value of point spread function (PSF). We assume that Poisson noise is dominating during the course of imaging. The maximum-likelihood estimation on a fluorescence image and corresponding point spread function (PSF) is developed. By solving the Euler-Lagrange equation of the total cost function, including the data term obtained by the hypothetical Poisson noise distribution model and the regularized term corresponding to the sparse representation constraint, and using gradient descent method we can get the iterative equations of the original fluorescence image and PSF respectively. Compared with the related blind deconvolution methods, our model shows superior performance in terms of both objective criteria and subjective human vision via processing simulated and real fluorescence microscopic degraded images.

  3. Real-time image deconvolution on the GPU

    NASA Astrophysics Data System (ADS)

    Klosowski, James T.; Krishnan, Shankar

    2011-01-01

    Two-dimensional image deconvolution is an important and well-studied problem with applications to image deblurring and restoration. Most of the best deconvolution algorithms use natural image statistics that act as priors to regularize the problem. Recently, Krishnan and Fergus provide a fast deconvolution algorithm that yields results comparable to the current state of the art. They use a hyper-Laplacian image prior to regularize the problem. The resulting optimization problem is solved using alternating minimization in conjunction with a half-quadratic penalty function. In this paper, we provide an efficient CUDA implementation of their algorithm on the GPU. Our implementation leverages many wellknown CUDA optimization techniques, as well as several others that have a significant impact on this particular algorithm. We discuss each of these, as well as make a few observations regarding the CUFFT library. Our experiments were run on an Nvidia GeForce GTX 260. For a single channel image of size 710 x 470, we obtain over 40 fps, while on a larger image of size 1900 x 1266, we get almost 6 fps (without counting disk I/O). In addition to linear performance, we believe ours is the first implementation to perform deconvolutions at video rates. Our running times also demonstrate that our GPU implementation is over 27 times faster than the original CPU implementation.

  4. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  5. Petrographic Image Analysis as a tool to improve reservoir data

    SciTech Connect

    Ballentine, F.M.; Shelby, D.L.; Philipson, C.A. )

    1990-09-01

    Petrographic Image Analysis (PIA) can be used to acquire improved core analysis data in percussion sidewalls because the analysis focuses on the unaltered portions of the cores, thereby eliminating the bias that may be created by shattered and disturbed areas. It is also a valuable tool for reservoir characterization of all rock types. Petrophysical data acquired from laboratory tests on conventional and percussion sidewall cores, wireline logs, and PIA techniques were compared for a cored interval in the Tuscaloosa Formation. Comparison of log and conventional core porosities yielded good agreement. Sidewall core laboratory porosities were 2 to 10 porosity percent too high, owing to the shattering effect. However, PIA performed on percussion core samples compared well with both the conventional core laboratory and log porosities. Permeabilities obtained from conventional core laboratory measurements and PIA analyses compared well. Sidewall permeabilities (derived from particle size analysis) showed similar trends, but tended to be optimistic. These improved Tuscaloosa porosity and permeability values, as well as other parameters derived from PIA, can be useful in log interpretation and reservoir evaluation. PIA formation factor and porosity were used to define the cementation exponent m. Capillary pressure curves generated by PIA were utilized to correlate irreducible water saturation to PIA porosity and permeability values. PIA porosity and permeability were also used to determine critical water saturations. PIA is valuable as an aid to reservoir characterization. The Smackover Formation has been separated into several hydraulic units based on pore type and fluid flow properties. PIA was used to help delineate the differences between the pore systems in these reservoir units. Binary images and shape factor distributions were utilized to characterize the distributions and shapes of the pores within each unit.

  6. Generalized Analysis Tools for Multi-Spacecraft Missions

    NASA Astrophysics Data System (ADS)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI SR-001, 1998. [2] Chanteur, G.: Spatial Interpolation for Four Spacecraft: Theory, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 371-393, ISSI SR-001, 1998. [3] Chanteur, G.: Accuracy of field gradient estimations by Cluster: Explanation of its dependency upon elongation and planarity of the tetrahedron, pp. 265-268, ESA SP-449, 2000. [4] Vogt, J., Paschmann, G., and Chanteur, G.: Reciprocal Vectors, pp. 33-46, ISSI SR-008, 2008.

  7. Breast image feature learning with adaptive deconvolutional networks

    NASA Astrophysics Data System (ADS)

    Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.

    2012-03-01

    Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpiñán, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

  8. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

  9. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  10. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  11. jSIPRO - analysis tool for magnetic resonance spectroscopic imaging.

    PubMed

    Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan

    2013-10-01

    Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

  12. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  13. Wavelet analysis as a nonstationary plasma fluctuation diagnostic tool

    SciTech Connect

    Santoso, S.; Powers, E.J.; Ouroua, A.; Heard, J.W.; Bengtson, R.D.

    1996-12-31

    Analysis of nonstationary plasma fluctuation data has been a long-time challenge for the plasma diagnostic community. For this reason, in this paper the authors present and apply wavelet transforms as a new diagnostic tool to analyze nonstationary plasma fluctuation data. Unlike the Fourier transform, which represents a given signal globally without temporal resolution, the wavelet transform provides a local representation of the given signal in the time-scale domain. The fundamental concepts and multiresolution properties of wavelet transforms, along with a brief comparison with the short-time Fourier transform, are presented in this paper. The selection of a prototype wavelet or a mother wavelet is also discussed. Digital implementation of wavelet spectral analysis, which include time-scale power spectra and scale power spectra are described. The efficacy of the wavelet approach is demonstrated by analyzing transient broadband electrostatic potential fluctuations inside the inversion radius of sawtoothing TEXT-U plasmas during electron cyclotron resonance heating. The potential signals are collected using a 2 MeV heavy ion beam probe.

  14. Elementary Mode Analysis: A Useful Metabolic Pathway Analysis Tool for Characterizing Cellular Metabolism

    PubMed Central

    Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich

    2010-01-01

    Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845

  15. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

  16. Suspected-target pesticide screening using gas chromatography-quadrupole time-of-flight mass spectrometry with high resolution deconvolution and retention index/mass spectrum library.

    PubMed

    Zhang, Fang; Wang, Haoyang; Zhang, Li; Zhang, Jing; Fan, Ruojing; Yu, Chongtian; Wang, Wenwen; Guo, Yinlong

    2014-10-01

    A strategy for suspected-target screening of pesticide residues in complicated matrices was exploited using gas chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS). The screening workflow followed three key steps of, initial detection, preliminary identification, and final confirmation. The initial detection of components in a matrix was done by a high resolution mass spectrum deconvolution; the preliminary identification of suspected pesticides was based on a special retention index/mass spectrum (RI/MS) library that contained both the first-stage mass spectra (MS(1) spectra) and retention indices; and the final confirmation was accomplished by accurate mass measurements of representative ions with their response ratios from the MS(1) spectra or representative product ions from the second-stage mass spectra (MS(2) spectra). To evaluate the applicability of the workflow in real samples, three matrices of apple, spinach, and scallion, each spiked with 165 test pesticides in a set of concentrations, were selected as the models. The results showed that the use of high-resolution TOF enabled effective extractions of spectra from noisy chromatograms, which was based on a narrow mass window (5 mDa) and suspected-target compounds identified by the similarity match of deconvoluted full mass spectra and filtering of linear RIs. On average, over 74% of pesticides at 50 ng/mL could be identified using deconvolution and the RI/MS library. Over 80% of pesticides at 5 ng/mL or lower concentrations could be confirmed in each matrix using at least two representative ions with their response ratios from the MS(1) spectra. In addition, the application of product ion spectra was capable of confirming suspected pesticides with specificity for some pesticides in complicated matrices. In conclusion, GC-QTOF MS combined with the RI/MS library seems to be one of the most efficient tools for the analysis of suspected-target pesticide residues in complicated matrices. PMID:25059143

  17. Genome-tools: a flexible package for genome sequence analysis.

    PubMed

    Lee, William; Chen, Swaine L

    2002-12-01

    Genome-tools is a Perl module, a set of programs, and a user interface that facilitates access to genome sequence information. The package is flexible, extensible, and designed to be accessible and useful to both nonprogrammers and programmers. Any relatively well-annotated genome available with standard GenBank genome files may be used with genome-tools. A simple Web-based front end permits searching any available genome with an intuitive interface. Flexible design choices also make it simple to handle revised versions of genome annotation files as they change. In addition, programmers can develop cross-genomic tools and analyses with minimal additional overhead by combining genome-tools modules with newly written modules. Genome-tools runs on any computer platform for which Perl is available, including Unix, Microsoft Windows, and Mac OS. By simplifying the access to large amounts of genomic data, genome-tools may be especially useful for molecular biologists looking at newly sequenced genomes, for which few informatics tools are available. The genome-tools Web interface is accessible at http://genome-tools.sourceforge.net, and the source code is available at http://sourceforge.net/projects/genome-tools. PMID:12503321

  18. Immunoglobulin analysis tool: a novel tool for the analysis of human and mouse heavy and light chain transcripts.

    PubMed

    Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F; Ippolito, Gregory C; Zemlin, Michael

    2012-01-01

    Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft(®) Excel(®) based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts - such as the usage of germline gene segments, or the length and composition of the CDR-3 region - IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai's H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel(®) 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set to its fullest. PMID:22754554

  19. Analysis of the influence of tool dynamics in diamond turning

    SciTech Connect

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  20. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob. PMID:18948494

  1. Bioelectrical impedance analysis: A new tool for assessing fish condition

    USGS Publications Warehouse

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  2. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  3. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  4. SEM analysis as a diagnostic tool for photovoltaic cell degradation

    NASA Astrophysics Data System (ADS)

    Osayemwenre, Gilbert; Meyer, E. L.

    2013-04-01

    The importance of scanning electron microscopy (SEM) analysis as a diagnostic tool for analyzing the degradation of a polycrystalline Photovoltaic cell has been studied. The main aim of this study is to characterize the surface morphology of hot spot regions (degraded) cells in photovoltaic solar cells. In recent years, production of hetero and multi-junction solar cells has experience tremendous growth as compared to conventional silicon (Si) solar cells. Thin film photovoltaic solar cells generally are more prone to exhibiting defects and associated degradation modes. To improve the lifetime of these cells and modules, it is imperative to fully understand the cause and effect of defects and degradation modes. The objective of this paper is to diagnose the observed degradation in polycrystalline silicon cells, using scanning electron microscopy (SEM). In this study poly-Si cells were characterize before and after reverse biasing, the reverse biasing was done to evaluate the cells' susceptibility to leakage currents and hotspots formation. After reverse biasing, some cells were found to exhibit hotspots as confirmed by infrared thermography. The surface morphology of these hotspots re

  5. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  6. The analysis of crow population dynamics as a surveillance tool.

    PubMed

    Ludwig, A; Bigras-Poulin, M; Michel, P

    2009-12-01

    West Nile virus (WNV) infection, a zoonotic disease for which birds act as a reservoir, first appeared in North America in August 1999. It was first reported in Quebec in 2002. The Quebec surveillance system for WNV has several components, including the surveillance of mortality in corvid populations, which includes the American crow (Corvus brachyrhynchos). The main objectives of this study are to better understand the population dynamics of this species in Quebec and to evaluate the impact of WNV on these dynamics. We obtained observation data for living crows in this province for the period of 1990-2005 and then conducted a spectral analysis of these data. To study changes in crow population dynamics, the analysis was carried out before and after the appearance of WNV and space was divided in two different areas (urban and non-urban). Our results show the importance of cycles with periods of less than 1 year in non-urban areas and cycles with periods of greater than 1 year in urban areas in the normal population dynamics of the species. We obtained expected fluctuations in bird densities using an algorithm derived from spectral decomposition. When we compared these predictions with data observed after 2002, we found marked perturbations in population dynamics beginning in 2003 and lasting up to 2005. In the discussion, we present various hypotheses based on the behaviour of the American crow to explain the normal population dynamics observed in this species and the effect of type of area (urban versus non-urban). We also discuss how the predictive algorithm could be used as a disease surveillance tool and as a measure of the impact of a disease on wild fauna. PMID:19811623

  7. New Access and Analysis Tools for Voyager LECP Data

    NASA Astrophysics Data System (ADS)

    Brown, L. E.; Hill, M. E.; Decker, R. B.; Cooper, J. F.; Krimigis, S. M.; Vandegriff, J. D.

    2008-12-01

    The Low Energy Charged Particle (LECP) instruments on the Voyager 1 and 2 spacecraft have been returning unique scientific measurements since launching in 1977, most notably observations from the historic tour of the giant planets. As these spacecraft continue on their exit trajectories from the Solar system they have become an interstellar mission and have begun to probe the boundary between the heliosphere and the interstellar cloud and continue to make exciting discoveries. As the mission changed from one focused on discrete encounters to an open ended search for heliospheric boundaries and transitory disturbances, the positions and timing of which are not known, the data processing needs have changed. Open data policies and the push to draw data under the umbrella of emerging Virtual Observatories have added a data sharing component that was not a part of the original mission plans. We present our work in utilizing new, reusable software analysis tools to access legacy data in a way that leverages pre-existing data analysis techniques. We took an existing Applied Physics Laboratory application, Mission Independent Data Layer (MIDL) -- developed originally under a NASA Applied Information Research Program (AISRP) and subsequently used with data from Geotail, Cassini, IMP-8, ACE, Messenger, and New Horizons -- and applied it to Voyager data. We use the MIDL codebase to automatically generate standard data products such as daily summary plots and associated tabulated data that increase our ability to monitor the heliospheric environment on a regular basis. These data products will be publicly available and updated automatically and can be analyzed by the community using the ultra portable MIDL software launched from the data distribution website. The currently available LECP data will also be described with SPASE metadata and incorporated into the emerging Virtual Energetic Particle Observatory (VEPO).

  8. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  9. Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details

    PubMed Central

    Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca

    2011-01-01

    Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners. PMID:21904538

  10. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    NASA Astrophysics Data System (ADS)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency management, to know in advance the different levels of risk perception and preparedness existing among several sectors of the population. Knowing where the most vulnerable population is located may optimize the use of resources, better direct the initial efforts and organize the evacuation and attention procedures. As part of the CBEWS, a comprehensive survey was applied in the study area to measure, among others features, the levels of risk perception, preparation and information received about natural hazards. After a statistical and direct analysis on a complete social dataset recorded, a spatial information distribution is actually in progress. Based on boundaries features (municipalities and sub-districts) of Italian Institute of Statistics (ISTAT), a local scale background has been granted (a private address level is not accessible for privacy rules so the local districts-ID inside municipality has been the detail level performed) and a spatial location of the surveyed population has been completed. The geometric component has been defined and actually it is possible to create a local distribution of social parameters derived from perception questionnaries results. A lot of raw information and social-statistical analysis offer different mirror and "visual concept" of risk perception. For this reason a concrete complete GeoDB is under working for the complete organization of the dataset. By a technical point of view the environment for data sharing is based on a complete open source web-service environment, to offer manually-made and user-friendly interface to this kind of information. Final aim is to offer different switches of dataset, using the same scale prototype and data hierarchical structure, to provide and compare social location of risk perception in the most detailed level.

  11. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    NASA Astrophysics Data System (ADS)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  12. Mission operations data analysis tools for Mars Observer guidance and control

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1994-01-01

    Mission operations for the Mars Observer (MO) Project at the Jet Propulsion Laboratory were supported by a variety of ground data processing software and analysis tools. Some of these tools were generic to multimission spacecraft mission operations, some were specific to the MO spacecraft, and others were custom tailored to the operation and control of the Attitude and Articulation Control Subsystem (AACS). The focus of this paper is on the data analysis tools for the AACS. Four different categories of analysis tools are presented; with details offered for specific tools. Valuable experience was gained from the use of these tools and through their development. These tools formed the backbone and enhanced the efficiency of the AACS Unit in the Mission Operations Spacecraft Team. These same tools, and extensions thereof, have been adopted by the Galileo mission operations, and are being designed into Cassini and other future spacecraft mission operations.

  13. De-convoluting mixed crude oil in Prudhoe Bay Field, North Slope, Alaska

    USGS Publications Warehouse

    Peters, K.E.; Scott, Ramos L.; Zumberge, J.E.; Valin, Z.C.; Bird, K.J.

    2008-01-01

    Seventy-four crude oil samples from the Barrow arch on the North Slope of Alaska were studied to assess the relative volumetric contributions from different source rocks to the giant Prudhoe Bay Field. We applied alternating least squares to concentration data (ALS-C) for 46 biomarkers in the range C19-C35 to de-convolute mixtures of oil generated from carbonate rich Triassic Shublik Formation and clay rich Jurassic Kingak Shale and Cretaceous Hue Shale-gamma ray zone (Hue-GRZ) source rocks. ALS-C results for 23 oil samples from the prolific Ivishak Formation reservoir of the Prudhoe Bay Field indicate approximately equal contributions from Shublik Formation and Hue-GRZ source rocks (37% each), less from the Kingak Shale (26%), and little or no contribution from other source rocks. These results differ from published interpretations that most oil in the Prudhoe Bay Field originated from the Shublik Formation source rock. With few exceptions, the relative contribution of oil from the Shublik Formation decreases, while that from the Hue-GRZ increases in reservoirs along the Barrow arch from Point Barrow in the northwest to Point Thomson in the southeast (???250 miles or 400 km). The Shublik contribution also decreases to a lesser degree between fault blocks within the Ivishak pool from west to east across the Prudhoe Bay Field. ALS-C provides a robust means to calculate the relative amounts of two or more oil types in a mixture. Furthermore, ALS-C does not require that pure end member oils be identified prior to analysis or that laboratory mixtures of these oils be prepared to evaluate mixing. ALS-C of biomarkers reliably de-convolutes mixtures because the concentrations of compounds in mixtures vary as linear functions of the amount of each oil type. ALS of biomarker ratios (ALS-R) cannot be used to de-convolute mixtures because compound ratios vary as nonlinear functions of the amount of each oil type.

  14. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    NASA Astrophysics Data System (ADS)

    González, Adriana; Delouille, Véronique; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting). The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  15. An Analysis of Teacher Selection Tools in Pennsylvania

    ERIC Educational Resources Information Center

    Vitale, Tracy L.

    2009-01-01

    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  16. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, D.; Kueppers, U.; Castro, J. M.

    2009-12-01

    The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with a two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. This low-viscosity, transparent glue allowed for an easy application on the target area by means of a syringe and permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  17. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

    2010-05-01

    The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. Using a syringe, this low-viscosity, transparent glue could be easily applied on the target area. We found that the glue permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  18. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  19. Image deconvolution under Poisson noise using SURE-LET approach

    NASA Astrophysics Data System (ADS)

    Xue, Feng; Liu, Jiaqi; Meng, Gang; Yan, Jing; Zhao, Min

    2015-10-01

    We propose an image deconvolution algorithm when the data is contaminated by Poisson noise. By minimizing Stein's unbiased risk estimate (SURE), the SURE-LET method was firstly proposed to deal with Gaussian noise corruption. Our key contribution is to demonstrate that the SURE-LET algorithm is also applicable for Poisson noisy image and proposed an efficient algorithm. The formulation of SURE requires knowledge of Gaussian noise variance. We experimentally found a simple and direct link between the noise variance estimated by median absolute difference (MAD) method and the optimal one that leads to the best deconvolution performance in terms of mean squared error (MSE). Extensive experiments show that this optimal noise variance works satisfactorily for a wide range of natural images.

  20. Spectral semi-blind deconvolution with hybrid regularization

    NASA Astrophysics Data System (ADS)

    Deng, L. Z.; Cao, L.; Zhu, H.

    2014-05-01

    A spectral semi-blind deconvolution with hybrid regularization (SBD-HR) is proposed to recover the spectrum and to estimate the parameter of the point spread function (PSF) adaptively. Firstly, a weighted Tikhonov regularization term about the spectrum is presented to preserve the details of spectrum. Then the regularization term about the PSF is modeled as L1-norm instead of L2-norm to enhance the stability of kernel estimation. The numerical solution processes for recovering the spectrum and for estimating the parameter of the PSF are deduced. Simulation results of infrared spectrum deconvolution demonstrate that the proposed method can recover the spectrum better from the degraded spectrum and estimate the parameter of the PSF accurately.

  1. Deconvolution of impulse response by the truncated singular value decomposition

    NASA Astrophysics Data System (ADS)

    Cha, Hao; Dai, Mingzhen

    1992-12-01

    The impulse response of the target plays a very important part in the target identification. However, the impulse response of an electromagnetic system can be obtained through solving the convolution problem. In this paper, the ill-posed problem is solved directly in the time domain by the truncated singular value decomposition (TSVD) method. The less singular values which cause the problem to be ill-posed are discarded, so that the deconvolution problem is changed into a well-posed problem. The impulse response of a sphere is used to make simulations. The results show that the TSVD algorithm can improve the result of the deconvolution by about 10 dB (SNR) compared with the conventional conjugated gradient method.

  2. Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification

    NASA Astrophysics Data System (ADS)

    Lopes, R.; Vicente, D.; Silva, N.

    2009-05-01

    Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

  3. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  4. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    SciTech Connect

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  5. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  6. Exothermic Behavior of Thermal Decomposition of Sodium Percarbonate: Kinetic Deconvolution of Successive Endothermic and Exothermic Processes.

    PubMed

    Nakano, Masayoshi; Wada, Takeshi; Koga, Nobuyoshi

    2015-09-24

    This study focused on the kinetic modeling of the thermal decomposition of sodium percarbonate (SPC, sodium carbonate-hydrogen peroxide (2/3)). The reaction is characterized by apparently different kinetic profiles of mass-loss and exothermic behavior as recorded by thermogravimetry and differential scanning calorimetry, respectively. This phenomenon results from a combination of different kinetic features of the reaction involving two overlapping mass-loss steps controlled by the physico-geometry of the reaction and successive endothermic and exothermic processes caused by the detachment and decomposition of H2O2(g). For kinetic modeling, the overall reaction was initially separated into endothermic and exothermic processes using kinetic deconvolution analysis. Then, both of the endothermic and exothermic processes were further separated into two reaction steps accounting for the physico-geometrically controlled reaction that occurs in two steps. Kinetic modeling through kinetic deconvolution analysis clearly illustrates the appearance of the net exothermic effect is the result of a slight delay of the exothermic process to the endothermic process in each physico-geometrically controlled reaction step. This demonstrates that kinetic modeling attempted in this study is useful for interpreting the exothermic behavior of solid-state reactions such as the oxidative decomposition of solids and thermal decomposition of oxidizing agent. PMID:26371394

  7. Online Analysis of Wind and Solar Part I: Ramping Tool

    SciTech Connect

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  8. Online Analysis of Wind and Solar Part II: Transmission Tool

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  9. Blind deconvolution estimation of fluorescence measurements through quadratic programming

    NASA Astrophysics Data System (ADS)

    Campos-Delgado, Daniel U.; Gutierrez-Navarro, Omar; Arce-Santana, Edgar R.; Skala, Melissa C.; Walsh, Alex J.; Jo, Javier A.

    2015-07-01

    Time-deconvolution of the instrument response from fluorescence lifetime imaging microscopy (FLIM) data is usually necessary for accurate fluorescence lifetime estimation. In many applications, however, the instrument response is not available. In such cases, a blind deconvolution approach is required. An iterative methodology is proposed to address the blind deconvolution problem departing from a dataset of FLIM measurements. A linear combination of a base conformed by Laguerre functions models the fluorescence impulse response of the sample at each spatial point in our formulation. Our blind deconvolution estimation (BDE) algorithm is formulated as a quadratic approximation problem, where the decision variables are the samples of the instrument response and the scaling coefficients of the basis functions. In the approximation cost function, there is a bilinear dependence on the decision variables. Hence, due to the nonlinear nature of the estimation process, an alternating least-squares scheme iteratively solves the approximation problem. Our proposal searches for the samples of the instrument response with a global perspective, and the scaling coefficients of the basis functions locally at each spatial point. First, the iterative methodology relies on a least-squares solution for the instrument response, and quadratic programming for the scaling coefficients applied just to a subset of the measured fluorescence decays to initially estimate the instrument response to speed up the convergence. After convergence, the final stage computes the fluorescence impulse response at all spatial points. A comprehensive validation stage considers synthetic and experimental FLIM datasets of ex vivo atherosclerotic plaques and human breast cancer cell samples that highlight the advantages of the proposed BDE algorithm under different noise and initial conditions in the iterative scheme and parameters of the proposal.

  10. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  11. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  12. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras

  13. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  14. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing (NGS) data. We make OmicsTract and SynpExtractor publicly and freely available for non-commercial use in order to strengthen and build capacity for the phylogenetic paradigm of omics analysis. PMID:26230532

  15. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate change. Moreover, Climate Wizard is not a static product, but rather a data analysis framework designed to be used for climate change impact and adaption planning, which can be expanded to include other information, such as downscaled future projections of hydrology, soil moisture, wildfire, vegetation, marine conditions, disease, and agricultural productivity. PMID:20016827

  16. GENOME RESOURCES AND COMPARATIVE ANALYSIS TOOLS FOR CARDIOVASCULAR RESEARCH

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Disorders of the cardiovascular (CV) system are often caused by the interaction of genetic and environmental factors that jointly contribute to individual susceptibility. Genomic data and bioinformatics tools generated from genome projects, coupled with functional verification, offer novel approache...

  17. Hopfield neural network deconvolution for weak lensing measurement

    NASA Astrophysics Data System (ADS)

    Nurbaeva, G.; Tewes, M.; Courbin, F.; Meylan, G.

    2015-05-01

    Weak gravitational lensing has the potential to place tight constraints on the equation of the state of dark energy. However, this will only be possible if shear measurement methods can reach the required level of accuracy. We present a new method for measuring the ellipticity of galaxies used in weak lensing surveys. The method makes use of direct deconvolution of the data by the total point spread function (PSF). We adopt a linear algebra formalism that represents the PSF as a Toeplitz matrix. This allows us to solve the convolution equation by applying the Hopfield neural network iterative scheme. The ellipticity of galaxies in the deconvolved images are then measured using second-order moments of the autocorrelation function of the images. To our knowledge, it is the first time full image deconvolution has been used to measure weak lensing shear. We apply our method to the simulated weak lensing data proposed in the GREAT10 challenge and obtain a quality factor of Q = 87. This result is obtained after applying image denoising to the data, prior to the deconvolution. The additive and multiplicative biases on the shear power spectrum are then √{A}= + 0.09 × 10-4 and ℳ/2 = +0.0357, respectively.

  18. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  19. Simultaneous deconvolution of the bivariate distribution of molecular weight and chemical composition of polyolefins made with ziegler-natta catalysts.

    PubMed

    Alghyamah, Abdulaziz A; Soares, João B P

    2009-02-18

    Polyolefins made with Ziegler-Natta catalysts have non-uniform distributions of molecular weight (MWD) and chemical composition (CCD). The MWD is usually measured by high-temperature gel permeation chromatography (GPC) and the CCD by either temperature rising elution fractionation (TREF) or crystallization analysis fractionation (CRYSTAF). A mathematical model is needed to quantify the information provided by these analytical techniques and to relate it to the presence of multiple site types on Ziegler-Natta catalysts. We developed a robust computer algorithm to deconvolute the MWD and CCD of polyolefins simultaneously using Flory's most probable distribution and the cumulative CCD component of Stockmayer's distribution, which includes the soluble fraction commonly present in linear low-density polyethylene (LLDPE) resins and have applied this procedure for the first time to several industrial LLDPE resins. The deconvolution results are reproducible and consistent with theoretical expectations. PMID:21706614

  20. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    SciTech Connect

    Jodoin, Vincent J; Lee, Ronald W; Peplow, Douglas E.; Lefebvre, Jordan P

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  1. Optimizing end-to-end system performance for millimeter and submillimeter spectroscopy of protostars : wideband heterodyne receivers and sideband-deconvolution techniques for rapid molecular-line surveys

    NASA Astrophysics Data System (ADS)

    Sumner, Matthew Casey

    This thesis describes the construction, integration, and use of a new 230-GHz ultra-wideband heterodyne receiver, as well as the development and testing of a new sideband-deconvolution algorithm, both designed to enable rapid, sensitive molecular-line surveys. The 230-GHz receiver, known as Z-Rex, is the first of a new generation of wideband receivers to be installed at the Caltech Submillimeter Observatory (CSO). Intended as a proof-of-concept device, it boasts an ultra-wide IF output range of sim 6 - 18 GHz, offering as much as a twelvefold increase in the spectral coverage that can be achieved with a single LO setting. A similarly wideband IF system has been designed to couple this receiver to an array of WASP2 spectrometers, allowing the full bandwidth of the receiver to be observed at low resolution, ideal for extra-galactic redshift surveys. A separate IF system feeds a high-resolution 4-GHz AOS array frequently used for performing unbiased line surveys of galactic objects, particularly star-forming regions. The design and construction of the wideband IF system are presented, as is the work done to integrate the receiver and the high-resolution spectrometers into a working system. The receiver is currently installed at the CSO where it is available for astronomers' use. In addition to demonstrating wideband design principles, the receiver also serves as a testbed for a synthesizer-driven, active LO chain that is under consideration for future receiver designs. Several lessons have been learned, including the importance of driving the final amplifier of the LO chain into saturation and the absolute necessity of including a high-Q filter to remove spurious signals from the synthesizer output. The on-telescope performance of the synthesizer-driven LO chain is compared to that of the Gunn-oscillator units currently in use at the CSO. Although the frequency agility of the synthesized LO chain gives it a significant advantage for unbiased line surveys, the cleaner signal and broader tuning range of the Gunn continue to make it the preferred choice. The receiver and high-resolution spectrometer system were brought into a fully operational state late in 2007, when they were used to perform unbiased molecular-line surveys of several galactic sources, including the Orion KL hot core and a position in the L1157 outflow. In order to analyze these data, a new data pipeline was needed to deconvolve the double-sideband signals from the receiver and to model the molecular spectra. A highly automated sideband-deconvolution system has been created, and spectral-analysis tools are currently being developed. The sideband deconvolution relies on chi-square minimization to determine the optimal single-sideband spectrum in the presence of unknown sideband-gain imbalances and spectral baselines. Analytic results are presented for several different methods of approaching the problem, including direct optimization, nonlinear root finding, and a hybrid approach that utilizes a two-stage process to separate out the relatively weak nonlinearities so that the majority of the parameters can be found with a fast linear solver. Analytic derivations of the Jacobian matrices for all three cases are presented, along with a new Mathematica utility that enables the calculation of arbitrary gradients. The direct-optimization method has been incorporated into software, along with a spectral simulation engine that allows different deconvolution scenarios to be tested. The software has been validated through the deconvolution of simulated data sets, and initial results from L1157 and Orion are presented. Both surveys demonstrate the power of the wideband receivers and improved data pipeline to enable exciting scientific studies. The L1157 survey was completed in only 20 hours of telescope time and offers moderate sensitivity over a > 50-GHz range, from 220 GHz to approximately 270 or 280 GHz. The speed with which this survey was completed implies that the new systems will permit unbiased line surveys to become a standard observational tool. The Orion survey is expected to offer sim 30 mK sensitivity over a similar frequency range, improving previous results by an order of magnitude. The new receiver's ability to cover such broad bandwidths permits very deep surveys to be completed in a reasonable time, and the sideband-deconvolution algorithm is capable of preserving these low noise levels. Combined, these tools can provide line spectra with the sensitivity required for constraining astrochemical models and investigating prebiotic molecules.

  2. The ISOPHOT Interactive Analysis PIA, a Calibration and Scientific Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gabriel, C.; Acosta-Pulido, J.; Heinrichsen, I.; Morris, H.; Tai, W.-M.

    ISOPHOT is one of the instruments on board the Infrared Space Observatory (ISO), launched in November 1995 by the European Space Agency. ISOPHOT Interactive Analysis, PIA, is a scientific and calibration data analysis tool for ISOPHOT data reduction. The PIA software was designed both as a tool for use by the instrument team for calibration of the PHT instrument during and after the ISO mission, and as an interactive tool for ISOPHOT data analysis by general observers. It has been jointly developed by the ESA Astrophysics Division (responsible for planning, direction, and execution) and scientific institutes forming the ISOPHOT consortium. PIA is entirely based on the Interactive Data Language IDL (IDL92). All the capabilities of input/output, processing, and visualization can be reached from window menus for all the different ISOPHOT sub-systems, in all levels of data reduction from digital raw data, coming from the ISO telemetry, to the final level of calibrated images, spectra, and multi-filter, multi-aperture photometry. In this article, we describe the structure of the package, putting special emphasis on the internal data organization and management.

  3. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed instruction tailored to their experience level along with proper support and mentoring.This project was funded by a grant from the National Science Foundation, Grant # PHY1157078.

  4. Deconvolution of neural dynamics from fMRI data using a spatiotemporal hemodynamic response function.

    PubMed

    Aquino, K M; Robinson, P A; Schira, M M; Breakspear, M

    2014-07-01

    Functional magnetic resonance imaging (fMRI) is a powerful and broadly used means of non-invasively mapping human brain activity. However fMRI is an indirect measure that rests upon a mapping from neuronal activity to the blood oxygen level dependent (BOLD) signal via hemodynamic effects. The quality of estimated neuronal activity hinges on the validity of the hemodynamic model employed. Recent work has demonstrated that the hemodynamic response has non-separable spatiotemporal dynamics, a key property that is not implemented in existing fMRI analysis frameworks. Here both simulated and empirical data are used to demonstrate that using a physiologically based model of the spatiotemporal hemodynamic response function (stHRF) results in a quantitative improvement of the estimated neuronal response relative to unphysical space-time separable forms. To achieve this, an integrated spatial and temporal deconvolution is established using a recently developed stHRF. Simulated data allows the variation of key parameters such as noise and the spatial complexity of the neuronal drive, while knowing the neuronal input. The results demonstrate that the use of a spatiotemporally integrated HRF can avoid "ghost" neuronal responses that can otherwise be falsely inferred. Applying the spatiotemporal deconvolution to high resolution fMRI data allows the recovery of neuronal responses that are consistent with independent electrophysiological measures. PMID:24632091

  5. Mechanisms proposed for spectrogram correlation and transformation deconvolution in FM bat sonar

    NASA Astrophysics Data System (ADS)

    Simmons, James A.

    2005-09-01

    Big brown bats use time/frequency distributions to represent FM biosonar pulses and echoes as a consequence of reception through frequency tuned channels of the inner ear and subsequent processing by similarly tuned neural channels in the auditory pathway. Integration time is 350 μs, yet delay resolution is 2-10 μs, which must be based on detecting changes in the echo spectrum caused by interference between overlapping reflections inside the integration time. However, bats perceive not merely the echo interference spectrum but the numerical value of the delay separation from the spectrum, which requires deconvolution. Because spectrograms are the initial representation, this process is spectrogram correlation and transformation (SCAT). Proposed SCAT deconvolution mechanisms include extraction of echo envelope ripples for time-domain spectrometry, cepstral analysis of echoes, use of coherent or noncoherent reconstruction with basis functions, segmentation of onsets of overlapping replicas at moderate to long time separations, and localization of the occurrence of spectral interference ripples at specific times within dechirped spectrograms. Physiological evidence from single-unit recordings reveals a cepstral-like time-frequency process based on freqlets, both single-unit and multiunit responses reveal which may prove to be time-domain basis functions, and multiunit responses exhibit modulations by onset and envelope ripple. [Work supported by NIH and ONR.

  6. Thorium concentrations in the lunar surface. V - Deconvolution of the central highlands region

    NASA Technical Reports Server (NTRS)

    Metzger, A. E.; Etchegaray-Ramirez, M. I.; Haines, E. L.

    1982-01-01

    The distribution of thorium in the lunar central highlands measured from orbit by the Apollo 16 gamma-ray spectrometer is subjected to a deconvolution analysis to yield improved spatial resolution and contrast. Use of two overlapping data fields for complete coverage also provides a demonstration of the technique's ability to model concentrations several degrees beyond the data track. Deconvolution reveals an association between Th concentration and the Kant Plateau, Descartes Mountain and Cayley plains surface formations. The Kant Plateau and Descartes Mountains model with Th less than 1 part per million, which is typical of farside highlands but is infrequently seen over any other nearside highland portions of the Apollo 15 and 16 ground tracks. It is noted that, if the Cayley plains are the result of basin-forming impact ejecta, the distribution of Th concentration with longitude supports an origin from the Imbrium basin rather than the Nectaris or Orientale basins. Nectaris basin materials are found to have a Th concentration similar to that of the Descartes Mountains, evidence that the latter may have been emplaced as Nectaris basin impact deposits.

  7. Blind deconvolution of 3D fluorescence microscopy using depth-variant asymmetric PSF.

    PubMed

    Kim, Boyoung; Naemura, Takeshi

    2016-06-01

    The 3D wide-field fluorescence microscopy suffers from depth-variant asymmetric blur. The depth-variance and axial asymmetry are due to refractive index mismatch between the immersion and the specimen layer. The radial asymmetry is due to lens imperfections and local refractive index inhomogeneities in the specimen. To obtain the PSF that has these characteristics, there were PSF premeasurement trials. However, they are useless since imaging conditions such as camera position and refractive index of the specimen are changed between the premeasurement and actual imaging. In this article, we focus on removing unknown depth-variant asymmetric blur in such an optical system under the assumption of refractive index homogeneities in the specimen. We propose finding few parameters in the mathematical PSF model from observed images in which the PSF model has a depth-variant asymmetric shape. After generating an initial PSF from the analysis of intensities in the observed image, the parameters are estimated based on a maximum likelihood estimator. Using the estimated PSF, we implement an accelerated GEM algorithm for image deconvolution. Deconvolution result shows the superiority of our algorithm in terms of accuracy, which quantitatively evaluated by FWHM, relative contrast, standard deviation values of intensity peaks and FWHM. Microsc. Res. Tech. 79:480-494, 2016. © 2016 Wiley Periodicals, Inc. PMID:27062314

  8. An Optimal Deconvolution Method for Reconstructing Pneumatically Distorted Near-Field Sonic Boom Pressure Measurements

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Haering, Edward A., Jr.; Ehernberger, L. J.

    1996-01-01

    In-flight measurements of the SR-71 near-field sonic boom were obtained by an F-16XL airplane at flightpath separation distances from 40 to 740 ft. Twenty-two signatures were obtained from Mach 1.60 to Mach 1.84 and altitudes from 47,600 to 49,150 ft. The shock wave signatures were measured by the total and static sensors on the F-16XL noseboo. These near-field signature measurements were distorted by pneumatic attenuation in the pitot-static sensors and accounting for their effects using optimal deconvolution. Measurement system magnitude and phase characteristics were determined from ground-based step-response tests and extrapolated to flight conditions using analytical models. Deconvolution was implemented using Fourier transform methods. Comparisons of the shock wave signatures reconstructed from the total and static pressure data are presented. The good agreement achieved gives confidence of the quality of the reconstruction analysis. although originally developed to reconstruct the sonic boom signatures from SR-71 sonic boom flight tests, the methods presented here generally apply to other types of highly attenuated or distorted pneumatic measurements.

  9. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  10. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The Lagrangian specificity of the required measurements is assured by the fact that a transducer enclosed within a solid material is necessarily linked in motion to the particles of the material which surround it. This Lagrangian instrumentation is described in the second chapter. The authors are concerned with the techniques considered today to be the most effective. These are, for stress : piezoresistive gauges (50 Ω and low impedance) and piezoelectric techniques (PVF2 gauges, quartz transducers) ; and for particle velocity : electromagnetic gauges, VISAR and IDL Doppler laser interferometers. In each case both the physical principles as well as techniques of use are set out in detail. For the most part, the authors use their own experience to describe the calibration of these instrumentation systems and to compare their characteristics : measurement range, response time, accuracy, useful recording time, detection area... These characteristics should be taken into account by the physicist when he has to choose the instrumentation systems best adapted to the Lagrangian analysis he intends to apply to any given material. The discussion at the end of chapter 2 should guide his choice both for plane and spherical one-dimensional motions. The third chapter examines to what extent the accuracy of Lagrangian analysis is affected by the accuracies of the numerical analysis methods and experimental techniques. By means of a discussion of different cases of analysis, the authors want to make the reader aware of the different kinds of sources of errors that may be encountered. This work brings up to date the state of studies on Lagrangian analysis methods based on a wide review of bibliographical sources together with the contribution made to research in this field by the four authors themselves in the course of the last ten years. Le formage des métaux par explosif, la consolidation dynamique des poudres, la balistique terminale, l'abattage des roches par explosif, sont autant d'applications, dans les domaines civil et militaire, qui exigent d'approfondir les connaissances que l'on a des comportements des matériaux chargés par des ondes de contrainte de forte intensité. C'est dans ce domaine que les méthodes d'analyse lagrangienne, sujets de cet ouvrage, seront un outil d'investigation intéressant pour le mécanicien du solide. L'analyse lagrangienne a été développée autour des années 1970 par Fowles et Williams. L'idée de base repose sur l'intégration des équations de la mécanique en utilisant l'histoire des contraintes ou des vitesses matérielles obtenues au moyen de capteurs minces sur le trajet d'une onde de contrainte. Sont ainsi obtenues toutes les grandeurs cinématiques et mécaniques contenues dans les équations de conservation. Dans le premier chapitre, les auteurs introduisent les outils mathématiques destinés à analyser les mouvements monodimensionnels plan et sphérique. Pour le premier de ces mouvements, sont décrites les méthodes mathématiques d'analyse propres aux trois régimes de propagation d'onde rencontrés: l'onde instationnaire non amortie, l'onde simple et l'onde instationnaire amortie. Par ailleurs, pour chacun de ces régimes sont traités les cas où l'on dispose initialement des profils des contraintes ou des vitesses matérielles. Les auteurs insistent sur le fait que l'une ou l'autre des deux collections de données (contraintes et vitesses matérielles) sont suffisantes pour intégrer les équations de conservation dans le cas du mouvement plan, alors que les deux collections de données sont indispensables dans le cas de mouvement sphérique. Cependant, malgré cette difficulté supplémentaire, l'analyse lagrangienne du mouvement sphérique reste particulièrement intéressante pour le mécanicien puisqu'elle permet d'accéder au comportement du matériau pour des processus de déformation différents de celui imposé par le mouvement plan. Les méthodes exposées dans le premier chapitre reposent sur la mesure lagrangienne de la vitesse matérielle et de la contrainte, en fonction du temps dans le matériau comprimé par une onde longitudinale plane ou sphérique. La spécificité lagrangienne des mesures requises est assurec par le fait qu'un capteur inclus dans un matériau solide est lié dans le mouvement aux particules du matériau qui l'entourent. Cette métrologie lagrangienne est décrite dans le deuxième chapitre. Les auteurs se sont intéressés aux techniques considérées aujourd'hui comme les plus performantes ; elles se rapportent d'une part à la vitesse matérielle: jauges électromagnétiques, interféromètres Doppler laser de types VISAR et IDL ; d'autre part à la contrainte : jauges piézorésistives (50 Ω et basse impédance) et techniques piézoélectriques (jauges PVF2-capteurs à quartz). Dans chacun des cas les principes de fonctionnement ainsi que les techniques de mise en œuvre sont détaillés. Les auteurs utilisent largement leurs propres résultats pour décrire l'étalonnage de ces moyens métrologiques et comparer leurs performances : étendue de mesure, temps de réponse, précision, durée maximale d'utilisation, surface visée... Ce sont ces performances que devra prendre en compte le mécanicien lorsqu'il devra choisir parmi les différents moyens métrologiques la technique la mieux adaptée à l'analyse lagrangienne qu'il se proposera d'effectuer dans un matériau donné. Les arguments développés à la fin du chapitre 2 devraient lui permettre de guider son choix aussi bien dans le cas du mouvement monodimensionnel plan que dans le cas du mouvement monodimensionnel sphérique. Dans le troisième chapitre la précision de l'analyse lagrangienne est examinée sous l'aspect de l'influence des précisions des méthodes d'analyse numérique d'une part et des techniques expérimentales d'autre part. A partir du traitement de différents cas concrets d'analyse, les auteurs cherchent à sensibiliser le lecteur aux différentes sources d'erreurs rencontrées. Cet ouvrage constitue un état de l'art sur les méthodes d'analyse lagrangienne à partir d'une large étude bibliographique et de la contribution apportée dans ce domaine par les quatre auteurs depuis une dizaine d'années.

  11. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  12. The enhancement of fault detection and diagnosis in rolling element bearings using minimum entropy deconvolution combined with spectral kurtosis

    NASA Astrophysics Data System (ADS)

    Sawalhi, N.; Randall, R. B.; Endo, H.

    2007-08-01

    Spectral kurtosis (SK) represents a valuable tool for extracting transients buried in noise, which makes it very powerful for the diagnostics of rolling element bearings. However, a high value of SK requires that the individual transients are separated, which in turn means that if their repetition rate is high their damping must be sufficiently high that each dies away before the appearance of the next. This paper presents an algorithm for enhancing the surveillance capability of SK by using the minimum entropy deconvolution (MED) technique. The MED technique effectively deconvolves the effect of the transmission path and clarifies the impulses, even where they are not separated in the original signal. The paper illustrates these issues by analysing signals taken from a high-speed test rig, which contained a bearing with a spalled inner race. The results show that the use of the MED technique dramatically sharpens the pulses originating from the impacts of the balls with the spall and increases the kurtosis values to a level that reflects the severity of the fault. Moreover, when the algorithm was tested on signals taken from a gearbox for a bearing with a spalled outer race, it shows that each of the impulses originating from the impacts is made up of two parts (corresponding to entry into and exit from the spall). This agrees well with the literature but is often difficult to observe without the use of the MED technique. The use of the MED along with SK analysis also greatly improves the results of envelope analysis for making a complete diagnosis of the fault and trending its progression.

  13. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  14. Maximum correlated Kurtosis deconvolution and application on gear tooth chip fault detection

    NASA Astrophysics Data System (ADS)

    McDonald, Geoff L.; Zhao, Qing; Zuo, Ming J.

    2012-11-01

    In this paper a new deconvolution method is presented for the detection of gear and bearing faults from vibration data. The proposed maximum correlated Kurtosis deconvolution method takes advantage of the periodic nature of the faults as well as the impulse-like vibration behaviour associated with most types of faults. The results are compared to the standard minimum entropy deconvolution method on both simulated and experimental data. The experimental data is from a gearbox with gear chip fault, and the results are compared between healthy and faulty vibrations. The results indicate that the proposed maximum correlated Kurtosis deconvolution method performs considerably better than the traditional minimum entropy deconvolution method, and often performs several times better at fault detection. In addition to this improved performance, deconvolution of separate fault periods is possible; allowing for concurrent fault detection. Finally, an online implementation is proposed and shown to perform well and be computationally achievable on a personal computer.

  15. Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.

    PubMed

    Xavier, Daniela; Vázquez, Sara; Higuera, Clara; Morán, Federico; Montero, Francisco

    2011-08-01

    Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. PMID:21554926

  16. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. PMID:26679001

  17. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  18. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  19. Tools for Education Policy Analysis [with CD-ROM].

    ERIC Educational Resources Information Center

    Mingat, Alain; Tan, Jee-Peng

    This manual contains a set of tools to assist policymakers in analyzing and revamping educational policy. Its main focus is on some economic and financial aspects of education and selected features in the arrangements for service delivery. Originally offered as a series of training workshops for World Bank staff to work with clients in the…

  20. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key

  1. Mapping and spatiotemporal analysis tool for hydrological data: Spellmap

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

  2. TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.

    PubMed

    Fimereli, Danai; Detours, Vincent; Konopka, Tomasz

    2013-04-01

    High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/. PMID:23408855

  3. Determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1991-01-01

    The final report for work on the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution is presented. Papers and theses prepared during the research report period are included. Among all the research results reported, note should be made of the specific investigation of the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution. A methodology was developed to determine design and operation parameters for error minimization when deconvolution is included in data analysis. An error surface is plotted versus the signal-to-noise ratio (SNR) and all parameters of interest. Instrumental characteristics will determine a curve in this space. The SNR and parameter values which give the projection from the curve to the surface, corresponding to the smallest value for the error, are the optimum values. These values are constrained by the curve and so will not necessarily correspond to an absolute minimum in the error surface.

  4. The digital penalized LMS deconvolution method for TPC X-ray polarimeter signal processing

    NASA Astrophysics Data System (ADS)

    He, L.; Deng, Z.; Li, H.; Liu, Y. N.; Feng, H.

    2015-04-01

    This article presents the Digital Penalized LMS (Least Mean Square) deconvolution method for processing the X-ray polarimeter readout electronics output signal. The deconvolution filter is used to recover the detector signal high frequency component, which is lost due to the limited bandwidth of the readout electronics. The DPLMS deconvolution method does not need to know the transfer function of the readout electronics system in advance and can restrain the deconvolution noise by using a noise constraint. In this paper, this method will be applied to process the simulation data generated by GEANT4 and the resulting photoelectron angular resolution of a X-ray polarimeter will be presented.

  5. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    SciTech Connect

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  6. Translational meta-analysis tool for temporal gene expression profiles.

    PubMed

    Tusch, Guenter; Tole, Olvi

    2012-01-01

    Widespread use of microarray technology that led to highly complex datasets often is addressing similar or related biological questions. In translational medicine research is often based on measurements that have been obtained at different points in time. However, the researcher looks at them as a progression over time. If a biological stimulus shows an effect on a particular gene that is reversed over time, this would show, for instance, as a peak in the gene's temporal expression profile. Our program SPOT helps researchers find these patterns in large sets of microarray data. We created the software tool using open-source platforms and the Semantic Web tool Protégé-OWL. PMID:22874385

  7. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  8. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  9. Analysis tools for the calibration and commissioning of the AOF

    NASA Astrophysics Data System (ADS)

    Garcia-Rissmann, Aurea; Kolb, Johann; Le Louarn, Miska; Madec, Pierre-Yves; Muller, Nicolas

    2013-12-01

    The Adaptive Optics Facility (AOF) is an AO-oriented upgrade envisaged to be implemented at the UT4 in Paranal in 2013-2014, and which could serve as a test case for the E-ELT. Counting on the largest Deformable Secondary Mirror ever built (1170 actuators) and on four off-axes Na laser launch telescopes, the AOF will operate in distinct modes (GLAO, LTAO, SCAO), in accordance to the instruments attached to the 2 telescope Nasmyth ports (GALACSI+MUSE, GRAAL+HAWK-I) and to the Cassegrain port (ERIS). Tools are under development to allow a fast testing of important parameters for these systems when at commissioning and for posterior assessment of telemetry data. These concern the determination of turbulence parameters and Cn2 profiling, measurement of Strehl and ensquared energies, misregistration calculation, bandwidth & overall performance, etc. Our tools are presented as Graphical User Interfaces developed in the Matlab environment, and will be able to grab through a dedicated server data saved in SPARTA standards. We present here the tools developed up to present date and discuss details of what can be obtained from the AOF, based on simulations.

  10. Spectral Analysis Tool (SAT) for Radio-Frequency Interference Analysis and Spectrum Management

    NASA Astrophysics Data System (ADS)

    Lo, V. Y.; Chen, F.; Rucker, J.

    1998-04-01

    A microcomputer-based software for analyzing radio-frequency interference is presented in this article. The latest enhancement (Version 5) of the Spectral Analysis Tool (SAT) contains numerous features essential for effective spectrum management. Included in the SAT windows menu are editors of signal/interference, filters, spikes, power spectrum plots, calculators for spectral power, telecommunication link budget tables, and interference analysis tables. The configuration menu for the analysis table contains options on antennas, amplifiers, and receivers for ground terminal characterization. The complete software package of SAT, with a Windows installation program, fits within a single 3.5-inch diskette for easy distribution. Signal and ground terminal models are described. Finally, examples of the application of SAT also are included in this article.

  11. OutbreakTools: A new platform for disease outbreak analysis using the R software

    PubMed Central

    Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Höhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

    2014-01-01

    The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

  12. Optimal application of Morrison's iterative noise removal for deconvolution. Appendices

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1987-01-01

    Morrison's iterative method of noise removal, or Morrison's smoothing, is applied in a simulation to noise-added data sets of various noise levels to determine its optimum use. Morrison's smoothing is applied for noise removal alone, and for noise removal prior to deconvolution. For the latter, an accurate method is analyzed to provide confidence in the optimization. The method consists of convolving the data with an inverse filter calculated by taking the inverse discrete Fourier transform of the reciprocal of the transform of the response of the system. Various length filters are calculated for the narrow and wide Gaussian response functions used. Deconvolution of non-noisy data is performed, and the error in each deconvolution calculated. Plots are produced of error versus filter length; and from these plots the most accurate length filters determined. The statistical methodologies employed in the optimizations of Morrison's method are similar. A typical peak-type input is selected and convolved with the two response functions to produce the data sets to be analyzed. Both constant and ordinate-dependent Gaussian distributed noise is added to the data, where the noise levels of the data are characterized by their signal-to-noise ratios. The error measures employed in the optimizations are the L1 and L2 norms. Results of the optimizations for both Gaussians, both noise types, and both norms include figures of optimum iteration number and error improvement versus signal-to-noise ratio, and tables of results. The statistical variation of all quantities considered is also given.

  13. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. PMID:24522836

  14. A representation for Green's function retrieval by multidimensional deconvolution.

    PubMed

    Wapenaar, Kees; van der Neut, Joost

    2010-12-01

    Green's function retrieval by crosscorrelation may suffer from irregularities in the source distribution, asymmetric illumination, intrinsic losses, etc. Multidimensional deconvolution (MDD) may overcome these limitations. A unified representation for Green's function retrieval by MDD is proposed. From this representation, it follows that the traditional crosscorrelation method gives a Green's function of which the source is smeared in space and time. This smearing is quantified by a space-time point-spread function (PSF), which can be retrieved from measurements at an array of receivers. MDD removes this PSF and thus deblurs and deghosts the source of the Green's function obtained by correlation. PMID:21218859

  15. Multi-frame blind deconvolution using sparse priors

    NASA Astrophysics Data System (ADS)

    Dong, Wende; Feng, Huajun; Xu, Zhihai; Li, Qi

    2012-05-01

    In this paper, we propose a method for multi-frame blind deconvolution. Two sparse priors, i.e., the natural image gradient prior and an l1-norm based prior are used to regularize the latent image and point spread functions (PSFs) respectively. An alternating minimization approach is adopted to solve the resulted optimization problem. We use both gray scale blurred frames from a data set and some colored ones which are captured by a digital camera to verify the robustness of our approach. Experimental results show that the proposed method can accurately reconstruct PSFs with complex structures and the restored images are of high quality.

  16. Reconstructing the Genomic Content of Microbiome Taxa through Shotgun Metagenomic Deconvolution

    PubMed Central

    Carr, Rogan; Shen-Orr, Shai S.; Borenstein, Elhanan

    2013-01-01

    Metagenomics has transformed our understanding of the microbial world, allowing researchers to bypass the need to isolate and culture individual taxa and to directly characterize both the taxonomic and gene compositions of environmental samples. However, associating the genes found in a metagenomic sample with the specific taxa of origin remains a critical challenge. Existing binning methods, based on nucleotide composition or alignment to reference genomes allow only a coarse-grained classification and rely heavily on the availability of sequenced genomes from closely related taxa. Here, we introduce a novel computational framework, integrating variation in gene abundances across multiple samples with taxonomic abundance data to deconvolve metagenomic samples into taxa-specific gene profiles and to reconstruct the genomic content of community members. This assembly-free method is not bounded by various factors limiting previously described methods of metagenomic binning or metagenomic assembly and represents a fundamentally different approach to metagenomic-based genome reconstruction. An implementation of this framework is available at http://elbo.gs.washington.edu/software.html. We first describe the mathematical foundations of our framework and discuss considerations for implementing its various components. We demonstrate the ability of this framework to accurately deconvolve a set of metagenomic samples and to recover the gene content of individual taxa using synthetic metagenomic samples. We specifically characterize determinants of prediction accuracy and examine the impact of annotation errors on the reconstructed genomes. We finally apply metagenomic deconvolution to samples from the Human Microbiome Project, successfully reconstructing genus-level genomic content of various microbial genera, based solely on variation in gene count. These reconstructed genera are shown to correctly capture genus-specific properties. With the accumulation of metagenomic data, this deconvolution framework provides an essential tool for characterizing microbial taxa never before seen, laying the foundation for addressing fundamental questions concerning the taxa comprising diverse microbial communities. PMID:24146609

  17. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    SciTech Connect

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  18. Peak Studio: a tool for the visualization and analysis of fragment analysis files.

    PubMed

    McCafferty, Jonathan; Reid, Robert; Spencer, Melanie; Hamp, Timothy; Fodor, Anthony

    2012-10-01

    While emerging technologies such as next-generation sequencing are increasingly important tools for the analysis of metagenomic communities, molecular fingerprinting techniques such as automated ribosomal intergenic spacer analysis (ARISA) and terminal restriction fragment length polymorphisms (T-RFLP) remain in use due to their rapid speed and low cost. Peak Studio is a java-based graphical user interface (GUI) designed for the visualization and analysis of fragment analysis (FSA) files generated by the Applied Biosystems capillary electrophoresis instrument. Specifically designed for ARISA and T-RFLP experiments, Peak Studio provides the user the ability to freely adjust the parameters of a peak-calling algorithm and immediately see the implications for downstream clustering by principal component analysis. Peak Studio is fully open-source and, unlike proprietary solutions, can be deployed on any computer running Windows, OS X or Linux. Peak Studio allows data to be saved in multiple formats and can serve as a pre-processing suite that prepares data for statistical analysis in programs such as SAS or R. PMID:23760901

  19. Powerful tool for design analysis of linear control systems

    SciTech Connect

    Maddux, Jr, A S

    1982-05-10

    The methods for designing linear controls for electronic or mechanical systems have been understood and put to practice. What has not been readily available to engineers, however, is a practical, quick and inexpensive method for analyzing these linear control (feedback) systems once they have been designed into the electronic or mechanical hardware. Now, the PET, manufactured by Commodore Business Machines (CBM), operating with several peripherals via the IEEE 488 Bus, brings to the engineer for about $4000 a complete set of office tools for analyzing these system designs.

  20. Parachute system design, analysis, and simulation tool. Status report

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-12-31

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  1. Multiscale Analysis of Surface Topography from Single Point Incremental Forming using an Acetal Tool

    NASA Astrophysics Data System (ADS)

    Ham, M.; Powers, B. M.; Loiselle, J.

    2014-03-01

    Single point incremental forming (SPIF) is a sheet metal manufacturing process that forms a part by incrementally applying point loads to the material to achieve the desired deformations and final part geometry. This paper investigates the differences in surface topography between a carbide tool and an acetal-tipped tool. Area-scale analysis is performed on the confocal areal surface measurements per ASME B46. The objective of this paper is to determine at which scales surfaces formed by two different tool materials can be differentiated. It is found that the surfaces in contact with the acetal forming tool have greater relative areas at all scales greater than 5 × 104 μm2 than the surfaces in contact with the carbide tools. The surfaces not in contact with the tools during forming, also referred to as the free surface, are unaffected by the tool material.

  2. An advanced image analysis tool for the quantification and characterization of breast cancer in microscopy images.

    PubMed

    Goudas, Theodosios; Maglogiannis, Ilias

    2015-03-01

    The paper presents an advanced image analysis tool for the accurate and fast characterization and quantification of cancer and apoptotic cells in microscopy images. The proposed tool utilizes adaptive thresholding and a Support Vector Machines classifier. The segmentation results are enhanced through a Majority Voting and a Watershed technique, while an object labeling algorithm has been developed for the fast and accurate validation of the recognized cells. Expert pathologists evaluated the tool and the reported results are satisfying and reproducible. PMID:25681102

  3. Core Curriculum Analysis: A Tool for Educational Design

    ERIC Educational Resources Information Center

    Levander, Lena M.; Mikkola, Minna

    2009-01-01

    This paper examines the outcome of a dimensional core curriculum analysis. The analysis process was an integral part of an educational development project, which aimed to compact and clarify the curricula of the degree programmes. The task was also in line with the harmonising of the degree structures as part of the Bologna process within higher…

  4. QUANTITATIVE TRAIT LOCUS ANALYSIS AS A GENE DISCOVERY TOOL

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative trait locus analysis has been a mainstay approach for obtaining a genetic description of complex agronomic traits for plants. What is sometimes overlooked is the role QTL analysis can play in identifying genes that underlay complex traits. In this chapter, I will describe the basic st...

  5. Using a Bracketed Analysis as a Learning Tool.

    ERIC Educational Resources Information Center

    Main, Keith

    1995-01-01

    Bracketed analysis is an examination of experiences within a defined time frame or "bracket." It assumes the ability to learn from any source: behaviors, emotions, rational and irrational thought, insights, reflections, and reactions. A bracketed analysis to determine what went wrong with a grant proposal that missed deadlines illustrates its use.…

  6. Applications of custom developed object based analysis tool: Precipitation in Pacific, Tropical cyclones precipitation, Hail areas

    NASA Astrophysics Data System (ADS)

    Skok, Gregor; Rakovec, Jože; Strajnar, Benedikt; Bacmeister, Julio; Tribbia, Joe

    2014-05-01

    In the last few years an object-based analysis software tool was developed at University of Ljubljana in collaboration with National Center for Atmospheric Research (NCAR). The tool was originally based on ideas of the Method for Object-Based Diagnostic Evaluation (MODE) developed by NCAR but has since evolved and changed considerably and is now available as a separate free software package. The software is called the Forward in Time object analysis tool (FiT tool). The software was used to analyze numerous datasets - mainly focusing on precipitation. Climatology of satellite and model precipitation in the low-and-mid latitude Pacific Ocean was performed by identifying and tracking of individual perception systems and estimating their lifespan, movement and size. A global climatology of tropical cyclone precipitation was performed using satellite data and tracking and analysis of areas with hail in Slovenia was performed using radar data. The tool will be presented along with some results of applications.

  7. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  8. Visualization tools for vorticity transport analysis in incompressible flow.

    PubMed

    Sadlo, Filip; Peikert, Ronald; Sick, Mirjam

    2006-01-01

    Vortices are undesirable in many applications while indispensable in others. It is therefore of common interest to understand their mechanisms of creation. This paper aims at analyzing the transport of vorticity inside incompressible flow. The analysis is based on the vorticity equation and is performed along pathlines which are typically started in upstream direction from vortex regions. Different methods for the quantitative and explorative analysis of vorticity transport are presented and applied to CFD simulations of water turbines. Simulation quality is accounted for by including the errors of meshing and convergence into analysis and visualization. The obtained results are discussed and interpretations with respect to engineering questions are given. PMID:17080821

  9. A grid service-based tool for hyperspectral imaging analysis

    NASA Astrophysics Data System (ADS)

    Carvajal, Carmen L.; Lugo, Wilfredo; Rivera, Wilson; Sanabria, John

    2005-06-01

    This paper outlines the design and implementation of Grid-HSI, a Service Oriented Architecture-based Grid application to enable hyperspectral imaging analysis. Grid-HSI provides users with a transparent interface to access computational resources and perform remotely hyperspectral imaging analysis through a set of Grid services. Grid-HSI is composed by a Portal Grid Interface, a Data Broker and a set of specialized Grid services. Grid based applications, contrary to other clientserver approaches, provide the capabilities of persistence and potential transient process on the web. Our experimental results on Grid-HSI show the suitability of the prototype system to perform efficiently hyperspectral imaging analysis.

  10. HPLC - A valuable tool for analysis of lipids from conventional and unconventional sources

    Technology Transfer Automated Retrieval System (TEKTRAN)

    During the last 20 years, high performance liquid chromatography (HPLC) has been transformed from being a tool for the qualitative separation of lipids to becoming an indispensable tool for the quantitative analysis of lipids. Some of these HPLC method have also provided valuable insights into rese...

  11. An Analysis Tool for School Inclusion for Pupils with Special Educational Needs and Disabilities

    ERIC Educational Resources Information Center

    Ryan, David

    2008-01-01

    The present paper describes a developmental tool (Inclusion Analysis) that enables schools to identify how inclusive or otherwise they are in a relatively straightforward way with a minimal time commitment. The developmental background to the creation of the tool is outlined in terms of inclusion in Northern Ireland and the implementation of the…

  12. Retinal image restoration by means of blind deconvolution

    NASA Astrophysics Data System (ADS)

    Marrugo, Andrés G.; Šorel, Michal; Šroubek, Filip; Millán, María S.

    2011-11-01

    Retinal imaging plays a key role in the diagnosis and management of ophthalmologic disorders, such as diabetic retinopathy, glaucoma, and age-related macular degeneration. Because of the acquisition process, retinal images often suffer from blurring and uneven illumination. This problem may seriously affect disease diagnosis and progression assessment. Here we present a method for color retinal image restoration by means of multichannel blind deconvolution. The method is applied to a pair of retinal images acquired within a lapse of time, ranging from several minutes to months. It consists of a series of preprocessing steps to adjust the images so they comply with the considered degradation model, followed by the estimation of the point-spread function and, ultimately, image deconvolution. The preprocessing is mainly composed of image registration, uneven illumination compensation, and segmentation of areas with structural changes. In addition, we have developed a procedure for the detection and visualization of structural changes. This enables the identification of subtle developments in the retina not caused by variation in illumination or blur. The method was tested on synthetic and real images. Encouraging experimental results show that the method is capable of significant restoration of degraded retinal images.

  13. Deconvolution and simulation of thermoluminescence glow curves with Mathcad.

    PubMed

    Kiisk, V

    2013-09-01

    The paper reports two quite general and user-friendly calculation codes (called TLD-MC and TLS-MC) for deconvolution and simulation, respectively, of thermoluminescence (TL) glow curves, which have been implemented using the well-known engineering computing software PTC Mathcad. An advantage of this commercial software is the flexibility and productivity in setting up tailored computations due to a natural math notation, an interactive calculation environment and the availability of advanced numerical methods. TLD-MC includes the majority of popular models used for TL glow-curve deconvolution (the user can easily implement additional models if necessary). The least-squares (Levenberg-Marquardt) optimisation of various analytical and even some non-analytical models is reasonably fast and the obtained figure-of-merit values are generally excellent. TLS-MC implements numerical solution of the original set of differential equations describing charge carrier dynamics involving arbitrary number of interactive electron and hole traps. The programs are freely available from the website http://www.physic.ut.ee/~kiisk/mcadapps.htm. PMID:23528325

  14. Molecular dynamics in cytochrome c oxidase Moessbauer spectra deconvolution

    SciTech Connect

    Bossis, Fabrizio; Palese, Luigi L.

    2011-01-07

    Research highlights: {yields} Cytochrome c oxidase molecular dynamics serve to predict Moessbauer lineshape widths. {yields} Half height widths are used in modeling of Lorentzian doublets. {yields} Such spectral deconvolutions are useful in detecting the enzyme intermediates. -- Abstract: In this work low temperature molecular dynamics simulations of cytochrome c oxidase are used to predict an experimentally observable, namely Moessbauer spectra width. Predicted lineshapes are used to model Lorentzian doublets, with which published cytochrome c oxidase Moessbauer spectra were simulated. Molecular dynamics imposed constraints to spectral lineshapes permit to obtain useful information, like the presence of multiple chemical species in the binuclear center of cytochrome c oxidase. Moreover, a benchmark of quality for molecular dynamic simulations can be obtained. Despite the overwhelming importance of dynamics in electron-proton transfer systems, limited work has been devoted to unravel how much realistic are molecular dynamics simulations results. In this work, molecular dynamics based predictions are found to be in good agreement with published experimental spectra, showing that we can confidently rely on actual simulations. Molecular dynamics based deconvolution of Moessbauer spectra will lead to a renewed interest for application of this approach in bioenergetics.

  15. Deconvolution in line scanners using a priori information

    NASA Astrophysics Data System (ADS)

    Wirnitzer, Bernhard; Spraggon-Hernandez, Tadeo

    2002-12-01

    In a digital camera the MTF of the optical system must comprise a low-pass filter in order to avoid aliasing. The MTF of incoherent imaging usually and in principle is far from an ideal low-pass. Theoretically a digital ARMA-Filter can be used to compensate for this drawback. In praxis such deconvolution filters suffer from instability because of time-variant noise and space-variance of the MTF. In addition in a line scanner the MTF in scan direction slightly differs in each scanned image. Therefore inverse filtering will not operate satisfactory in an unknown environment. A new concept is presented which solves both problems using a-priori information about an object, e.g. that parts of it are known to be binary. This information is enough to achieve a stable space and time-variant ARMA-deconvolution filter. Best results are achieved using non linear filtering and pattern feedback. The new method was used to improve the bit-error-rate (BER) of a high-density matrix-code scanner by more than one order of magnitude. An audio scanner will be demonstrated, which reads 12 seconds of music in CD-quality from an audio coded image of 18mm55mm size.

  16. New tools for jet analysis in high energy collisions

    NASA Astrophysics Data System (ADS)

    Duffty, Daniel

    Our understanding of the fundamental interactions of particles has come far in the last century, and is still pushing forward. As we build ever more powerful machines to probe higher and higher energies, we will need to develop new tools to not only understand the new physics objects we are trying to detect, but even to understand the environment that we are searching in. We examine methods of identifying both boosted objects and low energy jets which will be shrouded in a sea of noise from other parts of the detector. We display the power of boosted-b tagging in a simulated W search. We also examine the effect of pileup on low energy jet reconstructions. For this purpose we develop a new priority-based jet algorithm, "p-jets", to cluster the energy that belongs together, but ignore the rest.

  17. Texture image analysis for osteoporosis detection with morphological tools

    NASA Astrophysics Data System (ADS)

    Sevestre-Ghalila, Sylvie; Benazza-Benyahia, Amel; Cherif, Hichem; Souid, Wided

    2001-07-01

    The disease of osteoporosis shows itself both in a reduction of the bone mass and a degradation of the microarchitecture of the bone tissue. Radiological images of heel's bone are analyzed in order to extract informations about microarchitectural patterns. We first extract the gray-scale skeleton of the microstructures contained in the underlying images. More precisely, we apply the thinning procedure proposed by Mersal which preserves connectivity of the microarchitecture. Then, a post-processing of the resulting skeleton consists in detecting the points of intersection of the trabecular bones (multiple points). The modified skeleton can be considered as a powerful tool to extract discriminant features between Osteoporotic Patients (OP) and Control Patients (CP). For instance, computing the distance between two horizontal (respectively vertical) adjacent trabecular bones is a straightforward task once the multiple points are available. Statistical tests indicate that the proposed method is more suitable to discriminate between OP and CP than conventional methods based on binary skeleton.

  18. TOOLS FOR COMPARATIVE ANALYSIS OF ALTERNATIVES: COMPETING OR COMPLEMENTARY PERSPECTIVES?

    EPA Science Inventory

    A third generation of environmental policymaking and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of these risks associated with the decision alternatives at hand will e...

  19. Sequential Analysis: A Tool for Monitoring Program Delivery.

    ERIC Educational Resources Information Center

    Howe, Holly L.; Hoff, Margaret B.

    1981-01-01

    The sensitivity and simplicity of Wald's sequential analysis test in monitoring a preventive health care program are discussed. Data exemplifying the usefulness and expedience of employing sequential methods are presented. (Author/GK)

  20. Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles

    PubMed Central

    Baer, DR; Gaspar, DJ; Nachimuthu, P; Techane, SD; Castner, DG

    2010-01-01

    The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578