Science.gov

Sample records for deconvolution analysis tool

  1. PVT Analysis With A Deconvolution Algorithm

    SciTech Connect

    Kouzes, Richard T.

    2011-02-01

    Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

  2. The deconvolution operation in convex analysis: An introduction

    SciTech Connect

    Hiriart-Urruty, J.B.

    1995-03-01

    Performing the infimal convolution of two functions is a frequent and useful operation in Convex Analysis: it is, to great extent, the dual operation of the addition; it serves (like other {open_quotes}convolutions{close_quotes} in Analysis) to regularize functions; it has nice geometrical and economic interpretations. The deconvolution of a (convex) function by another one is a new operation, firstly defined in clear-cut manner, which is to the infimal convolution what the subtraction is to the addition for real numbers; it appears in conjugating the difference of convex functions; it serves in solving explicitly convolution equations; it has an interpretation in terms of subtraction of epigraphs. Since its introduction, the deconvolution operation has been studied more thoroughly by the author and his former students or associates. What we intend to present here is a short (and, hopefully, pedagogical) introduction to the deconvolution operation, in a simplified setting. This can be viewed as a complement to chapter IV and X in the book.

  3. Importance of FTIR Spectra Deconvolution for the Analysis of Amorphous Calcium Phosphates

    NASA Astrophysics Data System (ADS)

    Brangule, Agnese; Agris Gross, Karlis

    2015-03-01

    This work will consider Fourier transform infrared spectroscopy - diffuse reflectance infrared reflection (FTIR-DRIFT) for collecting the spectra and deconvolution to identify changes in bonding as a means of more powerful detection. Spectra were recorded from amorphous calcium phosphate synthesized by wet precipitation, and from bone. FTIR-DRIFT was used to study the chemical environments of PO4, CO3 and amide. Deconvolution of spectra separated overlapping bands in the ʋ4PO4, ʋ2CO3, ʋ3CO3 and amide region allowing a more detailed analysis of changes at the atomic level. Amorphous calcium phosphate dried at 80 oC, despite showing an X-ray diffraction amorphous structure, displayed carbonate in positions resembling a carbonated hydroxyapatite. Additional peaks were designated as A1 type, A2 type or B type. Deconvolution allowed the separation of CO3 positions in bone from amide peaks. FTIR-DRIFT spectrometry in combination with deconvolution offers an advanced tool for qualitative and quantitative determination of CO3, PO4 and HPO4 and shows promise to measure the degree of order.

  4. Deconvolution Program

    1999-02-18

    The program is suitable for a lot of applications in applied mathematics, experimental physics, signal analytical system and some engineering applications range i.e. deconvolution spectrum, signal analysis and system property analysis etc.

  5. Analysis of optical wavefront reconstruction and deconvolution in adaptive optics

    NASA Astrophysics Data System (ADS)

    Luke, David Russell

    2001-11-01

    It was in the spirit of ``reuniting divergent trends by clarifying the common features and interconnections of many distinct and diverse scientific facts'' that Courant and Hilbert published their book Methods of Mathematical Physics [43]. This thesis is written in the same spirit and with the same goal. We focus our attention on the problem of wavefront reconstruction and deconvolution in adaptive optics. This is an ill-posed, non-linear inverse problem that touches on the theory of harmonic analysis, variational analysis, signal processing, nonconvex optimization, regularization, statistics and probability. Numerical solutions rely on spectral and operator-splitting methods as well as limited memory and multi-resolution techniques. We introduce novel methods for wavefront reconstruction and compare our results against common techniques. Previous work on this problem is reviewed and unified in a non-parametric, analytic framework.

  6. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools.

    PubMed

    Bade, Richard; Causanilles, Ana; Emke, Erik; Bijlsma, Lubertus; Sancho, Juan V; Hernandez, Felix; de Voogt, Pim

    2016-11-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of >200 pharmaceuticals and illicit drugs or ChemSpider. This hidden target screening approach led to the detection of numerous compounds including the illicit drug cocaine and its metabolite benzoylecgonine and the pharmaceuticals carbamazepine, gemfibrozil and losartan. The compounds found using both approaches were combined, and isotopic pattern and retention time prediction were used to filter out false positives. The remaining potential positives were reanalysed in MS/MS mode and their product ions were compared with literature and/or mass spectral libraries. The inclusion of the chemical database ChemSpider led to the tentative identification of several metabolites, including paraxanthine, theobromine, theophylline and carboxylosartan, as well as the pharmaceutical phenazone. The first three of these compounds are isomers and they were subsequently distinguished based on their product ions and predicted retention times. This work has shown that the use deconvolution tools facilitates non-target screening and enables the identification of a higher number of compounds. PMID:27351148

  7. Facilitating high resolution mass spectrometry data processing for screening of environmental water samples: An evaluation of two deconvolution tools.

    PubMed

    Bade, Richard; Causanilles, Ana; Emke, Erik; Bijlsma, Lubertus; Sancho, Juan V; Hernandez, Felix; de Voogt, Pim

    2016-11-01

    A screening approach was applied to influent and effluent wastewater samples. After injection in a LC-LTQ-Orbitrap, data analysis was performed using two deconvolution tools, MsXelerator (modules MPeaks and MS Compare) and Sieve 2.1. The outputs were searched incorporating an in-house database of >200 pharmaceuticals and illicit drugs or ChemSpider. This hidden target screening approach led to the detection of numerous compounds including the illicit drug cocaine and its metabolite benzoylecgonine and the pharmaceuticals carbamazepine, gemfibrozil and losartan. The compounds found using both approaches were combined, and isotopic pattern and retention time prediction were used to filter out false positives. The remaining potential positives were reanalysed in MS/MS mode and their product ions were compared with literature and/or mass spectral libraries. The inclusion of the chemical database ChemSpider led to the tentative identification of several metabolites, including paraxanthine, theobromine, theophylline and carboxylosartan, as well as the pharmaceutical phenazone. The first three of these compounds are isomers and they were subsequently distinguished based on their product ions and predicted retention times. This work has shown that the use deconvolution tools facilitates non-target screening and enables the identification of a higher number of compounds.

  8. Dereplication of Natural Products Using GC-TOF Mass Spectrometry: Improved Metabolite Identification by Spectral Deconvolution Ratio Analysis

    PubMed Central

    Carnevale Neto, Fausto; Pilon, Alan C.; Selegato, Denise M.; Freire, Rafael T.; Gu, Haiwei; Raftery, Daniel; Lopes, Norberto P.; Castro-Gamboa, Ian

    2016-01-01

    Dereplication based on hyphenated techniques has been extensively applied in plant metabolomics, thereby avoiding re-isolation of known natural products. However, due to the complex nature of biological samples and their large concentration range, dereplication requires the use of chemometric tools to comprehensively extract information from the acquired data. In this work we developed a reliable GC-MS-based method for the identification of non-targeted plant metabolites by combining the Ratio Analysis of Mass Spectrometry deconvolution tool (RAMSY) with Automated Mass Spectral Deconvolution and Identification System software (AMDIS). Plants species from Solanaceae, Chrysobalanaceae and Euphorbiaceae were selected as model systems due to their molecular diversity, ethnopharmacological potential, and economical value. The samples were analyzed by GC-MS after methoximation and silylation reactions. Dereplication was initiated with the use of a factorial design of experiments to determine the best AMDIS configuration for each sample, considering linear retention indices and mass spectral data. A heuristic factor (CDF, compound detection factor) was developed and applied to the AMDIS results in order to decrease the false-positive rates. Despite the enhancement in deconvolution and peak identification, the empirical AMDIS method was not able to fully deconvolute all GC-peaks, leading to low MF values and/or missing metabolites. RAMSY was applied as a complementary deconvolution method to AMDIS to peaks exhibiting substantial overlap, resulting in recovery of low-intensity co-eluted ions. The results from this combination of optimized AMDIS with RAMSY attested to the ability of this approach as an improved dereplication method for complex biological samples such as plant extracts. PMID:27747213

  9. Three component microseism analysis in Australia from deconvolution enhanced beamforming

    NASA Astrophysics Data System (ADS)

    Gal, Martin; Reading, Anya; Ellingsen, Simon; Koper, Keith; Burlacu, Relu; Tkalčić, Hrvoje; Gibbons, Steven

    2016-04-01

    Ocean induced microseisms in the range 2-10 seconds are generated in deep oceans and near coastal regions as body and surface waves. The generation of these waves can take place over an extended area and in a variety of geographical locations at the same time. It is therefore common to observe multiple arrivals with a variety of slowness vectors which leads to the desire to measure multiple arrivals accurately. We present a deconvolution enhanced direction of arrival algorithm, for single and 3 component arrays, based on CLEAN. The algorithm iteratively removes sidelobe contributions in the power spectrum, therefore improves the signal-to-noise ratio of weaker sources. The power level on each component (vertical, radial and transverse) can be accurately estimated as the beamformer decomposes the power spectrum into point sources. We first apply the CLEAN aided beamformer to synthetic data to show its performance under known conditions and then evaluate real (observed) data from a range of arrays with apertures between 10 and 70 km (ASAR, WRA and NORSAR) to showcase the improvement in resolution. We further give a detailed analysis of the 3 component wavefield in Australia including source locations, power levels, phase ratios, etc. by two spiral arrays (PSAR and SQspa). For PSAR the analysis is carried out in the frequency range 0.35-1Hz. We find LQ, Lg and fundamental and higher mode Rg wave phases. Additionally, we also observe the Sn phase. This is the first time this has been achieved through beamforming on microseism noise and underlines the potential for extra seismological information that can be extracted using the new implementation of CLEAN. The fundamental mode Rg waves are dominant in power for low frequencies and show equal power levels with LQ towards higher frequencies. Generation locations between Rg and LQ are mildly correlated for low frequencies and uncorrelated for higher frequencies. Results from SQspa will discuss lower frequencies around the

  10. CAM-CM: a signal deconvolution tool for in vivo dynamic contrast-enhanced imaging of complex tissues

    PubMed Central

    Chen, Li; Chan, Tsung-Han; Choyke, Peter L.; Hillman, Elizabeth M. C.; Chi, Chong−Yung; Bhujwalla, Zaver M.; Wang, Ge; Wang, Sean S.; Szabo, Zsolt; Wang, Yue

    2011-01-01

    Summary:In vivo dynamic contrast-enhanced imaging tools provide non-invasive methods for analyzing various functional changes associated with disease initiation, progression and responses to therapy. The quantitative application of these tools has been hindered by its inability to accurately resolve and characterize targeted tissues due to spatially mixed tissue heterogeneity. Convex Analysis of Mixtures – Compartment Modeling (CAM-CM) signal deconvolution tool has been developed to automatically identify pure-volume pixels located at the corners of the clustered pixel time series scatter simplex and subsequently estimate tissue-specific pharmacokinetic parameters. CAM-CM can dissect complex tissues into regions with differential tracer kinetics at pixel-wise resolution and provide a systems biology tool for defining imaging signatures predictive of phenotypes. Availability: The MATLAB source code can be downloaded at the authors′ website www.cbil.ece.vt.edu/software.htm Contact: yuewang@vt.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21785131

  11. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer

  12. Improving the precision of fMRI BOLD signal deconvolution with implications for connectivity analysis.

    PubMed

    Bush, Keith; Cisler, Josh; Bian, Jiang; Hazaroglu, Gokce; Hazaroglu, Onder; Kilts, Clint

    2015-12-01

    An important, open problem in neuroimaging analyses is developing analytical methods that ensure precise inferences about neural activity underlying fMRI BOLD signal despite the known presence of confounds. Here, we develop and test a new meta-algorithm for conducting semi-blind (i.e., no knowledge of stimulus timings) deconvolution of the BOLD signal that estimates, via bootstrapping, both the underlying neural events driving BOLD as well as the confidence of these estimates. Our approach includes two improvements over the current best performing deconvolution approach; 1) we optimize the parametric form of the deconvolution feature space; and, 2) we pre-classify neural event estimates into two subgroups, either known or unknown, based on the confidence of the estimates prior to conducting neural event classification. This knows-what-it-knows approach significantly improves neural event classification over the current best performing algorithm, as tested in a detailed computer simulation of highly-confounded fMRI BOLD signal. We then implemented a massively parallelized version of the bootstrapping-based deconvolution algorithm and executed it on a high-performance computer to conduct large scale (i.e., voxelwise) estimation of the neural events for a group of 17 human subjects. We show that by restricting the computation of inter-regional correlation to include only those neural events estimated with high-confidence the method appeared to have higher sensitivity for identifying the default mode network compared to a standard BOLD signal correlation analysis when compared across subjects.

  13. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  14. A FORTRAN program for deconvolution analysis using the matrix algorithm method with special reference to renography.

    PubMed

    Kempi, V

    1987-04-01

    A FORTRAN IV program is presented for deconvolution analysis using the matrix algorithm method. With the deconvolution technique retention functions are calculated from time-activity curve data representing both kidneys and blood background. The program computes for each kidney the minimum and maximum time of the retention function. It also calculates the initial amplitudes, absolute as well as relative, and the mean transit time of the retention functions. The design of the program allows for optional reviews of intermediate outputs at important stages. It also allows for the plotting of conventional time-activity curves of both kidneys corrected for blood background. Finally, the program plots the retention functions and some of their characteristics.

  15. Reliability of multiresolution deconvolution for improving depth resolution in SIMS analysis

    NASA Astrophysics Data System (ADS)

    Boulakroune, M.'Hamed

    2016-11-01

    This paper deals the effectiveness and reliability of multiresolution deconvolution algorithm for recovery Secondary Ions Mass Spectrometry, SIMS, profiles altered by the measurement. This new algorithm is characterized as a regularized wavelet transform. It combines ideas from Tikhonov Miller regularization, wavelet analysis and deconvolution algorithms in order to benefit from the advantages of each. The SIMS profiles were obtained by analysis of two structures of boron in a silicon matrix using a Cameca-Ims6f instrument at oblique incidence. The first structure is large consisting of two distant wide boxes and the second one is thin structure containing ten delta-layers in which the deconvolution by zone was applied. It is shown that this new multiresolution algorithm gives best results. In particular, local application of the regularization parameter of blurred and estimated solutions at each resolution level provided to smoothed signals without creating artifacts related to noise content in the profile. This led to a significant improvement in the depth resolution and peaks' maximums.

  16. Application of spectral deconvolution and inverse mechanistic modelling as a tool for root cause investigation in protein chromatography.

    PubMed

    Brestrich, Nina; Hahn, Tobias; Hubbuch, Jürgen

    2016-03-11

    In chromatographic protein purification, process variations, aging of columns, or processing errors can lead to deviations of the expected elution behavior of product and contaminants and can result in a decreased pool purity or yield. A different elution behavior of all or several involved species leads to a deviating chromatogram. The causes for deviations are however hard to identify by visual inspection and complicate the correction of a problem in the next cycle or batch. To overcome this issue, a tool for root cause investigation in protein chromatography was developed. The tool combines a spectral deconvolution with inverse mechanistic modelling. Mid-UV spectral data and Partial Least Squares Regression were first applied to deconvolute peaks to obtain the individual elution profiles of co-eluting proteins. The individual elution profiles were subsequently used to identify errors in process parameters by curve fitting to a mechanistic chromatography model. The functionality of the tool for root cause investigation was successfully demonstrated in a model protein study with lysozyme, cytochrome c, and ribonuclease A. Deviating chromatograms were generated by deliberately caused errors in the process parameters flow rate and sodium-ion concentration in loading and elution buffer according to a design of experiments. The actual values of the three process parameters and, thus, the causes of the deviations were estimated with errors of less than 4.4%. Consequently, the established tool for root cause investigation is a valuable approach to rapidly identify process variations, aging of columns, or processing errors. This might help to minimize batch rejections or contribute to an increased productivity.

  17. Application of spectral deconvolution and inverse mechanistic modelling as a tool for root cause investigation in protein chromatography.

    PubMed

    Brestrich, Nina; Hahn, Tobias; Hubbuch, Jürgen

    2016-03-11

    In chromatographic protein purification, process variations, aging of columns, or processing errors can lead to deviations of the expected elution behavior of product and contaminants and can result in a decreased pool purity or yield. A different elution behavior of all or several involved species leads to a deviating chromatogram. The causes for deviations are however hard to identify by visual inspection and complicate the correction of a problem in the next cycle or batch. To overcome this issue, a tool for root cause investigation in protein chromatography was developed. The tool combines a spectral deconvolution with inverse mechanistic modelling. Mid-UV spectral data and Partial Least Squares Regression were first applied to deconvolute peaks to obtain the individual elution profiles of co-eluting proteins. The individual elution profiles were subsequently used to identify errors in process parameters by curve fitting to a mechanistic chromatography model. The functionality of the tool for root cause investigation was successfully demonstrated in a model protein study with lysozyme, cytochrome c, and ribonuclease A. Deviating chromatograms were generated by deliberately caused errors in the process parameters flow rate and sodium-ion concentration in loading and elution buffer according to a design of experiments. The actual values of the three process parameters and, thus, the causes of the deviations were estimated with errors of less than 4.4%. Consequently, the established tool for root cause investigation is a valuable approach to rapidly identify process variations, aging of columns, or processing errors. This might help to minimize batch rejections or contribute to an increased productivity. PMID:26879457

  18. A further analysis for the minimum-variance deconvolution filter performance

    NASA Astrophysics Data System (ADS)

    Chi, Chong-Yung

    1987-06-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  19. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    NASA Astrophysics Data System (ADS)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the

  20. Investigation of the CLEAN deconvolution method for use with Late Time Response analysis of multiple objects

    NASA Astrophysics Data System (ADS)

    Hutchinson, Simon; Taylor, Christopher T.; Fernando, Michael; Andrews, David; Bowring, Nicholas

    2014-10-01

    This paper investigates the application of the CLEAN non-linear deconvolution method to Late Time Response (LTR) analysis for detecting multiple objects in Concealed Threat Detection (CTD). When an Ultra-Wide Band (UWB) frequency radar signal is used to illuminate a conductive target, surface currents are induced upon the object which in turn give rise to LTR signals. These signals are re-radiated from the target and the results from a number of targets are presented. The experiment was performed using double ridged horn antenna in a pseudo-monostatic arrangement. A Vector network analyser (VNA) has been used to provide the UWB Frequency Modulated Continuous Wave (FMCW) radar signal. The distance between the transmitting antenna and the target objects has been kept at 1 metre for all the experiments performed and the power level at the VNA was set to 0dBm. The targets in the experimental setup are suspended in air in a laboratory environment. Matlab has been used in post processing to perform linear and non-linear deconvolution of the signal. The Wiener filter, Fast Fourier Transform (FFT) and Continuous Wavelet Transform (CWT) are used to process the return signals and extract the LTR features from the noise clutter. A Generalized Pencil-of-Function (GPOF) method was then used to extract the complex poles of the signal. Artificial Neural Networks (ANN) and Linear Discriminant Analysis (LDA) have been used to classify the data.

  1. Application of automated mass spectrometry deconvolution and identification software for pesticide analysis in surface waters.

    PubMed

    Furtula, Vesna; Derksen, George; Colodey, Alan

    2006-01-01

    A new approach to surface water analysis has been investigated in order to enhance the detection of different organic contaminants in Nathan Creek, British Columbia. Water samples from Nathan Creek were prepared by liquid/liquid extraction using dichloromethane (DCM) as an extraction solvent and analyzed by gas chromatography mass spectrometry method in scan mode (GC-MS scan). To increase sensitivity for pesticides detection, acquired scan data were further analyzed by Automated Mass Spectrometry Deconvolution and Identification Software (AMDIS) incorporated into the Agilent Deconvolution Reporting Software (DRS), which also includes mass spectral libraries for 567 pesticides. Extracts were reanalyzed by gas chromatography mass spectrometry single ion monitoring (GC-MS-SIM) to confirm and quantitate detected pesticides. Pesticides: atrazine, dimethoate, diazinone, metalaxyl, myclobutanil, napropamide, oxadiazon, propazine and simazine were detected at three sampling sites on the mainstream of the Nathan Creek. Results of the study are further discussed in terms of detectivity and identification level for each pesticide found. The proposed approach of monitoring pesticides in surface waters enables their detection and identification at trace levels. PMID:17090491

  2. Terahertz deconvolution.

    PubMed

    Walker, Gillian C; Bowen, John W; Labaune, Julien; Jackson, J-Bianca; Hadjiloucas, Sillas; Roberts, John; Mourou, Gerard; Menu, Michel

    2012-12-01

    The ability to retrieve information from different layers within a stratified sample using terahertz pulsed reflection imaging and spectroscopy has traditionally been resolution limited by the pulse width available. In this paper, a deconvolution algorithm is presented which circumvents this resolution limit, enabling deep sub-wavelength and sub-pulse width depth resolution. The algorithm is explained through theoretical investigation, and demonstrated by reconstructing signals reflected from boundaries in stratified materials that cannot be resolved directly from the unprocessed time-domain reflection signal. Furthermore, the deconvolution technique has been used to recreate sub-surface images from a stratified sample: imaging the reverse side of a piece of paper. PMID:23262673

  3. XAP, a program for deconvolution and analysis of complex X-ray spectra

    USGS Publications Warehouse

    Quick, James E.; Haleby, Abdul Malik

    1989-01-01

    The X-ray analysis program (XAP) is a spectral-deconvolution program written in BASIC and specifically designed to analyze complex spectra produced by energy-dispersive X-ray analytical systems (EDS). XAP compensates for spectrometer drift, utilizes digital filtering to remove background from spectra, and solves for element abundances by least-squares, multiple-regression analysis. Rather than base analyses on only a few channels, broad spectral regions of a sample are reconstructed from standard reference spectra. The effects of this approach are (1) elimination of tedious spectrometer adjustments, (2) removal of background independent of sample composition, and (3) automatic correction for peak overlaps. Although the program was written specifically to operate a KEVEX 7000 X-ray fluorescence analytical system, it could be adapted (with minor modifications) to analyze spectra produced by scanning electron microscopes, electron microprobes, and probes, and X-ray defractometer patterns obtained from whole-rock powders.

  4. Analysis of force-deconvolution methods in frequency-modulation atomic force microscopy

    PubMed Central

    Illek, Esther; Giessibl, Franz J

    2012-01-01

    Summary In frequency-modulation atomic force microscopy the direct observable is the frequency shift of an oscillating cantilever in a force field. This frequency shift is not a direct measure of the actual force, and thus, to obtain the force, deconvolution methods are necessary. Two prominent methods proposed by Sader and Jarvis (Sader–Jarvis method) and Giessibl (matrix method) are investigated with respect to the deconvolution quality. Both methods show a nontrivial dependence of the deconvolution quality on the oscillation amplitude. The matrix method exhibits spikelike features originating from a numerical artifact. By interpolation of the data, the spikelike features can be circumvented. The Sader–Jarvis method has a continuous amplitude dependence showing two minima and one maximum, which is an inherent property of the deconvolution algorithm. The optimal deconvolution depends on the ratio of the amplitude and the characteristic decay length of the force for the Sader–Jarvis method. However, the matrix method generally provides the higher deconvolution quality. PMID:22496997

  5. Toward robust deconvolution of pass-through paleomagnetic measurements: new tool to estimate magnetometer sensor response and laser interferometry of sample positioning accuracy

    NASA Astrophysics Data System (ADS)

    Oda, Hirokuni; Xuan, Chuang; Yamamoto, Yuhji

    2016-07-01

    Pass-through superconducting rock magnetometers (SRM) offer rapid and high-precision remanence measurements for continuous samples that are essential for modern paleomagnetism studies. However, continuous SRM measurements are inevitably smoothed and distorted due to the convolution effect of SRM sensor response. Deconvolution is necessary to restore accurate magnetization from pass-through SRM data, and robust deconvolution requires reliable estimate of SRM sensor response as well as understanding of uncertainties associated with the SRM measurement system. In this paper, we use the SRM at Kochi Core Center (KCC), Japan, as an example to introduce new tool and procedure for accurate and efficient estimate of SRM sensor response. To quantify uncertainties associated with the SRM measurement due to track positioning errors and test their effects on deconvolution, we employed laser interferometry for precise monitoring of track positions both with and without placing a u-channel sample on the SRM tray. The acquired KCC SRM sensor response shows significant cross-term of Z-axis magnetization on the X-axis pick-up coil and full widths of ~46-54 mm at half-maximum response for the three pick-up coils, which are significantly narrower than those (~73-80 mm) for the liquid He-free SRM at Oregon State University. Laser interferometry measurements on the KCC SRM tracking system indicate positioning uncertainties of ~0.1-0.2 and ~0.5 mm for tracking with and without u-channel sample on the tray, respectively. Positioning errors appear to have reproducible components of up to ~0.5 mm possibly due to patterns or damages on tray surface or rope used for the tracking system. Deconvolution of 50,000 simulated measurement data with realistic error introduced based on the position uncertainties indicates that although the SRM tracking system has recognizable positioning uncertainties, they do not significantly debilitate the use of deconvolution to accurately restore high

  6. Data enhancement and analysis through mathematical deconvolution of signals from scientific measuring instruments

    NASA Technical Reports Server (NTRS)

    Wood, G. M.; Rayborn, G. H.; Ioup, J. W.; Ioup, G. E.; Upchurch, B. T.; Howard, S. J.

    1981-01-01

    Mathematical deconvolution of digitized analog signals from scientific measuring instruments is shown to be a means of extracting important information which is otherwise hidden due to time-constant and other broadening or distortion effects caused by the experiment. Three different approaches to deconvolution and their subsequent application to recorded data from three analytical instruments are considered. To demonstrate the efficacy of deconvolution, the use of these approaches to solve the convolution integral for the gas chromatograph, magnetic mass spectrometer, and the time-of-flight mass spectrometer are described. Other possible applications of these types of numerical treatment of data to yield superior results from analog signals of the physical parameters normally measured in aerospace simulation facilities are suggested and briefly discussed.

  7. Analysis of a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Horn, Florian; Bayer, Florian; Pelzer, Georg; Rieger, Jens; Ritter, André; Weber, Thomas; Zang, Andrea; Michel, Thilo; Anton, Gisela

    2014-03-01

    Grating-based X-ray phase-contrast imaging is a promising imaging modality to increase soft tissue contrast in comparison to conventional attenuation-based radiography. Complementary and otherwise inaccessible information is provided by the dark-field image, which shows the sub-pixel size granularity of the measured object. This could especially turn out to be useful in mammography, where tumourous tissue is connected with the presence of supertiny microcalcifications. In addition to the well-established image reconstruction process, an analysis method was introduced by Modregger, 1 which is based on deconvolution of the underlying scattering distribution within a single pixel revealing information about the sample. Subsequently, the different contrast modalities can be calculated with the scattering distribution. The method already proved to deliver additional information in the higher moments of the scattering distribution and possibly reaches better image quality with respect to an increased contrast-to-noise ratio. Several measurements were carried out using melamine foams as phantoms. We analysed the dependency of the deconvolution-based method with respect to the dark-field image on different parameters such as dose, number of iterations of the iterative deconvolution-algorithm and dark-field signal. A disagreement was found in the reconstructed dark-field values between the FFT method and the iterative method. Usage of the resulting characteristics might be helpful in future applications.

  8. Streaming Multiframe Deconvolutions on GPUs

    NASA Astrophysics Data System (ADS)

    Lee, M. A.; Budavári, T.

    2015-09-01

    Atmospheric turbulence distorts all ground-based observations, which is especially detrimental to faint detections. The point spread function (PSF) defining this blur is unknown for each exposure and varies significantly over time, making image analysis difficult. Lucky imaging and traditional co-adding throws away lots of information. We developed blind deconvolution algorithms that can simultaneously obtain robust solutions for the background image and all the PSFs. It is done in a streaming setting, which makes it practical for large number of big images. We implemented a new tool that runs of GPUs and achieves exceptional running times that can scale to the new time-domain surveys. Our code can quickly and effectively recover high-resolution images exceeding the quality of traditional co-adds. We demonstrate the power of the method on the repeated exposures in the Sloan Digital Sky Survey's Stripe 82.

  9. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  10. Demand Response Analysis Tool

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be usedmore » by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.« less

  11. MS-DIAL: Data Independent MS/MS Deconvolution for Comprehensive Metabolome Analysis

    PubMed Central

    Tsugawa, Hiroshi; Cajka, Tomas; Kind, Tobias; Ma, Yan; Higgins, Brendan; Ikeda, Kazutaka; Kanazawa, Mitsuhiro; VanderGheynst, Jean; Fiehn, Oliver; Arita, Masanori

    2015-01-01

    Data-independent acquisition (DIA) in liquid chromatography tandem mass spectrometry (LC-MS/MS) provides more comprehensive untargeted acquisition of molecular data. Here we provide an open-source software pipeline, MS-DIAL, to demonstrate how DIA improves simultaneous identification and quantification of small molecules by mass spectral deconvolution. For reversed phase LC-MS/MS, our program with an enriched LipidBlast library identified total 1,023 lipid compounds from nine algal strains to highlight their chemotaxonomic relationships. PMID:25938372

  12. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    NASA Astrophysics Data System (ADS)

    Xuan, Chuang; Oda, Hirokuni

    2015-11-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  13. Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis

    SciTech Connect

    Greenberg, M.; Ebel, D.S.

    2009-03-19

    We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

  14. Independent component analysis (ICA) algorithms for improved spectral deconvolution of overlapped signals in 1H NMR analysis: application to foods and related products.

    PubMed

    Monakhova, Yulia B; Tsikin, Alexey M; Kuballa, Thomas; Lachenmeier, Dirk W; Mushtakova, Svetlana P

    2014-05-01

    The major challenge facing NMR spectroscopic mixture analysis is the overlapping of signals and the arising impossibility to easily recover the structures for identification of the individual components and to integrate separated signals for quantification. In this paper, various independent component analysis (ICA) algorithms [mutual information least dependent component analysis (MILCA); stochastic non-negative ICA (SNICA); joint approximate diagonalization of eigenmatrices (JADE); and robust, accurate, direct ICA algorithm (RADICAL)] as well as deconvolution methods [simple-to-use-interactive self-modeling mixture analysis (SIMPLISMA) and multivariate curve resolution-alternating least squares (MCR-ALS)] are applied for simultaneous (1)H NMR spectroscopic determination of organic substances in complex mixtures. Among others, we studied constituents of the following matrices: honey, soft drinks, and liquids used in electronic cigarettes. Good quality spectral resolution of up to eight-component mixtures was achieved (correlation coefficients between resolved and experimental spectra were not less than 0.90). In general, the relative errors in the recovered concentrations were below 12%. SIMPLISMA and MILCA algorithms were found to be preferable for NMR spectra deconvolution and showed similar performance. The proposed method was used for analysis of authentic samples. The resolved ICA concentrations match well with the results of reference gas chromatography-mass spectrometry as well as the MCR-ALS algorithm used for comparison. ICA deconvolution considerably improves the application range of direct NMR spectroscopy for analysis of complex mixtures.

  15. ATAMM analysis tool

    NASA Technical Reports Server (NTRS)

    Jones, Robert; Stoughton, John; Mielke, Roland

    1991-01-01

    Diagnostics software for analyzing Algorithm to Architecture Mapping Model (ATAMM) based concurrent processing systems is presented. ATAMM is capable of modeling the execution of large grain algorithms on distributed data flow architectures. The tool graphically displays algorithm activities and processor activities for evaluation of the behavior and performance of an ATAMM based system. The tool's measurement capabilities indicate computing speed, throughput, concurrency, resource utilization, and overhead. Evaluations are performed on a simulated system using the software tool. The tool is used to estimate theoretical lower bound performance. Analysis results are shown to be comparable to the predictions.

  16. Multi-tissue constrained spherical deconvolution for improved analysis of multi-shell diffusion MRI data.

    PubMed

    Jeurissen, Ben; Tournier, Jacques-Donald; Dhollander, Thijs; Connelly, Alan; Sijbers, Jan

    2014-12-01

    Constrained spherical deconvolution (CSD) has become one of the most widely used methods to extract white matter (WM) fibre orientation information from diffusion-weighted MRI (DW-MRI) data, overcoming the crossing fibre limitations inherent in the diffusion tensor model. It is routinely used to obtain high quality fibre orientation distribution function (fODF) estimates and fibre tractograms and is increasingly used to obtain apparent fibre density (AFD) measures. Unfortunately, CSD typically only supports data acquired on a single shell in q-space. With multi-shell data becoming more and more prevalent, there is a growing need for CSD to fully support such data. Furthermore, CSD can only provide high quality fODF estimates in voxels containing WM only. In voxels containing other tissue types such as grey matter (GM) and cerebrospinal fluid (CSF), the WM response function may no longer be appropriate and spherical deconvolution produces unreliable, noisy fODF estimates. The aim of this study is to incorporate support for multi-shell data into the CSD approach as well as to exploit the unique b-value dependencies of the different tissue types to estimate a multi-tissue ODF. The resulting approach is dubbed multi-shell, multi-tissue CSD (MSMT-CSD) and is compared to the state-of-the-art single-shell, single-tissue CSD (SSST-CSD) approach. Using both simulations and real data, we show that MSMT-CSD can produce reliable WM/GM/CSF volume fraction maps, directly from the DW data, whereas SSST-CSD has a tendency to overestimate the WM volume in voxels containing GM and/or CSF. In addition, compared to SSST-CSD, MSMT-CSD can substantially increase the precision of the fODF fibre orientations and reduce the presence of spurious fODF peaks in voxels containing GM and/or CSF. Both effects translate into more reliable AFD measures and tractography results with MSMT-CSD compared to SSST-CSD.

  17. Regulatory sequence analysis tools.

    PubMed

    van Helden, Jacques

    2003-07-01

    The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.

  18. Systematic forensic toxicological analysis by GC-MS in serum using automated mass spectral deconvolution and identification system.

    PubMed

    Grapp, Marcel; Maurer, Hans H; Desel, Herbert

    2016-08-01

    Non-targeted screening of body fluids for psychoactive agents is an essential task for forensic toxicology. The challenge is the identification of xenobiotics of interest from background noise and endogenous matrix components. The aim of the present work was to evaluate the use of an Automated Mass Spectral Deconvolution and Identification System (AMDIS) for gas chromatography-mass spectrometry (GC-MS) based toxicological serum screening. One hundred fifty serum samples submitted to the authors´ laboratory for systematic forensic toxicological analysis underwent GC-MS screening after neutral and basic liquid-liquid extraction. Recorded datasets were routinely evaluated both by experienced personnel and automatically using the AMDIS software combined with the Maurer/Pfleger/Weber GC-MS library MPW_2011. The results from manual and automated data evaluation were then systematically compared. AMDIS parameters for data deconvolution and substance identification had been successfully adapted to the GC-MS screening procedure in serum. The number of false positive hits could substantially be reduced without increasing the risk of overlooking relevant compounds. With AMDIS-based data evaluation, additional drugs were identified in 25 samples (17%) that had not been detected by manual data evaluation. Importantly, among these drugs, there were frequently prescribed and toxicologically relevant antidepressants and antipsychotic drugs such as citalopram, mirtazapine, quetiapine, or venlafaxine. For most of the identified drugs, their serum concentrations were in the therapeutic or subtherapeutic range. Thus, our study indicated that automated data evaluation by AMDIS provided reliable screening results. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Graphical Contingency Analysis Tool

    SciTech Connect

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identify the best action by interactively evaluate candidate actions.

  20. Convolution-deconvolution in DIGES

    SciTech Connect

    Philippacopoulos, A.J.; Simos, N.

    1995-05-01

    Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities.

  1. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  2. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  3. PCard Data Analysis Tool

    SciTech Connect

    Hilts, Jim

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes, Email status (date sent, response), audit resolution.

  4. PCard Data Analysis Tool

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes,more » Email status (date sent, response), audit resolution.« less

  5. Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis

    PubMed Central

    Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

    2015-01-01

    The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

  6. Graphical Contingency Analysis Tool

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore » the best action by interactively evaluate candidate actions.« less

  7. Transmission Planning Analysis Tool

    SciTech Connect

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identify weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysis and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.

  8. Transmission Planning Analysis Tool

    2015-06-23

    Developed to solve specific problem: Assist transmission planning for regional transfers in interconnected power systems. This work was originated in a study for the U.S. Department of State, to recommend transmission reinforcements for the Central American regional system that interconnects 6 countries. Transmission planning analysis is currently performed by engineers with domainspecific and systemspecific knowledge without a unique methodology. The software codes of this disclosure assists engineers by defining systematic analysis procedures to help identifymore » weak points and make decisions on transmission planning of regional interconnected power systems. Transmission Planning Analysis Tool groups PSS/E results of multiple AC contingency analysis and voltage stability analysis and QV analysis of many scenarios of study and arrange them in a systematic way to aid power system planning engineers or transmission operators in effective decision]making process or in the off]line study environment.« less

  9. Constrained spherical deconvolution analysis of the limbic network in human, with emphasis on a direct cerebello-limbic pathway

    PubMed Central

    Arrigo, Alessandro; Mormina, Enricomaria; Anastasi, Giuseppe Pio; Gaeta, Michele; Calamuneri, Alessandro; Quartarone, Angelo; De Salvo, Simona; Bruschetta, Daniele; Rizzo, Giuseppina; Trimarchi, Fabio; Milardi, Demetrio

    2014-01-01

    The limbic system is part of an intricate network which is involved in several functions like memory and emotion. Traditionally the role of the cerebellum was considered mainly associated to motion control; however several evidences are raising about a role of the cerebellum in learning skills, emotions control, mnemonic and behavioral processes involving also connections with limbic system. In 15 normal subjects we studied limbic connections by probabilistic Constrained Spherical Deconvolution (CSD) tractography. The main result of our work was to prove for the first time in human brain the existence of a direct cerebello-limbic pathway which was previously hypothesized but never demonstrated. We also extended our analysis to the other limbic connections including cingulate fasciculus, inferior longitudinal fasciculus, uncinated fasciculus, anterior thalamic connections and fornix. Although these pathways have been already described in the tractographic literature we provided reconstruction, quantitative analysis and Fractional Anisotropy (FA) right-left symmetry comparison using probabilistic CSD tractography that is known to provide a potential improvement compared to previously used Diffusion Tensor Imaging (DTI) techniques. The demonstration of the existence of cerebello-limbic pathway could constitute an important step in the knowledge of the anatomic substrate of non-motor cerebellar functions. Finally the CSD statistical data about limbic connections in healthy subjects could be potentially useful in the diagnosis of pathological disorders damaging this system. PMID:25538606

  10. Swift Science Analysis Tools

    NASA Astrophysics Data System (ADS)

    Marshall, F. E.; Swift Team Team

    2003-05-01

    Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (http://swiftsc.gsfc.nasa.gov), and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

  11. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  12. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  13. Application of network identification by deconvolution method to the thermal analysis of the pump-probe transient thermoreflectance signal.

    PubMed

    Ezzahri, Y; Shakouri, A

    2009-07-01

    The paper discusses the possibility to apply network identification by deconvolution (NID) method to the analysis of the thermal transient behavior due to a laser delta pulse excitation in a pump-probe transient thermoreflectance experiment. NID is a method based on linear RC network theory using Fourier's law of heat conduction. This approach allows the extraction of the thermal time constant spectrum of the sample under study after excitation by either a step or pulse function. Furthermore, using some mathematical transformations, the method allows analyzing the detail of the heat flux path through the sample, starting from the excited top free surface, by introducing two characteristic functions: the cumulative structure function and the differential structure function. We start by a review of the theoretical background of the NID method in the case of a step function excitation and then show how this method can be adjusted to be used in the case of a delta pulse function excitation. We show how the NID method can be extended to analyze the thermal transients of many optical experiments in which the excitation function is a laser pulse. The effect of the semi-infinite substrate as well as extraction of the interface and thin film thermal resistances will be discussed.

  14. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  15. Stack Trace Analysis Tool

    SciTech Connect

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  16. Stack Trace Analysis Tool

    2013-02-19

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallell application. STAT uses the MRNet tree based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to from a single call prefix tree.more » The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence slasses. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  17. Stack Trace Analysis Tool

    2008-01-16

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallel application. STAT uses the MRNet free based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to form a single call prefix tree.more » The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence classes. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.« less

  18. Stack Trace Analysis Tool

    SciTech Connect

    2008-01-16

    STAT is a light weight debugging tool that gathers and merges stack traces from all of the processes in a parallel application. STAT uses the MRNet free based overlay network to broadcast commands from the tool front-end to the STAT daemons and for the front-end to gather the traces from the STAT daemons. As the traces propagate through the MRNet network tree, they are merged across all tasks to form a single call prefix tree. The call prefix tree can be examined to identify tasks with similar function call patterns and to delineate a small set of equivalence classes. A representative task from each of these classes can then be fed into a full feature debugger like TotalView for root cause analysis.

  19. Automated Deconvolution of Overlapped Ion Mobility Profiles

    NASA Astrophysics Data System (ADS)

    Brantley, Matthew; Zekavat, Behrooz; Harper, Brett; Mason, Rachel; Solouki, Touradj

    2014-10-01

    Presence of unresolved ion mobility (IM) profiles limits the efficient utilization of IM mass spectrometry (IM-MS) systems for isomer differentiation. Here, we introduce an automated ion mobility deconvolution (AIMD) computer software for streamlined deconvolution of overlapped IM-MS profiles. AIMD is based on a previously reported post-IM/collision-induced dissociation (CID) deconvolution approach [ J. Am. Soc. Mass Spectrom. 23, 1873 (2012)] and, unlike the previously reported manual approach, it does not require resampling of post-IM/CID data. A novel data preprocessing approach is utilized to improve the accuracy and efficiency of the deconvolution process. Results from AIMD analysis of overlapped IM profiles of data from (1) Waters Synapt G1 for a binary mixture of isomeric peptides (amino acid sequences: GRGDS and SDGRG) and (2) Waters Synapt G2-S for a binary mixture of isomeric trisaccharides (raffinose and isomaltotriose) are presented.

  20. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  1. Neutron multiplicity analysis tool

    SciTech Connect

    Stewart, Scott L

    2010-01-01

    I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This

  2. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  3. Climate Data Analysis Tools

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). Themore » CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).« less

  4. Climate Data Analysis Tools

    SciTech Connect

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).

  5. Uncertainty analysis of signal deconvolution using a measured instrument response function

    NASA Astrophysics Data System (ADS)

    Hartouni, E. P.; Beeman, B.; Caggiano, J. A.; Cerjan, C.; Eckart, M. J.; Grim, G. P.; Hatarik, R.; Moore, A. S.; Munro, D. H.; Phillips, T.; Sayre, D. B.

    2016-11-01

    A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). In the case investigated here, the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to determine the uncertainty estimate of the physical model's parameters. We apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimum physical parameters.

  6. Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols

    EPA Science Inventory

    In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

  7. Digital Deconvolution Filter Derived from Linear Discriminant Analysis and Application for Multiphoton Fluorescence Microscopy

    PubMed Central

    2015-01-01

    A digital filter derived from linear discriminant analysis (LDA) is developed for recovering impulse responses in photon counting from a high speed photodetector (rise time of ∼1 ns) and applied to remove ringing distortions from impedance mismatch in multiphoton fluorescence microscopy. Training of the digital filter was achieved by defining temporally coincident and noncoincident transients and identifying the projection within filter-space that best separated the two classes. Once trained, data analysis by digital filtering can be performed quickly. Assessment of the reliability of the approach was performed through comparisons of simulated voltage transients, in which the ground truth results were known a priori. The LDA filter was also found to recover deconvolved impulses for single photon counting from highly distorted ringing waveforms from an impedance mismatched photomultiplier tube. The LDA filter was successful in removing these ringing distortions from two-photon excited fluorescence micrographs and through data simulations was found to extend the dynamic range of photon counting by approximately 3 orders of magnitude through minimization of detector paralysis. PMID:24559143

  8. Deconvolution of petroleum mixtures using mid-FTIR analysis and non-negative matrix factorization

    NASA Astrophysics Data System (ADS)

    Livanos, George; Zervakis, Michalis; Pasadakis, Nikos; Karelioti, Marouso; Giakos, George

    2016-11-01

    The aim of this study is to develop an efficient, robust and cost effective methodology capable of both identifying the chemical fractions in complex commercial petroleum products and numerically estimating their concentration within the mixture sample. We explore a methodology based on attenuated total reflectance fourier transform infrared (ATR-FTIR) analytical signals, combined with a modified factorization algorithm to solve this ‘mixture problem’, first in qualitative and then in quantitative mode. The proposed decomposition approach is self-adapting to data without prior knowledge and is able of accurately estimating the weight contributions of constituents in the entire chemical compound. The results of the presented work to petroleum analysis indicate that it is possible to deconvolve the mixing process and recover the content in a chemically complex petroleum mixture using the infrared signals of a limited number of samples and the principal substances forming the mixture. A focus application of the proposed methodology is the quality control of commercial gasoline by identifying and quantifying the individual fractions utilized for its formulation via a fast, robust and efficient procedure based on mathematical analysis of the acquired spectra.

  9. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  10. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  11. Blind decorrelation and deconvolution algorithm for multiple-input multiple-output system: II. Analysis and simulation

    NASA Astrophysics Data System (ADS)

    Chen, Da-Ching; Yu, Tommy; Yao, Kung; Pottie, Gregory J.

    1999-11-01

    For single-input multiple-output (SIMO) systems blind deconvolution based on second-order statistics has been shown promising given that the sources and channels meet certain assumptions. In our previous paper we extend the work to multiple-input multiple-output (MIMO) systems by introducing a blind deconvolution algorithm to remove all channel dispersion followed by a blind decorrelation algorithm to separate different sources from their instantaneous mixture. In this paper we first explore more details embedded in our algorithm. Then we present simulation results to show that our algorithm is applicable to MIMO systems excited by a broad class of signals such as speech, music and digitally modulated symbols.

  12. Wavespace-Based Coherent Deconvolution

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Cattafesta, Louis N., III

    2012-01-01

    Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

  13. FSSC Science Tools: Pulsar Analysis

    NASA Technical Reports Server (NTRS)

    Thompson, Dave

    2010-01-01

    This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

  14. Sight Application Analysis Tool

    SciTech Connect

    Bronevetsky, G.

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  15. Logistics Process Analysis ToolProcess Analysis Tool

    SciTech Connect

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).

  16. Logistics Process Analysis ToolProcess Analysis Tool

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component wasmore » added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  17. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.

  18. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  19. Subcellular Microanatomy by 3D Deconvolution Brightfield Microscopy: Method and Analysis Using Human Chromatin in the Interphase Nucleus

    PubMed Central

    Tadrous, Paul Joseph

    2012-01-01

    Anatomy has advanced using 3-dimensional (3D) studies at macroscopic (e.g., dissection, injection moulding of vessels, radiology) and microscopic (e.g., serial section reconstruction with light and electron microscopy) levels. This paper presents the first results in human cells of a new method of subcellular 3D brightfield microscopy. Unlike traditional 3D deconvolution and confocal techniques, this method is suitable for general application to brightfield microscopy. Unlike brightfield serial sectioning it has subcellular resolution. Results are presented of the 3D structure of chromatin in the interphase nucleus of two human cell types, hepatocyte and plasma cell. I show how the freedom to examine these structures in 3D allows greater morphological discrimination between and within cell types and the 3D structural basis for the classical “clock-face” motif of the plasma cell nucleus is revealed. Potential for further applications discussed. PMID:22567315

  20. Tiling Microarray Analysis Tools

    SciTech Connect

    Nix, Davis Austin

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons), 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)

  1. Sandia PUF Analysis Tool

    SciTech Connect

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  2. Tiling Microarray Analysis Tools

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more » 4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  3. Sandia PUF Analysis Tool

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics suchmore » as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less

  4. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  5. Extension of deconvolution algorithms for the mapping of moving acoustic sources.

    PubMed

    Fleury, Vincent; Bulté, Jean

    2011-03-01

    Several deconvolution algorithms are commonly used in aeroacoustics to estimate the power level radiated by static sources, for instance, the deconvolution approach for the mapping of acoustic sources (DAMAS), DAMAS2, CLEAN, and the CLEAN based on spatial source coherence algorithm (CLEAN-SC). However, few efficient methodologies are available for moving sources. In this paper, several deconvolution approaches are proposed to estimate the narrow-band spectra of low-Mach number uncorrelated sources. All of them are based on a beamformer output. Due to velocity, the beamformer output is inherently related to the source spectra over the whole frequency range, which makes the deconvolution very complex from a computational point of view. Using the conventional Doppler approximation and for limited time analysis, the problem can be separated into multiple independent problems, each involving a single source frequency, as for static sources. DAMAS, DAMAS2, CLEAN, and CLEAN-SC are then extended to moving sources. These extensions are validated from both synthesized data and real aircraft flyover noise measurements. Comparable performances to those of the corresponding static methodologies are recovered. All these approaches constitute complementary and efficient tools in order to quantify the noise level emitted from moving acoustic sources.

  6. Windprofiler optimization using digital deconvolution procedures

    NASA Astrophysics Data System (ADS)

    Hocking, W. K.; Hocking, A.; Hocking, D. G.; Garbanzo-Salas, M.

    2014-10-01

    Digital improvements to data acquisition procedures used for windprofiler radars have the potential for improving the height coverage at optimum resolution, and permit improved height resolution. A few newer systems already use this capability. Real-time deconvolution procedures offer even further optimization, and this has not been effectively employed in recent years. In this paper we demonstrate the advantages of combining these features, with particular emphasis on the advantages of real-time deconvolution. Using several multi-core CPUs, we have been able to achieve speeds of up to 40 GHz from a standard commercial motherboard, allowing data to be digitized and processed without the need for any type of hardware except for a transmitter (and associated drivers), a receiver and a digitizer. No Digital Signal Processor chips are needed, allowing great flexibility with analysis algorithms. By using deconvolution procedures, we have then been able to not only optimize height resolution, but also have been able to make advances in dealing with spectral contaminants like ground echoes and other near-zero-Hz spectral contamination. Our results also demonstrate the ability to produce fine-resolution measurements, revealing small-scale structures within the backscattered echoes that were previously not possible to see. Resolutions of 30 m are possible for VHF radars. Furthermore, our deconvolution technique allows the removal of range-aliasing effects in real time, a major bonus in many instances. Results are shown using new radars in Canada and Costa Rica.

  7. Bayesian least squares deconvolution

    NASA Astrophysics Data System (ADS)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  8. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  9. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1994-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within it, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  10. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  11. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  12. Predictive deconvolution and hybrid feature selection for computer-aided detection of prostate cancer.

    PubMed

    Maggio, Simona; Palladini, Alessandro; Marchi, Luca De; Alessandrini, Martino; Speciale, Nicolò; Masetti, Guido

    2010-02-01

    Computer-aided detection (CAD) schemes are decision making support tools, useful to overcome limitations of problematic clinical procedures. Trans-rectal ultrasound image based CAD would be extremely important to support prostate cancer diagnosis. An effective approach to realize a CAD scheme for this purpose is described in this work, employing a multi-feature kernel classification model based on generalized discriminant analysis. The mutual information of feature value and tissue pathological state is used to select features essential for tissue characterization. System-dependent effects are reduced through predictive deconvolution of the acquired radio-frequency signals. A clinical study, performed on ground truth images from biopsy findings, provides a comparison of the classification model applied before and after deconvolution, showing in the latter case a significant gain in accuracy and area under the receiver operating characteristic curve.

  13. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  14. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  15. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  16. Automated resolution of chromatographic signals by independent component analysis-orthogonal signal deconvolution in comprehensive gas chromatography/mass spectrometry-based metabolomics.

    PubMed

    Domingo-Almenara, Xavier; Perera, Alexandre; Ramírez, Noelia; Brezmes, Jesus

    2016-07-01

    Comprehensive gas chromatography-mass spectrometry (GC×GC-MS) provides a different perspective in metabolomics profiling of samples. However, algorithms for GC×GC-MS data processing are needed in order to automatically process the data and extract the purest information about the compounds appearing in complex biological samples. This study shows the capability of independent component analysis-orthogonal signal deconvolution (ICA-OSD), an algorithm based on blind source separation and distributed in an R package called osd, to extract the spectra of the compounds appearing in GC×GC-MS chromatograms in an automated manner. We studied the performance of ICA-OSD by the quantification of 38 metabolites through a set of 20 Jurkat cell samples analyzed by GC×GC-MS. The quantification by ICA-OSD was compared with a supervised quantification by selective ions, and most of the R(2) coefficients of determination were in good agreement (R(2)>0.90) while up to 24 cases exhibited an excellent linear relation (R(2)>0.95). We concluded that ICA-OSD can be used to resolve co-eluted compounds in GC×GC-MS. PMID:27208528

  17. Automated resolution of chromatographic signals by independent component analysis-orthogonal signal deconvolution in comprehensive gas chromatography/mass spectrometry-based metabolomics.

    PubMed

    Domingo-Almenara, Xavier; Perera, Alexandre; Ramírez, Noelia; Brezmes, Jesus

    2016-07-01

    Comprehensive gas chromatography-mass spectrometry (GC×GC-MS) provides a different perspective in metabolomics profiling of samples. However, algorithms for GC×GC-MS data processing are needed in order to automatically process the data and extract the purest information about the compounds appearing in complex biological samples. This study shows the capability of independent component analysis-orthogonal signal deconvolution (ICA-OSD), an algorithm based on blind source separation and distributed in an R package called osd, to extract the spectra of the compounds appearing in GC×GC-MS chromatograms in an automated manner. We studied the performance of ICA-OSD by the quantification of 38 metabolites through a set of 20 Jurkat cell samples analyzed by GC×GC-MS. The quantification by ICA-OSD was compared with a supervised quantification by selective ions, and most of the R(2) coefficients of determination were in good agreement (R(2)>0.90) while up to 24 cases exhibited an excellent linear relation (R(2)>0.95). We concluded that ICA-OSD can be used to resolve co-eluted compounds in GC×GC-MS.

  18. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

  19. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  20. General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P. (Compiler)

    2016-01-01

    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  1. Climate Data Analysis Tools - (CDAT)

    NASA Astrophysics Data System (ADS)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware

  2. Deconvolution using the complex cepstrum

    SciTech Connect

    Riley, H B

    1980-12-01

    The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

  3. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  4. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  5. Modeling of pharmacokinetic systems using stochastic deconvolution.

    PubMed

    Kakhi, Maziar; Chittenden, Jason

    2013-12-01

    In environments where complete mechanistic knowledge of the system dynamics is not available, a synergy of first-principle concepts, stochastic methods and statistical approaches can provide an efficient, accurate, and insightful strategy for model development. In this work, a system of ordinary differential equations describing system pharmacokinetics (PK) was coupled to a Wiener process for tracking the absorption rate coefficient, and was embedded in a nonlinear mixed effects population PK formalism. The procedure is referred to as "stochastic deconvolution" and it is proposed as a diagnostic tool to inform on a mapping function between the fraction of the drug absorbed and the fraction of the drug dissolved when applying one-stage methods to in vitro-in vivo correlation modeling. The goal of this work was to show that stochastic deconvolution can infer an a priori specified absorption profile given dense observational (simulated) data. The results demonstrate that the mathematical model is able to accurately reproduce the simulated data in scenarios where solution strategies for linear, time-invariant systems would assuredly fail. To this end, PK systems that are representative of Michaelis-Menten kinetics and enterohepatic circulation were investigated. Furthermore, the solution times are manageable using a modest computer hardware platform.

  6. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  7. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  8. AIDA: Adaptive Image Deconvolution Algorithm

    NASA Astrophysics Data System (ADS)

    Hom, Erik; Haase, Sebastian; Marchis, Franck

    2013-10-01

    AIDA is an implementation and extension of the MISTRAL myopic deconvolution method developed by Mugnier et al. (2004) (see J. Opt. Soc. Am. A 21:1841-1854). The MISTRAL approach has been shown to yield object reconstructions with excellent edge preservation and photometric precision when used to process astronomical images. AIDA improves upon the original MISTRAL implementation. AIDA, written in Python, can deconvolve multiple frame data and three-dimensional image stacks encountered in adaptive optics and light microscopic imaging.

  9. Target deconvolution techniques in modern phenotypic profiling

    PubMed Central

    Lee, Jiyoun; Bogyo, Matthew

    2013-01-01

    The past decade has seen rapid growth in the use of diverse compound libraries in classical phenotypic screens to identify modulators of a given process. The subsequent process of identifying the molecular targets of active hits, also called ‘target deconvolution’, is an essential step for understanding compound mechanism of action and for using the identified hits as tools for further dissection of a given biological process. Recent advances in ‘omics’ technologies, coupled with in silico approaches and the reduced cost of whole genome sequencing, have greatly improved the workflow of target deconvolution and have contributed to a renaissance of ‘modern’ phenotypic profiling. In this review, we will outline how both new and old techniques are being used in the difficult process of target identification and validation as well as discuss some of the ongoing challenges remaining for phenotypic screening. PMID:23337810

  10. General Mission Analysis Tool (GMAT) Mathematical Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  11. Spatially varying regularization of deconvolution in 3D microscopy.

    PubMed

    Seo, J; Hwang, S; Lee, J-M; Park, H

    2014-08-01

    Confocal microscopy has become an essential tool to explore biospecimens in 3D. Confocal microcopy images are still degraded by out-of-focus blur and Poisson noise. Many deconvolution methods including the Richardson-Lucy (RL) method, Tikhonov method and split-gradient (SG) method have been well received. The RL deconvolution method results in enhanced image quality, especially for Poisson noise. Tikhonov deconvolution method improves the RL method by imposing a prior model of spatial regularization, which encourages adjacent voxels to appear similar. The SG method also contains spatial regularization and is capable of incorporating many edge-preserving priors resulting in improved image quality. The strength of spatial regularization is fixed regardless of spatial location for the Tikhonov and SG method. The Tikhonov and the SG deconvolution methods are improved upon in this study by allowing the strength of spatial regularization to differ for different spatial locations in a given image. The novel method shows improved image quality. The method was tested on phantom data for which ground truth and the point spread function are known. A Kullback-Leibler (KL) divergence value of 0.097 is obtained with applying spatially variable regularization to the SG method, whereas KL value of 0.409 is obtained with the Tikhonov method. In tests on a real data, for which the ground truth is unknown, the reconstructed data show improved noise characteristics while maintaining the important image features such as edges.

  12. Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle.

    PubMed

    Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

    2015-05-01

    UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement. PMID:25703359

  13. A new scoring function for top-down spectral deconvolution

    DOE PAGES

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore » that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less

  14. A new scoring function for top-down spectral deconvolution

    SciTech Connect

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showed that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.

  15. Model Analysis ToolKit

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  16. Model Analysis ToolKit

    SciTech Connect

    Harp, Dylan R.

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - model calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R

  17. Fourier Transform Based Deconvolution Analysis of Frequency Modulation Lineshapes; Fluorocarbon Radical Densities in AN Ecr Etcher from Infrared Diode Laser Spectroscopy.

    NASA Astrophysics Data System (ADS)

    Wu, Jian-Zhong

    1995-11-01

    Highly sensitive modulation detection techniques produce a distorted and more complicated form of the original line shape. In this thesis many existing theories for analysis of these modulation lineshapes are mathematically compared. The Fourier transform of the modulation spectrum is shown to be always a product of the Fourier transform of the original spectrum and a known modulation function. If the latter is divided out, then the inverse transform will give the original spectrum as it would have appeared with no modulation. Mathematical expressions of modulation functions are also derived for several more general modulation of waveforms: one-tone frequency modulation by an arbitrary waveform, two-tone frequency modulation (TTFM) by sine waveforms, and tone-burst frequency modulation (TBFM) by sine waveforms. The two seemingly different modulation schemes, TTFM and TBFM, are proven to be actually the same, when no AM is involved and the modulation index is low. Based on the fast Fourier transform (FFT) the deconvolution procedure for one-tone low frequency wavelength modulation has been implemented in FORTRAN. This recovery procedure is applied to some computer simulated modulation spectra of a Gaussian and then to actual multiple line, second harmonic diode laser spectral data taken from a plasma. Also included are infrared absorption measurements of CF, CF_2, CF_3, and COF_2 in an electron cyclotron resonance (ECR) plasma etcher with CHF _3 and rm C_2H_2F _4 as the main feed gases. The role of CF _{x} (x=1, 2, 3) radicals on Si/SiO_2 etch selectivity has been examined by measuring the neutral CF_{x} absolute concentrations in the gas phase for the same sets of conditions employed in etch rate measurements. We have observed P(10.5) and P(11.5) of ^2 Pi_{1/2} and P(10.5) of ^2Pi_{3/2} in the fundamental band of CF in the ECR etcher, and used these transitions in CF absolute density determinations in rm C_2H_2F_4 or CHF_3 in the power range of 500 W to 1200 W and

  18. SHARAD Radargram Analysis Tool Development in JMARS

    NASA Astrophysics Data System (ADS)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  19. 2010 Solar Market Transformation Analysis and Tools

    SciTech Connect

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  20. Budget Risk & Prioritization Analysis Tool

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  1. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  2. Shock and Discontinuities Analysis Tool (SDAT)

    NASA Astrophysics Data System (ADS)

    Viñas, A. F.; Holland, M. P.

    2005-05-01

    We have developed an analysis/visualization tool to study shocks and other discontinuities from satellite observations of plasma and magnetic field data. The tool uses an extension of the Viñas-Scudder analysis method based on the Rankine-Hugoniot conservation equations and published in JGR (1985). SDAT provides shock parameters such as the normal components n, shock speed Us, angle between the normal and the upstream magnetic field ΘBn, Alfvén and magnetosonic Mach numbers, deHoffman-Teller velocity and many other important shock parameters to describe the shock . SDAT was developed fully in IDL. As currently configured, SDAT reads ASCII data from any space mission for the analysis. All data displays and graphics generated by SDAT are written in Postcript and a summary of the analysis is generated as an ASCII file. The input plasma (ρ, V, Tp and Te) and magnetic field (B) data is read from different files at their own native resolution. The tool allows for data zooming in time and the input data can be in any coordinate system. Examples of real satellite data are included to facilitate the learning process in the usage of the tool. This tool is available to the space physics community. As configured, the tool is basically complete for shock analysis; however, work continues to generalize SDAT for the analysis of other types of discontinuities.

  3. Dynamic contrast-enhanced CT of head and neck tumors: perfusion measurements using a distributed-parameter tracer kinetic model. Initial results and comparison with deconvolution-based analysis

    NASA Astrophysics Data System (ADS)

    Bisdas, Sotirios; Konstantinou, George N.; Sherng Lee, Puor; Thng, Choon Hua; Wagenblast, Jens; Baghi, Mehran; San Koh, Tong

    2007-10-01

    The objective of this work was to evaluate the feasibility of a two-compartment distributed-parameter (DP) tracer kinetic model to generate functional images of several physiologic parameters from dynamic contrast-enhanced CT data obtained of patients with extracranial head and neck tumors and to compare the DP functional images to those obtained by deconvolution-based DCE-CT data analysis. We performed post-processing of DCE-CT studies, obtained from 15 patients with benign and malignant head and neck cancer. We introduced a DP model of the impulse residue function for a capillary-tissue exchange unit, which accounts for the processes of convective transport and capillary-tissue exchange. The calculated parametric maps represented blood flow (F), intravascular blood volume (v1), extravascular extracellular blood volume (v2), vascular transit time (t1), permeability-surface area product (PS), transfer ratios k12 and k21, and the fraction of extracted tracer (E). Based on the same regions of interest (ROI) analysis, we calculated the tumor blood flow (BF), blood volume (BV) and mean transit time (MTT) by using a modified deconvolution-based analysis taking into account the extravasation of the contrast agent for PS imaging. We compared the corresponding values by using Bland-Altman plot analysis. We outlined 73 ROIs including tumor sites, lymph nodes and normal tissue. The Bland-Altman plot analysis revealed that the two methods showed an accepted degree of agreement for blood flow, and, thus, can be used interchangeably for measuring this parameter. Slightly worse agreement was observed between v1 in the DP model and BV but even here the two tracer kinetic analyses can be used interchangeably. Under consideration of whether both techniques may be used interchangeably was the case of t1 and MTT, as well as for measurements of the PS values. The application of the proposed DP model is feasible in the clinical routine and it can be used interchangeably for measuring

  4. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  5. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  6. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  7. Photogrammetry Tool for Forensic Analysis

    NASA Technical Reports Server (NTRS)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  8. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  9. Deconvolution of dynamic mechanical networks

    PubMed Central

    Hinczewski, Michael; von Hansen, Yann; Netz, Roland R.

    2010-01-01

    Time-resolved single-molecule biophysical experiments yield data that contain a wealth of dynamic information, in addition to the equilibrium distributions derived from histograms of the time series. In typical force spectroscopic setups the molecule is connected via linkers to a readout device, forming a mechanically coupled dynamic network. Deconvolution of equilibrium distributions, filtering out the influence of the linkers, is a straightforward and common practice. We have developed an analogous dynamic deconvolution theory for the more challenging task of extracting kinetic properties of individual components in networks of arbitrary complexity and topology. Our method determines the intrinsic linear response functions of a given object in the network, describing the power spectrum of conformational fluctuations. The practicality of our approach is demonstrated for the particular case of a protein linked via DNA handles to two optically trapped beads at constant stretching force, which we mimic through Brownian dynamics simulations. Each well in the protein free energy landscape (corresponding to folded, unfolded, or possibly intermediate states) will have its own characteristic equilibrium fluctuations. The associated linear response function is rich in physical content, because it depends both on the shape of the well and its diffusivity—a measure of the internal friction arising from such processes as the transient breaking and reformation of bonds in the protein structure. Starting from the autocorrelation functions of the equilibrium bead fluctuations measured in this force clamp setup, we show how an experimentalist can accurately extract the state-dependent protein diffusivity using a straightforward two-step procedure. PMID:21118989

  10. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  11. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  12. CRAB: Distributed analysis tool for CMS

    NASA Astrophysics Data System (ADS)

    Sala, Leonardo; CMS Collaboration

    2012-12-01

    CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

  13. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  14. Integrated Drill Core Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Conze, Ronald; Reed, Josh; Chen, Yu-Chung; Krysiak, Frank

    2010-05-01

    Data management in scientific drilling programs such IODP, ICDP, and ANDRILL is applied to capture drilling and science data during an expedition and for long-term data storage and dissemination. Currently data management tools are linked directly with capture and visualization applications to allow for both, a two-way flow of data between the database and the applications, and an integrated data environment. The new system has meanwhile been tested by recent IODP and ICDP projects. The components comprise the Expedition Drilling Information System (ExpeditionDIS) used for data acquisition, PSICAT, the Paleontological Stratigraphic Interval Construction and Analysis Tool, for graphical editing and viewing of core description diagrams, and Corelyzer as part of CoreWall for scalable, extensible visualization, developed to enhance the study of geological cores. This interoperable configuration of tools provides an excellent all-in-one toolbox for core analysis.

  15. Comparing Work Skills Analysis Tools. Project Report.

    ERIC Educational Resources Information Center

    Barker, Kathryn

    This document outlines the processes and outcomes of a research project conducted to review work skills analysis tools (products and/or services) that profile required job skills and/or assess individuals' acquired skills. The document begins with a brief literature review and discussion of pertinent terminology. Presented next is a list of…

  16. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  17. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (Editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  18. Minimum entropy deconvolution and blind equalisation

    NASA Technical Reports Server (NTRS)

    Satorius, E. H.; Mulligan, J. J.

    1992-01-01

    Relationships between minimum entropy deconvolution, developed primarily for geophysics applications, and blind equalization are pointed out. It is seen that a large class of existing blind equalization algorithms are directly related to the scale-invariant cost functions used in minimum entropy deconvolution. Thus the extensive analyses of these cost functions can be directly applied to blind equalization, including the important asymptotic results of Donoho.

  19. Efficient Bayesian-based multiview deconvolution.

    PubMed

    Preibisch, Stephan; Amat, Fernando; Stamataki, Evangelia; Sarov, Mihail; Singer, Robert H; Myers, Eugene; Tomancak, Pavel

    2014-06-01

    Light-sheet fluorescence microscopy is able to image large specimens with high resolution by capturing the samples from multiple angles. Multiview deconvolution can substantially improve the resolution and contrast of the images, but its application has been limited owing to the large size of the data sets. Here we present a Bayesian-based derivation of multiview deconvolution that drastically improves the convergence time, and we provide a fast implementation using graphics hardware. PMID:24747812

  20. From sensor networks to connected analysis tools

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the

  1. Analytical Approximation of the Deconvolution of Strongly Overlapping Broad Fluorescence Bands

    NASA Astrophysics Data System (ADS)

    Dubrovkin, J. M.; Tomin, V. I.; Ushakou, D. V.

    2016-09-01

    A method for deconvoluting strongly overlapping spectral bands into separate components that enables the uniqueness of the deconvolution procedure to be monitored was proposed. An asymmetric polynomial-modified function subjected to Fourier filtering (PMGFS) that allowed more accurate and physically reasonable band shapes to be obtained and also improved significantly the deconvolution convergence was used as the band model. The method was applied to the analysis of complexation in solutions of the molecular probe 4'-(diethylamino)-3-hydroxyflavone with added LiCl. Two-band fluorescence of the probe in such solutions was the result of proton transfer in an excited singlet state and overlapped strongly with stronger spontaneous emission of complexes with the ions. Physically correct deconvolutions of overlapping bands could not always be obtained using available software.

  2. Tools for sea urchin genomic analysis.

    PubMed

    Cameron, R Andrew

    2014-01-01

    The Sea Urchin Genome Project Web site, SpBase ( http://SpBase.org ), in association with a suite of publicly available sequence comparison tools provides a platform from which to analyze genes and genomic sequences of sea urchin. This information system is specifically designed to support laboratory bench studies in cell and molecular biology. In particular these tools and datasets have supported the description of the gene regulatory networks of the purple sea urchin S. purpuratus. This chapter details methods to undertake in the first steps to find genes and noncoding regulatory sequences for further analysis.

  3. X-ray deconvolution microscopy

    PubMed Central

    Ehn, Sebastian; Epple, Franz Michael; Fehringer, Andreas; Pennicard, David; Graafsma, Heinz; Noël, Peter; Pfeiffer, Franz

    2016-01-01

    Recent advances in single-photon-counting detectors are enabling the development of novel approaches to reach micrometer-scale resolution in x-ray imaging. One example of such a technology are the MEDIPIX3RX-based detectors, such as the LAMBDA which can be operated with a small pixel size in combination with real-time on-chip charge-sharing correction. This characteristic results in a close to ideal, box-like point spread function which we made use of in this study. The proposed method is based on raster-scanning the sample with sub-pixel sized steps in front of the detector. Subsequently, a deconvolution algorithm is employed to compensate for blurring introduced by the overlap of pixels with a well defined point spread function during the raster-scanning. The presented approach utilizes standard laboratory x-ray equipment while we report resolutions close to 10 μm. The achieved resolution is shown to follow the relationship pn with the pixel-size p of the detector and the number of raster-scanning steps n. PMID:27446649

  4. X-ray deconvolution microscopy.

    PubMed

    Ehn, Sebastian; Epple, Franz Michael; Fehringer, Andreas; Pennicard, David; Graafsma, Heinz; Noël, Peter; Pfeiffer, Franz

    2016-04-01

    Recent advances in single-photon-counting detectors are enabling the development of novel approaches to reach micrometer-scale resolution in x-ray imaging. One example of such a technology are the MEDIPIX3RX-based detectors, such as the LAMBDA which can be operated with a small pixel size in combination with real-time on-chip charge-sharing correction. This characteristic results in a close to ideal, box-like point spread function which we made use of in this study. The proposed method is based on raster-scanning the sample with sub-pixel sized steps in front of the detector. Subsequently, a deconvolution algorithm is employed to compensate for blurring introduced by the overlap of pixels with a well defined point spread function during the raster-scanning. The presented approach utilizes standard laboratory x-ray equipment while we report resolutions close to 10 μm. The achieved resolution is shown to follow the relationship [Formula: see text] with the pixel-size p of the detector and the number of raster-scanning steps n. PMID:27446649

  5. Wavenumber-frequency deconvolution of aeroacoustic microphone phased array data of arbitrary coherence

    NASA Astrophysics Data System (ADS)

    Bahr, Christopher J.; Cattafesta, Louis N.

    2016-11-01

    Deconvolution of aeroacoustic data acquired with microphone phased arrays is a computationally challenging task for distributed sources with arbitrary coherence. A new technique for performing such deconvolution is proposed. This technique relies on analysis of the array data in the wavenumber-frequency domain, allowing for fast convolution and reduced storage requirements when compared to traditional coherent deconvolution. A positive semidefinite constraint for the iterative deconvolution procedure is implemented and shows improved behavior in terms of quantifiable convergence metrics when compared to a standalone covariance inequality constraint. A series of simulations validates the method's ability to resolve coherence and phase angle relationships between partially coherent sources, as well as determines convergence criteria for deconvolution analysis. Simulations for point sources near the microphone phased array show potential for handling such data in the wavenumber-frequency domain. In particular, a physics-based integration boundary calculation is described, and can successfully isolate sources and track the appropriate integration bounds with and without the presence of flow. Magnitude and phase relationships between multiple sources are successfully extracted. Limitations of the deconvolution technique are determined from the simulations, particularly in the context of a simulated acoustic field in a closed test section wind tunnel with strong boundary layer contamination. A final application to a trailing edge noise experiment conducted in an open-jet wind tunnel matches best estimates of acoustic levels from traditional calculation methods and qualitatively assesses the coherence characteristics of the trailing edge noise source.

  6. Virtual tools for teaching electrocardiographic rhythm analysis.

    PubMed

    Criley, John Michael; Nelson, William P

    2006-01-01

    Electrocardiographic (ECG) rhythm analysis is inadequately taught in training programs, resulting in undertrained physicians reading a large percentage of the 40 million ECGs recorded annually. The effective use of simple tools (calipers, ruler, and magnifier) required for crucial measurements and comparisons of intervals requires considerable time for interactive instruction and is difficult to teach in the classroom. The ECGViewer (Blaufuss Medical Multimedia Laboratories, Palo Alto, Calif) program was developed using virtual tools easily manipulated by computer mouse that can be used to analyze archived scanned ECGs on computer screens and classroom projection. Trainees manipulate the on-screen tools from their seats by wireless mouse while the instructor makes corrections with a second mouse, in clear view of the trainees. An on-screen ladder diagram may be constructed by the trainee and critiqued by the instructor. The ECGViewer program has been successfully used and well received by trainees at medical school, residence, and subspecialty fellow level.

  7. Fairing Separation Analysis Using SepTOOL

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Kosareo, Daniel N.

    2015-01-01

    This document describes the relevant equations programmed in spreadsheet software, SepTOOL, developed by ZIN Technologies, Inc. (ZIN) to determine the separation clearance between a launch vehicle payload fairing and remaining stages. The software uses closed form rigid body dynamic solutions of the vehicle in combination with flexible body dynamics of the fairing, which is obtained from flexible body dynamic analysis or from test data, and superimposes the two results to obtain minimum separation clearance for any given set of flight trajectory conditions. Using closed form solutions allows SepTOOL to perform separation calculations several orders of magnitude faster compared to numerical methods which allows users to perform real time parameter studies. Moreover, SepTOOL can optimize vehicle performance to minimize separation clearance. This tool can evaluate various shapes and sizes of fairings along with different vehicle configurations and trajectories. These geometries and parameters are inputted in a user friendly interface. Although the software was specifically developed for evaluating the separation clearance of launch vehicle payload fairings, separation dynamics of other launch vehicle components can be evaluated provided that aerodynamic loads acting on the vehicle during the separation event are negligible. This document describes the development of SepTOOL providing analytical procedure and theoretical equations whose implementation of these equations is not disclosed. Realistic examples are presented, and the results are verified with ADAMS (MSC Software Corporation) simulations. It should be noted that SepTOOL is a preliminary separation clearance assessment software for payload fairings and should not be used for final clearance analysis.

  8. Climate Data Analysis Tools Facilitate Scientific Investigations

    NASA Astrophysics Data System (ADS)

    Doutriaux, Charles; Drach, Robert; McCoy, Renata; Mlaker, Velimir; Williams, Dean

    2009-09-01

    Insightful analysis of geophysical phenomena often depends on the choice of software tools to access, manipulate, and visualize data sets of interest. These data exploration tasks can be efficiently executed in a single computational environment by the freely distributed Climate Data Analysis Tools (CDAT; http://www-pcmdi.llnl.gov/software-portal/). Although CDAT has been designed primarily for climate science applications, its capabilities are increasingly being enhanced to address the data needs of other geophysical sciences. The CDAT software is based on the object-oriented Python computer language, but the software extends existing constructs to achieve capabilities that are relevant to any gridded geophysical data. Depending on the user's preference, CDAT can be fully deployed by either a script or graphical user interface.

  9. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  10. Reduction of blurring in broadband volume holographic imaging using a deconvolution method

    PubMed Central

    Lv, Yanlu; Zhang, Xuanxuan; Zhang, Dong; Zhang, Lin; Luo, Yuan; Luo, Jianwen

    2016-01-01

    Volume holographic imaging (VHI) is a promising biomedical imaging tool that can simultaneously provide multi-depth or multispectral information. When a VHI system is probed with a broadband source, the intensity spreads in the horizontal direction, causing degradation of the image contrast. We theoretically analyzed the reason of the horizontal intensity spread, and the analysis was validated by the simulation and experimental results of the broadband impulse response of the VHI system. We proposed a deconvolution method to reduce the horizontal intensity spread and increase the image contrast. Imaging experiments with three different objects, including bright field illuminated USAF test target and lung tissue specimen and fluorescent beads, were carried out to test the performance of the proposed method. The results demonstrated that the proposed method can significantly improve the horizontal contrast of the image acquire by broadband VHI system. PMID:27570703

  11. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools

  12. Deconvolution of immittance data: some old and new methods

    SciTech Connect

    Tuncer, Enis; Macdonald, Ross J.

    2007-01-01

    The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of Kronig-Kramers transforms, a powerful nonlinear-least-squares inversion method, and a new Monte-Carlo inversion method. The important distinction, usually ignored, between discrete-point distributions and continuous ones is emphasized, and both recent parametric and non-parametric deconvolution/inversion procedures for frequency-response data are discussed and compared. Information missing in a recent parametric measurement-model deconvolution approach is pointed out and remedied, and its priority evaluated. Comparisons are presented between the standard parametric least squares inversion method and a new non-parametric Monte Carlo one that allows complicated composite distributions of relaxation times (DRT) to be accurately estimated without the uncertainty present with regularization methods. Also, detailed Monte-Carlo DRT estimates for the supercooled liquid 0.4Ca(NO) 0.6KNO3(CKN) at 350 K are compared with appropriate frequency-response-model fit results. These composite models were derived from stretched-exponential Kohlrausch temporal response with the inclusion of either of two different series electrode-polarization functions.

  13. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  14. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  15. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  16. Deconvolution of images with periodic striping noise

    NASA Astrophysics Data System (ADS)

    Wang, Zuoguan; Xu, Wujun; Fu, Yutian

    2008-03-01

    In this paper a new deconvolution algorithm is presented concerning images contaminated by periodic stripes. Inspired by the 2-D power spectrum distribution property of periodic stripes in the frequency domain, we construct a novel regularized inverse filter which allows the algorithm to suppress the amplification of striping noise in the Fourier inverse step and further get rid of most of them, and mirror-wavelet denoising is followed to remove the left colored noise. In simulations with striped images, this algorithm outperforms the traditional mirror-wavelet based deconvolution in terms of both visual effect and SNR comparison, only at the expense of slightly heavier computation load. The same idea about regularized inverse filter can also be used to improve other deconvolution algorithms, such as wavelet packets and wiener filters, when they are employed to images stained by periodic stripes.

  17. Deconvolution in 3-D optical microscopy.

    PubMed

    Shaw, P

    1994-09-01

    Fluorescent probes are becoming ever more widely used in the study of subcellular structure, and determination of their three-dimensional distributions has become very important. Confocal microscopy is now a common technique for overcoming the problem of out-of-focus flare in fluorescence imaging, but an alternative method uses digital image processing of conventional fluorescence images--a technique often termed 'deconvolution' or 'restoration'. This review attempts to explain image deconvolution in a non-technical manner. It is also applicable to 3-D confocal images, and can provide a further significant improvement in clarity and interpretability of such images. Some examples of the application of image deconvolution to both conventional and confocal fluorescence images are shown.

  18. PLANET: a phage library analysis expert tool.

    PubMed

    Leplae, R; Tramontano, A

    1995-01-01

    In recent years random peptide libraries displayed on filamentous phage have been widely used and new ideas and techniques are continuously developing in the field (1-5). Notwithstanding this growing interest in the technique and in its promising results, and the enormous increase in usage and scope, very little effort has been devoted to the implementation of software able to handle and analyze the growing number of phage library-derived sequences. In our laboratory, phage libraries are extensively used and peptide sequences are continuously produced, so that the need arose of creating a database (6) to collect all the experimental results in a format compatible with GCG sequence analysis packages (7). We present here the description of an XWindow-based software package named PLANET (Phage Library ANalysis Expert Tool) devoted to the maintenance and statistical analysis of the database.

  19. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  20. Target deconvolution strategies in drug discovery.

    PubMed

    Terstappen, Georg C; Schlüpen, Christina; Raggiaschi, Roberto; Gaviraghi, Giovanni

    2007-11-01

    Recognition of some of the limitations of target-based drug discovery has recently led to the renaissance of a more holistic approach in which complex biological systems are investigated for phenotypic changes upon exposure to small molecules. The subsequent identification of the molecular targets that underlie an observed phenotypic response--termed target deconvolution--is an important aspect of current drug discovery, as knowledge of the molecular targets will greatly aid drug development. Here, the broad panel of experimental strategies that can be applied to target deconvolution is critically reviewed.

  1. Werner deconvolution for variable altitude aeromagnetic data

    SciTech Connect

    Ostrowski, J.S. ); Pilkington, M.; Teskey, D.J. )

    1993-10-01

    The standard Werner deconvolution method is extended to include the effects of variable sensor altitude but this leads to a deconvolution algorithm that is unstable for slowly changing flight height. By expressing the sensor altitude as a linear function of horizontal position (within a specified window), the authors show that the numerical instability can be avoided. The subsequent selection and averaging of the raw solutions is controlled by three parameters that can be adjusted to specific survey data characteristics. Results for an aeromagnetic survey over Vancouver Island, British Columbia show that, in comparison with the variable altitude approach, the standard Werner method produces unacceptable errors when applied to variable altitude data.

  2. Integrated FDIR Analysis Tool for Space Applications

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni; Di Tommaso, Umberto

    2013-08-01

    The crucial role of health management in space applications has been the subject of many studies carried out by NASA and ESA and is held in high regard by Thales Alenia Space. The common objective is to improve reliability and availability of space systems. This paper will briefly illustrate the evolution of IDEHAS (IntegrateD Engineering Harness Avionics and Software), an advanced tool currently used in Thales Alenia Space - Italy in several space programs and recently enhanced to fully support FDIR (Fault Detection Isolation and Recovery) analysis. The FDIR analysis logic flow will be presented, emphasizing the improvements offered to Mission Support & Operations activities. Finally the benefits provided to the Company and a list of possible future enhancements will be given.

  3. Nonstationary sparsity-constrained seismic deconvolution

    NASA Astrophysics Data System (ADS)

    Sun, Xue-Kai; Sam, Zandong Sun; Xie, Hui-Wen

    2014-12-01

    The Robinson convolution model is mainly restricted by three inappropriate assumptions, i.e., statistically white reflectivity, minimum-phase wavelet, and stationarity. Modern reflectivity inversion methods (e.g., sparsity-constrained deconvolution) generally attempt to suppress the problems associated with the first two assumptions but often ignore that seismic traces are nonstationary signals, which undermines the basic assumption of unchanging wavelet in reflectivity inversion. Through tests on reflectivity series, we confirm the effects of nonstationarity on reflectivity estimation and the loss of significant information, especially in deep layers. To overcome the problems caused by nonstationarity, we propose a nonstationary convolutional model, and then use the attenuation curve in log spectra to detect and correct the influences of nonstationarity. We use Gabor deconvolution to handle nonstationarity and sparsity-constrained deconvolution to separating reflectivity and wavelet. The combination of the two deconvolution methods effectively handles nonstationarity and greatly reduces the problems associated with the unreasonable assumptions regarding reflectivity and wavelet. Using marine seismic data, we show that correcting nonstationarity helps recover subtle reflectivity information and enhances the characterization of details with respect to the geological record.

  4. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  5. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools

  6. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  7. A Distributed, Parallel Visualization and Analysis Tool

    SciTech Connect

    2007-12-01

    VisIt is an interactive parallel visualization and graphical analysis tool for viewing scientific date on UNIX and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three- dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range.

  8. A Distributed, Parallel Visualization and Analysis Tool

    2007-12-01

    VisIt is an interactive parallel visualization and graphical analysis tool for viewing scientific date on UNIX and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-more » dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range.« less

  9. Multi-Mission Power Analysis Tool

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2011-01-01

    Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

  10. Timeline analysis tools for law enforcement

    NASA Astrophysics Data System (ADS)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  11. Airborne LIDAR Data Processing and Analysis Tools

    NASA Astrophysics Data System (ADS)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  12. Semi-blind nonstationary deconvolution: Joint reflectivity and Q estimation

    NASA Astrophysics Data System (ADS)

    Gholami, Ali

    2015-06-01

    Source signature deconvolution and attenuation or inverse quality factor- (Q-) filtering are two challenging problems in seismic data analysis which are used for extending the temporal bandwidth of the data. The separate estimates of the wavelet and, especially, the Earth Q model are by themselves problematic and add further uncertainties to inverse problems which are clearly ill-conditioned. The two problems are formulated in the framework of polynomial extrapolation and a closed form solution is provided based on the Lagrange interpolation. Analysis of the stability issue shows that the errors in the estimated results grow exponentially with both the problem size N and the inverse of Q. In order to circumvent both the instability and uncertainty of the Q model, these problems are addressed in a unified formulation as a semi-blind nonstationary deconvolution (SeND) to decompose the observed trace into the least number of nonstationary wavelets selected from a dictionary via a basis pursuit algorithm. The dictionary is constructed from the known source wavelet with different propagation times, each attenuated with a range of possible Q values. Using the Horner's rule, an efficient algorithm is also provided for application of the dictionary and its adjoint. SeND is an extension of the conventional sparse spike deconvolution to its nonstationary form, which provides the reflectivity and Q models simultaneously without requiring a-priori Q information. Assuming that the wavelet and attenuation mechanism are both known, the numerical data SeND allows to estimate both the original reflectivity and the Q models with higher accuracy, especially with respect to conventional spectral ratio techniques. The application of the algorithm to field data finally indicates a substantial improvement in temporal resolution on a seismic record.

  13. Built Environment Analysis Tool: April 2013

    SciTech Connect

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  14. A novel nonstationary deconvolution method based on spectral modeling and variable-step sampling hyperbolic smoothing

    NASA Astrophysics Data System (ADS)

    Li, Fang; Wang, Shoudong; Chen, Xiaohong; Liu, Guochang; Zheng, Qiang

    2014-04-01

    Deconvolution is an important part of seismic processing tool for improving the resolution. One of the key assumptions made in most deconvolutional methods is that the seismic data is stationary. However, due to the anelastic absorption, the seismic data is usually nonstationary. In this paper, a novel nonstationary deconvolution approach is proposed based on spectral modeling and variable-step sampling (VSS) hyperbolic smoothing. To facilitate our method, firstly, we apply the Gabor transform to perform a time-frequency decomposition of the nonstationary seismic trace. Secondly, we estimate the source wavelet amplitude spectrum by spectral modeling. Thirdly, smoothing the Gabor magnitude spectrum of seismic data along hyperbolic paths with VSS can obtain the magnitude of the attenuation function, and can also eliminate the effect of source wavelet. Fourthly, by assuming that the source wavelet and attenuation function are minimum phase, their phases can be determined by Hilbert transform. Finally, the estimated two factors are removed by dividing them into the Gabor spectrum of the trace to estimate the Gabor spectrum of the reflectivity. An inverse Gabor transform gives the time-domain reflectivity estimate. Tests on synthetic and field data show that the presented method is an effective tool that not only has the advantages of stationary deconvolution, but also can compensate for the energy absorption, without knowing or estimating the quality factor Q.

  15. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  16. ISHM Decision Analysis Tool: Operations Concept

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  17. Solar Array Verification Analysis Tool (SAVANT) Developed

    NASA Technical Reports Server (NTRS)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  18. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    SciTech Connect

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  19. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A; Buescher, Kevin L; Armstrong, Jerawan C

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  20. Millennial scale system impulse response of polar climates - deconvolution results between δ 18O records from Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Reischmann, E.; Yang, X.; Rial, J. A.

    2013-12-01

    Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, δ 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the

  1. Deconvolution of time series in the laboratory

    NASA Astrophysics Data System (ADS)

    John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian

    2016-10-01

    In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.

  2. Deconvolution/identification techniques for nonnegative signals

    SciTech Connect

    Goodman, D.M.; Yu, D.R.

    1991-11-01

    Several methods for solving the nonparametric deconvolution/identification problem when the unknown is nonnegative are presented. First we consider the constrained least squares method and discuss three ways to estimate the regularization parameter: the discrepancy principle, Mallow`s C{sub L}, and generalized cross validation. Next we consider maximum entropy methods. Last, we present a new conjugate gradient algorithm. A preliminary comparison is presented; detailed Monte-Carlo experiments will be presented at the conference. 13 refs.

  3. Self-Constrained Euler Deconvolution Using Potential Field Data of Different Altitudes

    NASA Astrophysics Data System (ADS)

    Zhou, Wenna; Nan, Zeyu; Li, Jiyan

    2016-06-01

    Euler deconvolution has been developed as almost the most common tool in potential field data semi-automatic interpretation. The structural index (SI) is a main determining factor of the quality of depth estimation. In this paper, we first present an improved Euler deconvolution method to eliminate the influence of SI using potential field data of different altitudes. The different altitudes data can be obtained by the upward continuation or can be directly obtained by the airborne measurement realization. Euler deconvolution at different altitudes of a certain range has very similar calculation equation. Therefore, the ratio of Euler equations of two different altitudes can be calculated to discard the SI. Thus, the depth and location of geologic source can be directly calculated using the improved Euler deconvolution without any prior information. Particularly, the noise influence can be decreased using the upward continuation of different altitudes. The new method is called self-constrained Euler deconvolution (SED). Subsequently, based on the SED algorithm, we deduce the full tensor gradient (FTG) calculation form of the new improved method. As we all know, using multi-components data of FTG have added advantages in data interpretation. The FTG form is composed by x-, y- and z-directional components. Due to the using more components, the FTG form can get more accurate results and more information in detail. The proposed modification method is tested using different synthetic models, and the satisfactory results are obtained. Finally, we applied the new approach to Bishop model magnetic data and real gravity data. All the results demonstrate that the new approach is utility tool to interpret the potential field and full tensor gradient data.

  4. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  5. Deconvolution of Lateral Shear Interferograms

    NASA Astrophysics Data System (ADS)

    Ambrose, Joseph George

    1994-01-01

    This dissertation develops and presents an existing but little known method to provide an exact solution to the Wavefront Difference Equation routinely encountered in the reduction of Lateral Shear Interferograms (LSI). The method first suggested by Dr. Roland Shack treats LSI as a convolution of the wavefront with an odd impulse pair. This representation casts the Lateral Shear problem in terms of Fourier optics operators and filters with a simplified treatment of the reduction of the LSI possible. This work extends the original proposal applied to line scans of wavefronts to full two-dimensional recovery of the wavefront along with developing the associated mathematical theory and computer code to efficiently execute the wavefront reduction. Further, a number of applications of the wavefront reduction technique presented here are developed. The applications of the filtering technique developed here include optical imaging systems exhibiting the primary aberrations, a model of residual tool marks after fabrication and propagation of an optical probe through atmospheric turbulence. The computer program developed in this work resides on a PC and produces accurate results to a 1/500 wave when compared to ray traced input wavefronts. The combination of the relatively simple concept providing the basis of the reduction technique with the highly accurate results over a wide range of input wavefronts makes this a timely effort. Finally, the reduction technique can be applied to the accurate testing of aspheric optical components.

  6. Deconvolution of lateral shear interferograms

    NASA Astrophysics Data System (ADS)

    Ambrose, Joseph George

    This dissertation develops and presents an existing but little known method to provide an exact solution to the wavefront difference equation routinely encountered in the reduction of lateral shear interferograms (LSI). The method first suggested by Dr. Roland Shack treats LSI as a convolution of the wavefront with an odd impulse pair. This representation casts the lateral shear problem in terms of Fourier optics operators and filters with a simplified treatment of the reduction of the LSI possible. This work extends the original proposal applied to line scans of wavefronts to full two-dimensional recovery of the wavefront along with developing the associated mathematical theory and computer code to efficiently execute the wavefront reduction. Further, a number of applications of the wavefront reduction technique presented here are developed. The applications of the filtering technique developed here include optical imaging systems exhibiting the primary aberrations, a model of residual tool marks after fabrication, and propagation of an optical probe through atmospheric turbulence. The computer program developed resides on a PC and produces accurate results to a 1/500 wave when compared to ray traced input wavefronts. The combination of the relatively simple concept providing the basis of the reduction technique with the highly accurate results over a wide range of input wavefronts makes this a timely effort. Finally, the reduction technique can be applied to the accurate testing of aspheric optical components.

  7. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  8. Knowledge base navigator facilitating regional analysis inter-tool communication.

    SciTech Connect

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  9. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  10. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  11. Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 μm) and regression tree analysis

    USGS Publications Warehouse

    Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus

    2013-01-01

    This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 μm wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 μm region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

  12. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  13. Blind deconvolution of images using optimal sparse representations.

    PubMed

    Bronstein, Michael M; Bronstein, Alexander M; Zibulevsky, Michael; Zeevi, Yehoshua Y

    2005-06-01

    The relative Newton algorithm, previously proposed for quasi-maximum likelihood blind source separation and blind deconvolution of one-dimensional signals is generalized for blind deconvolution of images. Smooth approximation of the absolute value is used as the nonlinear term for sparse sources. In addition, we propose a method of sparsification, which allows blind deconvolution of arbitrary sources, and show how to find optimal sparsifying transformations by supervised learning.

  14. Application of 3-D digital deconvolution to optically sectioned images for improving the automatic analysis of fluorescent-labeled tumor specimens

    NASA Astrophysics Data System (ADS)

    Lockett, Stephen J.; Jacobson, Kenneth A.; Herman, Brian

    1992-06-01

    The analysis of fluorescent stained clusters of cells has been improved by recording multiple images of the same microscopic scene at different focal planes and then applying a three dimensional (3-D) out of focus background subtraction algorithm. The algorithm significantly reduced the out of focus signal and improved the spatial resolution. The method was tested on specimens of 10 micrometers diameter ((phi) ) beads embedded in agarose and on a 5 micrometers breast tumor section labeled with a fluorescent DNA stain. The images were analyzed using an algorithm for automatically detecting fluorescent objects. The proportion of correctly detected in focus beads and breast nuclei increased from 1/8 to 8/8 and from 56/104 to 81/104 respectively after processing by the subtraction algorithm. Furthermore, the subtraction algorithm reduced the proportion of out of focus relative to in focus total intensity detected in the bead images from 51% to 33%. Further developments of these techniques, that utilize the 3-D point spread function (PSF) of the imaging system and a 3-D segmentation algorithm, should result in the correct detection and precise quantification of virtually all cells in solid tumor specimens. Thus the approach should serve as a highly reliable automated screening method for a wide variety of clinical specimens.

  15. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  16. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  17. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  18. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  19. Nonparametric, nonnegative deconvolution of large time series

    NASA Astrophysics Data System (ADS)

    Cirpka, O. A.

    2006-12-01

    There is a long tradition of characterizing hydrologic systems by linear models, in which the response of the system to a time-varying stimulus is computed by convolution of a system-specific transfer function with the input signal. Despite its limitations, the transfer-function concept has been shown valuable for many situations such as the precipitation/run-off relationships of catchments and solute transport in agricultural soils and aquifers. A practical difficulty lies in the identification of the transfer function. A common approach is to fit a parametric function, enforcing a particular shape of the transfer function, which may be in contradiction to the real behavior (e.g., multimodal transfer functions, long tails, etc.). In our nonparametric deconvolution, the transfer function is assumed an auto-correlated random time function, which is conditioned on the data by a Bayesian approach. Nonnegativity, which is a vital constraint for solute-transport applications, is enforced by the method of Lagrange multipliers. This makes the inverse problem nonlinear. In nonparametric deconvolution, identifying the auto-correlation parameters is crucial. Enforcing too much smoothness prohibits the identification of important features, whereas insufficient smoothing leads to physically meaningless transfer functions, mapping noise components in the two data series onto each other. We identify optimal smoothness parameters by the expectation-maximization method, which requires the repeated generation of many conditional realizations. The overall approach, however, is still significantly faster than Markov-Chain Monte-Carlo methods presented recently. We apply our approach to electric-conductivity time series measured in a river and monitoring wells in the adjacent aquifer. The data cover 1.5 years with a temporal resolution of 1h. The identified transfer functions have lengths of up to 60 days, making up 1440 parameters. We believe that nonparametric deconvolution is an

  20. Recursive deconvolution of combinatorial chemical libraries.

    PubMed

    Erb, E; Janda, K D; Brenner, S

    1994-11-22

    A recursive strategy that solves for the active members of a chemical library is presented. A pentapeptide library with an alphabet of Gly, Leu, Phe, and Tyr (1024 members) was constructed on a solid support by the method of split synthesis. One member of this library (NH2-Tyr-Gly-Gly-Phe-Leu) is a native binder to a beta-endorphin antibody. A variation of the split synthesis approach is used to build the combinatorial library. In four vials, a member of the library's alphabet is coupled to a solid support. After each coupling, a portion of the resin from each of the four reaction vials was set aside and catalogued. The solid support from each vial is then combined, mixed, and redivided. The steps of (i) coupling, (ii) saving and cataloging, and (iii) randomizing were repeated until a pentapeptide library was obtained. The four pentapeptide libraries where the N-terminal amino acid is defined were screened against the beta-endorphin antibody and quantitated via an ELISA. The amino acid of the four pools that demonstrated the most binding was then coupled to the four tetrapeptide partial libraries that had been set aside and catalogued during the split synthesis. This recursive deconvolution was repeated until the best binders were deduced. Besides the anticipated native binder, two other members of the library displayed significant binding. This recursive method of deconvolution does not use a molecular tag, requires only one split synthesis, and can be applied to the deconvolution of nonlinear small-molecule combinatorial libraries and linear oligomeric combinatorial libraries, since it is based only on the procedure of the synthesis. PMID:7972077

  1. Using Kepler for Tool Integration in Microarray Analysis Workflows

    PubMed Central

    Gan, Zhuohui; Stowe, Jennifer C.; Altintas, Ilkay; McCulloch, Andrew D.; Zambon, Alexander C.

    2015-01-01

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines. PMID:26605000

  2. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings. PMID:22045723

  3. Statistical methods for the forensic analysis of striated tool marks

    SciTech Connect

    Hoeksema, Amy Beth

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  4. Comparison of Deconvolution Filters for Photoacoustic Tomography

    PubMed Central

    Van de Sompel, Dominique; Sasportas, Laura S.; Jokerst, Jesse V.; Gambhir, Sanjiv S.

    2016-01-01

    In this work, we compare the merits of three temporal data deconvolution methods for use in the filtered backprojection algorithm for photoacoustic tomography (PAT). We evaluate the standard Fourier division technique, the Wiener deconvolution filter, and a Tikhonov L-2 norm regularized matrix inversion method. Our experiments were carried out on subjects of various appearances, namely a pencil lead, two man-made phantoms, an in vivo subcutaneous mouse tumor model, and a perfused and excised mouse brain. All subjects were scanned using an imaging system with a rotatable hemispherical bowl, into which 128 ultrasound transducer elements were embedded in a spiral pattern. We characterized the frequency response of each deconvolution method, compared the final image quality achieved by each deconvolution technique, and evaluated each method’s robustness to noise. The frequency response was quantified by measuring the accuracy with which each filter recovered the ideal flat frequency spectrum of an experimentally measured impulse response. Image quality under the various scenarios was quantified by computing noise versus resolution curves for a point source phantom, as well as the full width at half maximum (FWHM) and contrast-to-noise ratio (CNR) of selected image features such as dots and linear structures in additional imaging subjects. It was found that the Tikhonov filter yielded the most accurate balance of lower and higher frequency content (as measured by comparing the spectra of deconvolved impulse response signals to the ideal flat frequency spectrum), achieved a competitive image resolution and contrast-to-noise ratio, and yielded the greatest robustness to noise. While the Wiener filter achieved a similar image resolution, it tended to underrepresent the lower frequency content of the deconvolved signals, and hence of the reconstructed images after backprojection. In addition, its robustness to noise was poorer than that of the Tikhonov filter. The

  5. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  6. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  7. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  8. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  9. Model analysis tools in the Virtual Model Repository (VMR)

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Ridley, A. J.

    2013-12-01

    The Virtual Model Repository (VMR) provides scientific analysis tools for a wide variety of numerical models of the Earth's magnetosphere. Data discovery, visualization tools and data/model comparisons are provided in a consistent and intuitive format. A large collection of numerical model runs are available to analyze, including the large Earth magnetosphere event run library at the CCMC and many runs from the University of Michigan. Relevant data useful for data/model comparisons is found using various APIs and included in many of the visualization tools. Recent additions to the VMR include a comprehensive suite of tools for analysis of the Global Ionosphere Thermosphere Model (GITM).

  10. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  11. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  12. Graphical Acoustic Liner Design and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  13. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  14. Deconvolution methods for structured illumination microscopy.

    PubMed

    Chakrova, Nadya; Rieger, Bernd; Stallinga, Sjoerd

    2016-07-01

    We compare two recently developed multiple-frame deconvolution approaches for the reconstruction of structured illumination microscopy (SIM) data: the pattern-illuminated Fourier ptychography algorithm (piFP) and the joint Richardson-Lucy deconvolution (jRL). The quality of the images reconstructed by these methods is compared in terms of the achieved resolution improvement, noise enhancement, and inherent artifacts. Furthermore, we study the issue of object-dependent resolution improvement by considering the modulation transfer functions derived from different types of objects. The performance of the considered methods is tested in experiments and benchmarked with a commercial SIM microscope. We find that the piFP method resolves periodic and isolated structures equally well, whereas the jRL method provides significantly higher resolution for isolated objects compared to periodic ones. Images reconstructed by the piFP and jRL algorithms are comparable to the images reconstructed using the generalized Wiener filter applied in most commercial SIM microscopes. An advantage of the discussed algorithms is that they allow the reconstruction of SIM images acquired under different types of illumination, such as multi-spot or random illumination. PMID:27409703

  15. AIDA: An Adaptive Image Deconvolution Algorithm

    NASA Astrophysics Data System (ADS)

    Hom, Erik; Marchis, F.; Lee, T. K.; Haase, S.; Agard, D. A.; Sedat, J. W.

    2007-10-01

    We recently described an adaptive image deconvolution algorithm (AIDA) for myopic deconvolution of multi-frame and three-dimensional data acquired through astronomical and microscopic imaging [Hom et al., J. Opt. Soc. Am. A 24, 1580 (2007)]. AIDA is a reimplementation and extension of the MISTRAL method developed by Mugnier and co-workers and shown to yield object reconstructions with excellent edge preservation and photometric precision [J. Opt. Soc. Am. A 21, 1841 (2004)]. Written in Numerical Python with calls to a robust constrained conjugate gradient method, AIDA has significantly improved run times over the original MISTRAL implementation. AIDA includes a scheme to automatically balance maximum-likelihood estimation and object regularization, which significantly decreases the amount of time and effort needed to generate satisfactory reconstructions. Here, we present a gallery of results demonstrating the effectiveness of AIDA in processing planetary science images acquired using adaptive-optics systems. Offered as an open-source alternative to MISTRAL, AIDA is available for download and further development at: http://msg.ucsf.edu/AIDA. This work was supported in part by the W. M. Keck Observatory, the National Institutes of Health, NASA, the National Science Foundation Science and Technology Center for Adaptive Optics at UC-Santa Cruz, and the Howard Hughes Medical Institute.

  16. Structure preserving color deconvolution for immunohistochemistry images

    NASA Astrophysics Data System (ADS)

    Chen, Ting; Srinivas, Chukka

    2015-03-01

    Immunohistochemistry (IHC) staining is an important technique for the detection of one or more biomarkers within a single tissue section. In digital pathology applications, the correct unmixing of the tissue image into its individual constituent dyes for each biomarker is a prerequisite for accurate detection and identification of the underlying cellular structures. A popular technique thus far is the color deconvolution method1 proposed by Ruifrok et al. However, Ruifrok's method independently estimates the individual dye contributions at each pixel which potentially leads to "holes and cracks" in the cells in the unmixed images. This is clearly inadequate since strong spatial dependencies exist in the tissue images which contain rich cellular structures. In this paper, we formulate the unmixing algorithm into a least-square framework of image patches, and propose a novel color deconvolution method which explicitly incorporates the spatial smoothness and structure continuity constraint into a neighborhood graph regularizer. An analytical closed-form solution to the cost function is derived for this algorithm for fast implementation. The algorithm is evaluated on a clinical data set containing a number of 3,3-Diaminobenzidine (DAB) and hematoxylin (HTX) stained IHC slides and demonstrates better unmixing results than the existing strategy.

  17. Parachute system design, analysis, and simulation tool

    SciTech Connect

    Sundberg, W.D.; McBride, D.D.; Gwinn, K.W.; Waye, D.E.; Hailey, C.E.

    1992-01-01

    For over twenty years designers at Sandia National Laboratories have developed various parachute simulation codes to model deployment, inflation, loading, trajectories, aircraft downwash and line sail. In addition to these codes, material property data bases have been acquired. Recently we have initiated project to integrate these codes and data bases into a single software tool entitled SPARSYS (Sandia PARachute SYstem Simulation). We have constructed a graphical user interface as the driver and framework for SPARSYS. In this paper we present a status report on SPARSYS describing progress in developing and incorporating independent modules, in developing an integrated trajectory package, and in developing a materials data base including high-rate-of-strain data.

  18. [Analysis on evaluation tool for literature quality in clinical study].

    PubMed

    Liu, Qing; Zhai, Wei; Tan, Ya-qin; Huang, Juan

    2014-09-01

    The tools used for the literature quality evaluation are introduced. The common evaluation tools that are publicly and extensively used for the evaluation of clinical trial literature quality in the world are analyzed, including Jadad scale, Consolidated Standards of Reporting Trials (CONSORT) statement and Grades of Recommendations Assessment, Development and Evaluation (GRADE) system and the others. Additionally, the present development, updates and applications of these tools are involved in analysis.

  19. Klonos: A Similarity Analysis Based Tool for Software Porting

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  20. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  1. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2014-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  2. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2013-10-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Preliminary results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.

  3. Fully Parallel MHD Stability Analysis Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang

    2015-11-01

    Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.

  4. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution

  5. Value Analysis: A Tool for Community Colleges.

    ERIC Educational Resources Information Center

    White, Rita A.

    Adoption of a value analysis program is proposed to aid colleges in identifying and implementing educationally sound labor-saving devices and procedures, enabling them to meet more students' needs at less cost with no quality reduction and a minimum of staff resistance. Value analysis is defined as a method for studying how well a product does…

  6. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    ERIC Educational Resources Information Center

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  7. Spatial deconvolution of spectropolarimetric data: an application to quiet Sun magnetic elements

    NASA Astrophysics Data System (ADS)

    Quintero Noda, C.; Asensio Ramos, A.; Orozco Suárez, D.; Ruiz Cobo, B.

    2015-07-01

    for the magnetic field intensity and for its inclination. Moreover, the discontinuity between the magnetic and non magnetic environment in the flux tube gets sharper. Conclusions: The deconvolution process used in this paper reveals information that the smearing induced by the point spread function (PSF) of the telescope hides. Additionally, the deconvolution is done with a low computational load, making it appealing for its use on the analysis of large data sets. A copy of the IDL code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/579/A3

  8. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  9. Perceived Image Quality Improvements from the Application of Image Deconvolution to Retinal Images from an Adaptive Optics Fundus Imager

    NASA Astrophysics Data System (ADS)

    Soliz, P.; Nemeth, S. C.; Erry, G. R. G.; Otten, L. J.; Yang, S. Y.

    Aim: The objective of this project was to apply an image restoration methodology based on wavefront measurements obtained with a Shack-Hartmann sensor and evaluating the restored image quality based on medical criteria.Methods: Implementing an adaptive optics (AO) technique, a fundus imager was used to achieve low-order correction to images of the retina. The high-order correction was provided by deconvolution. A Shack-Hartmann wavefront sensor measures aberrations. The wavefront measurement is the basis for activating a deformable mirror. Image restoration to remove remaining aberrations is achieved by direct deconvolution using the point spread function (PSF) or a blind deconvolution. The PSF is estimated using measured wavefront aberrations. Direct application of classical deconvolution methods such as inverse filtering, Wiener filtering or iterative blind deconvolution (IBD) to the AO retinal images obtained from the adaptive optical imaging system is not satisfactory because of the very large image size, dificulty in modeling the system noise, and inaccuracy in PSF estimation. Our approach combines direct and blind deconvolution to exploit available system information, avoid non-convergence, and time-consuming iterative processes. Results: The deconvolution was applied to human subject data and resulting restored images compared by a trained ophthalmic researcher. Qualitative analysis showed significant improvements. Neovascularization can be visualized with the adaptive optics device that cannot be resolved with the standard fundus camera. The individual nerve fiber bundles are easily resolved as are melanin structures in the choroid. Conclusion: This project demonstrated that computer-enhanced, adaptive optic images have greater detail of anatomical and pathological structures.

  10. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  11. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  12. The environment power system analysis tool development program

    NASA Astrophysics Data System (ADS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-12-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  13. Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning

    PubMed Central

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  14. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  15. Suspected-target pesticide screening using gas chromatography-quadrupole time-of-flight mass spectrometry with high resolution deconvolution and retention index/mass spectrum library.

    PubMed

    Zhang, Fang; Wang, Haoyang; Zhang, Li; Zhang, Jing; Fan, Ruojing; Yu, Chongtian; Wang, Wenwen; Guo, Yinlong

    2014-10-01

    A strategy for suspected-target screening of pesticide residues in complicated matrices was exploited using gas chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS). The screening workflow followed three key steps of, initial detection, preliminary identification, and final confirmation. The initial detection of components in a matrix was done by a high resolution mass spectrum deconvolution; the preliminary identification of suspected pesticides was based on a special retention index/mass spectrum (RI/MS) library that contained both the first-stage mass spectra (MS(1) spectra) and retention indices; and the final confirmation was accomplished by accurate mass measurements of representative ions with their response ratios from the MS(1) spectra or representative product ions from the second-stage mass spectra (MS(2) spectra). To evaluate the applicability of the workflow in real samples, three matrices of apple, spinach, and scallion, each spiked with 165 test pesticides in a set of concentrations, were selected as the models. The results showed that the use of high-resolution TOF enabled effective extractions of spectra from noisy chromatograms, which was based on a narrow mass window (5 mDa) and suspected-target compounds identified by the similarity match of deconvoluted full mass spectra and filtering of linear RIs. On average, over 74% of pesticides at 50 ng/mL could be identified using deconvolution and the RI/MS library. Over 80% of pesticides at 5 ng/mL or lower concentrations could be confirmed in each matrix using at least two representative ions with their response ratios from the MS(1) spectra. In addition, the application of product ion spectra was capable of confirming suspected pesticides with specificity for some pesticides in complicated matrices. In conclusion, GC-QTOF MS combined with the RI/MS library seems to be one of the most efficient tools for the analysis of suspected-target pesticide residues

  16. Analysis Tool for Atmospheric Neutrino Oscillations

    SciTech Connect

    Escamilla, J.; Ernst, D. J.; Latimer, D. C.

    2008-03-13

    Of all the neutrino oscillation data, the atmospheric data is statistically the most important. It is also the most complex to analyze. This is because the source is cosmic rays colliding with the atmosphere or the rock in the Earth and, most importantly, Super-K is the only experiment in which the angle of the neutrino is approximately determined by a measurement of the direction of the created and measured lepton. For pedagogic purposes, we first derive how such an analysis is done. We then derive a new computationally efficient but still quantitative way of doing the analysis and we present some results for a two neutrino model.

  17. A new tool for contamination analysis

    SciTech Connect

    Meltzer, M.; Gregg, H.

    1996-06-01

    The Contamination Analysis Unit (CAU) is a sensing system that facilitates a new approach to industrial cleaning. Through use of portable mass spectrometry and various desorption techniques, the CAU provides in-process, near-real-time measurement of surface cleanliness levels. It can be of help in significantly reducing hazardous waste generation and toxic air emissions from manufacturing operations.

  18. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    SciTech Connect

    Bush, B.; Penev, M.; Melaina, M.; Zuboy, J.

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  19. A New Web-based Tool for Aerosol Data Analysis: the AERONET Data Synergy Tool

    NASA Astrophysics Data System (ADS)

    Giles, D. M.; Holben, B. N.; Slutsker, I.; Welton, E. J.; Chin, M.; Schmaltz, J.; Kucsera, T.; Diehl, T.

    2006-12-01

    The Aerosol Robotic Network (AERONET) provides important aerosol microphysical and optical properties via an extensive distribution of continental sites and sparsely-distributed coastal and oceanic sites among the major oceans and inland seas. These data provide only spatial point measurements while supplemental data are needed for a complete aerosol analysis. Ancillary data sets (e.g., MODIS true color imagery and back trajectory analyses) are available by navigating to several web data sources. In an effort to streamline aerosol data discovery and analysis, a new web data tool called the "AERONET Data Synergy Tool" was launched from the AERONET web site. This tool provides access to ground-based (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. The Data Synergy Tool user can access these data sources to obtain properties such as the optical depth, composition, absorption, size, spatial and vertical distribution, and source region of aerosols. AERONET Ascension Island and COVE platform site data will be presented to highlight the Data Synergy Tool capabilities in analyzing urban haze, smoke, and dust aerosol events over the ocean. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  20. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  1. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  2. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals.more » SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.« less

  3. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    SciTech Connect

    Andraka, Charles E.

    2015-10-20

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projections screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals. SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position. This is key to evaluating a facet as suitable for a specific solar application. SOFAST reports the measurements of the facet as detailed surface normal location in a format suitable for ray tracing optical analysis codes. SOFAST also reports summary information as to the facet fitted shape (monomial) and error parameters. Useful plots of the error distribution are also presented.

  4. Radar Interferometry Time Series Analysis and Tools

    NASA Astrophysics Data System (ADS)

    Buckley, S. M.

    2006-12-01

    We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.

  5. Nonlinear Robustness Analysis Tools for Flight Control Law Validation & Verification

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhijit

    Loss of control in flight is among the highest aviation accident categories for both the number of accidents and the number of fatalities. The flight controls community is seeking an improved validation tools for safety critical flight control systems. Current validation tools rely heavily on linear analysis, which ignore the inherent nonlinear nature of the aircraft dynamics and flight control system. Specifically, current practices in validating the flight control system involve gridding the flight envelope and checking various criteria based on linear analysis to ensure safety of the flight control system. The analysis and certification methods currently applied assume the aircrafts' dynamics is linear. In reality, the behavior of the aircraft is always nonlinear due to its aerodynamic characteristics and physical limitations imposed by the actuators. This thesis develops nonlinear analysis tools capable of certifying flight control laws for nonlinear aircraft dynamics. The proposed analysis tools can handle both the aerodynamic nonlinearities and the physical limitations imposed by the actuators in the aircrafts' dynamics. This proposed validation technique will extend and enrich the predictive capability of existing flight control law validation methods to analyze nonlinearities. The objective of this thesis is to provide the flight control community with an advanced set of analysis tools to reduce aviation fatalities and accidents rate.

  6. Limited-memory scaled gradient projection methods for real-time image deconvolution in microscopy

    NASA Astrophysics Data System (ADS)

    Porta, F.; Zanella, R.; Zanghirati, G.; Zanni, L.

    2015-04-01

    Gradient projection methods have given rise to effective tools for image deconvolution in several relevant areas, such as microscopy, medical imaging and astronomy. Due to the large scale of the optimization problems arising in nowadays imaging applications and to the growing request of real-time reconstructions, an interesting challenge to be faced consists in designing new acceleration techniques for the gradient schemes, able to preserve their simplicity and low computational cost of each iteration. In this work we propose an acceleration strategy for a state-of-the-art scaled gradient projection method for image deconvolution in microscopy. The acceleration idea is derived by adapting a step-length selection rule, recently introduced for limited-memory steepest descent methods in unconstrained optimization, to the special constrained optimization framework arising in image reconstruction. We describe how important issues related to the generalization of the step-length rule to the imaging optimization problem have been faced and we evaluate the improvements due to the acceleration strategy by numerical experiments on large-scale image deconvolution problems.

  7. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. PMID:26572119

  8. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants.

  9. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  10. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis. PMID:27391182

  11. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  12. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  13. Development of a climate data analysis tool (CDAT)

    SciTech Connect

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  14. The discrete Kalman filtering approach for seismic signals deconvolution

    SciTech Connect

    Kurniadi, Rizal; Nurhandoko, Bagus Endar B.

    2012-06-20

    Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

  15. Deconvolution of axisymmetric flame properties using Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Daun, Kyle J.; Thomson, Kevin A.; Liu, Fengshan; Smallwood, Greg J.

    2006-07-01

    We present a method based on Tikhonov regularization for solving one-dimensional inverse tomography problems that arise in combustion applications. In this technique, Tikhonov regularization transforms the ill-conditioned set of equations generated by onion-peeling deconvolution into a well-conditioned set that is less susceptible to measurement errors that arise in experimental settings. The performance of this method is compared to that of onion-peeling and Abel three-point deconvolution by solving for a known field variable distribution from projected data contaminated with an artificially generated error. The results show that Tikhonov deconvolution provides a more accurate field distribution than onion-peeling and Abel three-point deconvolution and is more stable than the other two methods as the distance between projected data points decreases.

  16. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  17. Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

    1999-01-01

    A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

  18. [SIGAPS, a tool for the analysis of scientific publications].

    PubMed

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production.

  19. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  20. Design and analysis tools for concurrent blackboard systems

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1991-01-01

    A set of blackboard system design and analysis tools that consists of a knowledge source organizer, a knowledge source input/output connectivity analyzer, and a validated blackboard system simulation model is discussed. The author presents the structure and functionality of the knowledge source input/output connectivity analyzer. An example outlining the use of the analyzer to aid in the design of a concurrent tactical decision generator for air-to-air combat is presented. The blackboard system design and analysis tools were designed for generic blackboard systems and are application independent.

  1. Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs

    SciTech Connect

    Howard, M.; Luttman, A.; Fowler, M.

    2014-11-01

    In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

  2. Deconvoluting nonaxial recoil in Coulomb explosion measurements of molecular axis alignment

    NASA Astrophysics Data System (ADS)

    Christensen, Lauge; Christiansen, Lars; Shepperson, Benjamin; Stapelfeldt, Henrik

    2016-08-01

    We report a quantitative study of the effect of nonaxial recoil during Coulomb explosion of laser-aligned molecules and introduce a method to remove the blurring caused by nonaxial recoil in the fragment-ion angular distributions. Simulations show that nonaxial recoil affects correlations between the emission directions of fragment ions differently from the effect caused by imperfect molecular alignment. The method, based on analysis of the correlation between the emission directions of the fragment ions from Coulomb explosion, is used to deconvolute the effect of nonaxial recoil from experimental fragment angular distributions. The deconvolution method is then applied to a number of experimental data sets to correct the degree of alignment for nonaxial recoil, to select optimal Coulomb explosion channels for probing molecular alignment, and to estimate the highest degree of alignment that can be observed from selected Coulomb explosion channels.

  3. Optimizing Laguerre expansion based deconvolution methods for analysing bi-exponential fluorescence lifetime images.

    PubMed

    Zhang, Yongliang; Chen, Yu; Li, David Day-Uei

    2016-06-27

    Fast deconvolution is an essential step to calibrate instrument responses in big fluorescence lifetime imaging microscopy (FLIM) image analysis. This paper examined a computationally effective least squares deconvolution method based on Laguerre expansion (LSD-LE), recently developed for clinical diagnosis applications, and proposed new criteria for selecting Laguerre basis functions (LBFs) without considering the mutual orthonormalities between LBFs. Compared with the previously reported LSD-LE, the improved LSD-LE allows to use a higher laser repetition rate, reducing the acquisition time per measurement. Moreover, we extended it, for the first time, to analyze bi-exponential fluorescence decays for more general FLIM-FRET applications. The proposed method was tested on both synthesized bi-exponential and realistic FLIM data for studying the endocytosis of gold nanorods in Hek293 cells. Compared with the previously reported constrained LSD-LE, it shows promising results. PMID:27410552

  4. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  5. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  6. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  7. Impact of beam deconvolution on noise properties in CMB measurements: Application to Planck LFI

    NASA Astrophysics Data System (ADS)

    Keihänen, E.; Kiiveri, K.; Lindholm, V.; Reinecke, M.; Suur-Uski, A.-S.

    2016-03-01

    We present an analysis of the effects of beam deconvolution on noise properties in CMB measurements. The analysis is built around the artDeco beam deconvolver code. We derive a low-resolution noise covariance matrix that describes the residual noise in deconvolution products, both in harmonic and pixel space. The matrix models the residual correlated noise that remains in time-ordered data after destriping, and the effect of deconvolution on this noise. To validate the results, we generate noise simulations that mimic the data from the Planck LFI instrument. A χ2 test for the full 70 GHz covariance in multipole range ℓ = 0 - 50 yields a mean reduced χ2 of 1.0037. We compare two destriping options, full and independent destriping, when deconvolving subsets of available data. Full destriping leaves substantially less residual noise, but leaves data sets intercorrelated. We also derive a white noise covariance matrix that provides an approximation of the full noise at high multipoles, and study the properties on high-resolution noise in pixel space through simulations.

  8. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    PubMed

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  9. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  10. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  11. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  12. Selected Tools for Risk Analysis in Logistics Processes

    NASA Astrophysics Data System (ADS)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  13. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively…

  14. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  15. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  16. A Framework for Fast Image Deconvolution With Incomplete Observations.

    PubMed

    Simoes, Miguel; Almeida, Luis B; Bioucas-Dias, Jose; Chanussot, Jocelyn

    2016-11-01

    In image deconvolution problems, the diagonalization of the underlying operators by means of the fast Fourier transform (FFT) usually yields very large speedups. When there are incomplete observations (e.g., in the case of unknown boundaries), standard deconvolution techniques normally involve non-diagonalizable operators, resulting in rather slow methods or, otherwise, use inexact convolution models, resulting in the occurrence of artifacts in the enhanced images. In this paper, we propose a new deconvolution framework for images with incomplete observations that allows us to work with diagonalized convolution operators, and therefore is very fast. We iteratively alternate the estimation of the unknown pixels and of the deconvolved image, using, e.g., an FFT-based deconvolution method. This framework is an efficient, high-quality alternative to existing methods of dealing with the image boundaries, such as edge tapering. It can be used with any fast deconvolution method. We give an example in which a state-of-the-art method that assumes periodic boundary conditions is extended, using this framework, to unknown boundary conditions. Furthermore, we propose a specific implementation of this framework, based on the alternating direction method of multipliers (ADMM). We provide a proof of convergence for the resulting algorithm, which can be seen as a "partial" ADMM, in which not all variables are dualized. We report experimental comparisons with other primal-dual methods, where the proposed one performed at the level of the state of the art. Four different kinds of applications were tested in the experiments: deconvolution, deconvolution with inpainting, superresolution, and demosaicing, all with unknown boundaries.

  17. A Framework for Fast Image Deconvolution With Incomplete Observations.

    PubMed

    Simoes, Miguel; Almeida, Luis B; Bioucas-Dias, Jose; Chanussot, Jocelyn

    2016-11-01

    In image deconvolution problems, the diagonalization of the underlying operators by means of the fast Fourier transform (FFT) usually yields very large speedups. When there are incomplete observations (e.g., in the case of unknown boundaries), standard deconvolution techniques normally involve non-diagonalizable operators, resulting in rather slow methods or, otherwise, use inexact convolution models, resulting in the occurrence of artifacts in the enhanced images. In this paper, we propose a new deconvolution framework for images with incomplete observations that allows us to work with diagonalized convolution operators, and therefore is very fast. We iteratively alternate the estimation of the unknown pixels and of the deconvolved image, using, e.g., an FFT-based deconvolution method. This framework is an efficient, high-quality alternative to existing methods of dealing with the image boundaries, such as edge tapering. It can be used with any fast deconvolution method. We give an example in which a state-of-the-art method that assumes periodic boundary conditions is extended, using this framework, to unknown boundary conditions. Furthermore, we propose a specific implementation of this framework, based on the alternating direction method of multipliers (ADMM). We provide a proof of convergence for the resulting algorithm, which can be seen as a "partial" ADMM, in which not all variables are dualized. We report experimental comparisons with other primal-dual methods, where the proposed one performed at the level of the state of the art. Four different kinds of applications were tested in the experiments: deconvolution, deconvolution with inpainting, superresolution, and demosaicing, all with unknown boundaries. PMID:27576251

  18. Discovery and New Frontiers Project Budget Analysis Tool

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  19. The Moon: Determining Minerals and their Abundances with Mid-IR Spectral Deconvolution II

    NASA Astrophysics Data System (ADS)

    Kozlowski, Richard W.; Donaldson Hanna, K.; Sprague, A. L.; Grosse, F. A.; Boop, T. S.; Warell, J.; Boccafola, K.

    2007-10-01

    We determine the mineral compositions and abundances at three locations on the lunar surface using an established spectral deconvolution algorithm (Ramsey 1996, Ph.D. Dissertation, ASU; Ramsey and Christiansen 1998, JGR 103, 577-596) for mid-infrared spectral libraries of mineral separates of varying grain sizes. Spectral measurements of the lunar surface were obtained at the Infrared Telescope Facility (IRTF) on Mauna Kea, HI with Boston University's Mid-Infrared Spectrometer and Imager (MIRSI). Our chosen locations, Aristarchus, Grimaldi and Mersenius C, have been previously observed in the VIS near-IR from ground-based telescopes and spacecraft (Zisk et al. 1977, The Moon 17, 59-99; Hawke et al. 1993, GRL 20, 419-422; McEwen et al. 1994, Science 266, 1858-1862; Peterson et al. 1995, 22, 3055-3058; Warell et al. 2006, Icarus 180, 281-291), however there are no sample returns for analysis. Surface mineral deconvolutions of the Grimaldi Basin infill are suggestive of anorthosite, labradorite, orthopyroxene, olivine, garnet and phosphate. Peterson et al. (1995) indicated the infill of Grimaldi Basin has a noritic anorthosite or anorthositic norite composition. Our spectral deconvolution supports these results. Modeling of other lunar locations is underway. We have also successfully modeled laboratory spectra of HED meteorites, Vesta, and Mercury (see meteorites and mercurian abstracts this meeting). These results demonstrate the spectral deconvolution method to be robust for making mineral identifications on remotely observed objects, in particular main-belt asteroids, the Moon, and Mercury. This work was funded by NSF AST406796.

  20. Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.

    PubMed

    Villeneuve, Emma; Carfantan, Hervé

    2014-10-01

    Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters.

  1. Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.

    PubMed

    Villeneuve, Emma; Carfantan, Hervé

    2014-10-01

    Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

  2. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  3. ITERA: IDL Tool for Emission-line Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Groves, Brent; Allen, Mark

    2013-07-01

    ITERA, the IDL Tool for Emission-line Ratio Analysis, is an IDL widget tool that allows you to plot ratios of any strong atomic and ionized emission lines as determined by standard photoionization and shock models. These "line ratio diagrams" can then be used to determine diagnostics for nebulae excitation mechanisms or nebulae parameters such as density, temperature, metallicity, etc. ITERA can also be used to determine line sensitivities to such parameters, compare observations with the models, or even estimate unobserved line fluxes.

  4. A dataflow analysis tool for parallel processing of algorithms

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  5. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  6. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  7. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  8. fMRI analysis software tools: an evaluation framework

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Colli, Vittoria; Strocchi, Sabina; Vite, Cristina; Binaghi, Elisabetta; Conte, Leopoldo

    2011-03-01

    Performance comparison of functional Magnetic Resonance Imaging (fMRI) software tools is a very difficult task. In this paper, a framework for comparison of fMRI analysis results obtained with different software packages is proposed. An objective evaluation is possible only after pre-processing steps that normalize input data in a standard domain. Segmentation and registration algorithms are implemented in order to classify voxels belonging to brain or not, and to find the non rigid transformation that best aligns the volume under inspection with a standard one. Through the definitions of intersection and union of fuzzy logic an index was defined which quantify information overlap between Statistical Parametrical Maps (SPMs). Direct comparison between fMRI results can only highlight differences. In order to assess the best result, an index that represents the goodness of the activation detection is required. The transformation of the activation map in a standard domain allows the use of a functional Atlas for labeling the active voxels. For each functional area the Activation Weighted Index (AWI) that identifies the mean activation level of whole area was defined. By means of this brief, but comprehensive description, it is easy to find a metric for the objective evaluation of a fMRI analysis tools. Trough the first evaluation method the situations where the SPMs are inconsistent were identified. The result of AWI analysis suggest which tool has higher sensitivity and specificity. The proposed method seems a valid evaluation tool when applied to an adequate number of patients.

  9. Suspension array technology: new tools for gene and protein analysis.

    PubMed

    Nolan, J P; Mandy, F F

    2001-11-01

    Flow cytometry has long been a key tool in the analysis of lymphocytes and other cells, owing to its ability to make quantitative, homogeneous, multiparameter measurements of particles. New developments in illumination sources, digital signal processing and microsphere chemistry are driving the development of flow cytometry in new areas of biomedical research. In particular. the maturation of approaches to perform highly parallel analyses using suspension arrays of microspheres with different morphospectral features is making flow cytometry an important tool in protein and genetic analysis. In this paper, we review the development of suspension array technology (SAT), current applications in protein and genomic analysis, and the prospects for this platform in a variety of large scale screening applications. PMID:11838973

  10. Reconstructing influenza incidence by deconvolution of daily mortality time series

    PubMed Central

    Goldstein, Edward; Dushoff, Jonathan; Ma, Junling; Plotkin, Joshua B.; Earn, David J. D.; Lipsitch, Marc

    2009-01-01

    We propose a mathematically straightforward method to infer the incidence curve of an epidemic from a recorded daily death curve and time-to-death distribution; the method is based on the Richardson–Lucy deconvolution scheme from optics. We apply the method to reconstruct the incidence curves for the 1918 influenza epidemic in Philadelphia and New York State. The incidence curves are then used to estimate epidemiological quantities, such as daily reproductive numbers and infectivity ratios. We found that during a brief period before the official control measures were implemented in Philadelphia, the drop in the daily number of new infections due to an average infector was much larger than expected from the depletion of susceptibles during that period; this finding was subjected to extensive sensitivity analysis. Combining this with recorded evidence about public behavior, we conclude that public awareness and change in behavior is likely to have had a major role in the slowdown of the epidemic even in a city whose response to the 1918 influenza epidemic is considered to have been among the worst in the U.S. PMID:20080801

  11. Tools for Large-Scale Mobile Malware Analysis

    SciTech Connect

    Bierma, Michael

    2014-01-01

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000 Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.

  12. SMART: Statistical Metabolomics Analysis-An R Tool.

    PubMed

    Liang, Yu-Jen; Lin, Yu-Ting; Chen, Chia-Wei; Lin, Chien-Wei; Chao, Kun-Mao; Pan, Wen-Harn; Yang, Hsin-Chou

    2016-06-21

    Metabolomics data provide unprecedented opportunities to decipher metabolic mechanisms by analyzing hundreds to thousands of metabolites. Data quality concerns and complex batch effects in metabolomics must be appropriately addressed through statistical analysis. This study developed an integrated analysis tool for metabolomics studies to streamline the complete analysis flow from initial data preprocessing to downstream association analysis. We developed Statistical Metabolomics Analysis-An R Tool (SMART), which can analyze input files with different formats, visually represent various types of data features, implement peak alignment and annotation, conduct quality control for samples and peaks, explore batch effects, and perform association analysis. A pharmacometabolomics study of antihypertensive medication was conducted and data were analyzed using SMART. Neuromedin N was identified as a metabolite significantly associated with angiotensin-converting-enzyme inhibitors in our metabolome-wide association analysis (p = 1.56 × 10(-4) in an analysis of covariance (ANCOVA) with an adjustment for unknown latent groups and p = 1.02 × 10(-4) in an ANCOVA with an adjustment for hidden substructures). This endogenous neuropeptide is highly related to neurotensin and neuromedin U, which are involved in blood pressure regulation and smooth muscle contraction. The SMART software, a user guide, and example data can be downloaded from http://www.stat.sinica.edu.tw/hsinchou/metabolomics/SMART.htm .

  13. Virtual tool mark generation for efficient striation analysis.

    PubMed

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-07-01

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5-10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  14. A survey of visualization tools for biological network analysis

    PubMed Central

    Pavlopoulos, Georgios A; Wegener, Anna-Lynn; Schneider, Reinhard

    2008-01-01

    The analysis and interpretation of relationships between biological molecules, networks and concepts is becoming a major bottleneck in systems biology. Very often the pure amount of data and their heterogeneity provides a challenge for the visualization of the data. There are a wide variety of graph representations available, which most often map the data on 2D graphs to visualize biological interactions. These methods are applicable to a wide range of problems, nevertheless many of them reach a limit in terms of user friendliness when thousands of nodes and connections have to be analyzed and visualized. In this study we are reviewing visualization tools that are currently available for visualization of biological networks mainly invented in the latest past years. We comment on the functionality, the limitations and the specific strengths of these tools, and how these tools could be further developed in the direction of data integration and information sharing. PMID:19040716

  15. Virtual Tool Mark Generation for Efficient Striation Analysis

    SciTech Connect

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.

  16. MISR Data Visualization and Analysis Using the hdfscan Tool

    NASA Astrophysics Data System (ADS)

    Crean, K. A.; Diner, D. J.; Banerjee, P. K.

    2001-05-01

    A new software tool called hdfscan is available to display and analyze data formatted using the HDF-EOS grid and swath interfaces, as well as the native HDF SDS, vdata, vgroup, and raster interfaces. The hdfscan tool can display data in both image and textual form, and can also display attributes, metadata, annotations, file structure, projection information, and simple data statistics. hdfscan also includes a data editing capability. In addition, the tool contains unique features to aid in the interpretation of data from the Multi-angle Imaging SpectroRadiometer (MISR) instrument, which currently flies aboard NASA's Terra spacecraft. These features include the ability to unscale and unpack MISR data fields; the ability to display MISR data flag values according to their interpreted values as well as their raw values; and knowledge of special MISR fill values. MISR measures upwelling radiance from Earth in 4 spectral bands corresponding to blue, green, red, and near-infrared wavelengths, at each of 9 view angles including the nadir (vertical) direction plus 26.1, 45.6, 60.0, and 70.5 degrees forward and aftward of nadir. Data products derived from MISR measurements aim at improving our understanding of the Earth's environment and climate. The hdfscan tool runs in one of two modes, as selected by the user: command-line mode, or via a graphical user interface (GUI). This provides a user with flexibility in using the tool for either batch mode processing or interactive analysis. This presentation will describe features and functionalities of the hdfscan tool. The user interface will be shown, and menu options will be explained. Information on how to obtain the tool will be provided.

  17. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  18. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  19. Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)

    SciTech Connect

    Alkadi, Nasr E; Starke, Michael R; Ma, Ookie; Nimbalkar, Sachin U; Cox, Daryl

    2013-01-01

    IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

  20. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    PubMed

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-01

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. PMID:26002210

  1. Regional deconvolution method for partial volume correction in brain PET

    NASA Astrophysics Data System (ADS)

    Rusinek, Henry; Tsui, Wai-Hon; de Leon, Mony J.

    2001-05-01

    Correction of PET images for partial volume effects (PVE) is of particular utility in studies of metabolism in brain aging and brain disorders. PVE is commonly corrected using voxel-by- voxel factors obtained from a high resolution brain mask (obtained from the coregistered MR scan), after convolution with the point spread function (PSF) of the imaging system. In a recently proposed regional deconvolution (RD) method, the observed regional activity is expressed as linear combinations of the true metabolic activity. The weights are obtained by integrating the PSF over the geometric extent of the brain regions. We have analyzed the accuracy of RD and two other PVE correction algorithms under a variety of conditions using simulated PET scans. Each of the brain regions was assigned a distribution of metabolic activity, with gray matter/white matter contrast representative of subjects in several age categories. Simulations were performed over a wide range of PET resolutions. The influence of PET/MR misregistration and heterogeneity of brain metabolism were also evaluated. Our results demonstrate the importance of correcting PET metabolic images for PVE. Without such correction, the regional brain activity values are contaminated with 30 - 40% errors. Under most conditions studied, the accuracy of RD and of the three- compartmental method were superior to the accuracy of the two- compartmental method. Our study provides the first demonstration of the feasibility of RD algorithm to provide accurate correction for a large number (n equals 109) of brain compartments. PVE correction methods appear to be promising tools in studies of metabolism in normal brain, brain aging, and brain disorders.

  2. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  3. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  4. Multi-frame partially saturated images blind deconvolution

    NASA Astrophysics Data System (ADS)

    Ye, Pengzhao; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2016-09-01

    When blurred images have saturated or over-exposed pixels, conventional blind deconvolution approaches often fail to estimate accurate point spread function (PSF) and will introduce local ringing artifacts. In this paper, we propose a method to deal with the problem under the modified multi-frame blind deconvolution framework. First, in the kernel estimation step, a light streak detection scheme using multi-frame blurred images is incorporated into the regularization constraint. Second, we deal with image regions affected by the saturated pixels separately by modeling a weighted matrix during each multi-frame deconvolution iteration process. Both synthetic and real-world examples show that more accurate PSFs can be estimated and restored images have richer details and less negative effects compared to state of art methods.

  5. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  6. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  7. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  8. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  9. The RUMBA software: tools for neuroimaging data analysis.

    PubMed

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses.

  10. AstroStat-A VO tool for statistical analysis

    NASA Astrophysics Data System (ADS)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  11. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  12. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  13. Deconvolution of dynamic dual photon microscopy images of cerebral microvasculature to assess the hemodynamic status of the brain

    NASA Astrophysics Data System (ADS)

    Mehrabian, Hatef; Lindvere, Liis; Stefanovic, Bojana; Martel, Anne L.

    2011-03-01

    Assessing the hemodynamic status of the brain and its variations in response to stimulations is required to understand the local cerebral circulatory mechanisms. Dynamic contrast enhanced imaging of cerebral microvasculature provides information that can be used in understanding physiology of cerebral diseases. Bolus tracking is used to extract characteristic parameters that quantify local cerebral blood flow. However, post-processing of the data is needed to segment the field of view (FOV) and to perform deconvolution to remove the effects of input bolus profile and the path it travels to reach the imaging window. Finding the arterial input function (AIF) and dealing with the ill-posedness of deconvolution system make this process are the main challenges. We propose using ICA to segment the FOV and to extract a local AIF as well as the venous output function that is required for deconvolution. This also helps to stabilize the system as ICA suppresses noise efficiently. Tikhoniv regularization (with L-curve analysis to find the best regularization parameter) is used to make the system stable. In-vivo dynamic 2PLSM images of a rat brain in two conditions (when the animal is at rest and when it is stimulated) are used in this study. The experimental along with the simulation studies provided promising results that demonstrate the feasibility and importance of performing deconvolution.

  14. Improved Cell Typing by Charge-State Deconvolution of matrix-assisted laser desorption/ionization Mass Spectra

    SciTech Connect

    Wilkes, Jon G.; Buzantu, Dan A.; Dare, Diane J.; Dragan, Yvonne P.; Chiarelli, M. Paul; Holland, Ricky D.; Beaudoin, Michael; Heinze, Thomas M.; Nayak, Rajesh; Shvartsburg, Alexandre A.

    2006-05-30

    Robust, specific, and rapid identification of toxic strains of bacteria and viruses, to guide the mitigation of their adverse health effects and optimum implementation of other response actions, remains a major analytical challenge. This need has driven the development of methods for classification of microorganisms using mass spectrometry, particularly matrix-assisted laser desorption ionization MS (MALDI) that allows high throughput analyses with minimum sample preparation. We describe a novel approach to cell typing based on pattern recognition of MALDI spectra, which involves charge-state deconvolution in conjunction with a new correlation analysis procedure. The method is applicable to both prokaryotic and eukaryotic cells. Charge-state deconvolution improves the quantitative reproducibility of spectra because multiply-charged ions resulting from the same biomarker attaching a different number of protons are recognized and their abundances are combined. This allows a clearer distinction of bacterial strains or of cancerous and normal liver cells. Improved class distinction provided by charge-state deconvolution was demonstrated by cluster spacing on canonical variate score charts and by correlation analyses. Deconvolution may enhance detection of early disease state or therapy progress markers in various tissues analyzed by MALDI.

  15. Federal metering data analysis needs and existing tools

    SciTech Connect

    Henderson, Jordan W.; Fowler, Kimberly M.

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  16. Interactive Software Fault Analysis Tool for Operational Anomaly Resolution

    NASA Technical Reports Server (NTRS)

    Chen, Ken

    2002-01-01

    Resolving software operational anomalies frequently requires a significant amount of resources for software troubleshooting activities. The time required to identify a root cause of the anomaly in the software may lead to significant timeline impacts and in some cases, may extend to compromise of mission and safety objectives. An integrated tool that supports software fault analysis based on the observed operational effects of an anomaly could significantly reduce the time required to resolve operational anomalies; increase confidence for the proposed solution; identify software paths to be re-verified during regression testing; and, as a secondary product of the analysis, identify safety critical software paths.

  17. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  18. Optimizing depth estimates from magnetic anomalies using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    Curto, Julia B.; Diniz, Tatiana; Vidotti, Roberta M.; Blakely, Richard J.; Fuck, Reinhardt A.

    2015-11-01

    We offer a methodology to analyze the spatial and statistical properties of the tilt derivative of magnetic anomalies, thus facilitating the mapping of the location and depth of concealed magnetic sources. This methodology uses commonly available graphical information system (GIS) software to estimate and interpolate horizontal distances between key attributes of the tilt derivative, which then are used to estimate depth and location of causative bodies. Application of the method to synthetic data illustrates its reliability to determine depths to magnetic contacts. We also achieved consistent depth results using real data from the northwest portion of the Paraná Basin, Brazil, where magnetic anomaly interpretations are complicated by low geomagnetic inclinations and rocks with remanent magnetization. The tilt-derivative method provides more continuous and higher resolution contact information than the 3D Euler deconvolution technique.

  19. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  20. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    PubMed

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  1. Graphical Tools for Network Meta-Analysis in STATA

    PubMed Central

    Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

  2. Graphical tools for network meta-analysis in STATA.

    PubMed

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  3. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew

  4. Receiver function deconvolution using transdimensional hierarchical Bayesian inference

    NASA Astrophysics Data System (ADS)

    Kolb, J. M.; Lekić, V.

    2014-06-01

    Teleseismic waves can convert from shear to compressional (Sp) or compressional to shear (Ps) across impedance contrasts in the subsurface. Deconvolving the parent waveforms (P for Ps or S for Sp) from the daughter waveforms (S for Ps or P for Sp) generates receiver functions which can be used to analyse velocity structure beneath the receiver. Though a variety of deconvolution techniques have been developed, they are all adversely affected by background and signal-generated noise. In order to take into account the unknown noise characteristics, we propose a method based on transdimensional hierarchical Bayesian inference in which both the noise magnitude and noise spectral character are parameters in calculating the likelihood probability distribution. We use a reversible-jump implementation of a Markov chain Monte Carlo algorithm to find an ensemble of receiver functions whose relative fits to the data have been calculated while simultaneously inferring the values of the noise parameters. Our noise parametrization is determined from pre-event noise so that it approximates observed noise characteristics. We test the algorithm on synthetic waveforms contaminated with noise generated from a covariance matrix obtained from observed noise. We show that the method retrieves easily interpretable receiver functions even in the presence of high noise levels. We also show that we can obtain useful estimates of noise amplitude and frequency content. Analysis of the ensemble solutions produced by our method can be used to quantify the uncertainties associated with individual receiver functions as well as with individual features within them, providing an objective way for deciding which features warrant geological interpretation. This method should make possible more robust inferences on subsurface structure using receiver function analysis, especially in areas of poor data coverage or under noisy station conditions.

  5. VMPLOT: A versatile analysis tool for mission operations

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.

    1993-01-01

    VMPLOT is a versatile analysis tool designed by the Magellan Spacecraft Team to graphically display engineering data used to support mission operations. While there is nothing revolutionary or innovative about graphical data analysis tools, VMPLOT has some distinguishing features that set it apart from other custom or commercially available software packages. These features include the ability to utilize time in a Universal Time Coordinated (UTC) or Spacecraft Clock (SCLK) format as an enumerated data type, the ability to automatically scale both axes based on the data to be displayed (including time), the ability to combine data from different files, and the ability to utilize the program either interactively or in batch mode, thereby enhancing automation. Another important feature of VMPLOT not visible to the user is the software engineering philosophies utilized. A layered approach was used to isolate program functionality to different layers. This was done to increase program portability to different platforms and to ease maintenance and enhancements due to changing requirements. The functionality of the unique features of VMPLOT as well as highlighting the algorithms that make these features possible are described. The software engineering philosophies used in the creation of the software tool are also summarized.

  6. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  7. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and

  8. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural

  9. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  10. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  11. Improving the efficiency of deconvolution algorithms for sound source localization.

    PubMed

    Lylloff, Oliver; Fernández-Grande, Efrén; Agerkvist, Finn; Hald, Jørgen; Roig, Elisabet Tiana; Andersen, Martin S

    2015-07-01

    The localization of sound sources with delay-and-sum (DAS) beamforming is limited by a poor spatial resolution-particularly at low frequencies. Various methods based on deconvolution are examined to improve the resolution of the beamforming map, which can be modeled by a convolution of the unknown acoustic source distribution and the beamformer's response to a point source, i.e., point-spread function. A significant limitation of deconvolution is, however, an additional computational effort compared to beamforming. In this paper, computationally efficient deconvolution algorithms are examined with computer simulations and experimental data. Specifically, the deconvolution problem is solved with a fast gradient projection method called Fast Iterative Shrikage-Thresholding Algorithm (FISTA), and compared with a Fourier-based non-negative least squares algorithm. The results indicate that FISTA tends to provide an improved spatial resolution and is up to 30% faster and more robust to noise. In the spirit of reproducible research, the source code is available online. PMID:26233017

  12. Deconvolution of astronomical images using SOR with adaptive relaxation.

    PubMed

    Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

    2011-07-01

    We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity.

  13. Detection of gravity field source boundaries using deconvolution method

    NASA Astrophysics Data System (ADS)

    Zuo, Boxin; Hu, Xiangyun; Liang, Yin; Han, Qi

    2014-12-01

    Complications arise in the interpretation of gravity fields because of interference from systematic degradations, such as boundary blurring and distortion. The major sources of these degradations are the various systematic errors that inevitably occur during gravity field data acquisition, discretization and geophysical forward modelling. To address this problem, we evaluate deconvolution method that aim to detect the clear horizontal boundaries of anomalous sources by the suppression of systematic errors. A convolution-based multilayer projection model, based on the classical 3-D gravity field forward model, is innovatively derived to model the systematic error degradation. Our deconvolution algorithm is specifically designed based on this multilayer projection model, in which three types of systematic error are defined. The degradations of the different systematic errors are considered in the deconvolution algorithm. As the primary source of degradation, the convolution-based systematic error is the main object of the multilayer projection model. Both the random systematic error and the projection systematic error are shown to form an integral part of the multilayer projection model, and the mixed norm regularization method and the primal-dual optimization method are therefore employed to control these errors and stabilize the deconvolution solution. We herein analyse the parameter identification and convergence of the proposed algorithms, and synthetic and field data sets are both used to illustrate their effectiveness. Additional synthetic examples are specifically designed to analyse the effects of the projection systematic error, which is caused by the uncertainty associated with the estimation of the impulse response function.

  14. Deconvolution of astronomical images using SOR with adaptive relaxation.

    PubMed

    Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

    2011-07-01

    We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity. PMID:21747506

  15. Basic statistical tools in research and data analysis

    PubMed Central

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis. PMID:27729694

  16. TA-DA: A Tool for Astrophysical Data Analysis

    NASA Astrophysics Data System (ADS)

    Da Rio, Nicola; Robberto, Massimo

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  17. TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS

    SciTech Connect

    Da Rio, Nicola; Robberto, Massimo

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  18. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  19. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  20. The Smooth Decomposition as a nonlinear modal analysis tool

    NASA Astrophysics Data System (ADS)

    Bellizzi, Sergio; Sampaio, Rubens

    2015-12-01

    The Smooth Decomposition (SD) is a statistical analysis technique for finding structures in an ensemble of spatially distributed data such that the vector directions not only keep the maximum possible variance but also the motions, along the vector directions, are as smooth in time as possible. In this paper, the notion of the dual smooth modes is introduced and used in the framework of oblique projection to expand a random response of a system. The dual modes define a tool that transforms the SD in an efficient modal analysis tool. The main properties of the SD are discussed and some new optimality properties of the expansion are deduced. The parameters of the SD give access to modal parameters of a linear system (mode shapes, resonance frequencies and modal energy participations). In case of nonlinear systems, a richer picture of the evolution of the modes versus energy can be obtained analyzing the responses under several excitation levels. This novel analysis of a nonlinear system is illustrated by an example.

  1. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and

  2. Breast image feature learning with adaptive deconvolutional networks

    NASA Astrophysics Data System (ADS)

    Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.

    2012-03-01

    Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpiñán, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

  3. Fluence estimation by deconvolution via l1-norm minimization

    NASA Astrophysics Data System (ADS)

    García Hernández, J. C.; Lazaro-Ponthus, D.; Gmar, M.; Barthe, J.

    2011-03-01

    Advances in radiotherapy irradiation techniques have led to very complex treatments requiring for a more stringent control. The dosimetric properties of electronic portal imaging devices (EPID) encouraged their use for treatment verification. Two main approaches have been proposed: the forward approach, where measured portal dose images are compared to predicted dose images and the backward approach, where EPID images are used to estimate the dose delivered to the patient. Both approaches need EPID images to be converted into a fluence distribution by deconvolution. However, deconvolution is an ill-posed problem which is very sensitive to small variations on input data. This study presents the application of a deconvolution method based on l1-norm minimization; this is a method known for being very stable while working with noisy data. The algorithm was first evaluated on synthetic images with different noise levels, the results were satisfactory. Deconvolution algorithm was then applied to experimental portal images; the required EPID response kernel and energy fluence images were computed by Monte-Carlo calculation, accelerator treatment head and EPID models had already been commissioned in a previous work. The obtained fluence images were in good agreement with simulated fluence images. This deconvolution algorithm may be generalized to an inverse problem with a general operator, where image formation is not longer modeled by a convolution but by a linear operation that might be seen as a position-dependent convolution. Moreover, this procedure would be detector independent and could be used for any detector type provided its response function is known.

  4. CFD Methods and Tools for Multi-Element Airfoil Analysis

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.; George, Michael W. (Technical Monitor)

    1995-01-01

    This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

  5. A POST ANALYSIS OF A PREVENTIVE AND CHRONIC HEALTHCARE TOOL.

    PubMed

    Borrayo, Brooke D; O'Lawrence, Henry

    2016-01-01

    This study uses the data set from Kaiser Permanente to examine the post implementation of a preventive and chronic care that utilizes clinical information system, delivery system design, and clinical decision support to maximize the office visit. The analysis suggests a significant positive relationship between frequency of utilization rates to address preventive and chronic care gaps. There is no implication of a significant positive relationship with the successfully captured rate, which satisfies closing the care gap within 45 days. The use of the preventive care tool will assist members in satisfying the preventive care gap, cervical cancer screening, within 45 days of the encounter. PMID:27483973

  6. Integrated network analysis and effective tools in plant systems biology

    PubMed Central

    Fukushima, Atsushi; Kanaya, Shigehiko; Nishida, Kozo

    2014-01-01

    One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1) network visualization tools, (2) pathway analyses, (3) genome-scale metabolic reconstruction, and (4) the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms. PMID:25408696

  7. BEDTools: the Swiss-army tool for genome feature analysis

    PubMed Central

    Quinlan, Aaron R.

    2014-01-01

    Technological advances have enabled the use of DNA sequencing as a flexible tool to characterize genetic variation and to measure the activity of diverse cellular phenomena such as gene isoform expression and transcription factor binding. Extracting biological insight from the experiments enabled by these advances demands the analysis of large, multi-dimensional datasets. This unit describes the use of the BEDTools toolkit for the exploration of high-throughput genomics datasets. I present several protocols for common genomic analyses and demonstrate how simple BEDTools operations may be combined to create bespoke pipelines addressing complex questions. PMID:25199790

  8. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  9. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    NASA Technical Reports Server (NTRS)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  10. Sub-diffraction limit differentiation of single fluorophores using Single Molecule Image Deconvolution (SMID)

    NASA Astrophysics Data System (ADS)

    Decenzo, Shawn H.; Desantis, Michael C.; Wang, Y. M.

    2009-03-01

    In order to better understand biological systems, researchers demand new techniques and improvements in single molecule differentiation. We present a unique approach utilizing an analysis of the standard deviation of the Gaussian point spread function of single immobile fluorescent molecules. This technique, Single Molecule Image Deconvolution (SMID), is applicable to standard TIRF instrumentation and standard fluorophores. We demonstrate the method by measuring the separation of two Cy3 molecules attached to the ends of short double-stranded DNA immobilized on a surface without photobleaching. Preliminary results and further applications will be presented.

  11. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  12. MetaNetVar: Pipeline for applying network analysis tools for genomic variants analysis

    PubMed Central

    Moyer, Eric; Hagenauer, Megan; Lesko, Matthew; Francis, Felix; Rodriguez, Oscar; Nagarajan, Vijayaraj; Huser, Vojtech; Busby, Ben

    2016-01-01

    Network analysis can make variant analysis better. There are existing tools like HotNet2 and dmGWAS that can provide various analytical methods. We developed a prototype of a pipeline called MetaNetVar that allows execution of multiple tools. The code is published at https://github.com/NCBI-Hackathons/Network_SNPs. A working prototype is published as an Amazon Machine Image - ami-4510312f . PMID:27158457

  13. The LAGRANTO Lagrangian analysis tool - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  14. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is

  15. Net energy analysis: Powerful tool for selecting electric power options

    NASA Astrophysics Data System (ADS)

    Baron, S.

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  16. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C; Temple, Brian A; Buescher, Kevin L

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  17. Stacks: an analysis tool set for population genomics

    PubMed Central

    CATCHEN, JULIAN; HOHENLOHE, PAUL A.; BASSHAM, SUSAN; AMORES, ANGEL; CRESKO, WILLIAM A.

    2014-01-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics. PMID:23701397

  18. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  19. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  20. Net energy analysis - powerful tool for selecting elective power options

    SciTech Connect

    Baron, S.

    1995-12-01

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  1. The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy

    NASA Astrophysics Data System (ADS)

    Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.

    2008-12-01

    In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

  2. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  3. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  4. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  5. Tutorial on platform for optical topography analysis tools.

    PubMed

    Sutoko, Stephanie; Sato, Hiroki; Maki, Atsushi; Kiguchi, Masashi; Hirabayashi, Yukiko; Atsumori, Hirokazu; Obata, Akiko; Funane, Tsukasa; Katura, Takusige

    2016-01-01

    Optical topography/functional near-infrared spectroscopy (OT/fNIRS) is a functional imaging technique that noninvasively measures cerebral hemoglobin concentration changes caused by neural activities. The fNIRS method has been extensively implemented to understand the brain activity in many applications, such as neurodisorder diagnosis and treatment, cognitive psychology, and psychiatric status evaluation. To assist users in analyzing fNIRS data with various application purposes, we developed a software called platform for optical topography analysis tools (POTATo). We explain how to handle and analyze fNIRS data in the POTATo package and systematically describe domain preparation, temporal preprocessing, functional signal extraction, statistical analysis, and data/result visualization for a practical example of working memory tasks. This example is expected to give clear insight in analyzing data using POTATo. The results specifically show the activated dorsolateral prefrontal cortex is consistent with previous studies. This emphasizes analysis robustness, which is required for validating decent preprocessing and functional signal interpretation. POTATo also provides a self-developed plug-in feature allowing users to create their own functions and incorporate them with established POTATo functions. With this feature, we continuously encourage users to improve fNIRS analysis methods. We also address the complications and resolving opportunities in signal analysis.

  6. ELECTRA © Launch and Re-Entry Safety Analysis Tool

    NASA Astrophysics Data System (ADS)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA© tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA© project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA© tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA© consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRA©’s development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for

  7. Image restoration for confocal microscopy: improving the limits of deconvolution, with application to the visualization of the mammalian hearing organ.

    PubMed

    Boutet de Monvel, J; Le Calvez, S; Ulfendahl, M

    2001-05-01

    Deconvolution algorithms have proven very effective in conventional (wide-field) fluorescence microscopy. Their application to confocal microscopy is hampered, in biological experiments, by the presence of important levels of noise in the images and by the lack of a precise knowledge of the point spread function (PSF) of the system. We investigate the application of wavelet-based processing tools to deal with these problems, in particular wavelet denoising methods, which turn out to be very effective in application to three-dimensional confocal images. When used in combination with more classical deconvolution algorithms, these methods provide a robust and efficient restoration scheme allowing one to deal with difficult imaging conditions. To make our approach applicable in practical situations, we measured the PSF of a Biorad-MRC1024 confocal microscope under a large set of imaging conditions, including in situ acquisitions. As a specific biological application, we present several examples of restorations of three-dimensional confocal images acquired inside an intact preparation of the hearing organ. We also provide a quantitative assessment of the gain in quality achieved by wavelet-aided restorations over classical deconvolution schemes, based on a set of numerical experiments that we performed with test images.

  8. Discriminating adenocarcinoma from normal colonic mucosa through deconvolution of Raman spectra

    NASA Astrophysics Data System (ADS)

    Cambraia Lopes, Patricia; Moreira, Joaquim Agostinho; Almeida, Abilio; Esteves, Artur; Gregora, Ivan; Ledinsky, Martin; Lopes, Jose Machado; Henrique, Rui; Oliveira, Albino

    2011-12-01

    In this work, we considered the feasibility of Raman spectroscopy for discriminating between adenocarcinomatous and normal mucosal formalin-fixed colonic tissues. Unlike earlier studies in colorectal cancer, a spectral deconvolution model was implemented to derive spectral information. Eleven samples of human colon were used, and 55 spectra were analyzed. Each spectrum was resolved into 25 bands from 975 to 1720 cm-1, where modes of proteins, lipids, and nucleic acids are observed. From a comparative study of band intensities, those presenting higher differences between tissue types were correlated to biochemical assignments. Results from fitting procedure were further used as inputs for linear discriminant analysis, where combinations of band intensities and intensity ratios were tested, yielding accuracies up to 81%. This analysis yields objective discriminating parameters after fitting optimization. The bands with higher diagnosis relevance detected by spectra deconvolution enable to confine the study to some spectral regions instead of broader ranges. A critical view upon limitations of this approach is presented, along with a comparison of our results to earlier ones obtained in fresh colonic tissues. This enabled to assess the effect of formalin fixation in colonic tissues, and determine its relevance in the present analysis.

  9. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    PubMed

    Huang, Lihan

    2014-02-01

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology.

  10. Cellular barcoding tool for clonal analysis in the hematopoietic system.

    PubMed

    Gerrits, Alice; Dykstra, Brad; Kalmykowa, Olga J; Klauke, Karin; Verovskaya, Evgenia; Broekhuis, Mathilde J C; de Haan, Gerald; Bystrykh, Leonid V

    2010-04-01

    Clonal analysis is important for many areas of hematopoietic stem cell research, including in vitro cell expansion, gene therapy, and cancer progression and treatment. A common approach to measure clonality of retrovirally transduced cells is to perform integration site analysis using Southern blotting or polymerase chain reaction-based methods. Although these methods are useful in principle, they generally provide a low-resolution, biased, and incomplete assessment of clonality. To overcome those limitations, we labeled retroviral vectors with random sequence tags or "barcodes." On integration, each vector introduces a unique, identifiable, and heritable mark into the host cell genome, allowing the clonal progeny of each cell to be tracked over time. By coupling the barcoding method to a sequencing-based detection system, we could identify major and minor clones in 2 distinct cell culture systems in vitro and in a long-term transplantation setting. In addition, we demonstrate how clonal analysis can be complemented with transgene expression and integration site analysis. This cellular barcoding tool permits a simple, sensitive assessment of clonality and holds great promise for future gene therapy protocols in humans, and any other applications when clonal tracking is important.

  11. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  12. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    NASA Astrophysics Data System (ADS)

    González, Adriana; Delouille, Véronique; Jacques, Laurent

    2016-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting). The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  13. De-convoluting mixed crude oil in Prudhoe Bay Field, North Slope, Alaska

    USGS Publications Warehouse

    Peters, K.E.; Scott, Ramos L.; Zumberge, J.E.; Valin, Z.C.; Bird, K.J.

    2008-01-01

    Seventy-four crude oil samples from the Barrow arch on the North Slope of Alaska were studied to assess the relative volumetric contributions from different source rocks to the giant Prudhoe Bay Field. We applied alternating least squares to concentration data (ALS-C) for 46 biomarkers in the range C19-C35 to de-convolute mixtures of oil generated from carbonate rich Triassic Shublik Formation and clay rich Jurassic Kingak Shale and Cretaceous Hue Shale-gamma ray zone (Hue-GRZ) source rocks. ALS-C results for 23 oil samples from the prolific Ivishak Formation reservoir of the Prudhoe Bay Field indicate approximately equal contributions from Shublik Formation and Hue-GRZ source rocks (37% each), less from the Kingak Shale (26%), and little or no contribution from other source rocks. These results differ from published interpretations that most oil in the Prudhoe Bay Field originated from the Shublik Formation source rock. With few exceptions, the relative contribution of oil from the Shublik Formation decreases, while that from the Hue-GRZ increases in reservoirs along the Barrow arch from Point Barrow in the northwest to Point Thomson in the southeast (???250 miles or 400 km). The Shublik contribution also decreases to a lesser degree between fault blocks within the Ivishak pool from west to east across the Prudhoe Bay Field. ALS-C provides a robust means to calculate the relative amounts of two or more oil types in a mixture. Furthermore, ALS-C does not require that pure end member oils be identified prior to analysis or that laboratory mixtures of these oils be prepared to evaluate mixing. ALS-C of biomarkers reliably de-convolutes mixtures because the concentrations of compounds in mixtures vary as linear functions of the amount of each oil type. ALS of biomarker ratios (ALS-R) cannot be used to de-convolute mixtures because compound ratios vary as nonlinear functions of the amount of each oil type.

  14. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of

  15. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  16. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    SciTech Connect

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  17. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  18. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    SciTech Connect

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  19. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  20. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  1. The Watershed Deposition Tool: A Tool for Incorporating Atmospheric Deposition in Watershed Analysis

    EPA Science Inventory

    The tool for providing the linkage between air and water quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. The Watershed Deposition Tool (WDT) takes gridded output of at...

  2. Input Range Testing for the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  3. SOURCE PULSE ENHANCEMENT BY DECONVOLUTION OF AN EMPIRICAL GREEN'S FUNCTION.

    USGS Publications Warehouse

    Mueller, Charles S.

    1985-01-01

    Observations of the earthquake source-time function are enhanced if path, recording-site, and instrument complexities can be removed from seismograms. Assuming that a small earthquake has a simple source, its seismogram can be treated as an empirical Green's function and deconvolved from the seismogram of a larger and/or more complex earthquake by spectral division. When the deconvolution is well posed, the quotient spectrum represents the apparent source-time function of the larger event. This study shows that with high-quality locally recorded earthquake data it is feasible to Fourier transform the quotient and obtain a useful result in the time domain. In practice, the deconvolution can be stabilized by one of several simple techniques. Application of the method is given. Refs.

  4. Deconvolution from wave front sensing using the frozen flow hypothesis.

    PubMed

    Jefferies, Stuart M; Hart, Michael

    2011-01-31

    Deconvolution from wave front sensing (DWFS) is an image-reconstruction technique for compensating the image degradation due to atmospheric turbulence. DWFS requires the simultaneous recording of high cadence short-exposure images and wave-front sensor (WFS) data. A deconvolution algorithm is then used to estimate both the target object and the wave front phases from the images, subject to constraints imposed by the WFS data and a model of the optical system. Here we show that by capturing the inherent temporal correlations present in the consecutive wave fronts, using the frozen flow hypothesis (FFH) during the modeling, high-quality object estimates may be recovered in much worse conditions than when the correlations are ignored.

  5. Deconvolution Kalman filtering for force measurements of revolving wings

    NASA Astrophysics Data System (ADS)

    Vester, R.; Percin, M.; van Oudheusden, B.

    2016-09-01

    The applicability of a deconvolution Kalman filtering approach is assessed for the force measurements on a flat plate undergoing a revolving motion, as an alternative procedure to correct for test setup vibrations. The system identification process required for the correct implementation of the deconvolution Kalman filter is explained in detail. It is found that in the presence of a relatively complex forcing history, the DK filter is better suited to filter out structural test rig vibrations than conventional filtering techniques that are based on, for example, low-pass or moving-average filtering. The improvement is especially found in the characterization of the generated force peaks. Consequently, more reliable force data is obtained, which is vital to validate semi-empirical estimation models, but is also relevant to correlate identified flow phenomena to the force production.

  6. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  7. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    PubMed

    Caruana, Matthew V; Carvalho, Susana; Braun, David R; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W K

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns.

  8. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  9. Bayesian deconvolution of mass and ion mobility spectra: from binary interactions to polydisperse ensembles.

    PubMed

    Marty, Michael T; Baldwin, Andrew J; Marklund, Erik G; Hochberg, Georg K A; Benesch, Justin L P; Robinson, Carol V

    2015-04-21

    Interpretation of mass spectra is challenging because they report a ratio of two physical quantities, mass and charge, which may each have multiple components that overlap in m/z. Previous approaches to disentangling the two have focused on peak assignment or fitting. However, the former struggle with complex spectra, and the latter are generally computationally intensive and may require substantial manual intervention. We propose a new data analysis approach that employs a Bayesian framework to separate the mass and charge dimensions. On the basis of this approach, we developed UniDec (Universal Deconvolution), software that provides a rapid, robust, and flexible deconvolution of mass spectra and ion mobility-mass spectra with minimal user intervention. Incorporation of the charge-state distribution in the Bayesian prior probabilities provides separation of the m/z spectrum into its physical mass and charge components. We have evaluated our approach using systems of increasing complexity, enabling us to deduce lipid binding to membrane proteins, to probe the dynamics of subunit exchange reactions, and to characterize polydispersity in both protein assemblies and lipoprotein Nanodiscs. The general utility of our approach will greatly facilitate analysis of ion mobility and mass spectra.

  10. Exothermic Behavior of Thermal Decomposition of Sodium Percarbonate: Kinetic Deconvolution of Successive Endothermic and Exothermic Processes.

    PubMed

    Nakano, Masayoshi; Wada, Takeshi; Koga, Nobuyoshi

    2015-09-24

    This study focused on the kinetic modeling of the thermal decomposition of sodium percarbonate (SPC, sodium carbonate-hydrogen peroxide (2/3)). The reaction is characterized by apparently different kinetic profiles of mass-loss and exothermic behavior as recorded by thermogravimetry and differential scanning calorimetry, respectively. This phenomenon results from a combination of different kinetic features of the reaction involving two overlapping mass-loss steps controlled by the physico-geometry of the reaction and successive endothermic and exothermic processes caused by the detachment and decomposition of H2O2(g). For kinetic modeling, the overall reaction was initially separated into endothermic and exothermic processes using kinetic deconvolution analysis. Then, both of the endothermic and exothermic processes were further separated into two reaction steps accounting for the physico-geometrically controlled reaction that occurs in two steps. Kinetic modeling through kinetic deconvolution analysis clearly illustrates the appearance of the net exothermic effect is the result of a slight delay of the exothermic process to the endothermic process in each physico-geometrically controlled reaction step. This demonstrates that kinetic modeling attempted in this study is useful for interpreting the exothermic behavior of solid-state reactions such as the oxidative decomposition of solids and thermal decomposition of oxidizing agent. PMID:26371394

  11. Exothermic Behavior of Thermal Decomposition of Sodium Percarbonate: Kinetic Deconvolution of Successive Endothermic and Exothermic Processes.

    PubMed

    Nakano, Masayoshi; Wada, Takeshi; Koga, Nobuyoshi

    2015-09-24

    This study focused on the kinetic modeling of the thermal decomposition of sodium percarbonate (SPC, sodium carbonate-hydrogen peroxide (2/3)). The reaction is characterized by apparently different kinetic profiles of mass-loss and exothermic behavior as recorded by thermogravimetry and differential scanning calorimetry, respectively. This phenomenon results from a combination of different kinetic features of the reaction involving two overlapping mass-loss steps controlled by the physico-geometry of the reaction and successive endothermic and exothermic processes caused by the detachment and decomposition of H2O2(g). For kinetic modeling, the overall reaction was initially separated into endothermic and exothermic processes using kinetic deconvolution analysis. Then, both of the endothermic and exothermic processes were further separated into two reaction steps accounting for the physico-geometrically controlled reaction that occurs in two steps. Kinetic modeling through kinetic deconvolution analysis clearly illustrates the appearance of the net exothermic effect is the result of a slight delay of the exothermic process to the endothermic process in each physico-geometrically controlled reaction step. This demonstrates that kinetic modeling attempted in this study is useful for interpreting the exothermic behavior of solid-state reactions such as the oxidative decomposition of solids and thermal decomposition of oxidizing agent.

  12. Seismic interferometry by multidimensional deconvolution without wavefield separation

    NASA Astrophysics Data System (ADS)

    Ravasi, Matteo; Meles, Giovanni; Curtis, Andrew; Rawlinson, Zara; Yikuo, Liu

    2015-07-01

    Seismic interferometry comprises a suite of methods to redatum recorded wavefields to those that would have been recorded if different sources (so-called virtual sources) had been activated. Seismic interferometry by cross-correlation has been formulated using either two-way (for full wavefields) or one-way (for directionally decomposed wavefields) representation theorems. To obtain improved Green's function estimates, the cross-correlation result can be deconvolved by a quantity that identifies the smearing of the virtual source in space and time, the so-called point-spread function. This type of interferometry, known as interferometry by multidimensional deconvolution (MDD), has so far been applied only to one-way directionally decomposed fields, requiring accurate wavefield decomposition from dual (e.g. pressure and velocity) recordings. Here we propose a form of interferometry by multidimensional deconvolution that uses full wavefields with two-way representations, and simultaneously invert for pressure and (normal) velocity Green's functions, rather than only velocity responses as for its one-way counterpart. Tests on synthetic data show that two-way MDD improves on results of interferometry by cross-correlation, and generally produces estimates of similar quality to those obtained by one-way MDD, suggesting that the preliminary decomposition into up- and downgoing components of the pressure field is not required if pressure and velocity data are jointly used in the deconvolution. We also show that constraints on the directionality of the Green's functions sought can be added directly into the MDD inversion process to further improve two-way multidimensional deconvolution. Finally, as a by-product of having pressure and particle velocity measurements, we adapt one- and two-way representation theorems to convert any particle velocity receiver into its corresponding virtual dipole/gradient source by means of MDD. Thus data recorded from standard monopolar (e

  13. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  14. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  15. Deconvolution of interferometric data using interior point iterative algorithms

    NASA Astrophysics Data System (ADS)

    Theys, C.; Lantéri, H.; Aime, C.

    2016-09-01

    We address the problem of deconvolution of astronomical images that could be obtained with future large interferometers in space. The presentation is made in two complementary parts. The first part gives an introduction to the image deconvolution with linear and nonlinear algorithms. The emphasis is made on nonlinear iterative algorithms that verify the constraints of non-negativity and constant flux. The Richardson-Lucy algorithm appears there as a special case for photon counting conditions. More generally, the algorithm published recently by Lanteri et al. (2015) is based on scale invariant divergences without assumption on the statistic model of the data. The two proposed algorithms are interior-point algorithms, the latter being more efficient in terms of speed of calculation. These algorithms are applied to the deconvolution of simulated images corresponding to an interferometric system of 16 diluted telescopes in space. Two non-redundant configurations, one disposed around a circle and the other on an hexagonal lattice, are compared for their effectiveness on a simple astronomical object. The comparison is made in the direct and Fourier spaces. Raw "dirty" images have many artifacts due to replicas of the original object. Linear methods cannot remove these replicas while iterative methods clearly show their efficacy in these examples.

  16. Source pulse enhancement by deconvolution of an empirical Green's function

    NASA Astrophysics Data System (ADS)

    Mueller, Charles S.

    1985-01-01

    Observations of the earthquake source-time function are enhanced if path, recording-site, and instrument complexities can be removed from seismograms. Assuming that a small earthquake has a simple source, its seismogram can be treated as an empirical Green's function and deconvolved from the seismogram of a larger and/or more complex earthquake by spectral division. When the deconvolution is well posed, the quotient spectrum represents the apparent source-time function of the larger event. This study shows that with high-quality locally recorded earthquake data it is feasible to Fourier transform the quotient and obtain a useful result in the time domain. In practice, the deconvolution can be stabilized by one of several simple techniques. In this paper, the method is implemented and tested on high-quality digital recordings of aftershocks of the Jan. 9, 1982 Miramichi (New Brunswick) earthquake. In particular, seismograms from a Jan. 17 aftershock (017 13:33 GMT, local mag.=3.5) exhibit path or site effects which complicate the determination of source parameters. After deconvolution, the apparent far-field source of this event is a simple pulse in displacement with duration ≈ 0.07 second for both P and S.

  17. Tissue-specific sparse deconvolution for brain CT perfusion.

    PubMed

    Fang, Ruogu; Jiang, Haodi; Huang, Junzhou

    2015-12-01

    Enhancing perfusion maps in low-dose computed tomography perfusion (CTP) for cerebrovascular disease diagnosis is a challenging task, especially for low-contrast tissue categories where infarct core and ischemic penumbra usually occur. Sparse perfusion deconvolution has been recently proposed to effectively improve the image quality and diagnostic accuracy of low-dose perfusion CT by extracting the complementary information from the high-dose perfusion maps to restore the low-dose using a joint spatio-temporal model. However the low-contrast tissue classes where infarct core and ischemic penumbra are likely to occur in cerebral perfusion CT tend to be over-smoothed, leading to loss of essential biomarkers. In this paper, we propose a tissue-specific sparse deconvolution approach to preserve the subtle perfusion information in the low-contrast tissue classes. We first build tissue-specific dictionaries from segmentations of high-dose perfusion maps using online dictionary learning, and then perform deconvolution-based hemodynamic parameters estimation for block-wise tissue segments on the low-dose CTP data. Extensive validation on clinical datasets of patients with cerebrovascular disease demonstrates the superior performance of our proposed method compared to state-of-art, and potentially improve diagnostic accuracy by increasing the differentiation between normal and ischemic tissues in the brain. PMID:26055434

  18. Tissue-specific sparse deconvolution for brain CT perfusion.

    PubMed

    Fang, Ruogu; Jiang, Haodi; Huang, Junzhou

    2015-12-01

    Enhancing perfusion maps in low-dose computed tomography perfusion (CTP) for cerebrovascular disease diagnosis is a challenging task, especially for low-contrast tissue categories where infarct core and ischemic penumbra usually occur. Sparse perfusion deconvolution has been recently proposed to effectively improve the image quality and diagnostic accuracy of low-dose perfusion CT by extracting the complementary information from the high-dose perfusion maps to restore the low-dose using a joint spatio-temporal model. However the low-contrast tissue classes where infarct core and ischemic penumbra are likely to occur in cerebral perfusion CT tend to be over-smoothed, leading to loss of essential biomarkers. In this paper, we propose a tissue-specific sparse deconvolution approach to preserve the subtle perfusion information in the low-contrast tissue classes. We first build tissue-specific dictionaries from segmentations of high-dose perfusion maps using online dictionary learning, and then perform deconvolution-based hemodynamic parameters estimation for block-wise tissue segments on the low-dose CTP data. Extensive validation on clinical datasets of patients with cerebrovascular disease demonstrates the superior performance of our proposed method compared to state-of-art, and potentially improve diagnostic accuracy by increasing the differentiation between normal and ischemic tissues in the brain.

  19. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    PubMed

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes. PMID:26753878

  20. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    PubMed

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  1. Generalized Analysis Tools for Multi-Spacecraft Missions

    NASA Astrophysics Data System (ADS)

    Chanteur, G. M.

    2011-12-01

    Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI

  2. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment

  3. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  4. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  5. jSIPRO - analysis tool for magnetic resonance spectroscopic imaging.

    PubMed

    Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan

    2013-10-01

    Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

  6. New tools for the analysis and design of building envelopes

    SciTech Connect

    Papamichael, K.; Winkelmann, F.C.; Buhl, W.F.; Chauvet, H.

    1994-08-01

    We describe the integrated development of PowerDOE, a new version of the DOE-2 building energy analysis program, and the Building Design Advisor (BDA), a multimedia-based design tool that assists building designers with the concurrent consideration of multiple design solutions with respect to multiple design criteria. PowerDOE has a windows-based Graphical User Interface (GUI) that makes it easier to use than DOE-2, while retaining DOE-2`s calculation power and accuracy. BDA, with a similar GUI, is designed to link to multiple analytical models and databases. In its first release it is linked to PowerDOE and a Daylighting Analysis Module, as well as to a Case Studies Database and a Schematic Graphic Editor. These allow building designers to set performance goals and address key building envelope parameters from the initial, schematic phases of building design to the detailed specification of building components and systems required by PowerDOE. The consideration of the thermal performance of building envelopes through PowerDOE and BDA is integrated with non-thermal envelope performance aspects, such as daylighting, as well as with the performance of non-envelope building components and systems, such as electric lighting and HVAC. Future versions of BDA will support links to CAD and electronic product catalogs, as well as provide context-dependent design advice to improve performance.

  7. DSA hole defectivity analysis using advanced optical inspection tool

    NASA Astrophysics Data System (ADS)

    Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kawakami, Shinichiro; Kosugi, Hitoshi; Rathsack, Benjamen; Kitano, Takahiro; Sweis, Jason; Mokhberi, Ali

    2013-04-01

    This paper discusses the defect density detection and analysis methodology using advanced optical wafer inspection capability to enable accelerated development of a DSA process/process tools and the required inspection capability to monitor such a process. The defectivity inspection methodologies are optimized for grapho epitaxy directed self-assembly (DSA) contact holes with 25 nm sizes. A defect test reticle with programmed defects on guide patterns is designed for improved optimization of defectivity monitoring. Using this reticle, resist guide holes with a variety of sizes and shapes are patterned using an ArF immersion scanner. The negative tone development (NTD) type thermally stable resist guide is used for DSA of a polystyrene-b-poly(methyl methacrylate) (PS-b-PMMA) block copolymer (BCP). Using a variety of defects intentionally made by changing guide pattern sizes, the detection rates of each specific defectivity type has been analyzed. It is found in this work that to maximize sensitivity, a two pass scan with bright field (BF) and dark field (DF) modes provides the best overall defect type coverage and sensitivity. The performance of the two pass scan with BF and DF modes is also revealed by defect analysis for baseline defectivity on a wafer processed with nominal process conditions.

  8. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  9. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  10. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  11. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  12. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    NASA Astrophysics Data System (ADS)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  13. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  14. Deconvolution of Soft Ionization Mass Spectra of Chlorinated Paraffins To Resolve Congener Groups.

    PubMed

    Yuan, Bo; Alsberg, Tomas; Bogdal, Christian; MacLeod, Matthew; Berger, Urs; Gao, Wei; Wang, Yawei; de Wit, Cynthia A

    2016-09-20

    We describe and illustrate a three-step data-processing approach that enables individual congener groups of chlorinated paraffins (CPs) to be resolved in mass spectra obtained from either of two soft ionization methods: electron capture negative ionization mass spectrometry (ECNI-MS) or atmospheric pressure chemical ionization mass spectrometry (APCI-MS). In the first step, general fragmentation pathways of CPs are deduced from analysis of mass spectra of individual CP congeners. In the second step, all possible fragment ions in the general fragmentation pathways of CPs with 10 to 20 carbon atoms are enumerated and compared to mass spectra of CP mixture standards, and a deconvolution algorithm is applied to identify fragment ions that are actually observed. In the third step, isotope permutations of the observed fragment ions are calculated and used to identify isobaric overlaps, so that mass intensities of individual CP congener groups can be deconvolved from the unresolved isobaric ion signal intensities in mass spectra. For a specific instrument, the three steps only need to be done once to enable deconvolution of CPs in unknown samples. This approach enables congener group-level resolution of CP mixtures in environmental samples, and it opens up the possibility for quantification of congener groups. PMID:27531279

  15. Remote heartbeat signal detection from visible spectrum recordings based on blind deconvolution

    NASA Astrophysics Data System (ADS)

    Kaur, Balvinder; Moses, Sophia; Luthra, Megha; Ikonomidou, Vasiliki N.

    2016-05-01

    While recent advances have shown that it is possible to acquire a signal equivalent to the heartbeat from visual spectrum video recordings of the human skin, extracting the heartbeat's exact timing information from it, for the purpose of heart rate variability analysis, remains a challenge. In this paper, we explore two novel methods to estimate the remote cardiac signal peak positions, aiming at a close representation of the R-peaks of the ECG signal. The first method is based on curve fitting (CF) using a modified filtered least mean square (LMS) optimization and the second method is based on system estimation using blind deconvolution (BDC). To prove the efficacy of the developed algorithms, we compared results obtained with the ground truth (ECG) signal. Both methods achieved a low relative error between the peaks of the two signals. This work, performed under an IRB approved protocol, provides initial proof that blind deconvolution techniques can be used to estimate timing information of the cardiac signal closely correlated to the one obtained by traditional ECG. The results show promise for further development of a remote sensing of cardiac signals for the purpose of remote vital sign and stress detection for medical, security, military and civilian applications.

  16. Thorium concentrations in the lunar surface. V - Deconvolution of the central highlands region

    NASA Technical Reports Server (NTRS)

    Metzger, A. E.; Etchegaray-Ramirez, M. I.; Haines, E. L.

    1982-01-01

    The distribution of thorium in the lunar central highlands measured from orbit by the Apollo 16 gamma-ray spectrometer is subjected to a deconvolution analysis to yield improved spatial resolution and contrast. Use of two overlapping data fields for complete coverage also provides a demonstration of the technique's ability to model concentrations several degrees beyond the data track. Deconvolution reveals an association between Th concentration and the Kant Plateau, Descartes Mountain and Cayley plains surface formations. The Kant Plateau and Descartes Mountains model with Th less than 1 part per million, which is typical of farside highlands but is infrequently seen over any other nearside highland portions of the Apollo 15 and 16 ground tracks. It is noted that, if the Cayley plains are the result of basin-forming impact ejecta, the distribution of Th concentration with longitude supports an origin from the Imbrium basin rather than the Nectaris or Orientale basins. Nectaris basin materials are found to have a Th concentration similar to that of the Descartes Mountains, evidence that the latter may have been emplaced as Nectaris basin impact deposits.

  17. A comparison of different peak shapes for deconvolution of alpha-particle spectra

    NASA Astrophysics Data System (ADS)

    Marzo, Giuseppe A.

    2016-10-01

    Alpha-particle spectrometry is a standard technique for assessing the sample content in terms of alpha-decaying isotopes. A comparison of spectral deconvolutions performed adopting different peak shape functions has been carried out and a sensitivity analysis has been performed to test for the robustness of the results. As previously observed, there is evidence that the alpha peaks are well reproduced by a Gaussian modified by a function which takes into account the prominent tailing that an alpha-particle spectrum measured by means of a silicon detector exhibits. Among the different peak shape functions considered, that proposed by G. Bortels and P. Collaers, Int. J. Rad. Appl. Instrum. A 38, pp. 831-837 (1987) is the function which provides more accurate and more robust results when the spectral resolution is high enough to make such tailing significant. Otherwise, in the case of lower resolution alpha-particle spectra, simpler peak shape functions which are characterized by a lower number of fitting parameters provide adequate results. The proposed comparison can be useful for selecting the most appropriate peak shape function when accurate spectral deconvolution of alpha-particle spectra is sought.

  18. Blind deconvolution of 3D fluorescence microscopy using depth-variant asymmetric PSF.

    PubMed

    Kim, Boyoung; Naemura, Takeshi

    2016-06-01

    The 3D wide-field fluorescence microscopy suffers from depth-variant asymmetric blur. The depth-variance and axial asymmetry are due to refractive index mismatch between the immersion and the specimen layer. The radial asymmetry is due to lens imperfections and local refractive index inhomogeneities in the specimen. To obtain the PSF that has these characteristics, there were PSF premeasurement trials. However, they are useless since imaging conditions such as camera position and refractive index of the specimen are changed between the premeasurement and actual imaging. In this article, we focus on removing unknown depth-variant asymmetric blur in such an optical system under the assumption of refractive index homogeneities in the specimen. We propose finding few parameters in the mathematical PSF model from observed images in which the PSF model has a depth-variant asymmetric shape. After generating an initial PSF from the analysis of intensities in the observed image, the parameters are estimated based on a maximum likelihood estimator. Using the estimated PSF, we implement an accelerated GEM algorithm for image deconvolution. Deconvolution result shows the superiority of our algorithm in terms of accuracy, which quantitatively evaluated by FWHM, relative contrast, standard deviation values of intensity peaks and FWHM. Microsc. Res. Tech. 79:480-494, 2016. © 2016 Wiley Periodicals, Inc. PMID:27062314

  19. Benchmark of the non-parametric Bayesian deconvolution method implemented in the SINBAD code for X/γ rays spectra processing

    NASA Astrophysics Data System (ADS)

    Rohée, E.; Coulon, R.; Carrel, F.; Dautremer, T.; Barat, E.; Montagu, T.; Normand, S.; Jammes, C.

    2016-11-01

    Radionuclide identification and quantification are a serious concern for many applications as for in situ monitoring at nuclear facilities, laboratory analysis, special nuclear materials detection, environmental monitoring, and waste measurements. High resolution gamma-ray spectrometry based on high purity germanium diode detectors is the best solution available for isotopic identification. Over the last decades, methods have been developed to improve gamma spectra analysis. However, some difficulties remain in the analysis when full energy peaks are folded together with high ratio between their amplitudes, and when the Compton background is much larger compared to the signal of a single peak. In this context, this study deals with the comparison between a conventional analysis based on "iterative peak fitting deconvolution" method and a "nonparametric Bayesian deconvolution" approach developed by the CEA LIST and implemented into the SINBAD code. The iterative peak fit deconvolution is used in this study as a reference method largely validated by industrial standards to unfold complex spectra from HPGe detectors. Complex cases of spectra are studied from IAEA benchmark protocol tests and with measured spectra. The SINBAD code shows promising deconvolution capabilities compared to the conventional method without any expert parameter fine tuning.

  20. Tools4miRs – one place to gather all the tools for miRNA analysis

    PubMed Central

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  1. Analysis of the influence of tool dynamics in diamond turning

    SciTech Connect

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  2. Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools

    NASA Astrophysics Data System (ADS)

    Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

    2010-05-01

    Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency

  3. Analysis of Facial Injuries Caused by Power Tools.

    PubMed

    Kim, Jiye; Choi, Jin-Hee; Hyun Kim, Oh; Won Kim, Sug

    2016-06-01

    The number of injuries caused by power tools is steadily increasing as more domestic woodwork is undertaken and more power tools are used recreationally. The injuries caused by the different power tools as a consequence of accidents are an issue, because they can lead to substantial costs for patients and the national insurance system. The increase in hand surgery as a consequence of the use of power tools and its economic impact, and the characteristics of the hand injuries caused by power saws have been described. In recent years, the authors have noticed that, in addition to hand injuries, facial injuries caused by power tools commonly present to the emergency room. This study aimed to review the data in relation to facial injuries caused by power saws that were gathered from patients who visited the trauma center at our hospital over the last 4 years, and to analyze the incidence and epidemiology of the facial injuries caused by power saws. The authors found that facial injuries caused by power tools have risen continually. Facial injuries caused by power tools are accidental, and they cause permanent facial disfigurements and functional disabilities. Accidents are almost inevitable in particular workplaces; however, most facial injuries could be avoided by providing sufficient operator training and by tool operators wearing suitable protective devices. The evaluation of the epidemiology and patterns of facial injuries caused by power tools in this study should provide the information required to reduce the number of accidental injuries.

  4. Bioelectrical impedance analysis: A new tool for assessing fish condition

    USGS Publications Warehouse

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  5. Thermal Management Tools for Propulsion System Trade Studies and Analysis

    NASA Technical Reports Server (NTRS)

    McCarthy, Kevin; Hodge, Ernie

    2011-01-01

    Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

  6. HPLC analysis as a tool for assessing targeted liposome composition.

    PubMed

    Oswald, Mira; Platscher, Michael; Geissler, Simon; Goepferich, Achim

    2016-01-30

    Functionalized phospholipids are indispensable materials for the design of targeted liposomes. Control over the quality and quantity of phospholipids is thereby key in the successful development and manufacture of such formulations. This was also the case for a complex liposomal preparation composed of 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC), Cholesterol (CHO), 1,2-distearoyl-sn-glycero-3-phosphoethanolamine-N-[amino(polyethylene glycol)-2000] (DSPE-PEG2000). To this end, an RP-HPLC method was developed. Detection was done via evaporative light scattering (ELS) for liposomal components. The method was validated for linearity, precision, accuracy, sensitivity and robustness. The liposomal compounds had a non-linear quadratic response in the concentration range of 0.012-0.42 mg/ml with a correlation coefficient greater than 0.99 with an accuracy of method confirmed 95-105% of the theoretical concentration. Furthermore, degradation products from the liposomal formulation could be identified. The presented method was successfully implemented as a control tool during the preparation of functionalized liposomes. It underlined the benefit of HPLC analysis of phospholipids during liposome preparation as an easy and rapid control method for the functionalized lipid at each preparation step as well as for the quantification of all components.

  7. CCAT: Combinatorial Code Analysis Tool for transcriptional regulation

    PubMed Central

    Jiang, Peng; Singh, Mona

    2014-01-01

    Combinatorial interplay among transcription factors (TFs) is an important mechanism by which transcriptional regulatory specificity is achieved. However, despite the increasing number of TFs for which either binding specificities or genome-wide occupancy data are known, knowledge about cooperativity between TFs remains limited. To address this, we developed a computational framework for predicting genome-wide co-binding between TFs (CCAT, Combinatorial Code Analysis Tool), and applied it to Drosophila melanogaster to uncover cooperativity among TFs during embryo development. Using publicly available TF binding specificity data and DNaseI chromatin accessibility data, we first predicted genome-wide binding sites for 324 TFs across five stages of D. melanogaster embryo development. We then applied CCAT in each of these developmental stages, and identified from 19 to 58 pairs of TFs in each stage whose predicted binding sites are significantly co-localized. We found that nearby binding sites for pairs of TFs predicted to cooperate were enriched in regions bound in relevant ChIP experiments, and were more evolutionarily conserved than other pairs. Further, we found that TFs tend to be co-localized with other TFs in a dynamic manner across developmental stages. All generated data as well as source code for our front-to-end pipeline are available at http://cat.princeton.edu. PMID:24366875

  8. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  9. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  10. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum

    NASA Astrophysics Data System (ADS)

    Wille, M.-L.; Zapf, M.; Ruiter, N. V.; Gemmeke, H.; Langton, C. M.

    2015-06-01

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.

  11. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum.

    PubMed

    Wille, M-L; Zapf, M; Ruiter, N V; Gemmeke, H; Langton, C M

    2015-06-21

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.

  12. Determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1991-01-01

    The final report for work on the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution is presented. Papers and theses prepared during the research report period are included. Among all the research results reported, note should be made of the specific investigation of the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution. A methodology was developed to determine design and operation parameters for error minimization when deconvolution is included in data analysis. An error surface is plotted versus the signal-to-noise ratio (SNR) and all parameters of interest. Instrumental characteristics will determine a curve in this space. The SNR and parameter values which give the projection from the curve to the surface, corresponding to the smallest value for the error, are the optimum values. These values are constrained by the curve and so will not necessarily correspond to an absolute minimum in the error surface.

  13. Analysis of light scattering from a cutting tool edge.

    PubMed

    Wang, H; Malacara, D

    1994-07-01

    The scattering of light from cutting tools is studied. The contribution of cutting tool edge parameters (height and width) to scattering patterns and the influence of side surface roughness on scattering patterns are investigated. An angle-limited integrated scattering method is developed and analyzed for fast determination of edge parameters.

  14. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, D.; Kueppers, U.; Castro, J. M.

    2009-12-01

    -size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  15. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

    2010-05-01

    -size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  16. An automated technique for detailed ?-FTIR mapping of diamond and spectral deconvolution

    NASA Astrophysics Data System (ADS)

    Howell, Dan; Griffin, Bill; O'Neill, Craig; O'Reilly, Suzanne; Pearson, Norman; Handley, Heather

    2010-05-01

    Since the original classification of diamonds based upon their absorption in the one-phonon region of the mid-infrared (IR) range was first introduced, a vast amount of research has been carried out in this field. The result today is that IR analysis has become the principle tool for classifying diamonds based upon the concentration and aggregation state of nitrogen, the most common impurity found within their crystal lattice. These studies have shown that diamonds can contain a large range of nitrogen, from nominally nitrogen free i.e. below detection limits (termed Type II) to nitrogen rich (termed Type I) with up to 5000 ppm. It has also been shown that the nitrogen concentration, aggregation and distribution in an individual stone can be either homogeneous or heterogeneous. Nitrogen has been shown to reside within diamond in three different aggregation states. The first is in the form of single substitutional nitrogen atoms, known as C centres. Diamonds that contain nitrogen only in this form are termed Type Ib. The second aggregation state is pairs of nitrogen atoms forming A centres (termed Type IaA diamonds) and the final state is four nitrogen atoms tetrahedrally arrange around a vacancy, forming a B centre (termed Type IaB). The sequence of aggregation has been shown to progress from C centres to A centres to B centres and is a function of time and temperature. As such it is a commonly used tool in the geological study of diamonds to gauge their mantle residence time / temperature history. The first step in the sequence is thought to occur relatively quickly in geological terms; the vast age of most diamonds therefore makes Type Ib samples rare in cratonic diamond deposits. The second step takes considerably more time, meaning that the A to B centre conversion may not always continue through to completion. So diamonds containing a mixture of both A and B centres are commonly termed Type IaAB. IR analysis of diamond also has the capability of identifying

  17. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    NASA Astrophysics Data System (ADS)

    Adams, David; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Farrell, Steven; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-12-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  18. Optimal application of Morrison's iterative noise removal for deconvolution. Appendices

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1987-01-01

    Morrison's iterative method of noise removal, or Morrison's smoothing, is applied in a simulation to noise-added data sets of various noise levels to determine its optimum use. Morrison's smoothing is applied for noise removal alone, and for noise removal prior to deconvolution. For the latter, an accurate method is analyzed to provide confidence in the optimization. The method consists of convolving the data with an inverse filter calculated by taking the inverse discrete Fourier transform of the reciprocal of the transform of the response of the system. Various length filters are calculated for the narrow and wide Gaussian response functions used. Deconvolution of non-noisy data is performed, and the error in each deconvolution calculated. Plots are produced of error versus filter length; and from these plots the most accurate length filters determined. The statistical methodologies employed in the optimizations of Morrison's method are similar. A typical peak-type input is selected and convolved with the two response functions to produce the data sets to be analyzed. Both constant and ordinate-dependent Gaussian distributed noise is added to the data, where the noise levels of the data are characterized by their signal-to-noise ratios. The error measures employed in the optimizations are the L1 and L2 norms. Results of the optimizations for both Gaussians, both noise types, and both norms include figures of optimum iteration number and error improvement versus signal-to-noise ratio, and tables of results. The statistical variation of all quantities considered is also given.

  19. Deconvolution/identification techniques for 1-D transient signals

    SciTech Connect

    Goodman, D.M.

    1990-10-01

    This paper discusses a variety of nonparametric deconvolution and identification techniques that we have developed for application to 1-D transient signal problems. These methods are time-domain techniques that use direct methods for matrix inversion. Therefore, they are not appropriate for large data'' problems. These techniques involve various regularization methods and permit the use of certain kinds of a priori information in estimating the unknown. These techniques have been implemented in a package using standard FORTRAN that should make the package readily transportable to most computers. This paper is also meant to be an instruction manual for the package. 25 refs., 17 figs., 1 tab.

  20. Note: On the deconvolution of Kelvin probe force microscopy data.

    PubMed

    Blümel, A; Plank, H; Klug, A; Fisslthaler, E; Sezen, M; Grogger, W; List, E J W

    2010-05-01

    In Kelvin probe force microscopy (KPFM) proper interpretation of the data is often difficult because the measured surface potential is affected by the interaction of the cantilever with the sample. In this work, the tip's interaction with a modeled surface potential distribution was simulated, leading to a calculated KPFM image. Although simplified, the calculation is capable of showing the influence of the cantilever in the correct qualitative manner, proven by a comparison with experimental data. Additionally, a deconvolution was performed on the simulated image, showing that for simple geometries revealing the "real" surface potential data is possible in principle. PMID:20515184

  1. Multi-frame blind deconvolution using sparse priors

    NASA Astrophysics Data System (ADS)

    Dong, Wende; Feng, Huajun; Xu, Zhihai; Li, Qi

    2012-05-01

    In this paper, we propose a method for multi-frame blind deconvolution. Two sparse priors, i.e., the natural image gradient prior and an l1-norm based prior are used to regularize the latent image and point spread functions (PSFs) respectively. An alternating minimization approach is adopted to solve the resulted optimization problem. We use both gray scale blurred frames from a data set and some colored ones which are captured by a digital camera to verify the robustness of our approach. Experimental results show that the proposed method can accurately reconstruct PSFs with complex structures and the restored images are of high quality.

  2. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    SciTech Connect

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  3. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  4. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  5. MTpy - Python Tools for Magnetotelluric Data Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

    2014-05-01

    We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

  6. Online Analysis of Wind and Solar Part I: Ramping Tool

    SciTech Connect

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  7. Online Analysis of Wind and Solar Part II: Transmission Tool

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  8. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  9. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  10. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  11. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    SciTech Connect

    Jodoin, Vincent J; Lee, Ronald W; Peplow, Douglas E.; Lefebvre, Jordan P

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  12. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing

  13. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate

  14. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest

  15. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  16. Molecular dynamics in cytochrome c oxidase Moessbauer spectra deconvolution

    SciTech Connect

    Bossis, Fabrizio; Palese, Luigi L.

    2011-01-07

    Research highlights: {yields} Cytochrome c oxidase molecular dynamics serve to predict Moessbauer lineshape widths. {yields} Half height widths are used in modeling of Lorentzian doublets. {yields} Such spectral deconvolutions are useful in detecting the enzyme intermediates. -- Abstract: In this work low temperature molecular dynamics simulations of cytochrome c oxidase are used to predict an experimentally observable, namely Moessbauer spectra width. Predicted lineshapes are used to model Lorentzian doublets, with which published cytochrome c oxidase Moessbauer spectra were simulated. Molecular dynamics imposed constraints to spectral lineshapes permit to obtain useful information, like the presence of multiple chemical species in the binuclear center of cytochrome c oxidase. Moreover, a benchmark of quality for molecular dynamic simulations can be obtained. Despite the overwhelming importance of dynamics in electron-proton transfer systems, limited work has been devoted to unravel how much realistic are molecular dynamics simulations results. In this work, molecular dynamics based predictions are found to be in good agreement with published experimental spectra, showing that we can confidently rely on actual simulations. Molecular dynamics based deconvolution of Moessbauer spectra will lead to a renewed interest for application of this approach in bioenergetics.

  17. Retinal image restoration by means of blind deconvolution

    NASA Astrophysics Data System (ADS)

    Marrugo, Andrés G.; Šorel, Michal; Šroubek, Filip; Millán, María S.

    2011-11-01

    Retinal imaging plays a key role in the diagnosis and management of ophthalmologic disorders, such as diabetic retinopathy, glaucoma, and age-related macular degeneration. Because of the acquisition process, retinal images often suffer from blurring and uneven illumination. This problem may seriously affect disease diagnosis and progression assessment. Here we present a method for color retinal image restoration by means of multichannel blind deconvolution. The method is applied to a pair of retinal images acquired within a lapse of time, ranging from several minutes to months. It consists of a series of preprocessing steps to adjust the images so they comply with the considered degradation model, followed by the estimation of the point-spread function and, ultimately, image deconvolution. The preprocessing is mainly composed of image registration, uneven illumination compensation, and segmentation of areas with structural changes. In addition, we have developed a procedure for the detection and visualization of structural changes. This enables the identification of subtle developments in the retina not caused by variation in illumination or blur. The method was tested on synthetic and real images. Encouraging experimental results show that the method is capable of significant restoration of degraded retinal images.

  18. Lidar backscatter signal recovery from phototransistor systematic effect by deconvolution.

    PubMed

    Refaat, Tamer F; Ismail, Syed; Abedin, M Nurul; Spuler, Scott M; Mayor, Shane D; Singh, Upendra N

    2008-10-10

    Backscatter lidar detection systems have been designed and integrated at NASA Langley Research Center using IR heterojunction phototransistors. The design focused on maximizing the system signal-to-noise ratio rather than noise minimization. The detection systems have been validated using the Raman-shifted eye-safe aerosol lidar (REAL) at the National Center for Atmospheric Research. Incorporating such devices introduces some systematic effects in the form of blurring to the backscattered signals. Characterization of the detection system transfer function aided in recovering such effects by deconvolution. The transfer function was obtained by measuring and fitting the system impulse response using single-pole approximation. An iterative deconvolution algorithm was implemented in order to recover the system resolution, while maintaining high signal-to-noise ratio. Results indicated a full recovery of the lidar signal, with resolution matching avalanche photodiodes. Application of such a technique to atmospheric boundary and cloud layers data restores the range resolution, up to 60 m, and overcomes the blurring effects.

  19. Deconvolution and simulation of thermoluminescence glow curves with Mathcad.

    PubMed

    Kiisk, V

    2013-09-01

    The paper reports two quite general and user-friendly calculation codes (called TLD-MC and TLS-MC) for deconvolution and simulation, respectively, of thermoluminescence (TL) glow curves, which have been implemented using the well-known engineering computing software PTC Mathcad. An advantage of this commercial software is the flexibility and productivity in setting up tailored computations due to a natural math notation, an interactive calculation environment and the availability of advanced numerical methods. TLD-MC includes the majority of popular models used for TL glow-curve deconvolution (the user can easily implement additional models if necessary). The least-squares (Levenberg-Marquardt) optimisation of various analytical and even some non-analytical models is reasonably fast and the obtained figure-of-merit values are generally excellent. TLS-MC implements numerical solution of the original set of differential equations describing charge carrier dynamics involving arbitrary number of interactive electron and hole traps. The programs are freely available from the website http://www.physic.ut.ee/~kiisk/mcadapps.htm.

  20. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  1. Further improvements in deconvolution of pass-through paleomagnetic measurement data: Accuracy of positioning and sensor response

    NASA Astrophysics Data System (ADS)

    Oda, H.; Xuan, C.; Yamamoto, Y.

    2015-12-01

    Pass-through superconducting rock magnetometer (SRM) is one of the most important tools for modern paleomagnetism research. It offers rapid and continuous measurements of weak remanent magnetization preserved in various geological archives. However, pass-through SRM measurements are inevitably smoothed and even distorted due to the convolution effect of the SRM sensor response, and deconvolution is necessary to restore high-resolution signal from pass-through measurements. Reliable deconvolution relies on accurate estimate of the SRM sensor response and our understanding of errors associated with sample measurements. In this presentation, we introduce new practical tool and procedure to facilitate rapid and accurate measurements of SRM sensor response with demonstration using an SRM at the Kochi Core Center, Japan. We also report accurate measurements of SRM tray positions measured using laser interferometry. The measured positions with an actual u-channel sample on the tray show vibrations with peak amplitudes of ~50μm following each stop of the tray. The vibrations diminish towards background level in ~0.4 sec. Comparison with the positions expected from the stepping motor counts show random discrepancies with standard deviations of 0.1~0.2 mm. Measurements with u-channel show higher standard deviations including significant stepwise changes of up to ~0.5mm.

  2. Maximum a posteriori blind image deconvolution with Huber-Markov random-field regularization.

    PubMed

    Xu, Zhimin; Lam, Edmund Y

    2009-05-01

    We propose a maximum a posteriori blind deconvolution approach using a Huber-Markov random-field model. Compared with the conventional maximum-likelihood method, our algorithm not only suppresses noise effectively but also significantly alleviates the artifacts produced by the deconvolution process. The performance of this method is demonstrated by computer simulations.

  3. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  4. 99Tcm-MAG3 renogram deconvolution in normal subjects and in normal functioning kidney grafts.

    PubMed

    González, A; Puchal, R; Bajén, M T; Mairal, L; Prat, L; Martin-Comin, J

    1994-09-01

    This study provides values of transit times obtained by 99Tcm- mercaptoacetyl triglycine (99Tcm-MAG3) renogram deconvolution for both normal subjects and kidney graft recipients. The analysis included 50 healthy kidney units from 25 volunteers and 28 normal functioning kidney grafts. The parameters calculated for the whole kidney (WK) and for the renal parenchyma (P) were: mean transit time (MTT) and times at 20% (T20) and 80% (T80) of renal retention function initial height. For healthy kidneys the WK MTT was 174 +/- 27 s and P MTT 148 +/- 22 s. The WK T20 values were 230 +/- 33 s and P T20 231 +/- 34 s. The WK T80 was 108 +/- 19 s and P T80 106 +/- 12 s. Whole kidney and parenchymal values of transit times for normal functioning kidney grafts do not present significant differences with respect to healthy kidneys. PMID:7816379

  5. Method for direct deconvolution of heat signals in transient adsorption calorimetry

    NASA Astrophysics Data System (ADS)

    Wolcott, Christopher A.; Campbell, Charles T.

    2015-03-01

    A method of heat signal analysis is presented for transient adsorption calorimetries including single crystal adsorption calorimetry (SCAC) which uses fast Fourier transforms (FFT) to determine the instrument response function and deconvolute the heat-versus-time signals. The method utilizes a heat signal generated by a laser pulse of known power-versus-time to extract the instrument response function for the calorimeter. The instrument response function is then used to extract the heat power signal from a molecular beam heat pulse of unknown intensity. This method allows for the extraction of the total heat deposited by the molecular beam pulse without any kinetic modeling even in the event of complex reaction dynamics. This method is compared to previous methods used to analyze SCAC data using example data from the two-step dissociative adsorption of methyl iodide on Pt(111). It is found to be equally accurate for extracting total heats and simpler to perform than the previous methods.

  6. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  7. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The

  8. TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.

    PubMed

    Fimereli, Danai; Detours, Vincent; Konopka, Tomasz

    2013-04-01

    High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/.

  9. Blind image deconvolution using machine learning for three-dimensional microscopy.

    PubMed

    Kenig, Tal; Kam, Zvi; Feuer, Arie

    2010-12-01

    In this work, we propose a novel method for the regularization of blind deconvolution algorithms. The proposed method employs example-based machine learning techniques for modeling the space of point spread functions. During an iterative blind deconvolution process, a prior term attracts the point spread function estimates to the learned point spread function space. We demonstrate the usage of this regularizer within a Bayesian blind deconvolution framework and also integrate into the latter a method for noise reduction, thus creating a complete blind deconvolution method. The application of the proposed algorithm is demonstrated on synthetic and real-world three-dimensional images acquired by a wide-field fluorescence microscope, where the need for blind deconvolution algorithms is indispensable, yielding excellent results. PMID:20975117

  10. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  11. pathFinder: a static network analysis tool for pharmacological analysis of signal transduction pathways.

    PubMed

    Samal, Babru B; Eiden, Lee E

    2008-01-01

    The study of signal transduction is becoming a de facto part of the analysis of gene expression and protein profiling techniques. Many online tools are used to cluster genes in various ways or to assign gene products to signal transduction pathways. Among these, pathFinder is a unique tool that can find signal transduction pathways between first, second, or nth messengers and their targets within the cell. pathFinder can identify qualitatively all possible signal transduction pathways connecting any starting component and target within a database of two-component pathways (directional dyads). One or more intermediate pathway components can be excluded to simulate the use of pharmacological inhibitors or genetic deletion (knockout). Missing elements in a pathway connecting the activator or initiator and target can also be inferred from a null pathway result. The value of this static network analysis tool is illustrated by the predication from pathFinder analysis of a novel cyclic AMP-dependent, protein kinase A-independent signaling pathway in neuroendocrine cells, which has been experimentally confirmed.

  12. The Appropriate Deconvolution Method on 1990 HST Saturn Equatorial Disturbance Images

    NASA Astrophysics Data System (ADS)

    Temma, T.; Chanover, N.

    2000-10-01

    We search for an appropriate method to restore WFPC1 Saturn equatorial storm images taken in 1990. As is well known, WFPC1 images suffer serious spherical aberration and therefore the choice of appropriate restoration method is crucial to do photometry or cloud morphology analysis. In our research, Maximum Entropy Method(MEM), Lucy and Wiener methods in the IRAF STSDAS package are adopted and compared mainly regarding conservation of photometric properties. We first look at the deconvolution effect on a WFPC2 Saturn image. The brightness profiles and brightness of Saturn and its satellites(Mimas and Tethys) relative to Rhea are examined. It has become clear that the Wiener method is not desirable to restore original image information because it produces circular negative annuli around the peaks. In addition, it shows poorer contrast improvements than the other two. Though the MEM and Lucy methods exhibit quite similar sharpening and photometric effects in this analysis, the Lucy method restores more brightness from surrounding area around a point source. This results in about 2% lower relative brightness of Saturn's disk than that derived by MEM relative to Rhea. At the next stage, we undergo the following procedure. First, a noiseless model Jupiter image is constructed. Then, it is convolved with the WFPC1 point spread function, and gaussian noise is added. Finally, the artificially blurred image is deconvolved, and the result is compared with the original noiseless model image. This comparison enables us to estimate efficiency and validity of the deconvolution process. Furthermore, we will show some preliminary radiative transfer center-limb modelling results on restored 1990 Saturn images. We present possible explanations about the difference in the cloud structures between the disturbed northern equatorial region and quiet southern equatorial region.

  13. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities.

  14. Improving ground-penetrating radar data in sedimentary rocks using deterministic deconvolution

    USGS Publications Warehouse

    Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.; Byrnes, A.P.

    2003-01-01

    Resolution is key to confidently identifying unique geologic features using ground-penetrating radar (GPR) data. Source wavelet "ringing" (related to bandwidth) in a GPR section limits resolution because of wavelet interference, and can smear reflections in time and/or space. The resultant potential for misinterpretation limits the usefulness of GPR. Deconvolution offers the ability to compress the source wavelet and improve temporal resolution. Unlike statistical deconvolution, deterministic deconvolution is mathematically simple and stable while providing the highest possible resolution because it uses the source wavelet unique to the specific radar equipment. Source wavelets generated in, transmitted through and acquired from air allow successful application of deterministic approaches to wavelet suppression. We demonstrate the validity of using a source wavelet acquired in air as the operator for deterministic deconvolution in a field application using "400-MHz" antennas at a quarry site characterized by interbedded carbonates with shale partings. We collected GPR data on a bench adjacent to cleanly exposed quarry faces in which we placed conductive rods to provide conclusive groundtruth for this approach to deconvolution. The best deconvolution results, which are confirmed by the conductive rods for the 400-MHz antenna tests, were observed for wavelets acquired when the transmitter and receiver were separated by 0.3 m. Applying deterministic deconvolution to GPR data collected in sedimentary strata at our study site resulted in an improvement in resolution (50%) and improved spatial location (0.10-0.15 m) of geologic features compared to the same data processed without deterministic deconvolution. The effectiveness of deterministic deconvolution for increased resolution and spatial accuracy of specific geologic features is further demonstrated by comparing results of deconvolved data with nondeconvolved data acquired along a 30-m transect immediately adjacent

  15. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. PMID:26679001

  16. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    PubMed

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper.

  17. Assessing the surgeon's technical skills: analysis of the available tools.

    PubMed

    Memon, Muhammed Ashraf; Brigden, David; Subramanya, Manjunath S; Memon, Breda

    2010-05-01

    The concept of assessing competency in surgical practice is not new and has taken on an added urgency in view of the recent high-profile inquiries into "botched cases" involving surgeons of various levels in different parts of the world. Until very recently, surgeons in the United Kingdom and other parts of the world, although required to undergo formal and compulsory examinations to test their factual knowledge and decision making, were not required to demonstrate technical ability. Therefore, there existed (and still exist) no objective assessment criteria to test trainees' surgical skill, especially during the exit examination, which, if passed, provides unrestricted license to surgeons to practice their specialties. However, with the introduction of a new curriculum by various surgical societies and a demand from the lay community for better standards, new assessment tools are emerging that focus on technical competency and that could objectively and reliably measure surgical skills. Furthermore, training authorities and hospitals are keen to embrace these changes for satisfactory accreditation and reaccreditation processes and to assure the public of the safety of the public and private health care systems. In the United Kingdom, two new surgical tools (Surgical Direct Observation of Procedural Skill, and Procedure Based Assessments) have been simultaneously introduced to assess surgical trainees. The authors describe these two assessment methods, provide an overview of other assessment tools currently or previously used to assess surgical skills, critically analyze the two new assessment tools, and reflect on the merit of simultaneously introducing them.

  18. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  19. Mapping and spatiotemporal analysis tool for hydrological data: Spellmap

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

  20. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  1. Tools for Developing a Quality Management Program: Proactive Tools (Process Mapping, Value Stream Mapping, Fault Tree Analysis, and Failure Mode and Effects Analysis)

    SciTech Connect

    Rath, Frank

    2008-05-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  2. Proteomic tools for the analysis of transient interactions between metalloproteins.

    PubMed

    Martínez-Fábregas, Jonathan; Rubio, Silvia; Díaz-Quintana, Antonio; Díaz-Moreno, Irene; De la Rosa, Miguel Á

    2011-05-01

    Metalloproteins play major roles in cell metabolism and signalling pathways. In many cases, they show moonlighting behaviour, acting in different processes, depending on the physiological state of the cell. To understand these multitasking proteins, we need to discover the partners with which they carry out such novel functions. Although many technological and methodological tools have recently been reported for the detection of protein interactions, specific approaches to studying the interactions involving metalloproteins are not yet well developed. The task is even more challenging for metalloproteins, because they often form short-lived complexes that are difficult to detect. In this review, we gather the different proteomic techniques and biointeractomic tools reported in the literature. All of them have shown their applicability to the study of transient and weak protein-protein interactions, and are therefore suitable for metalloprotein interactions.

  3. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    SciTech Connect

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  4. Process mapping as a tool for home health network analysis.

    PubMed

    Pluto, Delores M; Hirshorn, Barbara A

    2003-01-01

    Process mapping is a qualitative tool that allows service providers, policy makers, researchers, and other concerned stakeholders to get a "bird's eye view" of a home health care organizational network or a very focused, in-depth view of a component of such a network. It can be used to share knowledge about community resources directed at the older population, identify gaps in resource availability and access, and promote on-going collaborative interactions that encourage systemic policy reassessment and programmatic refinement. This article is a methodological description of process mapping, which explores its utility as a practice and research tool, illustrates its use in describing service-providing networks, and discusses some of the issues that are key to successfully using this methodology.

  5. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  6. Design tools for daylighting illumination and energy analysis

    SciTech Connect

    Selkowitz, S.

    1982-07-01

    The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

  7. OutbreakTools: A new platform for disease outbreak analysis using the R software

    PubMed Central

    Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Höhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

    2014-01-01

    The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

  8. On the Use of Factor Analysis as a Research Tool.

    ERIC Educational Resources Information Center

    Benson, Jeri; Nasser, Fadia

    1998-01-01

    Discusses the conceptual/theoretical design, statistical, and reporting issues in choosing factor analysis for research. Provides questions to consider when planning, analyzing, or reporting an exploratory factor analysis study. (SK)

  9. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  10. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  11. New Miscue Analysis: A Tool for Comprehending Reading.

    ERIC Educational Resources Information Center

    Anderson, Jonathan

    New miscue analysis, which combines characteristics of cloze procedure with traditional miscue analysis, seems to overcome some of the limitations encountered in the traditional method. In new miscue analysis, subjects read selections below and at their level of reading ability and are not agitated by being asked to read material that is too…

  12. Iterative deconvolution of x ray and optical SNR images

    NASA Technical Reports Server (NTRS)

    Nisenson, Peter; Standley, Clive; Hughes, John

    1992-01-01

    Blind Iterative Deconvolution (BID) is a technique which was originally developed to correct the degrading effects of atmospheric turbulence on astronomical images from single short exposure, high signal-to-noise-ratio frames. At the Center for Astro physics, we have implemented a version of BID following the general approach of Ayers and Dainty (1988), but extending the technique to use Wiener filtering, and developed it for application to high energy images from Einstein and ROSAT. In the optical, the point spread function (PSF) that degrades the images is due to a combination of telescope and atmospheric aberrations. At high energies, the degrading function is the instrument response function, which is known to be time and energy level unstable. In both cases, the PSF is poorly known, so BID can be used to extract the PSF from the image and then deconvolve the blurred image to produce a sharpened image. Other aspects of this technique are discussed.

  13. Denoised Wigner distribution deconvolution via low-rank matrix completion.

    PubMed

    Lee, Justin; Barbastathis, George

    2016-09-01

    Wigner distribution deconvolution (WDD) is a decades-old method for recovering phase from intensity measurements. Although the technique offers an elegant linear solution to the quadratic phase retrieval problem, it has seen limited adoption due to its high computational/memory requirements and the fact that the technique often exhibits high noise sensitivity. Here, we propose a method for noise suppression in WDD via low-rank noisy matrix completion. Our technique exploits the redundancy of an object's phase space to denoise its WDD reconstruction. We show in model calculations that our technique outperforms other WDD algorithms as well as modern iterative methods for phase retrieval such as ptychography. Our results suggest that a class of phase retrieval techniques relying on regularized direct inversion of ptychographic datasets (instead of iterative reconstruction techniques) can provide accurate quantitative phase information in the presence of high levels of noise. PMID:27607616

  14. Time-varying deconvolution of GPR data in civil engineering

    NASA Astrophysics Data System (ADS)

    Economou, Nikos; Vafidis, Antonis; Hamdan, Hamdan; Kritikakis, George; Andronikidis, Nikos; Dimitriadis, Kleisthenis

    2012-09-01

    Ground Penetrating Radar (GPR) profiles are often used in civil engineering problems. Overlapping reflections from thin subgrade layers are observed when a relatively low frequency antenna is used. An efficient GPR data processing method, which increases the dominant frequency of GPR data and the temporal resolution, is proposed. It is implemented in the t-f domain. The proposed time-varying deconvolution technique avoids the need for both the calculation of an inverse zero-phase whitening operator and subsequently the application of a band-pass filtering. The user must select the dominant frequency of the Ricker wavelet and use the phase of a reference electromagnetic wavelet, which is acquired experimentally, for stationary dephasing. Apart from delineating thin layers, this method reduces the number of antennas for imaging both shallow and deeper layers in civil engineering. The effectiveness of the proposed method is demonstrated through four civil engineering applications.

  15. Simultaneous Denoising, Deconvolution, and Demixing of Calcium Imaging Data.

    PubMed

    Pnevmatikakis, Eftychios A; Soudry, Daniel; Gao, Yuanjun; Machado, Timothy A; Merel, Josh; Pfau, David; Reardon, Thomas; Mu, Yu; Lacefield, Clay; Yang, Weijian; Ahrens, Misha; Bruno, Randy; Jessell, Thomas M; Peterka, Darcy S; Yuste, Rafael; Paninski, Liam

    2016-01-20

    We present a modular approach for analyzing calcium imaging recordings of large neuronal ensembles. Our goal is to simultaneously identify the locations of the neurons, demix spatially overlapping components, and denoise and deconvolve the spiking activity from the slow dynamics of the calcium indicator. Our approach relies on a constrained nonnegative matrix factorization that expresses the spatiotemporal fluorescence activity as the product of a spatial matrix that encodes the spatial footprint of each neuron in the optical field and a temporal matrix that characterizes the calcium concentration of each neuron over time. This framework is combined with a novel constrained deconvolution approach that extracts estimates of neural activity from fluorescence traces, to create a spatiotemporal processing algorithm that requires minimal parameter tuning. We demonstrate the general applicability of our method by applying it to in vitro and in vivo multi-neuronal imaging data, whole-brain light-sheet imaging data, and dendritic imaging data. PMID:26774160

  16. Ray-based blind deconvolution in ocean sound channels.

    PubMed

    Sabra, Karim G; Song, Hee-Chun; Dowling, David R

    2010-02-01

    This letter describes a ray-based blind deconvolution technique for ocean sound channels that produces broadband estimates of the source-to-array impulse response and the original source waveform from array-measured signals corrupted by (unknown) multipath propagation. The technique merely requires elementary knowledge of array geometry and sound speed at the array location. It is based on identifying a ray arrival direction to separate source waveform and acoustic-propagation phase contributions to the received signals. This technique successfully decoded underwater telecommunication sequences in the bandwidth 3-4 kHz that were broadcast 4 km in a 120-m-deep ocean sound channel without a-priori knowledge of sound channel characteristics.

  17. Positive deconvolution for superimposed extended source and point sources

    NASA Astrophysics Data System (ADS)

    Giovannelli, J.-F.; Coulais, A.

    2005-08-01

    The paper deals with the construction of images from visibilities acquired using aperture synthesis instruments: Fourier synthesis, deconvolution, and spectral interpolation/extrapolation. Its intended application is to specific situations in which the imaged object possesses two superimposed components: (i) an extended component together with (ii) a set of point sources. It is also specifically designed to the case of positive maps, and accounts for a known support. Its originality lies within joint estimation of the two components, coherently with data, properties of each component, positivity and possible support. We approach the subject as an inverse problem within a regularization framework: a regularized least-squares criterion is specifically proposed and the estimated maps are defined as its minimizer. We have investigated several options for the numerical minimization and we propose a new efficient algorithm based on augmented Lagrangian. Evaluation is carried out using simulated and real data (from radio interferometry) demonstrating the capability to accurately separate the two components.

  18. Image Deconvolution by Means of Frequency Blur Invariant Concept

    PubMed Central

    2014-01-01

    Different blur invariant descriptors have been proposed so far, which are either in the spatial domain or based on the properties available in the moment domain. In this paper, a frequency framework is proposed to develop blur invariant features that are used to deconvolve a degraded image caused by a Gaussian blur. These descriptors are obtained by establishing an equivalent relationship between the normalized Fourier transforms of the blurred and original images, both normalized by their respective fixed frequencies set to one. Advantage of using the proposed invariant descriptors is that it is possible to estimate both the point spread function (PSF) and the original image. The performance of frequency invariants will be demonstrated through experiments. An image deconvolution is done as an additional application to verify the proposed blur invariant features. PMID:25202743

  19. A unified approach to superresolution and multichannel blind deconvolution.

    PubMed

    Sroubek, Filip; Cristóbal, Gabriel; Flusser, Jan

    2007-09-01

    This paper presents a new approach to the blind deconvolution and superresolution problem of multiple degraded low-resolution frames of the original scene. We do not assume any prior information about the shape of degradation blurs. The proposed approach consists of building a regularized energy function and minimizing it with respect to the original image and blurs, where regularization is carried out in both the image and blur domains. The image regularization based on variational principles maintains stable performance under severe noise corruption. The blur regularization guarantees consistency of the solution by exploiting differences among the acquired low-resolution images. Several experiments on synthetic and real data illustrate the robustness and utilization of the proposed technique in real applications.

  20. Parametric study on sequential deconvolution for force identification

    NASA Astrophysics Data System (ADS)

    Lai, Tao; Yi, Ting-Hua; Li, Hong-Nan

    2016-09-01

    The force identification can be mathematically viewed as the mapping from the observed responses to external forces through a matrix filled with system Markov parameters, which makes it difficult or even impossible for long time duration. A potentially efficient solution is to sequentially perform the identification processing. This paper presents a parametric study on the sequential deconvolution input reconstruction (SDR) method, which was proposed by Bernal. The behavior of the SDR method due to the effects of window parameters, noise levels and sensor configurations is investigated. In addition, a new normalized standard deviation of the reconstruction error over time is derived to cover the effect of only independent noise entries. The sinusoidal and band-limited white noise excitations are tested to be identified with good accuracy even with 10% noise. The simulation results yield various conclusions that may be helpful to engineering practitioners.