Science.gov

Sample records for deconvolution analysis tool

  1. BLIND DECONVOLUTION AND DEBLURRING IN IMAGE ANALYSIS

    E-print Network

    Qiu, Peihua

    BLIND DECONVOLUTION AND DEBLURRING IN IMAGE ANALYSIS Peter Hall 1 and Peihua Qiu 1.2 ABSTRACT. Blind deconvolution problems arise in image analysis when both the extent of image blur, and the true from image data. This is a blind deconvolution problem and is, of course, significantly more

  2. BLIND DECONVOLUTION AND DEBLURRING IN IMAGE ANALYSIS

    E-print Network

    Qiu, Peihua

    BLIND DECONVOLUTION AND DEBLURRING IN IMAGE ANALYSIS Peter Hall1 and Peihua Qiu1.2 ABSTRACT. Blind. This is a blind deconvolution problem and is, of course, significantly more challenging than its more conventional, non-blind counterpart. See, for example, work of Kundur and Hatzinakos (1998), Carasso (2001

  3. Journal of Multivariate Analysis Optimal Spherical Deconvolution1

    E-print Network

    Rosenthal, Jeffrey S.

    Journal of Multivariate Analysis Optimal Spherical Deconvolution1 Peter T. Kim University of Guelph This paper addresses the issue of optimal deconvolution density estimation on the 2-sphere. Indeed, by using be done by deconvolution; however, as in the Euclidean case, the difficulty of the deconvolution turns out

  4. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  5. PVT Analysis With A Deconvolution Algorithm

    SciTech Connect

    Kouzes, Richard T.

    2011-02-01

    Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

  6. Iterative deconvolution and semiblind deconvolution methods in magnetic archaeological prospecting

    E-print Network

    Bertero, Mario

    Iterative deconvolution and semiblind deconvolution methods in magnetic archaeological prospecting magnetic surface measurements is a 2D deconvolution problem. Because this problem is ill posed, it requires- ploiting image deconvolution tools, two iterative reconstruc- tion methods are applied to minimize

  7. QML Blind Deconvolution: Asymptotic Analysis Alexander M. Bronstein, Michael M. Bronstein,

    E-print Network

    Zeevi, Yehoshua Y. "Josh"

    QML Blind Deconvolution: Asymptotic Analysis Alexander M. Bronstein, Michael M. Bronstein, Michael deconvolution is considered as a problem of quasi maximum likelihood (QML) estimation of the restoration kernel-efficient are discussed. 1 Introduction Blind deconvolution arises in various applications related to acoustics, optics

  8. AIRYLN: an adhoc numerical tool for deconvolution of images from the LBT instrument LINCNIRVANA

    E-print Network

    Bertero, Mario

    in di#erent hour angles in order to obtain a better uv­coverage, an aperture­synthesis image (LBT), composed of two 8.4­m apertures on a unique mount. It will provide multiple images of the sameAIRY­LN: an ad­hoc numerical tool for deconvolution of images from the LBT instrument LINC

  9. Importance of FTIR Spectra Deconvolution for the Analysis of Amorphous Calcium Phosphates

    NASA Astrophysics Data System (ADS)

    Brangule, Agnese; Agris Gross, Karlis

    2015-03-01

    This work will consider Fourier transform infrared spectroscopy - diffuse reflectance infrared reflection (FTIR-DRIFT) for collecting the spectra and deconvolution to identify changes in bonding as a means of more powerful detection. Spectra were recorded from amorphous calcium phosphate synthesized by wet precipitation, and from bone. FTIR-DRIFT was used to study the chemical environments of PO4, CO3 and amide. Deconvolution of spectra separated overlapping bands in the ?4PO4, ?2CO3, ?3CO3 and amide region allowing a more detailed analysis of changes at the atomic level. Amorphous calcium phosphate dried at 80 oC, despite showing an X-ray diffraction amorphous structure, displayed carbonate in positions resembling a carbonated hydroxyapatite. Additional peaks were designated as A1 type, A2 type or B type. Deconvolution allowed the separation of CO3 positions in bone from amide peaks. FTIR-DRIFT spectrometry in combination with deconvolution offers an advanced tool for qualitative and quantitative determination of CO3, PO4 and HPO4 and shows promise to measure the degree of order.

  10. MULTICHANNEL BLIND DECONVOLUTION OF ARBITRARY SIGNALS: ADAPTIVE ALGORITHMS AND STABILITY ANALYSIES

    E-print Network

    Douglas, Scott C.

    MULTICHANNEL BLIND DECONVOLUTION OF ARBITRARY SIGNALS: ADAPTIVE ALGORITHMS AND STABILITY ANALYSIES for the multichannel blind deconvolution of arbitrary non-Gaussian source mixtures. Two of the algorithms are spa- tia-temporal extensions of recently-derived blind signal separation algorithms that combine kurta- sis-based contrast

  11. A deconvolution technique for Hubble Space Telescope FGS fringe analysis

    NASA Technical Reports Server (NTRS)

    Hershey, John L.

    1992-01-01

    A technique has been developed for directly transforming interferometer fringe visibility functions ('S curves') from the Hubble Space Telescope (HST) fine guidance sensors (FGSs) into intensity profiles of the program object. In a process analogous to Fourier transform image deconvolution, an S curve from a double star yields a pair of narrow profiles containing information on the separation (in one coordinate), and relative brightness of the two stars. The procedure has yielded high internal precision for the separation and relative brightness in tests with HST data. Simulations indicate that it can also deconvolve S curves from multiple stars or continuous intensity distributions such as resolvable stellar or extragalactic objects. Similar deconvolution analyses may be useful in other types of interferometry.

  12. Monitoring a Building Using Deconvolution Interferometry. II: Ambient-Vibration Analysis

    E-print Network

    Snieder, Roel

    Monitoring a Building Using Deconvolution Interferometry. II: Ambient- Vibration Analysis by Nori interferometry to ambient-vibration data, instead of using earthquake data, to monitor a building. The time continuity of ambient vibrations is useful for temporal monitoring. We show that, because multiple sources

  13. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer studies is critically discussed, where special emphasis is set on evaluating different data processing strategies on the example of enriched stable Sr isotopes.1 The analytical key parameters such as blank (Kr, Sr and Rb), variation of the natural Sr isotopic composition in the sample, mass bias, interferences (Rb) and total combined uncertainty are considered. A full metrological protocol for data processing using IPD is presented based on data gained during two transgenerational marking studies of fish, where the transfer of a Sr isotope double spike (84Sr and 86Sr) from female spawners of common carp (Cyprinus carpio L.) and brown trout (Salmo trutta f.f.)2 to the centre of the otoliths of their offspring was studied by (LA)-MC-ICP-MS. 1J. Irrgeher, A. Zitek, M. Cervicek and T. Prohaska, J. Anal. At. Spectrom., 2014, 29, 193-200. 2A. Zitek, J. Irrgeher, M. Kletzl, T. Weismann and T. Prohaska, Fish. Manage. Ecol., 2013, 20, 654-361.

  14. Improving the precision of fMRI BOLD signal deconvolution with implications for connectivity analysis.

    PubMed

    Bush, Keith; Cisler, Josh; Bian, Jiang; Hazaroglu, Gokce; Hazaroglu, Onder; Kilts, Clint

    2015-12-01

    An important, open problem in neuroimaging analyses is developing analytical methods that ensure precise inferences about neural activity underlying fMRI BOLD signal despite the known presence of confounds. Here, we develop and test a new meta-algorithm for conducting semi-blind (i.e., no knowledge of stimulus timings) deconvolution of the BOLD signal that estimates, via bootstrapping, both the underlying neural events driving BOLD as well as the confidence of these estimates. Our approach includes two improvements over the current best performing deconvolution approach; 1) we optimize the parametric form of the deconvolution feature space; and, 2) we pre-classify neural event estimates into two subgroups, either known or unknown, based on the confidence of the estimates prior to conducting neural event classification. This knows-what-it-knows approach significantly improves neural event classification over the current best performing algorithm, as tested in a detailed computer simulation of highly-confounded fMRI BOLD signal. We then implemented a massively parallelized version of the bootstrapping-based deconvolution algorithm and executed it on a high-performance computer to conduct large scale (i.e., voxelwise) estimation of the neural events for a group of 17 human subjects. We show that by restricting the computation of inter-regional correlation to include only those neural events estimated with high-confidence the method appeared to have higher sensitivity for identifying the default mode network compared to a standard BOLD signal correlation analysis when compared across subjects. PMID:26226647

  15. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  16. Data Dependent Peak Model Based Spectrum Deconvolution for Analysis of High Resolution LC-MS Data

    PubMed Central

    2015-01-01

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  17. Data dependent peak model based spectrum deconvolution for analysis of high resolution LC-MS data.

    PubMed

    Wei, Xiaoli; Shi, Xue; Kim, Seongho; Patrick, Jeffrey S; Binkley, Joe; Kong, Maiying; McClain, Craig; Zhang, Xiang

    2014-02-18

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  18. Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function

    SciTech Connect

    Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

    1987-06-01

    A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

  19. A further analysis for the minimum-variance deconvolution filter performance

    NASA Technical Reports Server (NTRS)

    Chi, Chong-Yung

    1987-01-01

    Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

  20. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    NASA Astrophysics Data System (ADS)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the activating position of the main Si-O-T (where T is Al/Si) stretching band in the FTIR spectrum, which gives an indication of the relative changes in the Si/Al ratio. Also, the effect of the cation and silicate concentration in the activating solution has been discussed using the Fourier self deconvolution technique.

  1. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  2. Multichannel deconvolution with long-range dependence: A minimax study

    E-print Network

    Pensky, Marianna

    Multichannel deconvolution with long-range dependence: A minimax study Rida Benhaddou a , Rafal December 2013 Keywords: Adaptivity Besov spaces Block thresholding Deconvolution Fourier analysis Functional data Long-range dependence Meyer wavelets Minimax estimators Multichannel deconvolution Partial

  3. Investigation of the CLEAN deconvolution method for use with Late Time Response analysis of multiple objects

    NASA Astrophysics Data System (ADS)

    Hutchinson, Simon; Taylor, Christopher T.; Fernando, Michael; Andrews, David; Bowring, Nicholas

    2014-10-01

    This paper investigates the application of the CLEAN non-linear deconvolution method to Late Time Response (LTR) analysis for detecting multiple objects in Concealed Threat Detection (CTD). When an Ultra-Wide Band (UWB) frequency radar signal is used to illuminate a conductive target, surface currents are induced upon the object which in turn give rise to LTR signals. These signals are re-radiated from the target and the results from a number of targets are presented. The experiment was performed using double ridged horn antenna in a pseudo-monostatic arrangement. A Vector network analyser (VNA) has been used to provide the UWB Frequency Modulated Continuous Wave (FMCW) radar signal. The distance between the transmitting antenna and the target objects has been kept at 1 metre for all the experiments performed and the power level at the VNA was set to 0dBm. The targets in the experimental setup are suspended in air in a laboratory environment. Matlab has been used in post processing to perform linear and non-linear deconvolution of the signal. The Wiener filter, Fast Fourier Transform (FFT) and Continuous Wavelet Transform (CWT) are used to process the return signals and extract the LTR features from the noise clutter. A Generalized Pencil-of-Function (GPOF) method was then used to extract the complex poles of the signal. Artificial Neural Networks (ANN) and Linear Discriminant Analysis (LDA) have been used to classify the data.

  4. Demand Response Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-03-01

    Demand Response Analysis Tool is a software developed at the Lawrence Berkeley National Laboratory. It is initially funded by Southern California Edison. Our goal in developing this tool is to provide an online, useable, with standardized methods, an analysis tool to evaluate demand and demand response performance of commercial and industrial facilities. The tool provides load variability and weather sensitivity analysis capabilities as well as development of various types of baselines. It can be usedmore »by researchers, real estate management firms, utilities, or any individuals who are interested in analyzing their demand and demand response capabilities.« less

  5. XAP, a program for deconvolution and analysis of complex X-ray spectra

    USGS Publications Warehouse

    Quick, James E.; Haleby, Abdul Malik

    1989-01-01

    The X-ray analysis program (XAP) is a spectral-deconvolution program written in BASIC and specifically designed to analyze complex spectra produced by energy-dispersive X-ray analytical systems (EDS). XAP compensates for spectrometer drift, utilizes digital filtering to remove background from spectra, and solves for element abundances by least-squares, multiple-regression analysis. Rather than base analyses on only a few channels, broad spectral regions of a sample are reconstructed from standard reference spectra. The effects of this approach are (1) elimination of tedious spectrometer adjustments, (2) removal of background independent of sample composition, and (3) automatic correction for peak overlaps. Although the program was written specifically to operate a KEVEX 7000 X-ray fluorescence analytical system, it could be adapted (with minor modifications) to analyze spectra produced by scanning electron microscopes, electron microprobes, and probes, and X-ray defractometer patterns obtained from whole-rock powders.

  6. A novel fluorescence imaging technique combining deconvolution microscopy and spectral analysis for quantitative detection of opportunistic pathogens

    SciTech Connect

    Le Puil, Michael; Biggerstaff, John P.; Weidow, B.; Price, Jeffery R; Naser, S.; White, D.C.; Alberte, R.

    2006-01-01

    A novel fluorescence imaging technique based on deconvolution microscopy and spectral analysis is presented here as an alternative to confocal laser scanning microscopy. It allowed rapid, specific and simultaneous identification of five major opportunistic pathogens, relevant for public health, in suspension and provided quantitative results.

  7. MORESANE: MOdel REconstruction by Synthesis-ANalysis Estimators. A sparse deconvolution algorithm for radio interferometric imaging

    NASA Astrophysics Data System (ADS)

    Dabbech, A.; Ferrari, C.; Mary, D.; Slezak, E.; Smirnov, O.; Kenyon, J. S.

    2015-04-01

    Context. Recent years have been seeing huge developments of radio telescopes and a tremendous increase in their capabilities (sensitivity, angular and spectral resolution, field of view, etc.). Such systems make designing more sophisticated techniques mandatory not only for transporting, storing, and processing this new generation of radio interferometric data, but also for restoring the astrophysical information contained in such data. Aims.In this paper we present a new radio deconvolution algorithm named MORESANEand its application to fully realistic simulated data of MeerKAT, one of the SKA precursors. This method has been designed for the difficult case of restoring diffuse astronomical sources that are faint in brightness, complex in morphology, and possibly buried in the dirty beam's side lobes of bright radio sources in the field. Methods.MORESANE is a greedy algorithm that combines complementary types of sparse recovery methods in order to reconstruct the most appropriate sky model from observed radio visibilities. A synthesis approach is used for reconstructing images, in which the synthesis atoms representing the unknown sources are learned using analysis priors. We applied this new deconvolution method to fully realistic simulations of the radio observations of a galaxy cluster and of an HII region in M 31. Results.We show that MORESANE is able to efficiently reconstruct images composed of a wide variety of sources (compact point-like objects, extended tailed radio galaxies, low-surface brightness emission) from radio interferometric data. Comparisons with the state of the art algorithms indicate that MORESANE provides competitive results in terms of both the total flux/surface brightness conservation and fidelity of the reconstructed model. MORESANE seems particularly well suited to recovering diffuse and extended sources, as well as bright and compact radio sources known to be hosted in galaxy clusters.

  8. Three-dimensional analysis tool for segmenting and measuring the structure of telomeres in mammalian nuclei

    E-print Network

    van Vliet, Lucas J.

    Three-dimensional analysis tool for segmenting and measuring the structure of telomeres and analyzing FISH stained telomeres in interphase nuclei. After deconvolution of the images, we segment the individual telomeres and measure a distribution parameter we call T . This parameter describes

  9. Physics analysis tools

    SciTech Connect

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

  10. Configuration Analysis Tool

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1983-01-01

    Configuration Analysis Tool (CAT), is information storage and report generation system for aid of configuration management activities. Configuration management is discipline composed of many techniques selected to track and direct evolution of complex systems. CAT is interactive program that accepts, organizes and stores information pertinent to specific phases of project.

  11. PCard Data Analysis Tool

    SciTech Connect

    2005-04-01

    The Procurement Card data analysis and monitoring tool enables due-diligence review using predefined user-created queries and reports. The system tracks individual compliance emails. More specifically, the tool: - Helps identify exceptions or questionable and non-compliant purchases, - Creates audit random sample on request, - Allows users to create and run new or ad-hoc queries and reports, - Monitors disputed charges, - Creates predefined Emails to Cardholders requesting documentation and/or clarification, - Tracks audit status, notes, Email status (date sent, response), audit resolution.

  12. Graphical Contingency Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-03-02

    GCA is a visual analytic tool for power grid contingency analysis to provide more decision support for power grid operations. GCA allows power grid operators to quickly gain situational awareness of power grid by converting large amounts of operational data to graphic domain with a color contoured map; identify system trend and foresee and discern emergencies by performing trending analysis; identify the relationships between system configurations and affected assets by conducting clustering analysis; and identifymore »the best action by interactively evaluate candidate actions.« less

  13. Analysis of a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Horn, Florian; Bayer, Florian; Pelzer, Georg; Rieger, Jens; Ritter, André; Weber, Thomas; Zang, Andrea; Michel, Thilo; Anton, Gisela

    2014-03-01

    Grating-based X-ray phase-contrast imaging is a promising imaging modality to increase soft tissue contrast in comparison to conventional attenuation-based radiography. Complementary and otherwise inaccessible information is provided by the dark-field image, which shows the sub-pixel size granularity of the measured object. This could especially turn out to be useful in mammography, where tumourous tissue is connected with the presence of supertiny microcalcifications. In addition to the well-established image reconstruction process, an analysis method was introduced by Modregger, 1 which is based on deconvolution of the underlying scattering distribution within a single pixel revealing information about the sample. Subsequently, the different contrast modalities can be calculated with the scattering distribution. The method already proved to deliver additional information in the higher moments of the scattering distribution and possibly reaches better image quality with respect to an increased contrast-to-noise ratio. Several measurements were carried out using melamine foams as phantoms. We analysed the dependency of the deconvolution-based method with respect to the dark-field image on different parameters such as dose, number of iterations of the iterative deconvolution-algorithm and dark-field signal. A disagreement was found in the reconstructed dark-field values between the FFT method and the iterative method. Usage of the resulting characteristics might be helpful in future applications.

  14. Data enhancement and analysis through mathematical deconvolution of signals from scientific measuring instruments

    NASA Technical Reports Server (NTRS)

    Wood, G. M.; Rayborn, G. H.; Ioup, J. W.; Ioup, G. E.; Upchurch, B. T.; Howard, S. J.

    1981-01-01

    Mathematical deconvolution of digitized analog signals from scientific measuring instruments is shown to be a means of extracting important information which is otherwise hidden due to time-constant and other broadening or distortion effects caused by the experiment. Three different approaches to deconvolution and their subsequent application to recorded data from three analytical instruments are considered. To demonstrate the efficacy of deconvolution, the use of these approaches to solve the convolution integral for the gas chromatograph, magnetic mass spectrometer, and the time-of-flight mass spectrometer are described. Other possible applications of these types of numerical treatment of data to yield superior results from analog signals of the physical parameters normally measured in aerospace simulation facilities are suggested and briefly discussed.

  15. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  16. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  17. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  18. High-resolution imaging by multiple-image deconvolution

    E-print Network

    Boccacci, Patrizia

    High-resolution imaging by multiple-image deconvolution M. Bertero, P.Boccacci, G. Desiderà, and G deconvolution is a powerful tool for improving the quality of images corrupted by blurring and noise. However on the direction in the imaging plane or volume. Such a distortion cannot be corrected by image deconvolution. One

  19. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  20. Geodetic Strain Analysis Tool

    NASA Technical Reports Server (NTRS)

    Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

    2011-01-01

    A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

  1. UDECON: deconvolution optimization software for restoring high-resolution records from pass-through paleomagnetic measurements

    NASA Astrophysics Data System (ADS)

    Xuan, Chuang; Oda, Hirokuni

    2015-12-01

    The rapid accumulation of continuous paleomagnetic and rock magnetic records acquired from pass-through measurements on superconducting rock magnetometers (SRM) has greatly contributed to our understanding of the paleomagnetic field and paleo-environment. Pass-through measurements are inevitably smoothed and altered by the convolution effect of SRM sensor response, and deconvolution is needed to restore high-resolution paleomagnetic and environmental signals. Although various deconvolution algorithms have been developed, the lack of easy-to-use software has hindered the practical application of deconvolution. Here, we present standalone graphical software UDECON as a convenient tool to perform optimized deconvolution for pass-through paleomagnetic measurements using the algorithm recently developed by Oda and Xuan (Geochem Geophys Geosyst 15:3907-3924, 2014). With the preparation of a format file, UDECON can directly read pass-through paleomagnetic measurement files collected at different laboratories. After the SRM sensor response is determined and loaded to the software, optimized deconvolution can be conducted using two different approaches (i.e., "Grid search" and "Simplex method") with adjustable initial values or ranges for smoothness, corrections of sample length, and shifts in measurement position. UDECON provides a suite of tools to view conveniently and check various types of original measurement and deconvolution data. Multiple steps of measurement and/or deconvolution data can be compared simultaneously to check the consistency and to guide further deconvolution optimization. Deconvolved data together with the loaded original measurement and SRM sensor response data can be saved and reloaded for further treatment in UDECON. Users can also export the optimized deconvolution data to a text file for analysis in other software.

  2. Streaming Multiframe Deconvolutions on GPUs

    NASA Astrophysics Data System (ADS)

    Lee, M. A.; Budavári, T.

    2015-09-01

    Atmospheric turbulence distorts all ground-based observations, which is especially detrimental to faint detections. The point spread function (PSF) defining this blur is unknown for each exposure and varies significantly over time, making image analysis difficult. Lucky imaging and traditional co-adding throws away lots of information. We developed blind deconvolution algorithms that can simultaneously obtain robust solutions for the background image and all the PSFs. It is done in a streaming setting, which makes it practical for large number of big images. We implemented a new tool that runs of GPUs and achieves exceptional running times that can scale to the new time-domain surveys. Our code can quickly and effectively recover high-resolution images exceeding the quality of traditional co-adds. We demonstrate the power of the method on the repeated exposures in the Sloan Digital Sky Survey's Stripe 82.

  3. Draper Station Analysis Tool

    NASA Technical Reports Server (NTRS)

    Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

    2011-01-01

    Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

  4. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  5. Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis

    SciTech Connect

    Greenberg, M.; Ebel, D.S.

    2009-03-19

    We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

  6. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

  7. SPLAT: Spectral Analysis Tool

    NASA Astrophysics Data System (ADS)

    Draper, Peter W.

    2014-02-01

    SPLAT is a graphical tool for displaying, comparing, modifying and analyzing astronomical spectra stored in NDF, FITS and TEXT files as well as in NDX format. It can read in many spectra at the same time and then display these as line plots. Display windows can show one or several spectra at the same time and can be interactively zoomed and scrolled, centered on specific wavelengths, provide continuous coordinate readout, produce printable hardcopy and be configured in many ways. Analysis facilities include the fitting of a polynomial to selected parts of a spectrum, the fitting of Gaussian, Lorentzian and Voigt profiles to emission and absorption lines and the filtering of spectra using average, median and line-shape window functions as well as wavelet denoising. SPLAT also supports a full range of coordinate systems for spectra, which allows coordinates to be displayed and aligned in many different coordinate systems (wavelength, frequency, energy, velocity) and transformed between these and different standards of rest (topocentric, heliocentric, dynamic and kinematic local standards of rest, etc). SPLAT is distributed as part of the Starlink (ascl:1110.012) software collection.

  8. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  9. Automatic detection of UXO magnetic anomalies using extended Euler deconvolution

    E-print Network

    Automatic detection of UXO magnetic anomalies using extended Euler deconvolution Kristofer Davis1, whereas the gradient data have an SI of four. The developed extended Euler deconvolution method based extended Euler deconvolution, then the methodology of how we use it, and finally amplitude analysis

  10. Spline-based deconvolution Amir Averbuch , Valery Zheludev

    E-print Network

    Averbuch, Amir

    Spline-based deconvolution Amir Averbuch Ã, Valery Zheludev School of Computer Science, Tel Aviv deconvolution 2D data Noised data Harmonic analysis Approximate solutions a b s t r a c t This paper proposes robust algorithms to perform deconvolution and inversion of the heat equation starting from 1D and 2D

  11. A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis

    PubMed Central

    Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan

    2009-01-01

    DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301

  12. One approach for doublet deconvolution to improve reliability in spectra analysis for in vivo lead measurement.

    PubMed

    Kondrashov, V S; Rothenberg, S J

    2001-04-01

    Calculation of lead concentration from K-series X-ray fluorescent studies uses a robust normalization technique based on the amplitude or area of the elastic signal. Parameter estimation of the elastic signal can be affected by the overlap of the Kbeta2 line, especially for concentrations greater than 40 ppm where the Kbeta2 amplitude can be greater than 1% of the elastic signal. We tested the combination of estimation by method of least moduli and doublet deconvolution. We found that the estimation of the area of the elastic signal is more robust to changes in the low-energy end of the region of interest with the combined method than with method of least-squares estimation and singlet processing. We recommend use of the combined method for creation of calibration curves at concentrations greater than or equal to 40 ppm. PMID:11225706

  13. MORESANE: MOdel REconstruction by Synthesis-ANalysis Estimators. A sparse deconvolution algorithm for radio interferometric imaging

    E-print Network

    Dabbech, Arwa; Mary, David; Slezak, Eric; Smirnov, Oleg; Kenyon, Jonathan S

    2014-01-01

    (arXiv abridged abstract) The current years are seeing huge developments of radio telescopes and a tremendous increase of their capabilities. Such systems make mandatory the design of more sophisticated techniques not only for transporting, storing and processing this new generation of radio interferometric data, but also for restoring the astrophysical information contained in such data. In this paper we present a new radio deconvolution algorithm named MORESANE and its application to fully realistic simulated data of MeerKAT, one of the SKA precursors. This method has been designed for the difficult case of restoring diffuse astronomical sources which are faint in brightness, complex in morphology and possibly buried in the dirty beam's side lobes of bright radio sources in the field. MORESANE is a greedy algorithm which combines complementary types of sparse recovery methods in order to reconstruct the most appropriate sky model from observed radio visibilities. A synthesis approach is used for the reconst...

  14. Analysis/Design Tool

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Excelerator II, developed by INTERSOLV, Inc., provides a complete environment for rules-based expert systems. The software incorporates NASA's C Language Integrated Production System (CLIPS), a shell for constructing expert systems. Excelerator II provides complex verification and transformation routines based on matching that is simple and inexpensive. *Excelerator II was sold to SELECT Software Tools in June 1997 and is now called SELECT Excelerator. SELECT has assumed full support and maintenance for the product line.

  15. Nonstandard Tools for Nonsmooth Analysis

    E-print Network

    S. S. Kutateladze

    2012-06-11

    This is an overview of the basic tools of nonsmooth analysis which are grounded on nonstandard models of set theory. By way of illustration we give a criterion for an infinitesimally optimal path of a general discrete dynamic system.

  16. Atlas Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    de La Hoz, Santiago Gonzalez; Ruiz, Luis March; Liko, Dietrich

    2008-06-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval.

  17. Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis

    PubMed Central

    Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

    2015-01-01

    The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

  18. VCAT: Visual Crosswalk Analysis Tool

    SciTech Connect

    Cleland, Timothy J.; Forslund, David W.; Cleland, Catherine A.

    2012-08-31

    VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

  19. Tiling Microarray Analysis Tools

    Energy Science and Technology Software Center (ESTSC)

    2005-05-04

    TiMAT is a package of 23 command line Java applications for use in the analysis of Affymetrix tiled genomic microarray data. TiMAT enables: 1) Rebuilding the genome annotation for entire tiled arrays (repeat filtering, chromosomal coordinate assignment). 2) Post processing of oligo intensity values (quantile normalization, median scaling, PMMM transformation), 3) Significance testing (Wilcoxon rank sum and signed rank tests, intensity difference and ratio tests) and Interval refinement (filtering based on multiple statistics, overlap comparisons),more »4) Data visualization (detailed thumbnail/zoomed view with Interval Plots and data export to Affymetrix's Integrated Genome Browser) and Data reports (spreadsheet summaries and detailed profiles)« less

  20. Deconvolution filtering: Temporal smoothing revisited

    PubMed Central

    Bush, Keith; Cisler, Josh

    2014-01-01

    Inferences made from analysis of BOLD data regarding neural processes are potentially confounded by multiple competing sources: cardiac and respiratory signals, thermal effects, scanner drift, and motion-induced signal intensity changes. To address this problem, we propose deconvolution filtering, a process of systematically deconvolving and reconvolving the BOLD signal via the hemodynamic response function such that the resultant signal is composed of maximally likely neural and neurovascular signals. To test the validity of this approach, we compared the accuracy of BOLD signal variants (i.e., unfiltered, deconvolution filtered, band-pass filtered, and optimized band-pass filtered BOLD signals) in identifying useful properties of highly confounded, simulated BOLD data: (1) reconstructing the true, unconfounded BOLD signal, (2) correlation with the true, unconfounded BOLD signal, and (3) reconstructing the true functional connectivity of a three-node neural system. We also tested this approach by detecting task activation in BOLD data recorded from healthy adolescent girls (control) during an emotion processing task. Results for the estimation of functional connectivity of simulated BOLD data demonstrated that analysis (via standard estimation methods) using deconvolution filtered BOLD data achieved superior performance to analysis performed using unfiltered BOLD data and was statistically similar to well-tuned band-pass filtered BOLD data. Contrary to band-pass filtering, however, deconvolution filtering is built upon physiological arguments and has the potential, at low TR, to match the performance of an optimal band-pass filter. The results from task estimation on real BOLD data suggest that deconvolution filtering provides superior or equivalent detection of task activations relative to comparable analyses on unfiltered signals and also provides decreased variance over the estimate. In turn, these results suggest that standard preprocessing of the BOLD signal ignores significant sources of noise that can be effectively removed without damaging the underlying signal. PMID:24768215

  1. Sandia PUF Analysis Tool

    SciTech Connect

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  2. Sandia PUF Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2014-06-11

    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics suchmore »as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less

  3. Constrained spherical deconvolution analysis of the limbic network in human, with emphasis on a direct cerebello-limbic pathway

    PubMed Central

    Arrigo, Alessandro; Mormina, Enricomaria; Anastasi, Giuseppe Pio; Gaeta, Michele; Calamuneri, Alessandro; Quartarone, Angelo; De Salvo, Simona; Bruschetta, Daniele; Rizzo, Giuseppina; Trimarchi, Fabio; Milardi, Demetrio

    2014-01-01

    The limbic system is part of an intricate network which is involved in several functions like memory and emotion. Traditionally the role of the cerebellum was considered mainly associated to motion control; however several evidences are raising about a role of the cerebellum in learning skills, emotions control, mnemonic and behavioral processes involving also connections with limbic system. In 15 normal subjects we studied limbic connections by probabilistic Constrained Spherical Deconvolution (CSD) tractography. The main result of our work was to prove for the first time in human brain the existence of a direct cerebello-limbic pathway which was previously hypothesized but never demonstrated. We also extended our analysis to the other limbic connections including cingulate fasciculus, inferior longitudinal fasciculus, uncinated fasciculus, anterior thalamic connections and fornix. Although these pathways have been already described in the tractographic literature we provided reconstruction, quantitative analysis and Fractional Anisotropy (FA) right-left symmetry comparison using probabilistic CSD tractography that is known to provide a potential improvement compared to previously used Diffusion Tensor Imaging (DTI) techniques. The demonstration of the existence of cerebello-limbic pathway could constitute an important step in the knowledge of the anatomic substrate of non-motor cerebellar functions. Finally the CSD statistical data about limbic connections in healthy subjects could be potentially useful in the diagnosis of pathological disorders damaging this system. PMID:25538606

  4. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  5. Industrial Geospatial Analysis Tool for Energy Evaluation 

    E-print Network

    Alkadi, N.; Starke, M.; Ma, O.; Nimbalkar, S.; Cox, D.; Dowling, K.; Johnson, B.; Khan, S.

    2013-01-01

    IGATE-E is an industrial energy analysis tool. The tool is intended to be a decision support and planning tool to a wide spectrum of energy analysts, engineers, researchers, government organizations, private consultants, industry partners, and alike...

  6. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  7. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  8. Shot Planning and Analysis Tools

    SciTech Connect

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

    2011-07-25

    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  9. Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols

    EPA Science Inventory

    In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

  10. Constellation Reconfiguration: Tools and Analysis 

    E-print Network

    Davis, Jeremy John

    2011-10-21

    stream_source_info DAVIS-DISSERTATION.pdf.txt stream_content_type text/plain stream_size 208429 Content-Encoding ISO-8859-1 stream_name DAVIS-DISSERTATION.pdf.txt Content-Type text/plain; charset=ISO-8859-1 CONSTELLATION...: Aerospace Engineering CONSTELLATION RECONFIGURATION: TOOLS AND ANALYSIS A Dissertation by JEREMY JOHN DAVIS Submitted to the O ce of Graduate Studies of Texas A&M University in partial ful llment of the requirements for the degree of DOCTOR...

  11. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  12. Global spatial deconvolution of Lunar Prospector Th abundances D. J. Lawrence,1

    E-print Network

    Spudis, Paul D.

    Global spatial deconvolution of Lunar Prospector Th abundances D. J. Lawrence,1 R. C. Puetter,2,3 R completed the first global spatial deconvolution analysis of planetary gamma-ray data for lunar Th deconvolution techniques ­ Jansson's method and the Pixon method ­ and determined that the Pixon method provides

  13. Deconvolution and Blind Deconvolution in 1.1 Introduction ............................................................. 1

    E-print Network

    Masci, Frank

    1 Deconvolution and Blind Deconvolution in Astronomy CONTENTS 1.1 Introduction and Blind Deconvolution ........................................ 22 1.10 Conclusions and Chapter Summary in order to fully take ad- 0-8493-0052-5/00/$0.00+$.50 c 2001 by CRC Press LLC 1 #12;2 Blind image

  14. Climate Data Analysis Tools - (CDAT)

    NASA Astrophysics Data System (ADS)

    Doutriaux, C.; Jennifer, A.; Drach, R.; Dubois, P.; Williams, D.

    2003-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems. The power of the system comes from Python and its ability to seamlessly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and graphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management System or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS). One of the most difficult challenges facing climate researchers today is the cataloging and analysis of massive amounts of multi-dimensional global atmospheric and oceanic model data. To reduce the labor intensive and time-consuming process of data management, retrieval, and analysis, PCMDI and other DOE sites have come together to develop intelligent filing system and data management software for the linking of storage devices located throughout the United States and the international climate research community. This effort, headed by PCMDI, NCAR, and ANL will allow users anywhere to remotely access this distributed multi-petabyte archive and perform analysis. PCMDI's CDAT is an innovative system that supports exploration and visualization of climate scientific datasets. As an "open system", the software sub-systems (i.e., modules) are independent and freely available to the global climate community. CDAT is easily extended to include new modules and as a result of its flexibility, PCMDI has integrated other popular software components, such as: the popular Live Access Server (LAS) and the Distributed Oceanographic Data System (DODS). Together with ANL's Globus middleware software, CDAT's focus is to allow climate researchers the ability to access and analyze multi-dimensional distributed climate datasets.

  15. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  16. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  17. Image Deconvolution by Multiscale Methods

    E-print Network

    Starck, Jean-Luc

    Image Deconvolution by Multiscale Methods Jean-Luc Starck Service d'Astrophysique, CEA, Emmanuel Candes, David Donoho, Albert Bijaoui. #12;#12;#12;#12;#12;DECONVOLUTION METHODS IN ASTRONOMY, and a significant coefficient is detected. #12;NGC2997 MULTIRESOLUTION SUPPORT #12;#12;#12;DECONVOLUTION SIMULATION

  18. Image deconvolution, denoising and compression

    E-print Network

    Masci, Frank

    Image deconvolution, denoising and compression T.E. Gureyev and Ya.I.Nesterets 15.11.2002 #12 OF CONVOLUTION * = 3% Poisson noise 10% Poisson noise #12;DECONVOLUTION PROBLEM (*)),(),)((),( yxNyxPIyxD +!= Deconvolution problem: given D, P and N, find I (i.e. compensate for noise and the PSF of the imaging system

  19. KUICNET ALGORITHMS FOR BLIND DECONVOLUTION

    E-print Network

    Douglas, Scott C.

    KUICNET ALGORITHMS FOR BLIND DECONVOLUTION Scott C. Douglas 1 and S.­Y. Kung 2 1 Department to the blind deconvolution task. The proposed algorithm has a simple form and is effective in deconvolving with various distributions. I. INTRODUCTION The blind deconvolution task figures prominently in many practical

  20. Explicit deconvolution of wellbore storage distorted well test data 

    E-print Network

    Bahabanian, Olivier

    2007-04-25

    The analysis/interpretation of wellbore storage distorted pressure transient test data remains one of the most significant challenges in well test analysis. Deconvolution (i.e., the "conversion" of a variable-rate distorted pressure profile...

  1. Wavespace-Based Coherent Deconvolution

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Cattafesta, Louis N., III

    2012-01-01

    Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

  2. Modeling error in Approximate Deconvolution Models

    E-print Network

    Adrian Dunca; Roger Lewandowski

    2012-10-09

    We investigate the assymptotic behaviour of the modeling error in approximate deconvolution model in the 3D periodic case, when the order $N$ of deconvolution goes to $\\infty$. We consider successively the generalised Helmholz filters of order $p$ and the Gaussian filter. For Helmholz filters, we estimate the rate of convergence to zero thanks to energy budgets, Gronwall's Lemma and sharp inequalities about Fouriers coefficients of the residual stress. We next show why the same analysis does not allow to conclude convergence to zero of the error modeling in the case of Gaussian filter, leaving open issues.

  3. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  4. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

  5. NCI Interactive Budget Analysis Tool

    Cancer.gov

    This tool provides users an interactive overview of the National Cancer Institute (NCI) budget and Fact Book data since Fiscal Year 1999. Additional historical NCI budget information can be obtained through the NCI Fact Book Collection.

  6. EASY-GOING deconvolution: Automated MQMAS NMR spectrum analysis based on a model with analytical crystallite excitation efficiencies

    NASA Astrophysics Data System (ADS)

    Grimminck, Dennis L. A. G.; van Meerten, Bas; Verkuijlen, Margriet H. W.; van Eck, Ernst R. H.; Leo Meerts, W.; Kentgens, Arno P. M.

    2013-03-01

    The EASY-GOING deconvolution (EGdeconv) program is extended to enable fast and automated fitting of multiple quantum magic angle spinning (MQMAS) spectra guided by evolutionary algorithms. We implemented an analytical crystallite excitation model for spectrum simulation. Currently these efficiencies are limited to two-pulse and z-filtered 3QMAS spectra of spin 3/2 and 5/2 nuclei, whereas for higher spin-quantum numbers ideal excitation is assumed. The analytical expressions are explained in full to avoid ambiguity and facilitate others to use them. The EGdeconv program can fit interaction parameter distributions. It currently includes a Gaussian distribution for the chemical shift and an (extended) Czjzek distribution for the quadrupolar interaction. We provide three case studies to illustrate EGdeconv's capabilities for fitting MQMAS spectra. The EGdeconv program is available as is on our website http://egdeconv.science.ru.nl for 64-bit Linux operating systems.

  7. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  8. 2010 Solar Market Transformation Analysis and Tools

    SciTech Connect

    none,

    2010-04-01

    This document describes the DOE-funded solar market transformation analysis and tools under development in Fiscal Year 2010 so that stakeholders can access available resources and get engaged where interested.

  9. Statistical Tools for Forensic Analysis of Toolmarks

    SciTech Connect

    David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

    2004-04-22

    Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

  10. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  11. Model Analysis ToolKit

    Energy Science and Technology Software Center (ESTSC)

    2015-05-15

    MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore »calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less

  12. Budget Risk & Prioritization Analysis Tool

    SciTech Connect

    2010-12-31

    BRPAtool performs the following: ?Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors ?Enables analysis of different budget scenarios ?Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks ?Real-time analysis ?Enables managers to determine the multipliers and where funding is best applied ?Promotes solid budget defense

  13. A comparison of deconvolution techniques

    SciTech Connect

    Ammon, C.J. . Inst. of Tectonics)

    1992-08-01

    In the following, I compare several approaches to the deconvolution of a wavelet from a seismogram, when the wavelet can be estimated beforehand. Specifically, I examine the frequency domain waterlevel method, a least-squares, frequency-dependent weighting scheme, and a time-domain approach using the Singular Value Decomposition. I illustrate each deconvolution technique by estimating receiver functions using both high quality and low quality seismic waveforms. Each technique performs well with high quality data, and all have problems with noisy data. In cases where the data do not allow a complete extraction of desired information, the time domain approach is the more useful method to interpret the resulting waveforms. Singular Value Decomposition also permits an optimal combination of information from several deconvolutions or a quantitative comparison of the results from several deconvolutions. Additionally, the time domain approach is readily adaptable to multi-waveform deconvolution.

  14. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  15. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  16. Built Environment Energy Analysis Tool Overview (Presentation)

    SciTech Connect

    Porter, C.

    2013-04-01

    This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

  17. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  18. Photogrammetry Tool for Forensic Analysis

    NASA Technical Reports Server (NTRS)

    Lane, John

    2012-01-01

    A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

  19. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  20. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  1. Bayesian least squares deconvolution

    NASA Astrophysics Data System (ADS)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  2. Compressive blind image deconvolution.

    PubMed

    Amizic, Bruno; Spinoulas, Leonidas; Molina, Rafael; Katsaggelos, Aggelos K

    2013-10-01

    We propose a novel blind image deconvolution (BID) regularization framework for compressive sensing (CS) based imaging systems capturing blurred images. The proposed framework relies on a constrained optimization technique, which is solved by a sequence of unconstrained sub-problems, and allows the incorporation of existing CS reconstruction algorithms in compressive BID problems. As an example, a non-convex lp quasi-norm with is employed as a regularization term for the image, while a simultaneous auto-regressive regularization term is selected for the blur. Nevertheless, the proposed approach is very general and it can be easily adapted to other state-of-the-art BID schemes that utilize different, application specific, image/blur regularization terms. Experimental results, obtained with simulations using blurred synthetic images and real passive millimeter-wave images, show the feasibility of the proposed method and its advantages over existing approaches. PMID:23744684

  3. Bayesian least squares deconvolution

    E-print Network

    Ramos, A Asensio

    2015-01-01

    Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  4. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  5. From sensor networks to connected analysis tools

    NASA Astrophysics Data System (ADS)

    Dawes, N.; Bavay, M.; Egger, T.; Sarni, S.; Salehi, A.; Davison, A.; Jeung, H.; Aberer, K.; Lehning, M.

    2012-04-01

    Multi-disciplinary data systems provide excellent tools for locating data, but most eventually provide a series of local files for further processing, providing marginal advantages for the regular user. The Swiss Experiment Platform (SwissEx) was built with the primary goal of enabling high density measurements, integrating them with lower density existing measurements and encouraging cross/inter-disciplinary collaborations. Nearing the end of the project, we have exceeded these goals, also providing connected tools for direct data access from analysis applications. SwissEx (www.swiss-experiment.ch) provides self-organising networks for rapid deployment and integrates these data with existing measurements from across environmental research. The data are categorised and documented according to their originating experiments and fieldsites as well as being searchable globally. Data from SwissEx are available for download, but we also provide tools to directly access data from within common scientific applications (Matlab, LabView, R) and numerical models such as Alpine3D (using a data acquisition plugin and preprocessing library, MeteoIO). The continuation project (the Swiss Environmental Data and Knowledge Platform) will aim to continue the ideas developed within SwissEx and (alongside cloud enablement and standardisation) work on the development of these tools for application specific tasks. We will work alongside several projects from a wide range of disciplines to help them to develop tools which either require real-time data, or large data samples. As well as developing domain specific tools, we will also be working on tools for the utilisation of the latest knowledge in data control, trend analysis, spatio-temporal statistics and downscaling (developed within the CCES Extremes project), which will be a particularly interesting application when combined with the large range of measurements already held in the system. This presentation will look at the applications and web services developed so far as well as to be developed in the future.

  6. NOVEL ONLINE ADAPTIVE LEARNING ALGORITHMS FOR BLIND DECONVOLUTION

    E-print Network

    Douglas, Scott C.

    NOVEL ON­LINE ADAPTIVE LEARNING ALGORITHMS FOR BLIND DECONVOLUTION USING THE NATURAL GRADIENT Lake City, UT 84112 USA Abstract: Blind deconvolution is an important task for numerous applications signal deconvolution and source separation. Keywords: adaptive algorithms, blind deconvolution, blind

  7. NOVEL ON-LINE ADAPTIVE LEARNING ALGORITHMS FOR BLIND DECONVOLUTION

    E-print Network

    Cichocki, Andrzej

    NOVEL ON-LINE ADAPTIVE LEARNING ALGORITHMS FOR BLIND DECONVOLUTION USING THE NATURAL GRADIENT City, UT 84112 USA Abstract: Blind deconvolution is an important task for numerous applications signal deconvolution and source separation. Keywords: adaptive algorithms, blind deconvolution, blind

  8. Fairing Separation Analysis Using SepTOOL

    NASA Technical Reports Server (NTRS)

    Zalewski, Bart F.; Dial, William B.; Kosareo, Daniel N.

    2015-01-01

    This document describes the relevant equations programmed in spreadsheet software, SepTOOL, developed by ZIN Technologies, Inc. (ZIN) to determine the separation clearance between a launch vehicle payload fairing and remaining stages. The software uses closed form rigid body dynamic solutions of the vehicle in combination with flexible body dynamics of the fairing, which is obtained from flexible body dynamic analysis or from test data, and superimposes the two results to obtain minimum separation clearance for any given set of flight trajectory conditions. Using closed form solutions allows SepTOOL to perform separation calculations several orders of magnitude faster compared to numerical methods which allows users to perform real time parameter studies. Moreover, SepTOOL can optimize vehicle performance to minimize separation clearance. This tool can evaluate various shapes and sizes of fairings along with different vehicle configurations and trajectories. These geometries and parameters are inputted in a user friendly interface. Although the software was specifically developed for evaluating the separation clearance of launch vehicle payload fairings, separation dynamics of other launch vehicle components can be evaluated provided that aerodynamic loads acting on the vehicle during the separation event are negligible. This document describes the development of SepTOOL providing analytical procedure and theoretical equations whose implementation of these equations is not disclosed. Realistic examples are presented, and the results are verified with ADAMS (MSC Software Corporation) simulations. It should be noted that SepTOOL is a preliminary separation clearance assessment software for payload fairings and should not be used for final clearance analysis.

  9. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  10. Deconvolution with Shapelets

    E-print Network

    Peter Melchior; Rene Andrae; Matteo Maturi; Matthias Bartelmann

    2008-06-25

    We seek to find a shapelet-based scheme for deconvolving galaxy images from the PSF which leads to unbiased shear measurements. Based on the analytic formulation of convolution in shapelet space, we construct a procedure to recover the unconvolved shapelet coefficients under the assumption that the PSF is perfectly known. Using specific simulations, we test this approach and compare it to other published approaches. We show that convolution in shapelet space leads to a shapelet model of order $n_{max}^h = n_{max}^g + n_{max}^f$ with $n_{max}^f$ and $n_{max}^g$ being the maximum orders of the intrinsic galaxy and the PSF models, respectively. Deconvolution is hence a transformation which maps a certain number of convolved coefficients onto a generally smaller number of deconvolved coefficients. By inferring the latter number from data, we construct the maximum-likelihood solution for this transformation and obtain unbiased shear estimates with a remarkable amount of noise reduction compared to established approaches. This finding is particularly valid for complicated PSF models and low $S/N$ images, which renders our approach suitable for typical weak-lensing conditions.

  11. Challenges Facing Design and Analysis Tools

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

    2001-01-01

    The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

  12. Quantitative fluorescence microscopy and image deconvolution.

    PubMed

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used to remove blurred signal from an image. There are two major types of deconvolution approaches, deblurring and restoration algorithms. Deblurring algorithms remove blur, but treat a series of optical sections as individual two-dimensional entities, and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed. PMID:23931516

  13. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  14. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  15. Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning

    E-print Network

    Chen, Tsuhan

    Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation

  16. 9. Analysis a. Analysis tools for dam removal

    E-print Network

    Tullos, Desiree

    9. Analysis a. Analysis tools for dam removal v. Hydrodynamic, sediment transport and physical are frequently the main concerns associated with a dam removal due to the possible effects on infrastructure reservoir sediment when removing a dam are river erosion, mechanical removal, and stabilization (ASCE 1997

  17. Holographic deconvolution microscopy for highresolution particle tracking

    E-print Network

    Grier, David

    Holographic deconvolution microscopy for high­resolution particle tracking Lisa Dixon, Fook Chiong reconstructions by three­dimensional deconvolution,'' Opt. Express 21, 22,527--22,544 (2010). 13. Y. Cotte, M. F

  18. Physics 343 Lecture # 10: lab 5 + deconvolution

    E-print Network

    Baker, Andrew J.

    Physics 343 Lecture # 10: lab 5 + deconvolution #12; Schedule This week: "observations for deconvolution. Class handout = Difmap cookbook; useful website = http://www.astro.caltech.edu/~tjp/citvlb/ #12

  19. Simultaneous Wavelet Deconvolution in Periodic Setting

    E-print Network

    Pensky, Marianna

    Simultaneous Wavelet Deconvolution in Periodic Setting DANIELA DE CANDITIIS Istituto per le ABSTRACT. The paper proposes a method of deconvolution in a periodic setting which combines two important high-quality performance of the proposed method. Key words: deconvolution, Meyer wavelets, multichannel

  20. INVERSE SCALE SPACE METHODS FOR BLIND DECONVOLUTION

    E-print Network

    Ferguson, Thomas S.

    INVERSE SCALE SPACE METHODS FOR BLIND DECONVOLUTION ANTONIO MARQUINA Abstract. In this paper we propose a blind deconvolution algorithm based on the total variation regulariza- tion formulated deconvolution, Gaussian blur, denoising, inverse scale space methods. 1. Introduction. Given a blurry and noisy

  1. A CLEANbased method for mosaic deconvolution

    E-print Network

    Hardcastle, Martin

    A CLEAN­based method for mosaic deconvolution By F. Gue t hy, S. Gu i l l o t e a uy AND F. V i a l are used for the deconvolution of radio synthesis images: the CLEAN algorithm (H¨ogbom (1974), Clark (1980 deconvolution. 2. Mosaic reconstruction 2.1. Single field observation The maps produced by an interferometer

  2. MEDICALSCIENCES Reconstructing influenza incidence by deconvolution of

    E-print Network

    Plotkin, Joshua B.

    MEDICALSCIENCES Reconstructing influenza incidence by deconvolution of daily mortality time series on the Richardson­Lucy deconvolution scheme from optics. We apply the method to reconstruct the incidence curves reflection of the evolution of the epidemic; there is much literature on the problem of deconvolution

  3. Perfusion Quantification Using Gaussian Process Deconvolution

    E-print Network

    Edinburgh, University of

    Perfusion Quantification Using Gaussian Process Deconvolution I.K. Andersen,1,2* A. Szymkowiak,1 C using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD

  4. Multiscale Methods for Shape Constraints in Deconvolution

    E-print Network

    Munk, Axel

    Multiscale Methods for Shape Constraints in Deconvolution: Confidence Statements for Qualitative-mail: lutz.duembgen@stat.unibe.ch Abstract: We derive multiscale statistics for deconvolution in order-posed setting, where the Fourier transform of the error density in the deconvolution model is of polynomial

  5. IMAGE DECONVOLUTION M. Bertero and P. Boccacci

    E-print Network

    Boccacci, Patrizia

    IMAGE DECONVOLUTION M. Bertero and P. Boccacci DISI, Universita' di Genova, Via Dodecaneso 35, I-16146 Genova, Italy bertero@disi.unige.it, boccacci@disi.unige.it Abstract Image deconvolution of the ill-posedness of image deconvolution in a continuous setting, we develop a detailed statistical model

  6. Euler deconvolution in satellite geodesy Matthias Roth

    E-print Network

    Stuttgart, Universität

    Euler deconvolution in satellite geodesy Matthias Roth Institute of Geodesy, University of Stuttgart, Germany matthias.roth@gis.uni-stuttgart.de 1. Introduction Euler deconvolution is a semi with Thompson (1982) who used mag- netic data, Euler deconvolution became of great interest in research. Since

  7. A DESIGN AND ANALYSIS TOOL FOR SOLAR ELECTRIC SYSTEMS

    E-print Network

    Delaware, University of

    PV PLANNER A DESIGN AND ANALYSIS TOOL FOR SOLAR ELECTRIC SYSTEMS Updated User Manual May 2011 at the master's and doctoral levels. #12;PV PLANNER A DESIGN AND ANALYSIS TOOL FOR SOLAR ELECTRIC SYSTEMS........................................................................................................... 9 5. System Configuration

  8. Deconvolution of gas chromatographic data

    NASA Technical Reports Server (NTRS)

    Howard, S.; Rayborn, G. H.

    1980-01-01

    The use of deconvolution methods on gas chromatographic data to obtain an accurate determination of the relative amounts of each material present by mathematically separating the merged peaks is discussed. Data were obtained on a gas chromatograph with a flame ionization detector. Chromatograms of five xylenes with differing degrees of separation were generated by varying the column temperature at selected rates. The merged peaks were then successfully separated by deconvolution. The concept of function continuation in the frequency domain was introduced in striving to reach the theoretical limit of accuracy, but proved to be only partially successful.

  9. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  10. Deconvolution of wide field-of-view radiometer measurements of earth-emitted radiation. II - Analysis of first year of Nimbus 6 ERB data

    NASA Technical Reports Server (NTRS)

    Bess, T. D.; Green, R. N.; Smith, G. L.

    1981-01-01

    The theory of deconvolution considered by Smith and Green (1981) is applied to Nimbus 6 data in order to interpret the data with the maximum obtainable resolution. Attention is given to the data sample, sun-contaminated measurements, the measurement model, the deconvolution, the degree variance, the spherical harmonic coefficients, the geographical distribution of longwave radiation, time histories of zonal coefficients, and the effect of a grid system. Degree variance plots for 12 months of longwave radiation data show that the limit for a spherical harmonic representation of the Nimbus 6 wide field-of-view longwave data is 12th degree. The degree variance plots also reveal that most of the power is in the lower degree terms. Contour maps of the radiation field show the geographical distribution of earth emitted radiant exitance for each month.

  11. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  12. Three-dimensional analysis tool for segmenting and measuring the structure of telomeres in mammalian nuclei

    NASA Astrophysics Data System (ADS)

    Vermolen, Bart J.; Young, Ian T.; Chuang, Alice; Wark, Landon; Chuang, Tony; Mai, Sabine; Garini, Yuval

    2005-03-01

    Quantitative analysis in combination with fluorescence microscopy calls for innovative digital image measurement tools. We have developed a three-dimensional tool for segmenting and analyzing FISH stained telomeres in interphase nuclei. After deconvolution of the images, we segment the individual telomeres and measure a distribution parameter we call ?T. This parameter describes if the telomeres are distributed in a sphere-like volume (?T ~ 1) or in a disk-like volume (?T >> 1). Because of the statistical nature of this parameter, we have to correct for the fact that we do not have an infinite number of telomeres to calculate this parameter. In this study we show a way to do this correction. After sorting mouse lymphocytes and calculating ?T and using the correction introduced in this paper we show a significant difference between nuclei in G2 and nuclei in either G0/G1 or S phase. The mean values of ?T for G0/G1, S and G2 are 1.03, 1.02 and 13 respectively.

  13. GIS-based hydrogeochemical analysis tools (QUIMET)

    NASA Astrophysics Data System (ADS)

    Velasco, V.; Tubau, I.; Vázquez-Suñè, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernàndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

    2014-09-01

    A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

  14. Optimal application of Morrison's iterative noise removal for deconvolution

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1986-01-01

    Morrison's iterative method of noise removal can be applied for both noise removal alone and noise removal prior to deconvolution. This method is applied to noise of various noise levels added to determine the optimum use of the method. The phase shift method of migration and modeling is evaluated and the results are compared to Stolt's approach. A method is introduced by which the optimum iterative number for deconvolution can be found. Statistical computer simulation is used to describe the optimum use of two convergent iterative techniques for seismic data. The Always-Convergent deconvolution technique was applied to data recorded during the quantitative analysis of materials through NonDestructive Evaluation (NDE) in which ultrasonic signals were used to detect flaws in substances such as composites.

  15. Determinants for global cargo analysis tools

    NASA Astrophysics Data System (ADS)

    Wilmoth, M.; Kay, W.; Sessions, C.; Hancock, M.

    2007-04-01

    The purpose of Global TRADER (GT) is not only to gather and query supply-chain transactional data for facts but also to analyze that data for hidden knowledge for the purpose of useful and meaningful pattern prediction. The application of advanced analytics provides benefits beyond simple information retrieval from GT, including computer-aided detection of useful patterns and associations. Knowledge discovery, offering a breadth and depth of analysis unattainable by manual processes, involves three components: repository structures, analytical engines, and user tools and reports. For a large and complex domain like supply-chains, there are many stages to developing the most advanced analytic capabilities; however, significant benefits accrue as components are incrementally added. These benefits include detecting emerging patterns; identifying new patterns; fusing data; creating models that can learn and predict behavior; and identifying new features for future tools. The GT Analyst Toolset was designed to overcome a variety of constraints, including lack of third party data, partial data loads, non-cleansed data (non-disambiguation of parties, misspellings, transpositions, etc.), and varying levels of analyst experience and expertise. The end result was a set of analytical tools that are flexible, extensible, tunable, and able to support a wide range of analyst demands.

  16. Data Analysis Tools for NSTX-U Physics Meeting

    E-print Network

    Princeton Plasma Physics Laboratory

    LLC #12;NSTX-U Monday Physics Meeting­ Data Analysis Tools, Bill Davis (8/26/2013) 2 Overview ·Web/26/2013) 3 Web Tool access at http://nstx.pppl.gov/nstx/Software/WebTools Designed for ease-of-use: #12;NSTX-U Monday Physics Meeting­ Data Analysis Tools, Bill Davis (8/26/2013) 4 Access to Plotting Web Tools #12

  17. Integrated FDIR Analysis Tool for Space Applications

    NASA Astrophysics Data System (ADS)

    Piras, Annamaria; Malucchi, Giovanni; Di Tommaso, Umberto

    2013-08-01

    The crucial role of health management in space applications has been the subject of many studies carried out by NASA and ESA and is held in high regard by Thales Alenia Space. The common objective is to improve reliability and availability of space systems. This paper will briefly illustrate the evolution of IDEHAS (IntegrateD Engineering Harness Avionics and Software), an advanced tool currently used in Thales Alenia Space – Italy in several space programs and recently enhanced to fully support FDIR (Fault Detection Isolation and Recovery) analysis. The FDIR analysis logic flow will be presented, emphasizing the improvements offered to Mission Support & Operations activities. Finally the benefits provided to the Company and a list of possible future enhancements will be given.

  18. Deconvolution in a ridgelet and curvelet domain

    NASA Astrophysics Data System (ADS)

    Easley, Glenn R.; Berenstein, Carlos A.; Healy, Dennis M., Jr.

    2005-03-01

    We present techniques for performing image reconstruction based on deconvolution in the Radon domain. To deal with a variety of possible boundary conditions, we work with a corresponding generalized discrete Radon transform in order to obtain projection slices for deconvolution. By estimating the projections using wavelet techniques, we are able to do deconvolution directly in a ridgelet domain. We also show how this method can be carried out locally, so that deconvolution can be done in a curvelet domain as well. These techniques suggest a whole new paradigm for developing deconvolution algorithms, which can incorporate leading deconvolution schemes. We conclude by showing experimental results indicating that these new algorithms can significantly improve upon current leading deconvolution methods.

  19. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  20. Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle

    NASA Astrophysics Data System (ADS)

    Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

    2015-05-01

    UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement.

  1. Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)

    SciTech Connect

    Melaina, M.; Penev, M.

    2012-09-01

    NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

  2. Timeline analysis tools for law enforcement

    NASA Astrophysics Data System (ADS)

    Mucks, John

    1997-02-01

    The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.

  3. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  4. Multi-Mission Power Analysis Tool

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2011-01-01

    Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

  5. Simplified building energy analysis tool for architects

    NASA Astrophysics Data System (ADS)

    Chaisuparasmikul, Pongsak

    Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools applicable to the earliest stage of design, where more informed analysis of possible alternatives could yield the most benefit and the greatest cost savings both economic and environmental. This is where computer modeling and simulation can really lead to better and energy efficient buildings. Both apply to internal environment and human comfort, and environmental impact from surroundings.

  6. Airborne LIDAR Data Processing and Analysis Tools

    NASA Astrophysics Data System (ADS)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  7. Deconvolution of adaptive optics retinal images Julian C. Christou

    E-print Network

    Deconvolution of adaptive optics retinal images Julian C. Christou Center for Adaptive Optics by using deconvolution to remove the residual wave-front aberrations. Qualitatively, deconvolution improves such as deconvolution. Deconvolution has also been proposed as an alternative technique to adaptive wave-front correc

  8. Dynamic contrast-enhanced CT of head and neck tumors: perfusion measurements using a distributed-parameter tracer kinetic model. Initial results and comparison with deconvolution-based analysis

    NASA Astrophysics Data System (ADS)

    Bisdas, Sotirios; Konstantinou, George N.; Sherng Lee, Puor; Thng, Choon Hua; Wagenblast, Jens; Baghi, Mehran; San Koh, Tong

    2007-10-01

    The objective of this work was to evaluate the feasibility of a two-compartment distributed-parameter (DP) tracer kinetic model to generate functional images of several physiologic parameters from dynamic contrast-enhanced CT data obtained of patients with extracranial head and neck tumors and to compare the DP functional images to those obtained by deconvolution-based DCE-CT data analysis. We performed post-processing of DCE-CT studies, obtained from 15 patients with benign and malignant head and neck cancer. We introduced a DP model of the impulse residue function for a capillary-tissue exchange unit, which accounts for the processes of convective transport and capillary-tissue exchange. The calculated parametric maps represented blood flow (F), intravascular blood volume (v1), extravascular extracellular blood volume (v2), vascular transit time (t1), permeability-surface area product (PS), transfer ratios k12 and k21, and the fraction of extracted tracer (E). Based on the same regions of interest (ROI) analysis, we calculated the tumor blood flow (BF), blood volume (BV) and mean transit time (MTT) by using a modified deconvolution-based analysis taking into account the extravasation of the contrast agent for PS imaging. We compared the corresponding values by using Bland-Altman plot analysis. We outlined 73 ROIs including tumor sites, lymph nodes and normal tissue. The Bland-Altman plot analysis revealed that the two methods showed an accepted degree of agreement for blood flow, and, thus, can be used interchangeably for measuring this parameter. Slightly worse agreement was observed between v1 in the DP model and BV but even here the two tracer kinetic analyses can be used interchangeably. Under consideration of whether both techniques may be used interchangeably was the case of t1 and MTT, as well as for measurements of the PS values. The application of the proposed DP model is feasible in the clinical routine and it can be used interchangeably for measuring blood flow and vascular volume with the commercially available reference standard of the deconvolution-based approach. The lack of substantial agreement between the measurements of vascular transit time and permeability-surface area product may be attributed to the different tracer kinetic principles employed by both models and the detailed capillary tissue exchange physiological modeling of the DP technique.

  9. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A; Buescher, Kevin L; Armstrong, Jerawan C

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  10. Built Environment Analysis Tool: April 2013

    SciTech Connect

    Porter, C.

    2013-05-01

    This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

  11. A new scoring function for top-down spectral deconvolution

    DOE PAGESBeta

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore »that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less

  12. A new scoring function for top-down spectral deconvolution

    SciTech Connect

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showed that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.

  13. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  14. Software and Analysis Tools Overview Physics Meeting

    E-print Network

    Princeton Plasma Physics Laboratory

    Overview ·Status of MDSplus ·Various plotting options ·What's new in Web Tools ·EFITviewer ·XPC) ·Web Tools ­ Now can run from file input ­ Actively maintained, e.g., Open Science options coming Documentation and Web Tools found at http://nstx.pppl.gov/nstx/Software #12;NSTX-U Monday Physics Meeting

  15. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  16. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  17. HANSIS software tool for the automated analysis of HOLZ lines.

    PubMed

    Holec, D; Sridhara Rao, D V; Humphreys, C J

    2009-06-01

    A software tool, named as HANSIS (HOLZ analysis), has been developed for the automated analysis of higher-order Laue zone (HOLZ) lines in convergent beam electron diffraction (CBED) patterns. With this tool, the angles and distances between the HOLZ intersections can be measured and the data can be presented graphically with a user-friendly interface. It is capable of simultaneous analysis of several HOLZ patterns and thus provides a tool for systematic studies of CBED patterns. PMID:19375228

  18. Simultaneous Total Variation Image Inpainting and Blind Deconvolution

    E-print Network

    Ferguson, Thomas S.

    Simultaneous Total Variation Image Inpainting and Blind Deconvolution Tony F. Chan Andy M. Yip) boundary conditions for deconvolution required near the interface between observed and occluded regions are naturally generated through inpainting; (ii) inpainting results are enhanced through deconvolution (as

  19. IEEE TRANSACTIONS ON IMAGE PROCESSING 1 Multidimensional Multichannel FIR Deconvolution

    E-print Network

    Do, Minh N.

    IEEE TRANSACTIONS ON IMAGE PROCESSING 1 Multidimensional Multichannel FIR Deconvolution Using Gr for general multidimen- sional multichannel deconvolution with finite impulse response (FIR) convolution and deconvolution filters using Gr¨obner bases. Previous work formulates the problem of multichannel FIR

  20. Blind Deconvolution and Structured Matrix Computations with Applications to Array

    E-print Network

    Plemmons, Robert J.

    1 Blind Deconvolution and Structured Matrix Computations with Applications to Array Imaging Michael.1 Introduction ............................................................. 1 1.2 One Dimensional Deconvolution Dimensional Deconvolution Problems ............................... 18 1.6 Numerical Examples

  1. NuDE Tool-Sets Requirement Analysis Design Implementation

    E-print Network

    NuDE Tool-Sets Requirement Analysis Design Implementation Development institutions : - Konkuk RequirementAnalysisDesignImplementation NuDE : Formal method based NPP system development and verification

  2. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  3. Integrated Turbopump Thermo-Mechanical Design and Analysis Tools

    NASA Astrophysics Data System (ADS)

    Platt, Mike

    2002-07-01

    This viewgraph presentation provides information on the thermo-mechanical design and analysis tools used to control the steady and transient thermo-mechanical effects which drive life, reliability, and cost. The thermo-mechanical analysis tools provide upfront design capability by effectively leveraging existing component design tools to analyze and control: fits, clearance, preload; cooling requirements; stress levels, LCF (low cycle fatigue) limits, and HCF (high cycle fatigue) margin.

  4. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  5. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  6. TAFFYS: An Integrated Tool for Comprehensive Analysis of Genomic Aberrations in Tumor Samples

    PubMed Central

    Feng, Huanqing; Wang, Minghui

    2015-01-01

    Background Tumor single nucleotide polymorphism (SNP) array is a common platform for investigating the cancer genomic aberration and the functionally important altered genes. Original SNP array signals are usually corrupted by noise, and need to be de-convoluted into absolute copy number profile by analytical methods. Unfortunately, in contrast with the popularity of tumor Affymetrix SNP array, the methods that are specifically designed for this platform are still limited. The complicated characteristics of noise in signals is one of the difficulties for dissecting tumor Affymetrix SNP array data, as they inevitably blur the distinction between aberrations and create an obstacle for the copy number aberration (CNA) identification. Results We propose a tool named TAFFYS for comprehensive analysis of tumor Affymetrix SNP array data. TAFFYS introduce a wavelet-based de-noising approach and copy number-specific signal variance model for suppressing and modelling the noise in signals. Then a hidden Markov model is employed for copy number inference. Finally, by using the absolute copy number profile, statistical significance of each aberration region is calculated in term of different aberration types, including amplification, deletion and loss of heterozygosity (LOH). The result shows that copy number specific-variance model and wavelet de-noising algorithm fits well with the Affymetrix SNP array signals, leading to more accurate estimation for diluted tumor sample (even with only 30% of cancer cells) than other existed methods. Results of examinations also demonstrate a good compatibility and extensibility for different Affymetrix SNP array platforms. Application on the 35 breast tumor samples shows that TAFFYS can automatically dissect the tumor samples and reveal statistically significant aberration regions where cancer-related genes locate. Conclusions TAFFYS provide an efficient and convenient tool for identifying the copy number alteration and allelic imbalance and assessing the recurrent aberrations for the tumor Affymetrix SNP array data. PMID:26111017

  7. DERMAL ABSORPTION OF PESTICIDES CALCULATED BY DECONVOLUTION

    EPA Science Inventory

    Using published human data on skin-to-urine and blood-to-urine transfer of 12 pesticides and herbicides, the skin-to-blood transfer rates for each compound were estimated by two numerical deconvolution techniques. Regular constrained deconvolution produced an estimated upper limi...

  8. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  9. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  10. Approximate Deconvolution Reduced Order Modeling

    E-print Network

    Xie, Xuping; Wang, Zhu; Iliescu, Traian

    2015-01-01

    This paper proposes a large eddy simulation reduced order model(LES-ROM) framework for the numerical simulation of realistic flows. In this LES-ROM framework, the proper orthogonal decomposition(POD) is used to define the ROM basis and a POD differential filter is used to define the large ROM structures. An approximate deconvolution(AD) approach is used to solve the ROM closure problem and develop a new AD-ROM. This AD-ROM is tested in the numerical simulation of the one-dimensional Burgers equation with a small diffusion coefficient(10^{-3})

  11. Using Kepler for Tool Integration in Microarray Analysis Workflows

    PubMed Central

    Gan, Zhuohui; Stowe, Jennifer C.; Altintas, Ilkay; McCulloch, Andrew D.; Zambon, Alexander C.

    2015-01-01

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  12. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  13. Bayesian approach based blind image deconvolution with fuzzy median filter

    NASA Astrophysics Data System (ADS)

    Mohan, S. Chandra; Rajan, K.; Srinivasan, R.

    2011-10-01

    The inverse problem associated with reconstruction of Poisson blurred images has attracted attention in recent years. In this paper, we propose an alternative unified approach to blind image deconvolution problem using fuzzy median filter as Gibbs prior to model the nature of inter pixel interaction for better edge preserving reconstruction. The performance of the algorithm at various SNR levels has been studied quantitatively using PSNR, RMSE and universal quality index (UQI). Comparative analysis with existing methods has also been carried out.

  14. Variational approach to parameter estimation in blind deconvolution

    E-print Network

    Granada, Universidad de

    Variational approach to parameter estimation in blind deconvolution Rafael Molina Universidad de Granada Variational methods in blind deconvolution ­ p. 1/4 #12;Outline · Problem formulation · Bayesian values Variational methods in blind deconvolution ­ p. 2/4 #12;1 Problem Formulation Blind deconvolution

  15. Tools for Knowledge Analysis, Synthesis, and Sharing

    ERIC Educational Resources Information Center

    Medland, Michael B.

    2007-01-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own…

  16. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

  17. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  18. Relative focus map estimation using blind deconvolution.

    PubMed

    Kovács, Levente; Szirányi, Tamás

    2005-11-15

    An automatic focus map extraction method is presented that uses a modification of blind deconvolution for estimation of localized blurring functions. We use these local blurring functions [so-called point-spread functions (PSFs)] for extraction of focus areas on ordinary images. In this inverse task our goal is not image reconstruction but the estimation of localized PSFs and the relative focus map. Thus the method is less sensitive than general deconvolution is to noise and ill-posed deconvolution problems. The focus areas can be estimated without any knowledge of the shooting conditions or of the optical system used. PMID:16315708

  19. Spatial deconvolution of IRAS galaxies at 60 UM

    NASA Technical Reports Server (NTRS)

    Low, Frank J.

    1987-01-01

    Using IRAS in a slow scan observing mode to increase the spatial sampling rate and a deconvolution analysis to increase the spatial resolution, several bright galaxies were resolved at 60 micron. Preliminary results for M 82, NGC 1068, NGC 3079 and NGC 2623 show partially resolved emission from 10 to 26 arcsec., full width at half maximum, and extended emission from 30 to 90 arcsec. from the center. In addition, the interacting system, Arp 82, along with Mark 231 and Arp 220 were studied using the program ADDSCAN to average all available survey mode observations. The Arp 82 system is well resolved after deconvolution and its brighter component is extended; the two most luminous objects are not resolved with an upper limit of 15 arcsec. for Arp 220.

  20. Introducing an Online Cooling Tower Performance Analysis Tool 

    E-print Network

    Muller, M.R.; Muller, M.B.; Rao, P.

    2012-01-01

    an Online Cooling Tower Performance Analysis Tool Michael B. Muller Mechanical Engineer Rutgers University Piscataway, NJ Michael R. Muller Professor of Mechanical Engineering Rutgers University Piscataway, NJ Prakash Rao, PhD Mechanical Engineer...

  1. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  2. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  3. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  4. Deconvolution of dynamic mechanical networks

    E-print Network

    Michael Hinczewski; Yann von Hansen; Roland R. Netz

    2011-07-13

    Time-resolved single-molecule biophysical experiments yield data that contain a wealth of dynamic information, in addition to the equilibrium distributions derived from histograms of the time series. In typical force spectroscopic setups the molecule is connected via linkers to a read-out device, forming a mechanically coupled dynamic network. Deconvolution of equilibrium distributions, filtering out the influence of the linkers, is a straightforward and common practice. We have developed an analogous dynamic deconvolution theory for the more challenging task of extracting kinetic properties of individual components in networks of arbitrary complexity and topology. Our method determines the intrinsic linear response functions of a given molecule in the network, describing the power spectrum of conformational fluctuations. The practicality of our approach is demonstrated for the particular case of a protein linked via DNA handles to two optically trapped beads at constant stretching force, which we mimic through Brownian dynamics simulations. Each well in the protein free energy landscape (corresponding to folded, unfolded, or possibly intermediate states) will have its own characteristic equilibrium fluctuations. The associated linear response function is rich in physical content, since it depends both on the shape of the well and its diffusivity---a measure of the internal friction arising from such processes like the transient breaking and reformation of bonds in the protein structure. Starting from the autocorrelation functions of the equilibrium bead fluctuations measured in this force clamp setup, we show how an experimentalist can accurately extract the state-dependent protein diffusivity using a straightforward two-step procedure.

  5. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  6. Understanding and evaluating blind deconvolution algorithms

    E-print Network

    Freeman, William

    2009-03-31

    Blind deconvolution is the recovery of a sharp version of a blurred image when the blur kernel is unknown. Recent algorithms have afforded dramatic progress, yet many aspects of the problem remain challenging and hard to ...

  7. Understanding and evaluating blind deconvolution algorithms

    E-print Network

    Durand, Fredo

    Blind deconvolution is the recovery of a sharp version of a blurred image when the blur kernel is unknown. Recent algorithms have afforded dramatic progress, yet many aspects of the problem remain challenging and hard to ...

  8. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  9. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  10. Extrinsic Geometrical Methods for Neural Blind Deconvolution

    NASA Astrophysics Data System (ADS)

    Fiori, Simone

    2006-11-01

    The present contribution discusses a Riemannian-gradient-based algorithm and a projection-based learning algorithm over a curved parameter space for single-neuron learning. We consider the `blind deconvolution' signal processing problem. The learning rule naturally arises from a criterion-function minimization over the unitary hyper-sphere setting. We consider the blind deconvolution performances of the two algorithms as well as their computational burden and numerical features.

  11. Minimum entropy deconvolution and blind equalisation

    NASA Technical Reports Server (NTRS)

    Satorius, E. H.; Mulligan, J. J.

    1992-01-01

    Relationships between minimum entropy deconvolution, developed primarily for geophysics applications, and blind equalization are pointed out. It is seen that a large class of existing blind equalization algorithms are directly related to the scale-invariant cost functions used in minimum entropy deconvolution. Thus the extensive analyses of these cost functions can be directly applied to blind equalization, including the important asymptotic results of Donoho.

  12. A 3D image analysis tool for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.

    2005-04-01

    We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

  13. AstroStat: Statistical analysis tool

    NASA Astrophysics Data System (ADS)

    VO-India Team

    2015-07-01

    AstroStat performs statistical analysis on data and is compatible with Virtual Observatory (VO) standards. It accepts data in a variety of formats and performs various statistical tests using a menu driven interface. Analyses, performed in R, include exploratory tests, visualizations, distribution fitting, correlation and causation, hypothesis testing, multivariate analysis and clustering. AstroStat is available in two versions with an identical interface and features: as a web service that can be run using any standard browser and as an offline application.

  14. Increasing axial resolution of 3D data sets using deconvolution algorithms.

    PubMed

    Topor, P; Zimanyi, M; Mateasik, A

    2011-09-01

    Deconvolution algorithms are tools for the restoration of data degraded by blur and noise. An incorporation of regularization functions into the iterative form of reconstruction algorithms can improve the restoration performance and characteristics (e.g. noise and artefact handling). In this study, algorithms based on Richardson-Lucy deconvolution algorithm are tested. The ability of these algorithms to improve axial resolution of three-dimensional data sets is evaluated on model synthetic data. Finally, unregularized Richardson-Lucy algorithm is selected for the evaluation and reconstruction of three-dimensional chromosomal data sets of Drosophila melanogaster. Problems concerning the reconstruction process are discussed and further improvements are proposed. PMID:21599665

  15. Deconvolution of immittance data: some old and new methods

    SciTech Connect

    Tuncer, Enis; Macdonald, Ross J.

    2007-01-01

    The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of Kronig-Kramers transforms, a powerful nonlinear-least-squares inversion method, and a new Monte-Carlo inversion method. The important distinction, usually ignored, between discrete-point distributions and continuous ones is emphasized, and both recent parametric and non-parametric deconvolution/inversion procedures for frequency-response data are discussed and compared. Information missing in a recent parametric measurement-model deconvolution approach is pointed out and remedied, and its priority evaluated. Comparisons are presented between the standard parametric least squares inversion method and a new non-parametric Monte Carlo one that allows complicated composite distributions of relaxation times (DRT) to be accurately estimated without the uncertainty present with regularization methods. Also, detailed Monte-Carlo DRT estimates for the supercooled liquid 0.4Ca(NO) 0.6KNO3(CKN) at 350 K are compared with appropriate frequency-response-model fit results. These composite models were derived from stretched-exponential Kohlrausch temporal response with the inclusion of either of two different series electrode-polarization functions.

  16. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    SciTech Connect

    Bush, B.; Penev, M.; Melaina, M.; Zuboy, J.

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  17. Development of a climate data analysis tool (CDAT)

    SciTech Connect

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  18. A Semi-Automated Functional Test Data Analysis Tool

    SciTech Connect

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  19. SOFAST: Sandia Optical Fringe Analysis Slope Tool

    Energy Science and Technology Software Center (ESTSC)

    2012-09-13

    SOFAST is used to characterize the surface slope of reflective mirrors for solar applications. SOFAST uses a large monitor or projection screen to display fringe patterns, and a machine vision camera to image the reflection of these patterns in the subject mirror. From these images, a detailed map of surface normals can be generated and compared to design or fitted mirror shapes. SOFAST uses standard Fringe Reflection (Deflectometry) approaches to measure the mirror surface normals.more »SOFAST uses an extrinsic analysis of key points on the facet to locate the camera and monitor relative to the facet coordinate system. It then refines this position based on the measured surface slope and integrated shape of the mirror facet. The facet is placed into a reference frame such that key points on the facet match the design facet in orientation and position.« less

  20. Blind source deconvolution for deep Earth seismology

    NASA Astrophysics Data System (ADS)

    Stefan, W.; Renaut, R.; Garnero, E. J.; Lay, T.

    2007-12-01

    We present an approach to automatically estimate an empirical source characterization of deep earthquakes recorded teleseismically and subsequently remove the source from the recordings by applying regularized deconvolution. A principle goal in this work is to effectively deblur the seismograms, resulting in more impulsive and narrower pulses, permitting better constraints in high resolution waveform analyses. Our method consists of two stages: (1) we first estimate the empirical source by automatically registering traces to their 1st principal component with a weighting scheme based on their deviation from this shape, we then use this shape as an estimation of the earthquake source. (2) We compare different deconvolution techniques to remove the source characteristic from the trace. In particular Total Variation (TV) regularized deconvolution is used which utilizes the fact that most natural signals have an underlying spareness in an appropriate basis, in this case, impulsive onsets of seismic arrivals. We show several examples of deep focus Fiji-Tonga region earthquakes for the phases S and ScS, comparing source responses for the separate phases. TV deconvolution is compared to the water level deconvolution, Tikenov deconvolution, and L1 norm deconvolution, for both data and synthetics. This approach significantly improves our ability to study subtle waveform features that are commonly masked by either noise or the earthquake source. Eliminating source complexities improves our ability to resolve deep mantle triplications, waveform complexities associated with possible double crossings of the post-perovskite phase transition, as well as increasing stability in waveform analyses used for deep mantle anisotropy measurements.

  1. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  2. Calibration of Wide-Field Deconvolution Microscopy for Quantitative Fluorescence Imaging

    PubMed Central

    Lee, Ji-Sook; Wee, Tse-Luen (Erika); Brown, Claire M.

    2014-01-01

    Deconvolution enhances contrast in fluorescence microscopy images, especially in low-contrast, high-background wide-field microscope images, improving characterization of features within the sample. Deconvolution can also be combined with other imaging modalities, such as confocal microscopy, and most software programs seek to improve resolution as well as contrast. Quantitative image analyses require instrument calibration and with deconvolution, necessitate that this process itself preserves the relative quantitative relationships between fluorescence intensities. To ensure that the quantitative nature of the data remains unaltered, deconvolution algorithms need to be tested thoroughly. This study investigated whether the deconvolution algorithms in AutoQuant X3 preserve relative quantitative intensity data. InSpeck Green calibration microspheres were prepared for imaging, z-stacks were collected using a wide-field microscope, and the images were deconvolved using the iterative deconvolution algorithms with default settings. Afterwards, the mean intensities and volumes of microspheres in the original and the deconvolved images were measured. Deconvolved data sets showed higher average microsphere intensities and smaller volumes than the original wide-field data sets. In original and deconvolved data sets, intensity means showed linear relationships with the relative microsphere intensities given by the manufacturer. Importantly, upon normalization, the trend lines were found to have similar slopes. In original and deconvolved images, the volumes of the microspheres were quite uniform for all relative microsphere intensities. We were able to show that AutoQuant X3 deconvolution software data are quantitative. In general, the protocol presented can be used to calibrate any fluorescence microscope or image processing and analysis procedure. PMID:24688321

  3. Holographic deconvolution microscopy for high-resolution particle tracking

    E-print Network

    Grier, David

    Holographic deconvolution microscopy for high-resolution particle tracking Lisa Dixon, Fook Chiong.-W. Fink, "Depth-resolved holographic reconstructions by three-dimensional deconvolution," Opt. Express 21

  4. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  5. Parallel Analysis Tools for Ultra-Large Climate Data Sets

    NASA Astrophysics Data System (ADS)

    Jacob, Robert; Krishna, Jayesh; Xu, Xiabing; Mickelson, Sheri; Wilde, Mike; Peterson, Kara; Bochev, Pavel; Latham, Robert; Tautges, Tim; Brown, David; Brownrigg, Richard; Haley, Mary; Shea, Dennis; Huang, Wei; Middleton, Don; Schuchardt, Karen; Yin, Jian

    2013-04-01

    While climate models have used parallelism for several years, the post-processing tools are still mostly single-threaded applications and many are closed source. These tools are becoming a bottleneck in the production of new climate knowledge when they confront terabyte-sized output from high-resolution climate models. The ParVis project is using and creating Free and Open Source tools that bring data and task parallelism to climate model analysis to enable analysis of large climate data sets. ParVis is using the Swift task-parallel language to implement a diagnostic suite that generates over 600 plots of atmospheric quantities. ParVis has also created a Parallel Gridded Analysis Library (ParGAL) which implements many common climate analysis operations in a data-parallel fashion using the Message Passing Interface. ParGAL has in turn been built on sophisticated packages for describing grids in parallel (the Mesh Oriented database (MOAB), performing vector operations on arbitrary grids (Intrepid) and reading data in parallel (PnetCDF). ParGAL is being used to implement a parallel version of the NCAR Command Language (NCL) called ParNCL. ParNCL/ParCAL not only speeds up analysis of large datasets but also allows operations to be performed on native grids, eliminating the need to transform data to latitude-longitude grids. All of the tools ParVis is creating are available as free and open source software.

  6. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  7. Proteomic Tools for the Analysis of Cytoskeleton Proteins.

    PubMed

    Scarpati, Michael; Heavner, Mary Ellen; Wiech, Eliza; Singh, Shaneen

    2016-01-01

    Proteomic analyses have become an essential part of the toolkit of the molecular biologist, given the widespread availability of genomic data and open source or freely accessible bioinformatics software. Tools are available for detecting homologous sequences, recognizing functional domains, and modeling the three-dimensional structure for any given protein sequence. Although a wealth of structural and functional information is available for a large number of cytoskeletal proteins, with representatives spanning all of the major subfamilies, the majority of cytoskeletal proteins remain partially or totally uncharacterized. Moreover, bioinformatics tools provide a means for studying the effects of synthetic mutations or naturally occurring variants of these cytoskeletal proteins. This chapter discusses various freely available proteomic analysis tools, with a focus on in silico prediction of protein structure and function. The selected tools are notable for providing an easily accessible interface for the novice, while retaining advanced functionality for more experienced computational biologists. PMID:26498799

  8. Millennial scale system impulse response of polar climates - deconvolution results between ? 18O records from Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Reischmann, E.; Yang, X.; Rial, J. A.

    2013-12-01

    Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, ? 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the periodicity of the thermo-haline circulation (THC). Simplified models of the polar climate fluctuations are shown to support these findings.

  9. Separation analysis, a tool for analyzing multigrid algorithms

    NASA Technical Reports Server (NTRS)

    Costiner, Sorin; Taasan, Shlomo

    1995-01-01

    The separation of vectors by multigrid (MG) algorithms is applied to the study of convergence and to the prediction of the performance of MG algorithms. The separation operator for a two level cycle algorithm is derived. It is used to analyze the efficiency of the cycle when mixing of eigenvectors occurs. In particular cases the separation analysis reduces to Fourier type analysis. The separation operator of a two level cycle for a Schridubger eigenvalue problem, is derived and analyzed in a Fourier basis. Separation analysis gives information on how to choose performance relaxations and inter-level transfers. Separation analysis is a tool for analyzing and designing algorithms, and for optimizing their performance.

  10. Deconvolution of Thomson scattering temperature profiles

    SciTech Connect

    Scannell, R.; Beurskens, M.; Carolan, P. G.; Kirk, A.; Walsh, M.; Osborne, T. H.

    2011-05-15

    Deconvolution of Thomson scattering (TS) profiles is required when the gradient length of the electron temperature (T{sub e}) or density (n{sub e}) are comparable to the instrument function length ({Delta}{sub R}). The most correct method for deconvolution to obtain underlying T{sub e} and n{sub e} profiles is by consideration of scattered signals. However, deconvolution at the scattered signal level is complex since it requires knowledge of all spectral and absolute calibration data. In this paper a simple technique is presented where only knowledge of the instrument function I(r) and the measured profiles, T{sub e,observed}(r) and n{sub e,observed}(r), are required to obtain underlying T{sub e}(r) and n{sub e}(r). This method is appropriate for most TS systems and is particularly important where high spatial sampling is obtained relative to {Delta}{sub R}.

  11. Wave scattering deconvolution by seismic inversion

    SciTech Connect

    Sarwar, A.K.M.; Smith, D.L.

    1987-06-01

    The authors propose a wave scattering approach to the problem of deconvolution by the inversion of the reflection seismogram. They study the full wave solution of the one-dimensional wave equation for deconvolution. Both the reflectivity and the section multiple train can be predicted from the boundary data (the reflection seismogram). This is in contrast to the usual statistical approach in which reflectivity is unpredictable and random, and the section multiple train is the only predictable component of the seismogram. The proposed scattering approach also differs from Claerbout's method based on the Kunetz equation. The computer algorithm recursively solves for the pressure and particle velocity response and the impedance log. The method accomplishes deconvolution and impedance log reconstruction. The authors tested the method by computer model experiments and obtained satisfactory results using noise-free synthetic data. Further study is recommended for the method's application to real data.

  12. A novel nonstationary deconvolution method based on spectral modeling and variable-step sampling hyperbolic smoothing

    NASA Astrophysics Data System (ADS)

    Li, Fang; Wang, Shoudong; Chen, Xiaohong; Liu, Guochang; Zheng, Qiang

    2014-04-01

    Deconvolution is an important part of seismic processing tool for improving the resolution. One of the key assumptions made in most deconvolutional methods is that the seismic data is stationary. However, due to the anelastic absorption, the seismic data is usually nonstationary. In this paper, a novel nonstationary deconvolution approach is proposed based on spectral modeling and variable-step sampling (VSS) hyperbolic smoothing. To facilitate our method, firstly, we apply the Gabor transform to perform a time-frequency decomposition of the nonstationary seismic trace. Secondly, we estimate the source wavelet amplitude spectrum by spectral modeling. Thirdly, smoothing the Gabor magnitude spectrum of seismic data along hyperbolic paths with VSS can obtain the magnitude of the attenuation function, and can also eliminate the effect of source wavelet. Fourthly, by assuming that the source wavelet and attenuation function are minimum phase, their phases can be determined by Hilbert transform. Finally, the estimated two factors are removed by dividing them into the Gabor spectrum of the trace to estimate the Gabor spectrum of the reflectivity. An inverse Gabor transform gives the time-domain reflectivity estimate. Tests on synthetic and field data show that the presented method is an effective tool that not only has the advantages of stationary deconvolution, but also can compensate for the energy absorption, without knowing or estimating the quality factor Q.

  13. Development of High Performance Fluxomics Tools for Microbial Metabolism Analysis

    E-print Network

    Subramanian, Venkat

    Development of High Performance Fluxomics Tools for Microbial Metabolism Analysis Xueyang Feng://grmike.blogspot.com #12;Metabolic Flux: Flow of Molecules #12;Fluxomics Gene Gene Gene Gene Metabolite Metabolite Metabolite Metabolite Protein Protein Protein Protein Most reliable description of Cell Metabolism Metabolic

  14. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively…

  15. Football analysis using spatio-temporal tools Joachim Gudmundsson

    E-print Network

    Wolle, Thomas

    Football analysis using spatio-temporal tools Joachim Gudmundsson University of Sydney and NICTA, Australia thomas.wolle@gmail.com ABSTRACT Analysing a football match is without doubt an important task specifically for analysing the performance of football players and teams. The aim, functionality

  16. ATACOBOL -A COBOL Test Coverage Analysis Tool and Its Applications

    E-print Network

    Lyu, Michael R.

    to the actual testing semantic required by Y2K compliance software testing. However, the mainframe environment 2000 testing in the banking business. It illustrates how the all-uses criteria can be a strongerATACOBOL - A COBOL Test Coverage Analysis Tool and Its Applications Sam K.S. Sze Hong Kong

  17. A FORMAL LANGUAGE AND ANALYSIS TOOL FOR BLACK BOX SPECIFICATIONS

    E-print Network

    Ferrer, Gabriel J.

    input and the entire history of interactions it has had with the environment. We have observed scenario. The analysis tool is also able to generate an animation of the specified software artifact. This animation is a GUI with a button for each stimulus. It shows both the current response and the entire list

  18. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  19. Generalized Aliasing as a Basis for Program Analysis Tools

    E-print Network

    ­chair) Daniel Jackson (co­chair) Frank Pfenning Craig Chambers Copyright 2001, Robert O'Callahan This researchGeneralized Aliasing as a Basis for Program Analysis Tools Robert O'Callahan November 2000 CMU completed this thesis without the support and tireless efforts of my advisors, Daniel Jackson and Jeannette

  20. Generalized Aliasing as a Basis for Program Analysis Tools

    E-print Network

    -chair) Daniel Jackson (co-chair) Frank Pfenning Craig Chambers Copyright 2001, Robert O'Callahan This researchGeneralized Aliasing as a Basis for Program Analysis Tools Robert O'Callahan November 2000 CMU efforts of my advisors, Daniel Jackson and Jeannette Wing. With their help, I have learned far more during

  1. Boston University, Biomedical Forensic Sciences DNA Mixture Analysis Training Tool

    E-print Network

    Boston University, Biomedical Forensic Sciences DNA Mixture Analysis Training Tool NIJ Conference, 2012 Funded by: NIJ Forensic Science Training Development and Delivery Program NIJ Grant # 2008-DN-BX-K158, awarded to Biomedical Forensic Science Program at Boston University School of Medicine #12;DNA

  2. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  3. Effect of Static Analysis Tools on Software Security: Preliminary Investigation

    E-print Network

    Black, Paul E.

    F. Guthrie Romain Gaucher Paul E. Black National Institute of Standards and Technology Gaithersburg, MD 20899, USA {vadim.okun, will.guthrie, romain.gaucher, paul.black}@nist.gov ABSTRACT Static complement to testing to discover defects in source code. This paper is concerned with static analysis tools

  4. Sparse Component Analysis: a New Tool for Data Mining

    E-print Network

    Cichocki, Andrzej

    Sparse Component Analysis: a New Tool for Data Mining Pando Georgiev1 , Fabian Theis2 , Andrzej,hova}@bsp.brain.riken.go.jp Summary. In many practical problems for data mining the data X under consid- eration (given as (m × N Separation, cluster- ing. 1 Introduction Data mining techniques can be divided into the following classes [3

  5. Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-07-01

    This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  6. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  7. Self-Whitening Algorithms for Adaptive Equalization and Deconvolution

    E-print Network

    Cichocki, Andrzej

    Self-Whitening Algorithms for Adaptive Equalization and Deconvolution Scott C. Douglas1 , Andrzej 351-01 JAPAN Abstract{ In equalization and deconvolution tasks, the correlated nature of the input sig for deconvolution and equalization tasks. Multichannel extensions of the techniques are also described. accepted

  8. Extensions of the Justen-Ramlau blind deconvolution method

    E-print Network

    Reichel, Lothar

    Extensions of the Justen-Ramlau blind deconvolution method Tristan A. Hearn Lothar Reichel Abstract Blind deconvolution problems arise in many image restoration applications. Most available blind deconvolution methods are iterative. Recently, Justen and Ramlau proposed a novel non-iterative blind

  9. Blind Deconvolution via Lower-Bounded Logarithmic Image Priors

    E-print Network

    Sola, Rolf Haenni

    Blind Deconvolution via Lower-Bounded Logarithmic Image Priors Daniele Perrone, Remo Diethelm deconvolution problem where there are only two energy terms: a least- squares term for the data fidelity that this energy formulation is sufficient to achieve the state of the art in blind deconvolution with a good

  10. Parallel deconvolution methods for three dimensional image restoration

    E-print Network

    Reichel, Lothar

    Parallel deconvolution methods for three dimensional image restoration B. Lewis a and L. Reichel b 44242 ABSTRACT Restoration by deconvolution of three-dimensional images that have been contaminated implementations of iterative methods for image deconvolution on a distributed memory computing cluster. Keywords

  11. FRAMELET BASED DECONVOLUTION JIAN-FENG CAI AND ZUOWEI SHEN

    E-print Network

    Shen, Zuowei

    FRAMELET BASED DECONVOLUTION JIAN-FENG CAI AND ZUOWEI SHEN Abstract. In this paper, two framelet based deconvolution algorithms are proposed. The basic idea of framelet based approach is to convert the deconvolution problem to the problem of inpainting in a frame domain by constructing a framelet system with one

  12. Shearlet-TGV Based Fluorescence Microscopy Image Deconvolution

    E-print Network

    Ferguson, Thomas S.

    Shearlet-TGV Based Fluorescence Microscopy Image Deconvolution Jing Qin1 , Xiyu Yi2 , Shimon Weiss2 of superresolution imaging method- ologies, various deconvolution algorithms have been applied to fluores- cence-blind or prior-guided blind deconvolution algorithms. In this paper, we propose a novel regularization based

  13. Gradient Adaptive Algorithms for Contrast-Based Blind Deconvolution

    E-print Network

    Douglas, Scott C.

    Gradient Adaptive Algorithms for Contrast-Based Blind Deconvolution Scott C. Douglas 1 and S to the blind deconvolution task. Of particular importance in these extensions are the constraints placed on the deconvolution system transfer func- tion. While unit-norm constrained ICA approaches can be directly applied

  14. QUASINEWTON FILTEREDERROR AND FILTEREDREGRESSOR ALGORITHMS FOR ADAPTIVE EQUALIZATION AND DECONVOLUTION

    E-print Network

    Douglas, Scott C.

    AND DECONVOLUTION S.C. Douglas y , A. Cichocki z , and S. Amari z y Department of Electrical Engineering, University, RIKEN, Saitama 351­01 JAPAN ABSTRACT In equalization and deconvolution tasks, the corre­ lated nature­Newton convergence locally about the optimum coefficient solution for deconvolution and equalization tasks

  15. JOINT BLIND DECONVOLUTION AND SPECTRAL UNMIXING OF HYPERSPECTRAL IMAGES

    E-print Network

    Plemmons, Robert J.

    JOINT BLIND DECONVOLUTION AND SPECTRAL UNMIXING OF HYPERSPECTRAL IMAGES Qiang Zhang Dept sensors can collect simultaneous images ranging from visible to LWIR. Multiframe blind deconvolution (MFBD. Among these techniques, blind deconvolution methods are often applied to jointly estimate both an object

  16. Fried deconvolution Jer^ome Gilles and Stanley Osher

    E-print Network

    Ferguson, Thomas S.

    Fried deconvolution J´er^ome Gilles and Stanley Osher UCLA Department of Mathematics, 520 Portola, the Fried kernel, of the atmosphere modulation transfer function (MTF) and a framelet based deconvolution simulated blur and real images. Keywords: blind image deconvolution, Fried kernel, atmospheric blur

  17. Efficient Deconvolution and Super-Resolution Methods in Microwave Imagery

    E-print Network

    Ferguson, Thomas S.

    1 Efficient Deconvolution and Super-Resolution Methods in Microwave Imagery Igor Yanovsky, Bjorn H. Lambrigtsen, Alan B. Tanner and Luminita A. Vese Abstract--In this paper, we develop efficient deconvolution an efficient total variation minimization technique based on Split Bregman deconvolution that reduces image

  18. SelfWhitening Algorithms for Adaptive Equalization and Deconvolution

    E-print Network

    Douglas, Scott C.

    Self­Whitening Algorithms for Adaptive Equalization and Deconvolution Scott C. Douglas 1 , Andrzej­shi, Saitama 351­01 JAPAN Abstract-- In equalization and deconvolution tasks, the correlated nature coefficient solution for deconvolution and equalization tasks. Multichannel extensions of the techniques

  19. Parallel Deconvolution and Signal Compression using Adapted Wavelet Packet Bases

    E-print Network

    Guerrini, Carla

    1 Parallel Deconvolution and Signal Compression using Adapted Wavelet Packet Bases Carla Guerrini, is proposed for the numerical deconvolution and compression of real signals and images. The first problem is competitive with respect to the existent algorithms ( deconvolution + compression) and due to the very low

  20. CONVERGENCE OF THE ALTERNATING MINIMIZATION ALGORITHM FOR BLIND DECONVOLUTION

    E-print Network

    Ferguson, Thomas S.

    CONVERGENCE OF THE ALTERNATING MINIMIZATION ALGORITHM FOR BLIND DECONVOLUTION TONY F. CHAN \\Lambda AND C.K. WONG y Abstract. Blind deconvolution refers to the image processing task of restoring non­blind deconvolution problem. While the model is not convex and thus allows multiple solutions, we

  1. Efficient implementation of spatially-varying 3D ultrasound deconvolution

    E-print Network

    Kingsbury, Nick

    1 Efficient implementation of spatially-varying 3D ultrasound deconvolution Henry Gomersall, David data block is formed from a set of individual B- scans. In these circumstances, non-blind deconvolution with distance from the transducer. These two facts make the deconvolution process time-consuming to implement

  2. Stochastic Deconvolution James Gregson Felix Heide Matthias Hullin Wolfgang Heidrich

    E-print Network

    Heidrich, Wolfgang

    Stochastic Deconvolution James Gregson Felix Heide Matthias Hullin Wolfgang Heidrich The University of British Columbia Abstract We present a novel stochastic framework for non-blind deconvolution based Deconvolution is straightforward to implement, produces state-of-the-art re- sults and directly leads

  3. Dealing with Boundary Artifacts in MCMC-Based Deconvolution$

    E-print Network

    Bardsley, John

    Dealing with Boundary Artifacts in MCMC-Based Deconvolution$ Johnathan M. Bardsley Department, LLC, Las Vegas, Nevada. Abstract Many numerical methods for deconvolution problems are designed problem on an extended domain. Further, a Bayesian framework is constructed for the deconvolution, and we

  4. Target deconvolution techniques in modern phenotypic profiling Jiyoun Lee1

    E-print Network

    Bogyo, Matthew

    Target deconvolution techniques in modern phenotypic profiling Jiyoun Lee1 and Matthew Bogyo2 targets of active hits, also called `target deconvolution', is an essential step for understanding and the reduced cost of whole genome sequencing, have greatly improved the workflow of target deconvolution

  5. Fast Algorithms for Phase DiversityBased Blind Deconvolution

    E-print Network

    Ferguson, Thomas S.

    Fast Algorithms for Phase Diversity­Based Blind Deconvolution Curtis R. Vogel a , Tony Chan b that the method is remarkably robust and numerically efficient. Keywords: phase diversity, blind deconvolution, phase retrieval, quasi­Newton methods 1. INTRODUCTION Phase diversity­based blind deconvolution

  6. Discovery and New Frontiers Project Budget Analysis Tool

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  7. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  8. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  9. The Cube Analysis and Rendering Tool for Astronomy

    NASA Astrophysics Data System (ADS)

    Rosolowsky, E.; Kern, J.; Federl, P.; Jacobs, J.; Loveland, S.; Taylor, J.; Sivakoff, G.; Taylor, R.

    2015-09-01

    We present the design principles and current status of the Cube Analysis and Rendering Tool for Astronomy (CARTA). The CARTA project is designing a cube visualization tool for the Atacama Large Millimetre/submillimeter array. CARTA will join the domain-specific software already developed for millimetre-wave interferometry with sever-side visualization solution. This connection will enable archive-hosted exploration of three-dimensional data cubes. CARTA will also provide an indistinguishable desktop client. While such a goal is ambitious for a short project, the team is focusing on a well-developed framework which can readily accommodate community code development through plugins.

  10. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  11. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    NASA Technical Reports Server (NTRS)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  12. Mass Spectrometry Tools for Analysis of Intermolecular Interactions

    PubMed Central

    Auclair, Jared R.; Somasundaran, Mohan; Green, Karin M.; Evans, James E.; Schiffer, Celia A.; Ringe, Dagmar; Petsko, Gregory A.; Agar, Jeffrey N.

    2015-01-01

    The small quantities of protein required for mass spectrometry (MS) make it a powerful tool to detect binding (protein–protein, protein–small molecule, etc.) of proteins that are difficult to express in large quantities, as is the case for many intrinsically disordered proteins. Chemical cross-linking, proteolysis, and MS analysis, combined, are a powerful tool for the identification of binding domains. Here, we present a traditional approach to determine protein–protein interaction binding sites using heavy water (18O) as a label. This technique is relatively inexpensive and can be performed on any mass spectrometer without specialized software. PMID:22821539

  13. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya Monton, Carlos; Schuller, Ivan K.

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  14. AstroStat - A VO Tool for Statistical Analysis

    E-print Network

    Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

    2015-01-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

  15. Procrustes rotation as a diagnostic tool for projection pursuit analysis.

    PubMed

    Wentzell, Peter D; Hou, Siyuan; Silva, Carolina Santos; Wicks, Chelsi C; Pimentel, Maria Fernanda

    2015-06-01

    Projection pursuit (PP) is an effective exploratory data analysis tool because it optimizes the projection of high dimensional data using distributional characteristics rather than variance or distance metrics. The recent development of fast and simple PP algorithms based on minimization of kurtosis for clustering data has made this powerful tool more accessible, but under conditions where the sample-to-variable ratio is small, PP fails due to opportunistic overfitting of random correlations to limiting distributional targets. Therefore, some kind of variable compression or data regularization is required in these cases. However, this introduces an additional parameter whose optimization is manually time consuming and subject to bias. The present work describes the use of Procrustes analysis as diagnostic tool that can be used to evaluate the results of PP analysis in an efficient manner. Through Procrustes rotation, the similarity of different PP projections can be examined in an automated fashion with "Procrustes maps" to establish regions of stable projections as a function of the parameter to be optimized. The application of this diagnostic is demonstrated using principal components analysis to compress FTIR spectra from ink samples of ten different brands of pen, and also in conjunction with regularized PP for soybean disease classification. PMID:26002210

  16. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  17. Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment

    PubMed Central

    McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  18. Deep Convolutional Neural Network for Image Deconvolution

    E-print Network

    Jia, Jiaya Leo

    Deep Convolutional Neural Network for Image Deconvolution Li Xu Lenovo Research & Technology perspec- tive, we develop a deep convolutional neural network to capture the characteristics complies with an ideal linear convolution model due to camera noise, saturation, image compression, to name

  19. Euler deconvolution of GOCE gravity gradiometry data

    E-print Network

    Stuttgart, Universität

    data of a near global coverage. In this project we investigate the benefit of Euler deconvolution with high accuracy, i. e. we set ex = ey = ez = 0). Rewri en in matrix-vector notation, we get -(Vxz + 0 e the vector of increments ^ becomes small enough to meet the accuracy threshold. The initial values

  20. Nonstationary sparsity-constrained seismic deconvolution

    NASA Astrophysics Data System (ADS)

    Sun, Xue-Kai; Sam, Zandong Sun; Xie, Hui-Wen

    2014-12-01

    The Robinson convolution model is mainly restricted by three inappropriate assumptions, i.e., statistically white reflectivity, minimum-phase wavelet, and stationarity. Modern reflectivity inversion methods (e.g., sparsity-constrained deconvolution) generally attempt to suppress the problems associated with the first two assumptions but often ignore that seismic traces are nonstationary signals, which undermines the basic assumption of unchanging wavelet in reflectivity inversion. Through tests on reflectivity series, we confirm the effects of nonstationarity on reflectivity estimation and the loss of significant information, especially in deep layers. To overcome the problems caused by nonstationarity, we propose a nonstationary convolutional model, and then use the attenuation curve in log spectra to detect and correct the influences of nonstationarity. We use Gabor deconvolution to handle nonstationarity and sparsity-constrained deconvolution to separating reflectivity and wavelet. The combination of the two deconvolution methods effectively handles nonstationarity and greatly reduces the problems associated with the unreasonable assumptions regarding reflectivity and wavelet. Using marine seismic data, we show that correcting nonstationarity helps recover subtle reflectivity information and enhances the characterization of details with respect to the geological record.

  1. Histogram deconvolution - An aid to automated classifiers

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  2. Efficient Marginal Likelihood Optimization in Blind Deconvolution

    E-print Network

    Levin, Anat

    2011-04-04

    In blind deconvolution one aims to estimate from an input blurred image y a sharp image x and an unknown blur kernel k. Recent research shows that a key to success is to consider the overall shape of the posterior distribution ...

  3. Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)

    SciTech Connect

    Alkadi, Nasr E; Starke, Michael R; Ma, Ookie; Nimbalkar, Sachin U; Cox, Daryl

    2013-01-01

    IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

  4. A conceptual design tool for RBCC engine performance analysis

    SciTech Connect

    Olds, J.R.; Saks, G.

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

  5. A conceptual design tool for RBCC engine performance analysis

    NASA Astrophysics Data System (ADS)

    Olds, John R.; Saks, Greg

    1997-01-01

    Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960's. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework.

  6. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  7. Applying AI tools to operational space environmental analysis

    NASA Technical Reports Server (NTRS)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

  8. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGESBeta

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; et al

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore »in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  9. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    E-print Network

    M. Battaglieri; B. J. Briscoe; A. Celentano; S. -U. Chung; A. D'Angelo; R. De Vita; M. Döring; J. Dudek; S. Eidelman; S. Fegan; J. Ferretti; A. Filippi; G. Fox; G. Galata; H. Garcia-Tecocoatzi; D. I. Glazier; B. Grube; C. Hanhart; M. Hoferichter; S. M. Hughes; D. G. Ireland; B. Ketzer; F. J. Klein; B. Kubis; B. Liu; P. Masjuan; V. Mathieu; B. McKinnon; R. Mitchell; F. Nerling; S. Paul; J. R. Pelaez; J. Rademacker; A. Rizzo; C. Salgado; E. Santopinto; A. V. Sarantsev; T. Sato; T. Schlüter; M. L. L. da Silva; I. Stankovic; I. Strakovsky; A. Szczepaniak; A. Vassallo; N. K. Walford; D. P. Watts; L. Zana

    2015-03-30

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  10. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K.; Watts, Daniel P.

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  11. The RUMBA software: tools for neuroimaging data analysis.

    PubMed

    Bly, Benjamin Martin; Rebbechi, Donovan; Hanson, Stephen Jose; Grasso, Giorgio

    2004-01-01

    The enormous scale and complexity of data sets in functional neuroimaging makes it crucial to have well-designed and flexible software for image processing, modeling, and statistical analysis. At present, researchers must choose between general purpose scientific computing environments (e.g., Splus and Matlab), and specialized human brain mapping packages that implement particular analysis strategies (e.g., AFNI, SPM, VoxBo, FSL or FIASCO). For the vast majority of users in Human Brain Mapping and Cognitive Neuroscience, general purpose computing environments provide an insufficient framework for a complex data-analysis regime. On the other hand, the operational particulars of more specialized neuroimaging analysis packages are difficult or impossible to modify and provide little transparency or flexibility to the user for approaches other than massively multiple comparisons based on inferential statistics derived from linear models. In order to address these problems, we have developed open-source software that allows a wide array of data analysis procedures. The RUMBA software includes programming tools that simplify the development of novel methods, and accommodates data in several standard image formats. A scripting interface, along with programming libraries, defines a number of useful analytic procedures, and provides an interface to data analysis procedures. The software also supports a graphical functional programming environment for implementing data analysis streams based on modular functional components. With these features, the RUMBA software provides researchers programmability, reusability, modular analysis tools, novel data analysis streams, and an analysis environment in which multiple approaches can be contrasted and compared. The RUMBA software retains the flexibility of general scientific computing environments while adding a framework in which both experts and novices can develop and adapt neuroimaging-specific analyses. PMID:15067169

  12. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  13. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  14. AstroStat-A VO tool for statistical analysis

    NASA Astrophysics Data System (ADS)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  15. Federal metering data analysis needs and existing tools

    SciTech Connect

    Henderson, Jordan W.; Fowler, Kimberly M.

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  16. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  17. Cultural Consensus Analysis as a Tool for Clinic Improvements

    PubMed Central

    Smith, C Scott; Morris, Magdalena; Hill, William; Francovich, Chris; McMullin, Juliet; Chavez, Leo; Rhoads, Caroline

    2004-01-01

    Some problems in clinic function recur because of unexpected value differences between patients, faculty, and residents. Cultural consensus analysis (CCA) is a method used by anthropologists to identify groups with shared values. After conducting an ethnographic study and using focus groups, we developed and validated a CCA tool for use in clinics. Using this instrument, we identified distinct groups with 6 important value differences between those groups. An analysis of these value differences suggested specific and pragmatic interventions to improve clinic functioning. The instrument has also performed well in preliminary tests at another clinic. PMID:15109315

  18. Cultural consensus analysis as a tool for clinic improvements.

    PubMed

    Smith, C Scott; Morris, Magdalena; Hill, William; Francovich, Chris; McMullin, Juliet; Chavez, Leo; Rhoads, Caroline

    2004-05-01

    Some problems in clinic function recur because of unexpected value differences between patients, faculty, and residents. Cultural consensus analysis (CCA) is a method used by anthropologists to identify groups with shared values. After conducting an ethnographic study and using focus groups, we developed and validated a CCA tool for use in clinics. Using this instrument, we identified distinct groups with 6 important value differences between those groups. An analysis of these value differences suggested specific and pragmatic interventions to improve clinic functioning. The instrument has also performed well in preliminary tests at another clinic. PMID:15109315

  19. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  20. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  1. Design and Application of the Exploration Maintainability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

    2012-01-01

    Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

  2. Deconvolution of mineral absorption bands - An improved approach

    NASA Technical Reports Server (NTRS)

    Sunshine, Jessica M.; Pieters, Carle M.; Pratt, Stephen F.

    1990-01-01

    Although visible and near IR reflectance spectra contain absorption bands that are characteristic of the composition and structure of the absorbing species, deconvolving a complex spectrum is nontrivial. An improved approach to spectral deconvolution is presented that accurately represents absorption bands as discrete mathematical distributions and resolves composite absorption features into individual absorption bands. The frequently used Gaussian model of absorption bands is shown to be inappropriate for the Fe(2+) electronic transition absorptions in pyroxene spectra. A modified Gaussian model is derived using a power law relationship of energy to average bond length. The modified Gaussian model is shown to provide an objective and consistent tool for deconvolving individual absorption bands in the more complex orthopyroxene, clinopyroxene, pyroxene mixtures, and olivine spectra.

  3. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  4. VMPLOT: A versatile analysis tool for mission operations

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.

    1993-01-01

    VMPLOT is a versatile analysis tool designed by the Magellan Spacecraft Team to graphically display engineering data used to support mission operations. While there is nothing revolutionary or innovative about graphical data analysis tools, VMPLOT has some distinguishing features that set it apart from other custom or commercially available software packages. These features include the ability to utilize time in a Universal Time Coordinated (UTC) or Spacecraft Clock (SCLK) format as an enumerated data type, the ability to automatically scale both axes based on the data to be displayed (including time), the ability to combine data from different files, and the ability to utilize the program either interactively or in batch mode, thereby enhancing automation. Another important feature of VMPLOT not visible to the user is the software engineering philosophies utilized. A layered approach was used to isolate program functionality to different layers. This was done to increase program portability to different platforms and to ease maintenance and enhancements due to changing requirements. The functionality of the unique features of VMPLOT as well as highlighting the algorithms that make these features possible are described. The software engineering philosophies used in the creation of the software tool are also summarized.

  5. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

  6. Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 ?m) and regression tree analysis

    USGS Publications Warehouse

    Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus

    2013-01-01

    This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 ?m wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 ?m region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

  7. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  8. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  9. TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS

    SciTech Connect

    Da Rio, Nicola; Robberto, Massimo

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  10. Exploiting polypharmacology for drug target deconvolution

    PubMed Central

    Gujral, Taranjit Singh; Peshkin, Leonid; Kirschner, Marc W.

    2014-01-01

    Polypharmacology (action of drugs against multiple targets) represents a tempting avenue for new drug development; unfortunately, methods capable of exploiting the known polypharmacology of drugs for target deconvolution are lacking. Here, we present an ensemble approach using elastic net regularization combined with mRNA expression profiling and previously characterized data on a large set of kinase inhibitors to identify kinases that are important for epithelial and mesenchymal cell migration. By profiling a selected optimal set of 32 kinase inhibitors in a panel against six cell lines, we identified cell type-specific kinases that regulate cell migration. Our discovery of several informative kinases with a previously uncharacterized role in cell migration (such as Mst and Taok family of MAPK kinases in mesenchymal cells) may represent novel targets that warrant further investigation. Target deconvolution using our ensemble approach has the potential to aid in the rational design of more potent but less toxic drug combinations. PMID:24707051

  11. Estimating missing information by maximum likelihood deconvolution.

    PubMed

    Heintzmann, Rainer

    2007-01-01

    The ability of iteratively constrained maximum likelihood (ML) deconvolution to reconstruct out-of-band information is discussed and exemplified by simulations. The frequency dependent relative energy regain, a novel way of quantifying the reconstruction ability, is introduced. The positivity constraint of ML deconvolution allows reconstructing information outside the spatial frequency bandwidth which is set by the optical system. This is demonstrated for noise-free and noisy data. It is also shown that this property depends on the type of object under investigation. An object is constructed where no significant out-of-band reconstruction is possible. It is concluded that in practical situations the amount of possible out-of-band reconstruction depends on the agreement between reality and the model describing "typical objects" incorporated into the algorithm by appropriate penalty functions. PMID:16914319

  12. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  13. The Smooth Decomposition as a nonlinear modal analysis tool

    NASA Astrophysics Data System (ADS)

    Bellizzi, Sergio; Sampaio, Rubens

    2015-12-01

    The Smooth Decomposition (SD) is a statistical analysis technique for finding structures in an ensemble of spatially distributed data such that the vector directions not only keep the maximum possible variance but also the motions, along the vector directions, are as smooth in time as possible. In this paper, the notion of the dual smooth modes is introduced and used in the framework of oblique projection to expand a random response of a system. The dual modes define a tool that transforms the SD in an efficient modal analysis tool. The main properties of the SD are discussed and some new optimality properties of the expansion are deduced. The parameters of the SD give access to modal parameters of a linear system (mode shapes, resonance frequencies and modal energy participations). In case of nonlinear systems, a richer picture of the evolution of the modes versus energy can be obtained analyzing the responses under several excitation levels. This novel analysis of a nonlinear system is illustrated by an example.

  14. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  15. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  16. Blind deconvolution of images using neural networks

    NASA Astrophysics Data System (ADS)

    Steriti, Ronald J.; Fiddy, Michael A.

    1994-07-01

    In this paper we consider the blind deconvolution of an image from an unknown blurring function using a technique employing two nested Hopfield neural networks. This iterative method consists of two steps, first estimating the blurring function followed by the use of this function to estimate the original image. The successive inter-linked energy minimizations are found to converge in practice although a convergence proof has not yet been established.

  17. Deconvolution/identification techniques for nonnegative signals

    SciTech Connect

    Goodman, D.M.; Yu, D.R.

    1991-11-01

    Several methods for solving the nonparametric deconvolution/identification problem when the unknown is nonnegative are presented. First we consider the constrained least squares method and discuss three ways to estimate the regularization parameter: the discrepancy principle, Mallow`s C{sub L}, and generalized cross validation. Next we consider maximum entropy methods. Last, we present a new conjugate gradient algorithm. A preliminary comparison is presented; detailed Monte-Carlo experiments will be presented at the conference. 13 refs.

  18. Deconvolution of diode-laser spectra

    NASA Technical Reports Server (NTRS)

    Halsey, G. W.; Jennings, D. E.; Blass, W. E.

    1985-01-01

    A new technique has been developed for deconvolving diode-laser spectra. This technique treats Doppler broadening, collisional broadening, and instrumental effects simultaneously. This technique is superior to previous deconvolution methods in the recovery of line-strength and transition-frequency information. A section of the ethane spectrum near 12 microns is used as an example. This new approach applies to any spectroscopy in which the instrumental resolution is narrower than actual linewidths.

  19. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  20. Using conjoint analysis as a program evaluation tool

    SciTech Connect

    Moe, R.; Dion, S.

    1994-11-01

    While conjoint analysis has typically been applied in utility market research to estimate penetration of utility programs, help identify optimal program design features, and measure the trade-offs customers make when evaluating utility program options, it has seldom been used as a program evaluation tool. This paper discusses the use of conjoint analysis to estimate free ridership rates in evaluations of two utility DSM programs: a residential high efficiency heat pump/central air conditioning program and a low-flow showerhead program. The two studies incorporated different approaches for data collection and data analysis. The first study used a phone-mail approach to collect the data, a ranking method for scoring, and a SAS program for analysis; the second study collected data through in-person interviews, used a rating method for scoring, and Bretton Clark software for data analysis. The paper describes the design and results of both conjoint studies and how the standard conjoint analysis outputs were used to estimate the free rider rate for each program and, for one program, predict the penetration of high-efficiency models under alternative rebate structures. It also presents the results of both analyses and compares the estimated free rider rates to those derived for the same programs using other methods.

  1. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity. PMID:16675027

  2. Integrated network analysis and effective tools in plant systems biology

    PubMed Central

    Fukushima, Atsushi; Kanaya, Shigehiko; Nishida, Kozo

    2014-01-01

    One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1) network visualization tools, (2) pathway analyses, (3) genome-scale metabolic reconstruction, and (4) the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms. PMID:25408696

  3. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    SciTech Connect

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  4. Reconstructing the Genomic Content of Microbiome Taxa through Shotgun Metagenomic Deconvolution

    E-print Network

    Borenstein, Elhanan

    Reconstructing the Genomic Content of Microbiome Taxa through Shotgun Metagenomic Deconvolution of annotation errors on the reconstructed genomes. We finally apply metagenomic deconvolution to samples from capture genus-specific properties. With the accumulation of metagenomic data, this deconvolution framework

  5. EFFICIENT DECONVOLUTION AND SPATIAL RESOLUTION ENHANCEMENT FROM CONTINUOUS AND OVERSAMPLED OBSERVATIONS IN MICROWAVE IMAGERY

    E-print Network

    Ferguson, Thomas S.

    EFFICIENT DECONVOLUTION AND SPATIAL RESOLUTION ENHANCEMENT FROM CONTINUOUS AND OVERSAMPLED In this paper, we develop efficient deconvolution and super- resolution methodologies and apply these techniques Bregman deconvolution that reduces image ringing while sharpening the image and preserving information

  6. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  7. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  8. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  9. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C; Temple, Brian A; Buescher, Kevin L

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  10. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  11. The LAGRANTO Lagrangian analysis tool - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-08-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities. Trajectory starting positions can be defined easily and flexibly based on different geometrical and/or meteorological conditions, e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels. After the computation of the trajectories, a versatile selection of trajectories is offered based on single or combined criteria. These criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity, PV, greater than 2 PVU; 1 PVU = 10-6 K m2 kg-1 s-1). Full versions of this new version of LAGRANTO are available for global ECMWF and regional COSMO data, and core functionality is provided for the regional WRF and MetUM models and the global 20th Century Reanalysis data set. The paper first presents the intuitive application of LAGRANTO for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO can be used to quasi-operationally diagnose stratosphere-troposphere exchange events. Whereas these examples rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution serve to resolve the rather complex flow structure associated with orographic blocking due to the Alps, as shown in a third example. A final example illustrates the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes auxiliary tools, e.g., to visualize trajectories. A detailed user guide describes all LAGRANTO capabilities.

  12. GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS

    SciTech Connect

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  13. Discovery of protein acetylation patterns by deconvolution of peptide isomer mass spectra.

    PubMed

    Abshiru, Nebiyu; Caron-Lizotte, Olivier; Rajan, Roshan Elizabeth; Jamai, Adil; Pomies, Christelle; Verreault, Alain; Thibault, Pierre

    2015-01-01

    Protein post-translational modifications (PTMs) play important roles in the control of various biological processes including protein-protein interactions, epigenetics and cell cycle regulation. Mass spectrometry-based proteomics approaches enable comprehensive identification and quantitation of numerous types of PTMs. However, the analysis of PTMs is complicated by the presence of indistinguishable co-eluting isomeric peptides that result in composite spectra with overlapping features that prevent the identification of individual components. In this study, we present Iso-PeptidAce, a novel software tool that enables deconvolution of composite MS/MS spectra of isomeric peptides based on features associated with their characteristic fragment ion patterns. We benchmark Iso-PeptidAce using dilution series prepared from mixtures of known amounts of synthetic acetylated isomers. We also demonstrate its applicability to different biological problems such as the identification of site-specific acetylation patterns in histones bound to chromatin assembly factor-1 and profiling of histone acetylation in cells treated with different classes of HDAC inhibitors. PMID:26468920

  14. Discovery of protein acetylation patterns by deconvolution of peptide isomer mass spectra

    PubMed Central

    Abshiru, Nebiyu; Caron-Lizotte, Olivier; Rajan, Roshan Elizabeth; Jamai, Adil; Pomies, Christelle; Verreault, Alain; Thibault, Pierre

    2015-01-01

    Protein post-translational modifications (PTMs) play important roles in the control of various biological processes including protein–protein interactions, epigenetics and cell cycle regulation. Mass spectrometry-based proteomics approaches enable comprehensive identification and quantitation of numerous types of PTMs. However, the analysis of PTMs is complicated by the presence of indistinguishable co-eluting isomeric peptides that result in composite spectra with overlapping features that prevent the identification of individual components. In this study, we present Iso-PeptidAce, a novel software tool that enables deconvolution of composite MS/MS spectra of isomeric peptides based on features associated with their characteristic fragment ion patterns. We benchmark Iso-PeptidAce using dilution series prepared from mixtures of known amounts of synthetic acetylated isomers. We also demonstrate its applicability to different biological problems such as the identification of site-specific acetylation patterns in histones bound to chromatin assembly factor-1 and profiling of histone acetylation in cells treated with different classes of HDAC inhibitors. PMID:26468920

  15. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution in UDECON is achieved by searching for the minimum ABIC while shifting the sensor response (to account for possible mispositioning of the sample on the tray) and a smoothness parameter in ranges defined by user. Comparison of deconvolution results using sensor response estimated from integrated point source measurements and other methods suggest that the integrated point source estimate yields better results (smaller ABIC). The noise characteristics of magnetometer measurements and the reliability of the UDECON algorithm were tested using repeated (a total of 400 times) natural remanence measurement of a u-channel sample before and after stepwise alternating field demagnetizations. Using a series of synthetic data constructed based on real paleomagnetic record, we demonstrate that optimized deconvolution using UDECON can greatly help revealing detailed paleomagnetic information such as excursions that may be smoothed out during pass-through measurement. Application of UDECON to the vast amount of existing and future pass-through paleomagnetic and rock magnetic measurements on sediments recovered especially through ocean drilling programs will contribute to our understanding of the geodynamo and paleo-environment by providing more detailed records of geomagnetic and environmental changes.

  16. ELECTRA © Launch and Re-Entry Safety Analysis Tool

    NASA Astrophysics Data System (ADS)

    Lazare, B.; Arnal, M. H.; Aussilhou, C.; Blazquez, A.; Chemama, F.

    2010-09-01

    French Space Operation Act gives as prime objective to National Technical Regulations to protect people, properties, public health and environment. In this frame, an independent technical assessment of French space operation is delegated to CNES. To perform this task and also for his owns operations CNES needs efficient state-of-the-art tools for evaluating risks. The development of the ELECTRA© tool, undertaken in 2007, meets the requirement for precise quantification of the risks involved in launching and re-entry of spacecraft. The ELECTRA© project draws on the proven expertise of CNES technical centers in the field of flight analysis and safety, spaceflight dynamics and the design of spacecraft. The ELECTRA© tool was specifically designed to evaluate the risks involved in the re-entry and return to Earth of all or part of a spacecraft. It will also be used for locating and visualizing nominal or accidental re-entry zones while comparing them with suitable geographic data such as population density, urban areas, and shipping lines, among others. The method chosen for ELECTRA© consists of two main steps: calculating the possible reentry trajectories for each fragment after the spacecraft breaks up; calculating the risks while taking into account the energy of the fragments, the population density and protection afforded by buildings. For launch operations and active re-entry, the risk calculation will be weighted by the probability of instantaneous failure of the spacecraft and integrated for the whole trajectory. ELECTRA©’s development is today at the end of the validation phase, last step before delivery to users. Validation process has been performed in different ways: numerical application way for the risk formulation; benchmarking process for casualty area, level of energy of the fragments entries and level of protection housing module; best practices in space transportation industries concerning dependability evaluation; benchmarking process for world population repartition leading to the choice of a worldwide used model called GPW V3. Then, the complementary part for validation has been numerous system tests, most of them by comparison with already existing tools, operationally used for example into the European Space port in French Guyana. The purpose of this article is to review the method and models chosen by CNES for describing physical phenomena and the results of validation process including comparison with other risk assessment tools.

  17. Least-squares (LS) deconvolution of a series of overlapping cortical auditory evoked potentials: a simulation and experimental study

    NASA Astrophysics Data System (ADS)

    Bardy, Fabrice; Van Dun, Bram; Dillon, Harvey; Cowan, Robert

    2014-08-01

    Objective. To evaluate the viability of disentangling a series of overlapping ‘cortical auditory evoked potentials’ (CAEPs) elicited by different stimuli using least-squares (LS) deconvolution, and to assess the adaptation of CAEPs for different stimulus onset-asynchronies (SOAs). Approach. Optimal aperiodic stimulus sequences were designed by controlling the condition number of matrices associated with the LS deconvolution technique. First, theoretical considerations of LS deconvolution were assessed in simulations in which multiple artificial overlapping responses were recovered. Second, biological CAEPs were recorded in response to continuously repeated stimulus trains containing six different tone-bursts with frequencies 8, 4, 2, 1, 0.5, 0.25 kHz separated by SOAs jittered around 150 (120-185), 250 (220-285) and 650 (620-685) ms. The control condition had a fixed SOA of 1175 ms. In a second condition, using the same SOAs, trains of six stimuli were separated by a silence gap of 1600 ms. Twenty-four adults with normal hearing (<20 dB HL) were assessed. Main results. Results showed disentangling of a series of overlapping responses using LS deconvolution on simulated waveforms as well as on real EEG data. The use of rapid presentation and LS deconvolution did not however, allow the recovered CAEPs to have a higher signal-to-noise ratio than for slowly presented stimuli. The LS deconvolution technique enables the analysis of a series of overlapping responses in EEG. Significance. LS deconvolution is a useful technique for the study of adaptation mechanisms of CAEPs for closely spaced stimuli whose characteristics change from stimulus to stimulus. High-rate presentation is necessary to develop an understanding of how the auditory system encodes natural speech or other intrinsically high-rate stimuli.

  18. Design and Analysis Tool for External-Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2012-01-01

    A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

  19. Assessing Extremes Climatology Using NWS Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

    2010-12-01

    The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices’ ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of tornadoes, flash floods, storminess, extreme weather events, etc. LCAT will expand the suite of NWS climate products. The LCAT development utilizes NWS Operations and Services Improvement Process (OSIP) to document the field and user requirements, develop solutions, and prioritize resources. OSIP is a five work-stage process separated by four gate reviews. LCAT is currently at work-stage three: Research Demonstration and Solution Analysis. Gate 1 and 2 reviews identified LCAT as a high strategic priority project with a very high operational need. The Integrated Working Team, consisting of NWS field representatives, assists in tool function design and identification of LCAT operational deployment support.

  20. Spatial deconvolution of spectropolarimetric data: an application to quiet Sun magnetic elements

    NASA Astrophysics Data System (ADS)

    Quintero Noda, C.; Asensio Ramos, A.; Orozco Suárez, D.; Ruiz Cobo, B.

    2015-07-01

    Context. One of the difficulties in extracting reliable information about the thermodynamical and magnetic properties of solar plasmas from spectropolarimetric observations is the presence of light dispersed inside the instruments, known as stray light. Aims: We aim to analyze quiet Sun observations after the spatial deconvolution of the data. We examine the validity of the deconvolution process with noisy data as we analyze the physical properties of quiet Sun magnetic elements. Methods: We used a regularization method that decouples the Stokes inversion from the deconvolution process, so that large maps can be quickly inverted without much additional computational burden. We applied the method on Hinode quiet Sun spectropolarimetric data. We examined the spatial and polarimetric properties of the deconvolved profiles, comparing them with the original data. After that, we inverted the Stokes profiles using the Stokes Inversion based on Response functions (SIR) code, which allow us to obtain the optical depth dependence of the atmospheric physical parameters. Results: The deconvolution process increases the contrast of continuum images and makes the magnetic structures sharper. The deconvolved Stokes I profiles reveal the presence of the Zeeman splitting while the Stokes V profiles significantly change their amplitude. The area and amplitude asymmetries of these profiles increase in absolute value after the deconvolution process. We inverted the original Stokes profiles from a magnetic element and found that the magnetic field intensity reproduces the overall behavior of theoretical magnetic flux tubes, that is, the magnetic field lines are vertical in the center of the structure and start to fan when we move far away from the center of the magnetic element. The magnetic field vector inferred from the deconvolved Stokes profiles also mimic a magnetic flux tube but in this case we found stronger field strengths and the gradients along the line-of-sight are larger for the magnetic field intensity and for its inclination. Moreover, the discontinuity between the magnetic and non magnetic environment in the flux tube gets sharper. Conclusions: The deconvolution process used in this paper reveals information that the smearing induced by the point spread function (PSF) of the telescope hides. Additionally, the deconvolution is done with a low computational load, making it appealing for its use on the analysis of large data sets. A copy of the IDL code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/579/A3

  1. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  2. Spasmodic dysphonia, perceptual and acoustic analysis: presenting new diagnostic tools.

    PubMed

    Siemons-Lühring, Denise Irene; Moerman, Mieke; Martens, Jean-Pierre; Deuster, Dirk; Müller, Frank; Dejonckere, Philippe

    2009-12-01

    In this article, we investigate whether (1) the IINFVo (Impression, Intelligibility, Noise, Fluency and Voicing) perceptual rating scale and (2) the AMPEX (Auditory Model Based Pitch Extractor) acoustical analysis are suitable for evaluating adductor spasmodic dysphonia (AdSD). Voice recordings of 12 patients were analysed. The inter-rater and intra-rater consistency showed highly significant correlations for the IINFVo rating scale, with the exception of the parameter Noise. AMPEX reliably analyses vowels (correlation between PUVF (percentage of frames with unreliable F0/voicing 0.748), running speech (correlation between PVF (percentage of voiced frames)/voicing 0.699) and syllables. Correlations between IINFVo and AMPEX range from 0.608 to 0.818, except for noise. This study indicates that IINFVo and AMPEX could be robust and complementary assessment tools for the evaluation of AdSD. Both the tools provide us with the valuable information about voice quality, stability of F0 (fundamental frequency) and specific dimensions controlling the transitions between voiced and unvoiced segments. PMID:19866529

  3. Developments in microcomputer network analysis tools for transportation planning

    SciTech Connect

    Lewis, S.; McNeil, S.

    1986-10-01

    This article describes a range of available transportation network analysis tools. Recent developments have included the adding of junction capacity restrained versions, which give more appropriate use in urban areas (e.g., JAM), and the development of more complete packages (e.g., NEXUS and EMME/2). Although many of the existing software tools allow for the incorporation of fixed delays at junctions, they do not attempt to realistically model the dynamic and interactive effects of delays at junctions on one another. Such effects may be typically dominant in a congested urban network, though may be less important in an out-of-town development with spare capacity on the highway network. The fact that many U.S. developments have taken place in locations where highway capacity was available or the opportunity for major reconstruction existed may partly explain the absence of dynamic models in software developed in the United States. The increased availability of microcomputers and the decreasing cost of such processing power, as well as the fact that more U.S. site developments are occurring in highly congested situations, with limited funds for major new road building and improvements (particularly compared to anticipated demand), indicate strong reason for the continued development and use of models such as JAM, NEXUS, and EMME/2 in the United States. Such models may encourage more effective design, build, and operation strategies to be pursued.

  4. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  5. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  6. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  7. A Freeware Java Tool for Spatial Point Analysis of Neuronal Barry G. Condron

    E-print Network

    Condron, Barry

    NEWS ITEM A Freeware Java Tool for Spatial Point Analysis of Neuronal Structures Barry G. Condron, a freeware tool, called PAJ, has been developed. This Java-based tool takes 3D Cartesian coordinates as input in Java that is based on previously described statistical analysis (Diggle 2003). In PAJ, data is copied

  8. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  9. Sensitivity analysis of an information fusion tool: OWA operator

    NASA Astrophysics Data System (ADS)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  10. Input Range Testing for the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  11. In silico tools for the analysis of antibiotic biosynthetic pathways.

    PubMed

    Weber, Tilmann

    2014-05-01

    Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data. PMID:24631213

  12. IMPROVED TEMPORAL AND SPATIAL FOCUSING USING DECONVOLUTION: THEORETICAL, NUMERICAL AND

    E-print Network

    Snieder, Roel

    for Materials Research and Testing (BAM). For this experiment, we placed multiple sources within a concrete the improved spatial focus achieved using deconvolution by scanning around the source location with a laser showed that deconvolution was able to dramatically improve the temporal focus for a source and a receiver

  13. MULTICHANNEL BLIND SEPARATION AND DECONVOLUTION OF SOURCES WITH

    E-print Network

    Cichocki, Andrzej

    MULTICHANNEL BLIND SEPARATION AND DECONVOLUTION OF SOURCES WITH ARBITRARY DISTRIBUTIONS Scott C, RIKEN Wako-shi, Saitama 351-01 JAPAN Abstract{ Blind deconvolution and separation of linearly mixed simulations show the validity and e ciency of our method to blindly extract mixtures of arbitrary

  14. BLIND SIGNAL DECONVOLUTION BY SPATIO-TEMPORAL DECORRELATION AND

    E-print Network

    Cichocki, Andrzej

    BLIND SIGNAL DECONVOLUTION BY SPATIO-TEMPORAL DECORRELATION AND DEMIXING Seungjin CHOI and Andrzej-line adaptive multichannel blind deconvolution and sepa- ration of i.i.d. sources. Under mild conditions which consists of blind equalization and source separation. In blind equalization stage, we employ anti

  15. BLIND DECONVOLUTION WITH MINIMUM RENYI'S ENTROPY Deniz Erdogmus1

    E-print Network

    Slatton, Clint

    BLIND DECONVOLUTION WITH MINIMUM RENYI'S ENTROPY Deniz Erdogmus1 , Jose C. Principe1 , Luis Vielva2-mail: [deniz , principe]@cnel.ufl.edu, luis@dicom.unican.es ABSTRACT Blind techniques attract the attention, from communications to control systems. Blind deconvolution is a problem that has been investigated

  16. Immunoglobulin analysis tool: a novel tool for the analysis of human and mouse heavy and light chain transcripts.

    PubMed

    Rogosch, Tobias; Kerzel, Sebastian; Hoi, Kam Hon; Zhang, Zhixin; Maier, Rolf F; Ippolito, Gregory C; Zemlin, Michael

    2012-01-01

    Sequence analysis of immunoglobulin (Ig) heavy and light chain transcripts can refine categorization of B cell subpopulations and can shed light on the selective forces that act during immune responses or immune dysregulation, such as autoimmunity, allergy, and B cell malignancy. High-throughput sequencing yields Ig transcript collections of unprecedented size. The authoritative web-based IMGT/HighV-QUEST program is capable of analyzing large collections of transcripts and provides annotated output files to describe many key properties of Ig transcripts. However, additional processing of these flat files is required to create figures, or to facilitate analysis of additional features and comparisons between sequence sets. We present an easy-to-use Microsoft(®) Excel(®) based software, named Immunoglobulin Analysis Tool (IgAT), for the summary, interrogation, and further processing of IMGT/HighV-QUEST output files. IgAT generates descriptive statistics and high-quality figures for collections of murine or human Ig heavy or light chain transcripts ranging from 1 to 150,000 sequences. In addition to traditionally studied properties of Ig transcripts - such as the usage of germline gene segments, or the length and composition of the CDR-3 region - IgAT also uses published algorithms to calculate the probability of antigen selection based on somatic mutational patterns, the average hydrophobicity of the antigen-binding sites, and predictable structural properties of the CDR-H3 loop according to Shirai's H3-rules. These refined analyses provide in-depth information about the selective forces acting upon Ig repertoires and allow the statistical and graphical comparison of two or more sequence sets. IgAT is easy to use on any computer running Excel(®) 2003 or higher. Thus, IgAT is a useful tool to gain insights into the selective forces and functional properties of small to extremely large collections of Ig transcripts, thereby assisting a researcher to mine a data set to its fullest. PMID:22754554

  17. A fast approach to identification using deconvolution

    NASA Technical Reports Server (NTRS)

    Chi, C.-Y.; Mendel, J. M.

    1983-01-01

    In this paper, we propose a fast approach to impulse response and noise-variance identification for a finite-order, linear, time-invariant, single-input/single-output system, whose input driving noise is white (stationary or nonstationary) and measurement noise is stationary, white and Gaussian. Our algorithm is an iterative block component method that includes two stages, deconvolution and prediction-error identification. Experiences with our method indicate that it works well and saves about an order of magnitude in computation. Analyses and examples are given in this paper to support this claim.

  18. EVA - An Interactive Online Tool for Extreme Value Analysis

    NASA Astrophysics Data System (ADS)

    Zingerle, C.; Buchauer, M.; Neururer, A.; Schellander, H.

    2009-09-01

    Forecasting and analysing extreme events and their impact is a duty of operational forecasters, though it happens not very frequently. In such situations forecasters often rely on a synopsis of different forecast models, own experience, historical observations and intuition. Especially historical data are usually not available at the entirety and timeliness needed in operational forecasting and warning. A forecaster needs a comprehensive overview. He has no time to dig data from a database, search for extremes and compile a rather complicated extreme value analysis on the data. On the other hand in the field of engineering expertise on extreme events is often asked from a modern weather service and in a lot of cases time for elaboration is limited. EVA (Extreme Value Analysis) was developed at ZAMG during METEORISK, a project among alpine weather- and hydrological services dealing with meteorological and hydrological risks. The EVA system consists of two main components: An effective database containing pre-processed precipitation data (rain, snow and snow height) from meteorological events of durations from 1 minute up to 15 days measured at each station in the partner regions. The second part of the system is a set of web-tools to deal with the actual extreme value analysis. Different theoretical models can be chosen to calculate annualities. Presentation of the output is either tabular showing all extreme events at a station together with the theoretically calculated return times, or graphical where parameters like precipitation amount at certain return times and confidence intervals are plotted together with the empirical distribution of the actual measurements. Additional plots (quantile-quantile plots, empirical and fitted theoretical distribution model) allowing a more detailed assessment of the extreme value analysis can be requested. To complete analysis of a special extreme event ECMWF ERA40 sea level and upper air pressure fields and temperature distribution are available within the system. During the years after Meteorisk, the EVA System has been expanded by ZAMG adding further parameters like wind speed and temperature. The system has lately been harmonized, so that ZAMG has now only one platform providing fast extreme value analysis for all kind of interesting meteorological parameter. A further development is the EVA-maps application. Forecasted extreme events at station locations and actual measurements are compared to historical extreme events. Return times of the forecasted and measured events are classified and displayed in a map. A mouse-over menu offers detailed analysis of the situation at each station. EVA-maps is a powerful assistance to the forecasters, where they get a comprehensive overview of forecasted precipitation in relation to extreme events of the past.

  19. Comparative Usability Study of Two Space Logistics Analysis Tools

    E-print Network

    Lee, Chaiwoo

    Future space exploration missions and campaigns will require sophisticated tools to help plan and analyze logistics. To encourage their use, space logistics tools must be usable: a design concept encompassing terms such ...

  20. jSIPRO - analysis tool for magnetic resonance spectroscopic imaging.

    PubMed

    Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan

    2013-10-01

    Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

  1. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  2. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  3. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  4. Orienting the Neighborhood: A Subdivision Energy Analysis Tool

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-01-01

    In subdivisions, house orientations are largely determined by street layout. The resulting house orientations affect energy consumption (annual and on-peak) for heating and cooling, depending on window area distributions and shading from neighboring houses. House orientations also affect energy production (annual and on-peak) from solar thermal and photovoltaic systems, depending on available roof surfaces. Therefore, house orientations fundamentally influence both energy consumption and production, and an appropriate street layout is a prerequisite for taking full advantage of energy efficiency and renewable energy opportunities. The potential influence of street layout on solar performance is often acknowledged, but solar and energy issues must compete with many other criteria and constraints that influence subdivision street layout. When only general guidelines regarding energy are available, these factors may be ignored or have limited effect. Also, typical guidelines are often not site-specific and do not account for local parameters such as climate and the time value of energy. For energy to be given its due consideration in subdivision design, energy impacts need to be accurately quantified and displayed interactively to facilitate analysis of design alternatives. This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  5. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

  6. The Watershed Deposition Tool: A Tool for Incorporating Atmospheric Deposition in Watershed Analysis

    EPA Science Inventory

    The tool for providing the linkage between air and water quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. The Watershed Deposition Tool (WDT) takes gridded output of at...

  7. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  8. Coupled lattice neural network for blind deconvolution

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Chen, Yen-Wei; Nakao, Zensho

    1998-04-01

    In the research, we introduced an artificial neural network model named as coupled lattice neural network to reconstruct an original image from a degrade one in the blind deconvolution, where the original image and blurring function are not known. In the coupled lattice neural network, each neuron connects with its nearest neighbor neurons, the neighborhood corresponds to the weights of the neural network and is defined by a finite domain. Outputs of neurons shows the intensity distribution of an estimated original image. Weights of each neuron correspond to an estimated blur function and are the same for different neurons. The coupled lattice neural network includes two main operations, one is a nearest neighbor coupling or diffusion, the other is a local nonlinear reflection and learning. First a rule for a blur function growing is introduced. Then the coupled lattice neural network implements an estimated original image evolving based on an estimated blur function. Moreover we define a growing error criterion to control the evolution of the coupled lattice neural network. Whenever the error criterion is minimized, the coupled neural network gets stable, then outputs of the neural network correspond to the reconstructed original image, the weights are the blur function. In addition we demonstrate a method for the option of initial state variables of the coupled lattice neural network. The new approach to blind deconvolution can recover a digital binary image successful. Moreover the coupled lattice neural network can be used in the reconstruction of a gray-scale image.

  9. ATHENA: the analysis tool for heritable and environmental network associations

    PubMed Central

    Holzinger, Emily R.; Dudek, Scott M.; Frase, Alex T.; Pendergrass, Sarah A.; Ritchie, Marylyn D.

    2014-01-01

    Motivation: Advancements in high-throughput technology have allowed researchers to examine the genetic etiology of complex human traits in a robust fashion. Although genome-wide association studies have identified many novel variants associated with hundreds of traits, a large proportion of the estimated trait heritability remains unexplained. One hypothesis is that the commonly used statistical techniques and study designs are not robust to the complex etiology that may underlie these human traits. This etiology could include non-linear gene × gene or gene × environment interactions. Additionally, other levels of biological regulation may play a large role in trait variability. Results: To address the need for computational tools that can explore enormous datasets to detect complex susceptibility models, we have developed a software package called the Analysis Tool for Heritable and Environmental Network Associations (ATHENA). ATHENA combines various variable filtering methods with machine learning techniques to analyze high-throughput categorical (i.e. single nucleotide polymorphisms) and quantitative (i.e. gene expression levels) predictor variables to generate multivariable models that predict either a categorical (i.e. disease status) or quantitative (i.e. cholesterol levels) outcomes. The goal of this article is to demonstrate the utility of ATHENA using simulated and biological datasets that consist of both single nucleotide polymorphisms and gene expression variables to identify complex prediction models. Importantly, this method is flexible and can be expanded to include other types of high-throughput data (i.e. RNA-seq data and biomarker measurements). Availability: ATHENA is freely available for download. The software, user manual and tutorial can be downloaded from http://ritchielab.psu.edu/ritchielab/software. Contact: marylyn.ritchie@psu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24149050

  10. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

  11. Deconvolution of Complex 1D NMR Spectra Using Objective Model Selection

    PubMed Central

    Hughes, Travis S.; Wilson, Henry D.; de Vera, Ian Mitchelle S.; Kojetin, Douglas J.

    2015-01-01

    Fluorine (19F) NMR has emerged as a useful tool for characterization of slow dynamics in 19F-labeled proteins. One-dimensional (1D) 19F NMR spectra of proteins can be broad, irregular and complex, due to exchange of probe nuclei between distinct electrostatic environments; and therefore cannot be deconvoluted and analyzed in an objective way using currently available software. We have developed a Python-based deconvolution program, decon1d, which uses Bayesian information criteria (BIC) to objectively determine which model (number of peaks) would most likely produce the experimentally obtained data. The method also allows for fitting of intermediate exchange spectra, which is not supported by current software in the absence of a specific kinetic model. In current methods, determination of the deconvolution model best supported by the data is done manually through comparison of residual error values, which can be time consuming and requires model selection by the user. In contrast, the BIC method used by decond1d provides a quantitative method for model comparison that penalizes for model complexity helping to prevent over-fitting of the data and allows identification of the most parsimonious model. The decon1d program is freely available as a downloadable Python script at the project website (https://github.com/hughests/decon1d/). PMID:26241959

  12. Matrix of Response Functions for Deconvolution of Gamma-ray Spectra

    NASA Astrophysics Data System (ADS)

    Shustov, A. E.; Ulin, S. E.

    An approach for creation the response functions'matrix for the xenon gamma-ray detector is discussed. A set of gamma-ray spectra was obtained by Geant4 simulation to generate the matrix. Iterative algorithms used allow to deconvolve and restore initial gamma-ray spectra. Processed spectrum contains peaks that help to identify and estimate a activity of a radioactive source. Results and analysis of experimental spectra deconvolution are shown.

  13. Using deconvolution to improve the metrological performance of the grid method

    E-print Network

    Sur, Frédéric

    Using deconvolution to improve the metrological performance of the grid method Michel Gr´ediac1 73 28 80 27, michel.grediac@univ-bpclermont.fr Abstract The use of various deconvolution techniques to perform deconvolution. Six deconvolution techniques are presented and employed to restore a synthetic

  14. ALLPASS VS. UNIT-NORM CONSTRAINTS IN CONTRAST-BASED BLIND DECONVOLUTION

    E-print Network

    Douglas, Scott C.

    ALLPASS VS. UNIT-NORM CONSTRAINTS IN CONTRAST-BASED BLIND DECONVOLUTION Scott C. Douglas 1 and S-channel blind deconvolution tasks without modi#12;cation. For prewhitened signals, however, an allpass blind deconvolution tasks. I. INTRODUCTION The blind deconvolution task #12;gures prominently in many

  15. The Starlet Transform in Astronomical Data Processing: Application to Source Detection and Image Deconvolution

    E-print Network

    Starck, Jean-Luc

    Deconvolution J.-L. Starck , F. Murtagh , M. Bertero August 5, 2011 Contents 1 Introduction 2 2 Standard Deconvolution 26 5.1 Statistical Approach to Deconvolution . . . . . . . . . . . . . . . . . . . . . . 27 5.2 The Richardson-Lucy Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.3 Deconvolution

  16. HPLC analysis as a tool for assessing targeted liposome composition.

    PubMed

    Oswald, Mira; Platscher, Michael; Geissler, Simon; Goepferich, Achim

    2016-01-30

    Functionalized phospholipids are indispensable materials for the design of targeted liposomes. Control over the quality and quantity of phospholipids is thereby key in the successful development and manufacture of such formulations. This was also the case for a complex liposomal preparation composed of 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC), Cholesterol (CHO), 1,2-distearoyl-sn-glycero-3-phosphoethanolamine-N-[amino(polyethylene glycol)-2000] (DSPE-PEG2000). To this end, an RP-HPLC method was developed. Detection was done via evaporative light scattering (ELS) for liposomal components. The method was validated for linearity, precision, accuracy, sensitivity and robustness. The liposomal compounds had a non-linear quadratic response in the concentration range of 0.012-0.42mg/ml with a correlation coefficient greater than 0.99 with an accuracy of method confirmed 95-105% of the theoretical concentration. Furthermore, degradation products from the liposomal formulation could be identified. The presented method was successfully implemented as a control tool during the preparation of functionalized liposomes. It underlined the benefit of HPLC analysis of phospholipids during liposome preparation as an easy and rapid control method for the functionalized lipid at each preparation step as well as for the quantification of all components. PMID:26570988

  17. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  18. Analysis of the influence of tool dynamics in diamond turning

    SciTech Connect

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  19. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  20. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc; Bush, Brian; Penev, Michael

    2015-05-12

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  1. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    SciTech Connect

    Melaina, Marc

    2015-04-21

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  2. Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions

    E-print Network

    Halgamuge, Malka N.

    Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions Dung V hack tools. However, beside U3 technology, attackers also have another more flexible alternative, portable application or application virtualization, which allows a wide range of hack tools to be compiled

  3. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  4. Blind deconvolution of two-dimensional complex data

    SciTech Connect

    Ghiglia, D.C.; Romero, L.A.

    1994-01-01

    Inspired by the work of Lane and Bates on automatic multidimensional deconvolution, the authors have developed a systematic approach and an operational code for performing the deconvolution of multiply-convolved two-dimensional complex data sets in the absence of noise. They explain, in some detail, the major algorithmic steps, where noise or numerical errors can cause problems, their approach in dealing with numerical rounding errors, and where special noise-mitigating techniques can be used toward making blind deconvolution practical. Several examples of deconvolved imagery are presented, and future research directions are noted.

  5. A similarity theory of approximate deconvolution models of turbulence

    NASA Astrophysics Data System (ADS)

    Layton, William; Neda, Monika

    2007-09-01

    We apply the phenomenology of homogeneous, isotropic turbulence to the family of approximate deconvolution models proposed by Stolz and Adams. In particular, we establish that the models themselves have an energy cascade with two asymptotically different inertial ranges. Delineation of these gives insight into the resolution requirements of using approximate deconvolution models. The approximate deconvolution model's energy balance contains both an enhanced energy dissipation and a modification to the model's kinetic energy. The modification of the model's kinetic energy induces a secondary energy cascade which accelerates scale truncation. The enhanced energy dissipation completes the scale truncation by reducing the model's micro-scale from the Kolmogorov micro-scale.

  6. Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs

    SciTech Connect

    Howard, M.; Luttman, A.; Fowler, M.

    2014-11-01

    In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

  7. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  8. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, Daniel; Kueppers, Ulrich; Castro, Jon M.; Pacheco, Jose M. R.; Dingwell, Donald B.

    2010-05-01

    The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. Using a syringe, this low-viscosity, transparent glue could be easily applied on the target area. We found that the glue permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  9. Impregnating unconsolidated pyroclastic sequences: A tool for detailed facies analysis

    NASA Astrophysics Data System (ADS)

    Klapper, D.; Kueppers, U.; Castro, J. M.

    2009-12-01

    The interpretation of volcanic eruptions is usually derived from direct observation and the thorough analysis of the deposits. Processes in vent-proximal areas are usually not directly accessible or likely to be obscured. Hence, our understanding of proximal deposits is often limited as they were produced by the simultaneous events stemming from primary eruptive, transportative, and meteorological conditions. Here we present a method that permits for a direct and detailed quasi in-situ investigation of loose pyroclastic units that are usually analysed in the laboratory for their 1) grain-size distribution, 2) componentry, and 3) grain morphology. As the clast assembly is altered during sampling, the genesis of a stratigraphic unit and the relative importance of the above mentioned deposit characteristics is hard to achieve. In an attempt to overcome the possible loss of information during conventional sampling techniques, we impregnated the cleaned surfaces of proximal, unconsolidated units of the 1957-58 Capelinhos eruption on Faial, Azores. During this basaltic, emergent eruption, fluxes in magma rise rate led to a repeated build-up and collapse of tuff cones and consequently to a shift between phreatomagmatic and magmatic eruptive style. The deposits are a succession of generally parallel bedded, cm- to dm-thick layers with a predominantly ashy matrix. The lapilli content is varying gradually; the content of bombs is enriched in discrete layers without clear bomb sags. The sample areas have been cleaned and impregnated with a two-component glue (EPOTEK 301). For approx. 10 * 10 cm, a volume of mixed glue of 20 ml was required. This low-viscosity, transparent glue allowed for an easy application on the target area by means of a syringe and permeated the deposit as deep as 5 mm. After > 24 h, the glue was sufficiently dry to enable the sample to be laid open. This impregnation method renders it possible to cut and polish the sample and investigate grain-size distribution, componentry, and grain morphology in situ in a 2D-plane. In a first step, the sample surface has been scanned and analysed by means of image analysis software (Image J). After that, selected areas were investigated through thin section analysis. We were able to define depositional units in the (sub)-mm scale and the show the varying relative importance of 1) eruptive style, 2) transportation mode, and 3) the influence of wind and (air) humidity. The presented method is an easy and efficient tool for a detailed stratigraphic investigation of unconsolidated pyroclastic units.

  10. Experimental analysis of change detection algorithms for multitooth machine tool fault detection

    NASA Astrophysics Data System (ADS)

    Reñones, Aníbal; de Miguel, Luis J.; Perán, José R.

    2009-10-01

    This paper describes an industrial application of fault diagnosis method for a multitooth machine tool. Different statistical approaches have been used to detect and diagnose insert breakage in multitooth tools based on the analysis of electrical power consumption of the tool drives. Great effort has been made to obtain a robust method, able to avoid any needed re-calibration process, after, for example, a maintenance operation. From the point of view of maintenance costs, these multitooth tools are the most critical part of the machine tools used for mass production in the car industry. These tools integrate different kinds of machining operations and cutting conditions.

  11. The impact of beam deconvolution on noise properties in CMB measurements: Application to Planck LFI

    E-print Network

    Keihänen, E; Lindholm, V; Reinecke, M; Suur-Uski, A -S

    2015-01-01

    We present an analysis of the effects of beam deconvolution on noise properties in CMB measurements. The analysis is built around the artDeco beam deconvolver code. We derive a low-resolution noise covariance matrix that describes the residual noise in deconvolution products, both in harmonic and pixel space. The matrix models the residual correlated noise that remains in time-ordered data after destriping, and the effect of deconvolution on it. To validate the results, we generate noise simulations that mimic the data from the Planck LFI instrument. A $\\chi^2$ test for the full 70 GHz covariance in multipole range $\\ell=0-50$ yields a mean reduced $\\chi^2$ of 1.0037. We compare two destriping options, full and independent destriping, when deconvolving subsets of available data. Full destriping leaves substantially less residual noise, but leaves data sets intercorrelated. We derive also a white noise covariance matrix that provides an approximation of the full noise at high multipoles, and study the properti...

  12. Deconvolution of mixed magnetism in multilayer graphene

    SciTech Connect

    Swain, Akshaya Kumar; Bahadur, Dhirendra

    2014-06-16

    Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

  13. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  14. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  15. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    SciTech Connect

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  16. Application of the Lucy–Richardson Deconvolution Procedure to High Resolution Photoemission Spectra

    SciTech Connect

    Rameau, J.; Yang, H.-B.; Johnson, P.D.

    2010-07-01

    Angle-resolved photoemission has developed into one of the leading probes of the electronic structure and associated dynamics of condensed matter systems. As with any experimental technique the ability to resolve features in the spectra is ultimately limited by the resolution of the instrumentation used in the measurement. Previously developed for sharpening astronomical images, the Lucy-Richardson deconvolution technique proves to be a useful tool for improving the photoemission spectra obtained in modern hemispherical electron spectrometers where the photoelectron spectrum is displayed as a 2D image in energy and momentum space.

  17. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate change. Moreover, Climate Wizard is not a static product, but rather a data analysis framework designed to be used for climate change impact and adaption planning, which can be expanded to include other information, such as downscaled future projections of hydrology, soil moisture, wildfire, vegetation, marine conditions, disease, and agricultural productivity. PMID:20016827

  18. Computational Tools for Parsimony Phylogenetic Analysis of Omics Data.

    PubMed

    Salazar, Jose; Amri, Hakima; Noursi, David; Abu-Asab, Mones

    2015-08-01

    High-throughput assays from genomics, proteomics, metabolomics, and next generation sequencing produce massive omics datasets that are challenging to analyze in biological or clinical contexts. Thus far, there is no publicly available program for converting quantitative omics data into input formats to be used in off-the-shelf robust phylogenetic programs. To the best of our knowledge, this is the first report on creation of two Windows-based programs, OmicsTract and SynpExtractor, to address this gap. We note, as a way of introduction and development of these programs, that one particularly useful bioinformatics inferential modeling is the phylogenetic cladogram. Cladograms are multidimensional tools that show the relatedness between subgroups of healthy and diseased individuals and the latter's shared aberrations; they also reveal some characteristics of a disease that would not otherwise be apparent by other analytical methods. The OmicsTract and SynpExtractor were written for the respective tasks of (1) accommodating advanced phylogenetic parsimony analysis (through standard programs of MIX [from PHYLIP] and TNT), and (2) extracting shared aberrations at the cladogram nodes. OmicsTract converts comma-delimited data tables through assigning each data point into a binary value ("0" for normal states and "1" for abnormal states) then outputs the converted data tables into the proper input file formats for MIX or with embedded commands for TNT. SynapExtractor uses outfiles from MIX and TNT to extract the shared aberrations of each node of the cladogram, matching them with identifying labels from the dataset and exporting them into a comma-delimited file. Labels may be gene identifiers in gene-expression datasets or m/z values in mass spectrometry datasets. By automating these steps, OmicsTract and SynpExtractor offer a veritable opportunity for rapid and standardized phylogenetic analyses of omics data; their model can also be extended to next generation sequencing (NGS) data. We make OmicsTract and SynpExtractor publicly and freely available for non-commercial use in order to strengthen and build capacity for the phylogenetic paradigm of omics analysis. PMID:26230532

  19. Application of the ORIGEN Fallout Analysis Tool and the DELFIC Fallout Planning Tool to National Technical Nuclear Forensics

    SciTech Connect

    Jodoin, Vincent J; Lee, Ronald W; Peplow, Douglas E.; Lefebvre, Jordan P

    2011-01-01

    The objective of this project was to provide a robust fallout analysis and planning tool for the National Technical Nuclear Forensics interagency ground sample collection team. Their application called for a fast-running, portable mission-planning tool for use in response to emerging improvised nuclear device (IND) post-detonation situations. The project met those goals by research and development of models to predict the physical, chemical, and radiological properties of fallout debris. ORNL has developed new graphical user interfaces for two existing codes, the Oak Ridge Isotope Generation (ORIGEN) code and the Defense Land Fallout Interpretive Code (DELFIC). ORIGEN is a validated, radionuclide production and decay code that has been implemented into the Fallout Analysis Tool to predict the fallout source term nuclide inventory after the detonation of an IND. DELFIC is a validated, physics-based, research reference fallout prediction software package. It has been implemented into the Fallout Planning Tool and is used to predict the fractionated isotope concentrations in fallout, particle sizes, fractionation ratios, dose rate, and integrated dose over the planned collection routes - information vital to ensure quality samples for nuclear forensic analysis while predicting dose to the sample collectors. DELFIC contains a particle activity module, which models the radiochemical fractionation of the elements in a cooling fireball as they condense into and onto particles to predict the fractionated activity size distribution for a given scenario. This provides the most detailed physics-based characterization of the fallout source term phenomenology available in an operational fallout model.

  20. Blind deconvolution for ultrasound sequences using a noninverse greedy algorithm.

    PubMed

    Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

    2013-01-01

    The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

  1. Comparative study of some methods in blind deconvolution 

    E-print Network

    Mbarek, Kais

    1995-01-01

    This study presents some techniques used in Blind Deconvolution with emphasis on applications to digital communications. The literature contains many algorithms developed and tested in different situations, but very limited research was conducted...

  2. The discrete Kalman filtering approach for seismic signals deconvolution

    SciTech Connect

    Kurniadi, Rizal; Nurhandoko, Bagus Endar B.

    2012-06-20

    Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

  3. Online Analysis of Wind and Solar Part I: Ramping Tool

    SciTech Connect

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  4. Online Analysis of Wind and Solar Part II: Transmission Tool

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  5. MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies

    E-print Network

    Garcia-Alfaro, Joaquin

    MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies Joaquin.surname@telecom-bretagne.eu Abstract. We present the core functionality of MIRAGE, a management tool for the analysis and deployment policies into network security component con- figurations. In both cases, MIRAGE provides intra

  6. Generalized WebBased Data Analysis Tool for Policy Agendas Data

    E-print Network

    Wolfgang, Paul

    ABSTRACT The Policy Agendas web site includes a data analysis tool that permits selection of the data from. Background The Policy Agendas web site includes a data analysis tool that permits selection of the data from This paper describes the design of a web site that can be used to display and analyze the data collected

  7. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2014-12-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  8. Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

    1999-01-01

    A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

  9. FAILURE PREDICTION AND STRESS ANALYSIS OF MICROCUTTING TOOLS 

    E-print Network

    Chittipolu, Sujeev

    2010-07-14

    -based products are limited because silicon is brittle. Products can be made from other engineering materials and need to be machined in microscale. This research deals with predicting microtool failure by studying spindle runout and tool deflection effects...

  10. GENOME RESOURCES AND COMPARATIVE ANALYSIS TOOLS FOR CARDIOVASCULAR RESEARCH

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Disorders of the cardiovascular (CV) system are often caused by the interaction of genetic and environmental factors that jointly contribute to individual susceptibility. Genomic data and bioinformatics tools generated from genome projects, coupled with functional verification, offer novel approache...

  11. Evaluation of a Surface Exploration Traverse Analysis and Navigation Tool

    E-print Network

    Gilkey, Andrea L.

    SEXTANT is an extravehicular activity (EVA) mission planner tool developed in MATLAB, which computes the most efficient path between waypoints across a planetary surface. The traverse efficiency can be optimized around ...

  12. Novel tools for sequence and epitope analysis of glycosaminoglycans

    E-print Network

    Behr, Jonathan Robert

    2007-01-01

    Our understanding of glycosaminoglycan (GAG) biology has been limited by a lack of sensitive and efficient analytical tools designed to deal with these complex molecules. GAGs are heterogeneous and often sulfated linear ...

  13. An Integrated Traverse Planner and Analysis Tool for Planetary Exploration

    E-print Network

    Johnson, Aaron William

    Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. ...

  14. Deconvolution for three-dimensional acoustic source identification based on spherical harmonics beamforming

    NASA Astrophysics Data System (ADS)

    Chu, Zhigang; Yang, Yang; He, Yansong

    2015-05-01

    Spherical Harmonics Beamforming (SHB) with solid spherical arrays has become a particularly attractive tool for doing acoustic sources identification in cabin environments. However, it presents some intrinsic limitations, specifically poor spatial resolution and severe sidelobe contaminations. This paper focuses on overcoming these limitations effectively by deconvolution. First and foremost, a new formulation is proposed, which expresses SHB's output as a convolution of the true source strength distribution and the point spread function (PSF) defined as SHB's response to a unit-strength point source. Additionally, the typical deconvolution methods initially suggested for planar arrays, deconvolution approach for the mapping of acoustic sources (DAMAS), nonnegative least-squares (NNLS), Richardson-Lucy (RL) and CLEAN, are adapted to SHB successfully, which are capable of giving rise to highly resolved and deblurred maps. Finally, the merits of the deconvolution methods are validated and the relationships of source strength and pressure contribution reconstructed by the deconvolution methods vs. focus distance are explored both with computer simulations and experimentally. Several interesting results have emerged from this study: (1) compared with SHB, DAMAS, NNLS, RL and CLEAN all can not only improve the spatial resolution dramatically but also reduce or even eliminate the sidelobes effectively, allowing clear and unambiguous identification of single source or incoherent sources. (2) The availability of RL for coherent sources is highest, then DAMAS and NNLS, and that of CLEAN is lowest due to its failure in suppressing sidelobes. (3) Whether or not the real distance from the source to the array center equals the assumed one that is referred to as focus distance, the previous two results hold. (4) The true source strength can be recovered by dividing the reconstructed one by a coefficient that is the square of the focus distance divided by the real distance from the source to the array center. (5) The reconstructed pressure contribution is almost not affected by the focus distance, always approximating to the true one. This study will be of great significance to the accurate localization and quantification of acoustic sources in cabin environments.

  15. Joint deconvolution and classification with applications to passive acoustic underwater multipath.

    PubMed

    Anderson, Hyrum S; Gupta, Maya R

    2008-11-01

    This paper addresses the problem of classifying signals that have been corrupted by noise and unknown linear time-invariant (LTI) filtering such as multipath, given labeled uncorrupted training signals. A maximum a posteriori approach to the deconvolution and classification is considered, which produces estimates of the desired signal, the unknown channel, and the class label. For cases in which only a class label is needed, the classification accuracy can be improved by not committing to an estimate of the channel or signal. A variant of the quadratic discriminant analysis (QDA) classifier is proposed that probabilistically accounts for the unknown LTI filtering, and which avoids deconvolution. The proposed QDA classifier can work either directly on the signal or on features whose transformation by LTI filtering can be analyzed; as an example a classifier for subband-power features is derived. Results on simulated data and real Bowhead whale vocalizations show that jointly considering deconvolution with classification can dramatically improve classification performance over traditional methods over a range of signal-to-noise ratios. PMID:19045785

  16. Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.

    PubMed

    Villeneuve, Emma; Carfantan, Hervé

    2014-10-01

    Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

  17. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed instruction tailored to their experience level along with proper support and mentoring.This project was funded by a grant from the National Science Foundation, Grant # PHY1157078.

  18. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  19. Translational meta-analysis tool for temporal gene expression profiles.

    PubMed

    Tusch, Guenter; Tole, Olvi

    2012-01-01

    Widespread use of microarray technology that led to highly complex datasets often is addressing similar or related biological questions. In translational medicine research is often based on measurements that have been obtained at different points in time. However, the researcher looks at them as a progression over time. If a biological stimulus shows an effect on a particular gene that is reversed over time, this would show, for instance, as a peak in the gene's temporal expression profile. Our program SPOT helps researchers find these patterns in large sets of microarray data. We created the software tool using open-source platforms and the Semantic Web tool Protégé-OWL. PMID:22874385

  20. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  1. Tools for Education Policy Analysis [with CD-ROM].

    ERIC Educational Resources Information Center

    Mingat, Alain; Tan, Jee-Peng

    This manual contains a set of tools to assist policymakers in analyzing and revamping educational policy. Its main focus is on some economic and financial aspects of education and selected features in the arrangements for service delivery. Originally offered as a series of training workshops for World Bank staff to work with clients in the…

  2. PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis

    ERIC Educational Resources Information Center

    Waycott, Jenny; Jones, Ann; Scanlon, Eileen

    2005-01-01

    This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key…

  3. Mapping and spatiotemporal analysis tool for hydrological data: Spellmap

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

  4. Improved Cell Typing by Charge-State Deconvolution of matrix-assisted laser desorption/ionization Mass Spectra

    SciTech Connect

    Wilkes, Jon G.; Buzantu, Dan A.; Dare, Diane J.; Dragan, Yvonne P.; Chiarelli, M. Paul; Holland, Ricky D.; Beaudoin, Michael; Heinze, Thomas M.; Nayak, Rajesh; Shvartsburg, Alexandre A.

    2006-05-30

    Robust, specific, and rapid identification of toxic strains of bacteria and viruses, to guide the mitigation of their adverse health effects and optimum implementation of other response actions, remains a major analytical challenge. This need has driven the development of methods for classification of microorganisms using mass spectrometry, particularly matrix-assisted laser desorption ionization MS (MALDI) that allows high throughput analyses with minimum sample preparation. We describe a novel approach to cell typing based on pattern recognition of MALDI spectra, which involves charge-state deconvolution in conjunction with a new correlation analysis procedure. The method is applicable to both prokaryotic and eukaryotic cells. Charge-state deconvolution improves the quantitative reproducibility of spectra because multiply-charged ions resulting from the same biomarker attaching a different number of protons are recognized and their abundances are combined. This allows a clearer distinction of bacterial strains or of cancerous and normal liver cells. Improved class distinction provided by charge-state deconvolution was demonstrated by cluster spacing on canonical variate score charts and by correlation analyses. Deconvolution may enhance detection of early disease state or therapy progress markers in various tissues analyzed by MALDI.

  5. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  6. On the Use of Factor Analysis as a Research Tool.

    ERIC Educational Resources Information Center

    Benson, Jeri; Nasser, Fadia

    1998-01-01

    Discusses the conceptual/theoretical design, statistical, and reporting issues in choosing factor analysis for research. Provides questions to consider when planning, analyzing, or reporting an exploratory factor analysis study. (SK)

  7. MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis

    SciTech Connect

    Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

    2013-02-12

    MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

  8. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    SciTech Connect

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

  9. Design tools for daylighting illumination and energy analysis

    SciTech Connect

    Selkowitz, S.

    1982-07-01

    The problems and potentials for using daylighting to provide illumination in building interiors are reviewed. It describes some of the design tools now or soon to be available for incorporating daylighting into the building design process. It also describes state-of-the-art methods for analyzing the impacts daylighting can have on selection of lighting controls, lighting energy consumption, heating and cooling loads, and peak power demand.

  10. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  11. DSD-Crasher: A Hybrid Analysis Tool for Bug CHRISTOPH CSALLNER

    E-print Network

    Xie, Tao

    engineering community [Young 2003; Ernst 2003; Xie and Notkin 2003; Beyer et al. 2004; Csallner annotations [Leavens et al. 1998] that are used to guide our Check 'n' Crash tool [Csallner and Smaragdakis). Check 'n' Crash employs the ESC/Java static analysis tool by Flanagan et al. [2002], applies constraint

  12. Stellar Evolution and Seismic Tools for Asteroseismology Diffusive Processes in Stars and Seismic Analysis

    E-print Network

    Monteiro, Mário João

    Stellar Evolution and Seismic Tools for Asteroseismology Diffusive Processes in Stars and Seismic Analysis C.W. Straka, Y. Lebreton and M.J.P.F.G. Monteiro (eds) EAS Publications Series, 26 (2007) 155 undertaken by the Evolution and Seismic Tools Activity (ESTA) team of the CoRoT Seismology Working Group

  13. Analysis of Mutation Testing Tools Johnathan Snyder, Department of Computer Science, University of Alabama

    E-print Network

    Gray, Jeffrey G.

    Analysis of Mutation Testing Tools Johnathan Snyder, Department of Computer Science, University@crimson.ua.edu Conclusion In my research, I studied three mutation testing tools for Java: MuJava, Jumble, and PIT. All of them use byte level mutation which speeds up the time it takes to generate the mutants and run

  14. A User-Friendly Self-Similarity Analysis Tool Thomas Karagiannis, Michalis Faloutsos, Mart Molle

    E-print Network

    Molle, Mart

    and the absence of publicly avail- able software. In this paper, we introduce SELFIS, a comprehensive tool for an in-depth LRD analysis, including several LRD estimators. In addition, SELFIS includes a powerful ap acquired SELFIS within a month of its release, which clearly demonstrates the need for such a tool

  15. HPLC - A valuable tool for analysis of lipids from conventional and unconventional sources

    Technology Transfer Automated Retrieval System (TEKTRAN)

    During the last 20 years, high performance liquid chromatography (HPLC) has been transformed from being a tool for the qualitative separation of lipids to becoming an indispensable tool for the quantitative analysis of lipids. Some of these HPLC method have also provided valuable insights into rese...

  16. Improving space debris detection in GEO ring using image deconvolution

    NASA Astrophysics Data System (ADS)

    Núñez, Jorge; Núñez, Anna; Montojo, Francisco Javier; Condominas, Marta

    2015-07-01

    In this paper we present a method based on image deconvolution to improve the detection of space debris, mainly in the geostationary ring. Among the deconvolution methods we chose the iterative Richardson-Lucy (R-L), as the method that achieves better goals with a reasonable amount of computation. For this work, we used two sets of real 4096 × 4096 pixel test images obtained with the Telescope Fabra-ROA at Montsec (TFRM). Using the first set of data, we establish the optimal number of iterations in 7, and applying the R-L method with 7 iterations to the images, we show that the astrometric accuracy does not vary significantly while the limiting magnitude of the deconvolved images increases significantly compared to the original ones. The increase is in average about 1.0 magnitude, which means that objects up to 2.5 times fainter can be detected after deconvolution. The application of the method to the second set of test images, which includes several faint objects, shows that, after deconvolution, up to four previously undetected faint objects are detected in a single frame. Finally, we carried out a study of some economic aspects of applying the deconvolution method, showing that an important economic impact can be envisaged.

  17. Spectral analysis of seismic noise induced by rivers: A new tool to monitor spatiotemporal changes in

    E-print Network

    Cattin, Rodolphe

    Spectral analysis of seismic noise induced by rivers: A new tool to monitor spatiotemporal changes. Bollinger, J. Vergne, R. Cattin, and J. L. Na´belek (2008), Spectral analysis of seismic noise induced March 2007; revised 19 September 2007; accepted 28 January 2008; published 2 May 2008. [1] Analysis

  18. A user-centered approach for designing and developing spatiotemporal crime analysis tools

    E-print Network

    Klippel, Alexander

    A user-centered approach for designing and developing spatiotemporal crime analysis tools R. E,kevin.ross,bgf111,wul132,maceachren}@psu.edu 1. Introduction to Crime Analysis and GeoVISTA CrimeViz Crime analysis of crime emphasize the importance of geography (Shaw and McKay 1942; Cohen and Felson 1979; Sampson

  19. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  20. Iterative Deconvolution of Regional Waveforms and a Double-Event Interpretation of the

    E-print Network

    Cerveny, Vlastislav

    1 Iterative Deconvolution of Regional Waveforms and a Double-Event Interpretation of the 2003-called patch inversion (Valee and Bouchon, 2004), or the iterative deconvolution of multiple point sources

  1. Fatty acid analysis tool (FAAT): An FT-ICR MS lipid analysis algorithm.

    PubMed

    Leavell, Michael D; Leary, Julie A

    2006-08-01

    Electrospray ionization mass spectrometry is becoming an established tool for the investigation of lipids. As the methods for lipid analysis become more mature and their throughput increases, computer algorithms for the interpretation of such data will become a necessity. Toward this end, an algorithm dedicated to the analysis of Fourier transform mass spectral data from lipid extracts has been developed. The algorithm, Fatty Acid Analysis Tool, termed FAAT, has been successfully used to investigate complex lipid extracts containing thousands of components, from various species of mycobacteria including M. tuberculosis and M. abscessus. FAAT is rapid, generally taking tens of seconds to interpret multiple spectra, and accessible to most users as it is implemented in Microsoft Excel Visual Basic Software. In the reduction of data, FAAT begins by scaling spectra (i.e., to account for dilution factors), identifying monoisotopic ions, and assigning isotope packets. Unique features of FAAT include the following: (1) overlapping saturated and unsaturated lipid species can be distinguished, (2) known ions are assigned from a user-defined library including species that possess methylene heterogeneity, (3) and isotopic shifts from stable isotope labeling experiments are identified and assigned (up to a user-defined maximum). In addition, abundance differences between samples grown under normal and stressed conditions can be determined. In the analysis of mycobacterial lipid extracts, FAAT has successfully identified isotopic shifts from incorporation of 15N in M. abscessus. Additionally, FAAT has been used to successfully determine differences in lipid abundances between M. tuberculosis wild-type and mutant strains. PMID:16878888

  2. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  3. An integrated traverse planner and analysis tool for future lunar surface exploration

    E-print Network

    Johnson, Aaron William

    2010-01-01

    This thesis discusses the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT), a system designed to help maximize productivity, scientific return, and safety on future lunar and planetary explorations,. The ...

  4. Research Article A practical map-analysis tool for detecting potential dispersal corridors*

    E-print Network

    Hoffman, Forrest M.

    -1 Research Article A practical map-analysis tool for detecting potential dispersal corridors* William W. Hargrove1,* , Forrest M. Hoffman1,2 and Rebecca A. Efroymson1 1 Environmental Sciences Division

  5. VAST: A Human-Centered, Domain-Independent Video Analysis Support Tool 

    E-print Network

    Nordt, Marlo Faye

    2011-08-08

    : A HUMAN-CENTERED, DOMAIN-INDEPENDENT VIDEO ANALYSIS SUPPORT TOOL A Dissertation by MARLO FAYE NORDT Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree... of DOCTOR OF PHILOSOPHY December 2008 Major Subject: Computer Science VAST: A HUMAN-CENTERED, DOMAIN-INDEPENDENT VIDEO ANALYSIS SUPPORT TOOL A Dissertation by MARLO FAYE NORDT Submitted to the Office of Graduate Studies of Texas...

  6. Development of a task analysis tool to facilitate user interface design

    NASA Technical Reports Server (NTRS)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  7. An agent-based tool for infrastructure interdependency policy analysis.

    SciTech Connect

    North, M. J.

    2000-12-14

    Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructure interdependencies such as those between the electric power and natural gas markets. These markets are undergoing fundamental transformations including major changes in electric generator fuel sources. Electric generators that use natural gas as a fuel source are rapidly gaining market share. These generators introduce direct interdependency between the electric power and natural gas markets. These interdependencies have been investigated using the emergent behavior of CAS model agents within the Spot Market Agent Research Tool Version 2.0 Plus Natural Gas (SMART II+).

  8. Powerful tool for design analysis of linear control systems

    SciTech Connect

    Maddux, Jr, A S

    1982-05-10

    The methods for designing linear controls for electronic or mechanical systems have been understood and put to practice. What has not been readily available to engineers, however, is a practical, quick and inexpensive method for analyzing these linear control (feedback) systems once they have been designed into the electronic or mechanical hardware. Now, the PET, manufactured by Commodore Business Machines (CBM), operating with several peripherals via the IEEE 488 Bus, brings to the engineer for about $4000 a complete set of office tools for analyzing these system designs.

  9. Informed constrained spherical deconvolution (iCSD).

    PubMed

    Roine, Timo; Jeurissen, Ben; Perrone, Daniele; Aelterman, Jan; Philips, Wilfried; Leemans, Alexander; Sijbers, Jan

    2015-08-01

    Diffusion-weighted (DW) magnetic resonance imaging (MRI) is a noninvasive imaging method, which can be used to investigate neural tracts in the white matter (WM) of the brain. However, the voxel sizes used in DW-MRI are relatively large, making DW-MRI prone to significant partial volume effects (PVE). These PVEs can be caused both by complex (e.g. crossing) WM fiber configurations and non-WM tissue, such as gray matter (GM) and cerebrospinal fluid. High angular resolution diffusion imaging methods have been developed to correctly characterize complex WM fiber configurations, but significant non-WM PVEs are also present in a large proportion of WM voxels. In constrained spherical deconvolution (CSD), the full fiber orientation distribution function (fODF) is deconvolved from clinically feasible DW data using a response function (RF) representing the signal of a single coherently oriented population of fibers. Non-WM PVEs cause a loss of precision in the detected fiber orientations and an emergence of false peaks in CSD, more prominently in voxels with GM PVEs. We propose a method, informed CSD (iCSD), to improve the estimation of fODFs under non-WM PVEs by modifying the RF to account for non-WM PVEs locally. In practice, the RF is modified based on tissue fractions estimated from high-resolution anatomical data. Results from simulation and in-vivo bootstrapping experiments demonstrate a significant improvement in the precision of the identified fiber orientations and in the number of false peaks detected under GM PVEs. Probabilistic whole brain tractography shows fiber density is increased in the major WM tracts and decreased in subcortical GM regions. The iCSD method significantly improves the fiber orientation estimation at the WM-GM interface, which is especially important in connectomics, where the connectivity between GM regions is analyzed. PMID:25660002

  10. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  11. Accuracy of peak deconvolution algorithms within chromatographic integrators.

    PubMed

    Papas, A N; Tougas, T P

    1990-02-01

    The soundness of present-day algorithms to deconvolve overlapping skewed peaks was investigated. From simulated studies based on the exponentially modified Gaussian model (EMG), chromatographic peak area inaccuracies for unresolved peaks are presented for the two deconvolution methods, the tangent skim and the perpendicular drop method. These inherent inaccuracies, in many cases exceeding 50%, are much greater than those calculated from ideal Gaussian profiles. Multiple linear regression (MLR) was used to build models that predict the relative error for either peak deconvolution method. MLR also provided a means for determining influential independent variables, defining the required chromatographic relationships needed for prediction. Once forecasted errors for both methods are calculated, selection of either peak deconvolution method can be made by minimum errors. These selection boundaries are contrasted to method selection criteria of present data systems' algorithms. PMID:2305954

  12. Extracting ocean surface information from altimeter returns - The deconvolution method

    NASA Technical Reports Server (NTRS)

    Rodriguez, Ernesto; Chapman, Bruce

    1989-01-01

    An evaluation of the deconvolution method for estimating ocean surface parameters from ocean altimeter waveforms is presented. It is shown that this method presents a fast, accurate way of determining the ocean surface parameters from noisy altimeter data. Three parameters may be estimated by using this method, including the altimeter-height error, the ocean-surface standard deviation, and the ocean-surface skewness. By means of a Monte Carlo experiment, an 'optimum' deconvolution algorithm and the accuracies with which the above parameters may be estimated using this algorithm are determined. Then the influence of instrument effects, such as errors in calibration and pointing-angle estimation, on the estimated parameters is examined. Finally, the deconvolution algorithm is used to estimate height and ocean-surface parameters from Seasat data.

  13. The direct deconvolution of X-ray spectra

    NASA Technical Reports Server (NTRS)

    Kahn, S. M.; Blissett, R. J.

    1980-01-01

    The method of deconvolution of proportional counter X-ray spectra as outlined by Blissett and Cruise is reviewed with particular emphasis on low-energy spectra. This method involves the expansion of the incident spectrum in terms of a set of orthonormal singular functions which propagate independently through the detector matrix. When applied to low-energy detectors which typically exhibit absorption features in their efficiency curves, the initial Blissett and Cruise prescription is shown to be unstable. Alternative methods for handling the efficiency are described and evaluated. In the deconvolution of steep spectra, additional distortions are shown to arise as a result of sidelobe oscillations in the effective response functions. A selective weighting procedure is thus introduced to suppress spurious features of this type. Finally, the role of filtering in the deconvolution procedure is discussed and several suitable forms for the filter are suggested.

  14. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  15. Multichannel Blind Deconvolution of the Short-Exposure Astronomical Images Filip Sroubek, Jan Flusser, Tomas Suk

    E-print Network

    Sroubek, Filip

    Multichannel Blind Deconvolution of the Short-Exposure Astronomical Images Filip Sroubek, Jan@asu.cas.cz Abstract In this paper we present a new multichannel blind deconvolution method based on so-called subspace single channel ones. In comparison with earlier multichannel blind deconvolution techniques the subspace

  16. Fast Monotonic Blind Deconvolution Algorithm for Constrained TV Based Image Restoration

    E-print Network

    Lu, Wu-Sheng

    Fast Monotonic Blind Deconvolution Algorithm for Constrained TV Based Image Restoration Haiying Liu Kong, P. R. China Abstract--A new fast monotonic blind deconvolution algo- rithmic method. Index Terms--Blind Deconvolution, Total Variation, De- blurring, Denoising, Bisection Technique I

  17. TOTAL VARIATION BLIND DECONVOLUTION USING A VARIATIONAL APPROACH TO PARAMETER, IMAGE, AND BLUR ESTIMATION

    E-print Network

    Granada, Universidad de

    TOTAL VARIATION BLIND DECONVOLUTION USING A VARIATIONAL APPROACH TO PARAMETER, IMAGE, AND BLUR blind deconvolution and parameter estima- tion utilizing a variational framework. Within a hierarchi, all ordered lexicographi- cally. The general objective of blind deconvolution is to es- timate x and H

  18. ADAPTIVE BLIND DECONVOLUTION OF LINEAR CHANNELS USING RENYI'S ENTROPY WITH PARZEN WINDOW ESTIMATION

    E-print Network

    Slatton, Clint

    ADAPTIVE BLIND DECONVOLUTION OF LINEAR CHANNELS USING RENYI'S ENTROPY WITH PARZEN WINDOW ESTIMATION of Cantabria, Santander, Spain Abstract. Blind deconvolution of linear channels is a fundamental signal, as a criterion for blind deconvolution of linear channels. Comparisons between maximum and minimum entropy

  19. BAYESIAN BLIND DECONVOLUTION FROM DIFFERENTLY EXPOSED IMAGE PAIRS S. Derin Babacan1

    E-print Network

    Granada, Universidad de

    BAYESIAN BLIND DECONVOLUTION FROM DIFFERENTLY EXPOSED IMAGE PAIRS S. Derin Babacan1 , Jingnan Wang1 but with a very high level of noise. In this paper we address this problem and present a novel blind deconvolution-- Blind deconvolution, Bayesian methods, vari- ational distribution approximations, image stabilization

  20. Blind Deconvolution of Sparse Pulse Sequences under a Minimum Distance Constraint: A Partially

    E-print Network

    Tourneret, Jean-Yves

    1 Blind Deconvolution of Sparse Pulse Sequences under a Minimum Distance Constraint: A Partially (Fellow, IEEE), and Nicolas Dobigeon (Member, IEEE) Abstract-- For blind deconvolution of an unknown propose a Bayesian method for blind deconvolution that is based on a modified Bernoulli-Gaussian prior

  1. IMAGE RECONSTRUCTION FROM PHASED-ARRAY MRI DATA BASED ON MULTICHANNEL BLIND DECONVOLUTION

    E-print Network

    Chen, Rong-Rong

    IMAGE RECONSTRUCTION FROM PHASED-ARRAY MRI DATA BASED ON MULTICHANNEL BLIND DECONVOLUTION Huajun sensitivity functions. A new framework based on multichannel blind deconvolution (MBD) is developed for joint deconvolution, regularization, image restoration 1. INTRODUCTION MRI using phased array coils [1] have emerged

  2. The WaveD Transform in R: performs fast translation-invariant wavelet deconvolution.

    E-print Network

    Sydney, University of

    The WaveD Transform in R: performs fast translation-invariant wavelet deconvolution. Marc Raimondo of noisy convolutions to illustrate the non-linear adaptive properties of wavelet deconvolution. 1 of the WaveD transform to the deconvolution of noisy signals. The aim of deconvolu- tion is to recover

  3. 734 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 3, MARCH 2002 Wavelet Deconvolution

    E-print Network

    Fan, Jianqing

    734 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 3, MARCH 2002 Wavelet Deconvolution deconvolution estimators can achieve the optimal rates of convergence in the Besov spaces with 2 when, then linear estimators can be improved by using thresholding wavelet deconvolution estimators which

  4. Deconvolution using natural image priors Anat Levin Rob Fergus Fredo Durand William T. Freeman

    E-print Network

    Masci, Frank

    Deconvolution using natural image priors Anat Levin Rob Fergus Fr´edo Durand William T. Freeman, such as the Richardson-Lucy deconvolution algorithm. However, in this note we propose alternative strategies that make it is helpful to introduce prior into the deconvolution process, and search for an x which will more or less

  5. ADAPTIVE PARAUNITARY FILTER BANKS FOR CONTRAST-BASED MULTICHANNEL BLIND DECONVOLUTION

    E-print Network

    Douglas, Scott C.

    ADAPTIVE PARAUNITARY FILTER BANKS FOR CONTRAST-BASED MULTICHANNEL BLIND DECONVOLUTION X . Sun and S In multichannel blind deconvolution, an m-dimensional vec- tor sequence s(k) containing statistically}. Multichannel blind deconvolution is particularly useful in wireless communica- tions employing smart antennas

  6. Impact of sensor's point spread function on land cover characterization: assessment and deconvolution

    E-print Network

    Liang, Shunlin

    and deconvolution Chengquan Huanga, *, John R.G. Townshenda,b , Shunlin Lianga , Satya N.V. Kalluria,1 , Ruth S. De deconvolution methods in that the PSF was adjusted with lower weighting factors for signals originating from neighboring pixels than those specified by the PSF model. By using this deconvolution method, the lost

  7. BLIND IMAGE DECONVOLUTION: MOTION BLUR ESTIMATION Felix Krahmer, Youzuo Lin, Bonnie McAdoo

    E-print Network

    BLIND IMAGE DECONVOLUTION: MOTION BLUR ESTIMATION By Felix Krahmer, Youzuo Lin, Bonnie Mc://www.ima.umn.edu #12;BLIND IMAGE DECONVOLUTION: MOTION BLUR ESTIMATION Felix Krahmer , Youzuo Lin , Bonnie Mc added. Finally, the best angle and length estimates are combined with existing deconvolution methods

  8. 1430 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 52, NO. 5, MAY 2004 Multichannel Blind Deconvolution of

    E-print Network

    Zhang, Liqing

    Deconvolution of Nonminimum-Phase Systems Using Filter Decomposition Liqing Zhang, Andrzej Cichocki, Member method for multichannel blind deconvolution of nonmin- imum-phase systems. With this approach, we, we present a very simple cost function for blind deconvolution of nonminimum-phase systems

  9. Statistica Sinica 17(2007), 317-340 ADAPTIVE BOXCAR DECONVOLUTION

    E-print Network

    Raimondo, Marc

    2007-01-01

    Statistica Sinica 17(2007), 317-340 ADAPTIVE BOXCAR DECONVOLUTION ON FULL LEBESGUE MEASURE SETS G Johnstone, Kerkyacharian, Picard and Raimondo (2004) have developed a wavelet deconvolution method (called: Adaptive estimation, boxcar, deconvolution, non-parametric regression, Meyer wavelet. 1. Introduction

  10. ITERATIVE DECONVOLUTION FOR AUTOMATIC BASECALLING OF THE DNA ELECTROPHORESIS TIME SERIES

    E-print Network

    Zhang, Xiao-Ping

    1 ITERATIVE DECONVOLUTION FOR AUTOMATIC BASECALLING OF THE DNA ELECTROPHORESIS TIME SERIES Xiao quality of the DNA electrophoresis time series based on an iterative deconvolution method. It recovers accuracy and read length. The core of the method is an iterative nonlinear deconvolution algorithm. Unlike

  11. A blind deconvolution method for ground based telescopes and Fizeau interferometers

    E-print Network

    Bertero, Mario

    A blind deconvolution method for ground based telescopes and Fizeau interferometers M. Prato a, , A deconvolution problems for ground-based telescopes with AO systems. Suitable constraints on object and PSF, including the Strehl ratio, are exploited. Numerical experiments on multiple images blind deconvolution

  12. QUASI-NEWTON FILTERED-ERROR AND FILTERED-REGRESSOR ALGORITHMS FOR ADAPTIVE EQUALIZATION AND DECONVOLUTION

    E-print Network

    Cichocki, Andrzej

    AND DECONVOLUTION S.C. Douglasy, A. Cichockiz, and S. Amariz yDepartment of Electrical Engineering, University, RIKEN, Saitama 351-01 JAPAN ABSTRACT In equalization and deconvolution tasks, the corre- lated nature-Newton convergence locally about the optimum coe cient solution for deconvolution and equalization tasks. Simulations

  13. Blind deconvolution of human brain SPECT images using a distribution mixture estimation

    E-print Network

    Mignotte, Max

    Blind deconvolution of human brain SPECT images using a distribution mixture estimation Max and then to facilitate their interpretation, we propose herein to implement a deconvolution procedure relying of the object to be deconvolved when this one is needed. In this context, we compare the deconvolution results

  14. IEEE TRANSACTIONS ON IMAGE PROCESSING, TO APPEAR. 1 Total Variation Blind Deconvolution

    E-print Network

    Ferguson, Thomas S.

    IEEE TRANSACTIONS ON IMAGE PROCESSING, TO APPEAR. 1 Total Variation Blind Deconvolution Tony F. Chan, C.K. Wong Abstract--- In this paper, we present a blind deconvolution algorithm based]. As this blind deconvolution problem is ill­posed with respect to both k and u, You and Kaveh [13] proposed

  15. Deconvolution of Sparse Signals and Images from Noisy Partial Data using MUSIC

    E-print Network

    Yagle, Andrew E.

    1 Deconvolution of Sparse Signals and Images from Noisy Partial Data using MUSIC Andrew E. Yagle Department of EECS, The University of Michigan, Ann Arbor, MI 48109-2122 Abstract-- Deconvolution has three reconstruction; Deconvolution Phone: 734-763-9810. Fax: 734-763-1503. Email: aey@eecs.umich.edu. EDICS: 2-REST. I

  16. Deconvolution of Impulse Response in Event-Related BOLD fMRI1 Gary H. Glover

    E-print Network

    Glover, Gary H.

    Deconvolution of Impulse Response in Event-Related BOLD fMRI1 Gary H. Glover Center for Advanced MR that this character was predicted with a modification to Bux- ton's balloon model. Wiener deconvolution was used to fully overlap- ping stimuli, linear deconvolution is effective when the stimuli are separated

  17. Blind Deconvolution and Toeplitzation Using Iterative Null-Space and Rank-One Projections

    E-print Network

    Yagle, Andrew E.

    1 Blind Deconvolution and Toeplitzation Using Iterative Null-Space and Rank-One Projections Andrew deconvolution if the image has compact sup- port). We present a new reformulation of these prob- lems in which deconvolution, Toeplitzation Phone: 734-763-9810. Fax: 734-763-1503. Email: aey@eecs.umich.edu. EDICS: 2-REST. I

  18. Fifth-order electronically non-resonant Raman scattering: two-dimensional Fourier deconvolution

    E-print Network

    Kaufman, Laura

    Fifth-order electronically non-resonant Raman scattering: two-dimensional Fourier deconvolution presents an analytical Fourier deconvolution procedure for homodyne detected electronically non-reso- nant pulses used in the experiment. In developing the ®fth-order deconvolution procedure, an analogous

  19. TWO SPATIO-TEMPORAL DECORRELATION LEARNING ALGORITHMS AND THEIR APPLICATION TO MULTICHANNEL BLIND DECONVOLUTION

    E-print Network

    Cichocki, Andrzej

    DECONVOLUTION Seungjin CHOIy, Andrzej CICHOCKIz, Shun-ichi AMARIz y School of Electrical and Electronics-temporal decorrelation algorithms. These two al- gorithms are applied to multichannel blind deconvolution task. 1. INTRODUCTION Multichannel blind deconvolution (MBD) is a fundamental prob- lem encountered

  20. IMAGE DECONVOLUTION USING A GAUSSIAN SCALE MIXTURES MODEL TO APPROXIMATE THE WAVELET SPARSENESS CONSTRAINT

    E-print Network

    Kingsbury, Nick

    IMAGE DECONVOLUTION USING A GAUSSIAN SCALE MIXTURES MODEL TO APPROXIMATE THE WAVELET SPARSENESS into the deconvolution process and thus enables the al- gorithm to achieve better results with fewer iterations in our to deconvolution are based on the least squares (LS) formulation using the differences between the measured

  1. Blind deconvolution of multivariate signals : a deflation approach Ph. Loubaton* ,Ph. A. Regalia**

    E-print Network

    Regalia, Phillip A.

    Blind deconvolution of multivariate signals : a deflation approach Ph. Loubaton* ,Ph. A. Regalia established (e.g. [6])that the blind deconvolution problem makes sense as soon as y (or equivalently w ) is non Gaussian. In this paper, we deal with the previously adaptive blind deconvolution problem

  2. A Thresholded Landweber Algorithm for Wavelet-based Sparse Poisson Deconvolution

    E-print Network

    Kingsbury, Nick

    A Thresholded Landweber Algorithm for Wavelet-based Sparse Poisson Deconvolution Ganchi Zhang a new iterative deconvolution algorithm for noisy Poisson images based on wavelet sparse regularization good solution for 3D microscopy deconvolution. I. NOTATIONS In this document we use the following

  3. Wavelet deconvolution in a periodic setting Iain M. Johnstone, Gerard Kerkyacharian,

    E-print Network

    Université Pierre-et-Marie-Curie, Paris 6

    Wavelet deconvolution in a periodic setting Iain M. Johnstone, G´erard Kerkyacharian, Stanford-optimal rate of convergence for a variety of Lp loss functions. 1 De-convolution in white noise Suppose we Figure 2, are the following: Key Words and Phrases. Adaptive estimation, deconvolution, non

  4. Deconvolution imaging conditions and cross-talk suppression Travis L. Poole1

    E-print Network

    Deconvolution imaging conditions and cross-talk suppression Travis L. Poole1 , Andrew Curtis1 , Johan O. A. Robertsson2 , and Dirk-Jan van Manen3 ABSTRACT Deconvolution imaging conditions offer- correlation-based imaging conditions. In simple analytical cases, deconvolution imaging conditions also offer

  5. An Evolutionary Approach For Blind Deconvolution Of Barcode Images With Nonuniform Illumination

    E-print Network

    Dumas, Laurent

    An Evolutionary Approach For Blind Deconvolution Of Barcode Images With Nonuniform Illumination with a joint nonuniform illumina- tion estimation and blind deconvolution for barcode signals by us- ing of blind deconvolution of linear barcode signals. Linear barcodes are one of the oldest technology related

  6. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    SciTech Connect

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.

  7. Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles

    PubMed Central

    Baer, DR; Gaspar, DJ; Nachimuthu, P; Techane, SD; Castner, DG

    2010-01-01

    The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578

  8. An Analysis Tool for Flight Dynamics Monte Carlo Simulations 

    E-print Network

    Restrepo, Carolina 1982-

    2011-05-20

    and analysis work to understand vehicle operating limits and identify circumstances that lead to mission failure. A Monte Carlo simulation approach that varies a wide range of physical parameters is typically used to generate thousands of test cases...

  9. TOOLS FOR COMPARATIVE ANALYSIS OF ALTERNATIVES: COMPETING OR COMPLEMENTARY PERSPECTIVES?

    EPA Science Inventory

    A third generation of environmental policymaking and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of these risks associated with the decision alternatives at hand will e...

  10. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    SciTech Connect

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This paper describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.

  11. Receiver function deconvolution using transdimensional hierarchical Bayesian inference

    NASA Astrophysics Data System (ADS)

    Kolb, J. M.; Leki?, V.

    2014-06-01

    Teleseismic waves can convert from shear to compressional (Sp) or compressional to shear (Ps) across impedance contrasts in the subsurface. Deconvolving the parent waveforms (P for Ps or S for Sp) from the daughter waveforms (S for Ps or P for Sp) generates receiver functions which can be used to analyse velocity structure beneath the receiver. Though a variety of deconvolution techniques have been developed, they are all adversely affected by background and signal-generated noise. In order to take into account the unknown noise characteristics, we propose a method based on transdimensional hierarchical Bayesian inference in which both the noise magnitude and noise spectral character are parameters in calculating the likelihood probability distribution. We use a reversible-jump implementation of a Markov chain Monte Carlo algorithm to find an ensemble of receiver functions whose relative fits to the data have been calculated while simultaneously inferring the values of the noise parameters. Our noise parametrization is determined from pre-event noise so that it approximates observed noise characteristics. We test the algorithm on synthetic waveforms contaminated with noise generated from a covariance matrix obtained from observed noise. We show that the method retrieves easily interpretable receiver functions even in the presence of high noise levels. We also show that we can obtain useful estimates of noise amplitude and frequency content. Analysis of the ensemble solutions produced by our method can be used to quantify the uncertainties associated with individual receiver functions as well as with individual features within them, providing an objective way for deciding which features warrant geological interpretation. This method should make possible more robust inferences on subsurface structure using receiver function analysis, especially in areas of poor data coverage or under noisy station conditions.

  12. Extension of a System Level Tool for Component Level Analysis

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  13. GIPSY 3D: Analysis, visualization and VO-Tools

    NASA Astrophysics Data System (ADS)

    Ruíz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; van der Hulst, J. M.

    2009-07-01

    The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing System (GIPSY), and new ones based on use cases elaborated in collaboration with advanced users. One of the main goals is to provide local interoperability between GIPSY (visualization and data analysis) and other VO software. The connectivity with the Virtual Observatory environment will provide general access to 3D data VO archives and services, maximizing the potential for scientific discovery.

  14. Extension of a System Level Tool for Component Level Analysis

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)

    2001-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  15. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. DEBRISK, a Tool for Re-Entry Risk Analysis

    NASA Astrophysics Data System (ADS)

    Omaly, P.; Spel, M.

    2012-01-01

    An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the ablated object. The mass in the lumped mass equation is termed 'thermal mass' and consists of the part of the object that is exposed to the flow (so excluding the mass of the contained children). A fair amount of predefined materials is implemented, along with their thermal properties. In order to allow the users to modify the properties or to add new materials, user defined materials can be used. In that case the properties such as specific heat, emissivity and conductivity can either be entered as a constant or as being temperature dependent by entering a table. Materials can be derived from existing objects, which is useful in case only one or few of the material properties change. The code has been developed in the Java language, benefitting from the object oriented approach. Most methods that are used in DEBRISK to compute drag coefficients and heating rates are based on engineering methods developed in the 1950 to 1960's, which are used as well in similar tools (ORSAT, SESAME, ORSAT-J, ...). The paper presents a set of comparisons with literature cases of similar tools in order to verify the implementation of those methods in the developed software.

  17. Cluster analysis as a prediction tool for pregnancy outcomes.

    PubMed

    Banjari, Ines; Kenjeri?, Daniela; Šoli?, Krešimir; Mandi?, Milena L

    2015-03-01

    Considering specific physiology changes during gestation and thinking of pregnancy as a "critical window", classification of pregnant women at early pregnancy can be considered as crucial. The paper demonstrates the use of a method based on an approach from intelligent data mining, cluster analysis. Cluster analysis method is a statistical method which makes possible to group individuals based on sets of identifying variables. The method was chosen in order to determine possibility for classification of pregnant women at early pregnancy to analyze unknown correlations between different variables so that the certain outcomes could be predicted. 222 pregnant women from two general obstetric offices' were recruited. The main orient was set on characteristics of these pregnant women: their age, pre-pregnancy body mass index (BMI) and haemoglobin value. Cluster analysis gained a 94.1% classification accuracy rate with three branch- es or groups of pregnant women showing statistically significant correlations with pregnancy outcomes. The results are showing that pregnant women both of older age and higher pre-pregnancy BMI have a significantly higher incidence of delivering baby of higher birth weight but they gain significantly less weight during pregnancy. Their babies are also longer, and these women have significantly higher probability for complications during pregnancy (gestosis) and higher probability of induced or caesarean delivery. We can conclude that the cluster analysis method can appropriately classify pregnant women at early pregnancy to predict certain outcomes. PMID:26040101

  18. Correcting direction-dependent gains in the deconvolution of radio interferometric images

    E-print Network

    S. Bhatnagar; T. J. Cornwell; K. Golap; Juan M. Uson

    2008-06-03

    Astronomical imaging using aperture synthesis telescopes requires deconvolution of the point spread function as well as calibration of instrumental and atmospheric effects. In general, such effects are time-variable and vary across the field of view as well, resulting in direction-dependent (DD), time-varying gains. Most existing imaging and calibration algorithms assume that the corruptions are direction independent, preventing even moderate dynamic range full-beam, full-Stokes imaging. We present a general framework for imaging algorithms which incorporate DD errors. We describe as well an iterative deconvolution algorithm that corrects known DD errors due to the antenna power patterns and pointing errors for high dynamic range full-beam polarimetric imaging. Using simulations we demonstrate that errors due to realistic primary beams as well as antenna pointing errors will limit the dynamic range of upcoming higher sensitivity instruments and that our new algorithm can be used to correct for such errors. We have applied this algorithm to VLA 1.4 GHz observations of a field that contains two ``4C'' sources and have obtained Stokes-I and -V images with systematic errors that are one order of magnitude lower than those obtained with conventional imaging tools. Our simulations show that on data with no other calibration errors, the algorithm corrects pointing errors as well as errors due to known asymmetries in the antenna pattern.

  19. A software tool for the analysis of neuronal morphology data.

    PubMed

    Ledderose, Julia; Sención, Luis; Salgado, Humberto; Arias-Carrión, Oscar; Treviño, Mario

    2014-01-01

    Anatomy plays a fundamental role in supporting and shaping nervous system activity. The remarkable progress of computer processing power within the last two decades has enabled the generation of electronic databases of complete three-dimensional (3D) dendritic and axonal morphology for neuroanatomical studies. Several laboratories are freely posting their reconstructions online after result publication v.gr. NeuroMorpho.Org (Nat Rev Neurosci7:318-324, 2006). These neuroanatomical archives represent a crucial resource to explore the relationship between structure and function in the brain (Front Neurosci6:49, 2012). However, such 'Cartesian' descriptions bear little intuitive information for neuroscientists. Here, we developed a simple prototype of a MATLAB-based software tool to quantitatively describe the 3D neuronal structures from public repositories. The program imports neuronal reconstructions and quantifies statistical distributions of basic morphological parameters such as branch length, tortuosity, branch's genealogy and bifurcation angles. Using these morphological distributions, our algorithm can generate a set of virtual neurons readily usable for network simulations. PMID:24529393

  20. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    NASA Technical Reports Server (NTRS)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  1. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    PubMed Central

    Grüning, Björn A.; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu). PMID:24109552

  2. Image restoration for confocal microscopy: improving the limits of deconvolution, with application to the visualization of the mammalian hearing organ.

    PubMed Central

    Boutet de Monvel, J; Le Calvez, S; Ulfendahl, M

    2001-01-01

    Deconvolution algorithms have proven very effective in conventional (wide-field) fluorescence microscopy. Their application to confocal microscopy is hampered, in biological experiments, by the presence of important levels of noise in the images and by the lack of a precise knowledge of the point spread function (PSF) of the system. We investigate the application of wavelet-based processing tools to deal with these problems, in particular wavelet denoising methods, which turn out to be very effective in application to three-dimensional confocal images. When used in combination with more classical deconvolution algorithms, these methods provide a robust and efficient restoration scheme allowing one to deal with difficult imaging conditions. To make our approach applicable in practical situations, we measured the PSF of a Biorad-MRC1024 confocal microscope under a large set of imaging conditions, including in situ acquisitions. As a specific biological application, we present several examples of restorations of three-dimensional confocal images acquired inside an intact preparation of the hearing organ. We also provide a quantitative assessment of the gain in quality achieved by wavelet-aided restorations over classical deconvolution schemes, based on a set of numerical experiments that we performed with test images. PMID:11325744

  3. Analysis of spreadable cheese by Raman spectroscopy and chemometric tools.

    PubMed

    de Sá Oliveira, Kamila; de Souza Callegaro, Layce; Stephani, Rodrigo; Almeida, Mariana Ramos; de Oliveira, Luiz Fernando Cappa

    2016-03-01

    In this work, FT-Raman spectroscopy was explored to evaluate spreadable cheese samples. A partial least squares discriminant analysis was employed to identify the spreadable cheese samples containing starch. To build the models, two types of samples were used: commercial samples and samples manufactured in local industries. The method of supervised classification PLS-DA was employed to classify the samples as adulterated or without starch. Multivariate regression was performed using the partial least squares method to quantify the starch in the spreadable cheese. The limit of detection obtained for the model was 0.34% (w/w) and the limit of quantification was 1.14% (w/w). The reliability of the models was evaluated by determining the confidence interval, which was calculated using the bootstrap re-sampling technique. The results show that the classification models can be used to complement classical analysis and as screening methods. PMID:26471577

  4. Exploring NASA and ESA Atmospheric Data Using GIOVANNI, the Online Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory

    2007-01-01

    Giovanni, the NASA Goddard online visualization and analysis tool (http://giovanni.gsfc.nasa.gov) allows users explore various atmospheric phenomena without learning remote sensing data formats and downloading voluminous data. Using NASA MODIS (Terra and Aqua) and ESA MERIS (ENVISAT) aerosol data as an example, we demonstrate Giovanni usage for online multi-sensor remote sensing data comparison and analysis.

  5. Unsteady, high Reynolds number validation cases for a multi-phase CFD analysis tool have been

    E-print Network

    Kunz, Robert Francis

    1 Abstract Unsteady, high Reynolds number validation cases for a multi-phase CFD analysis tool have- of-the-art in CFD analysis of cavitation. The restrictions in range of applicability associated and viscous effects such as flow separation. The principal interest here is in modeling high Reynolds number

  6. Abstract Title: Image Informatics Tools for the Analysis of Retinal Images

    E-print Network

    California at Santa Barbara, University of

    Abstract Title: Image Informatics Tools for the Analysis of Retinal Images Presentation Start Barbara, Santa Barbara, CA. Keywords: 682 retinal detachment, 541 image processing, 543 imaging/image and quantitative analysis of retinal images, and to test these methods on a large retinal image database. Methods

  7. Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis

    ERIC Educational Resources Information Center

    Moffitt, Kevin Christopher

    2011-01-01

    The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

  8. Fault Tree Analysis: A Research Tool for Educational Planning. Technical Report No. 1.

    ERIC Educational Resources Information Center

    Alameda County School Dept., Hayward, CA. PACE Center.

    This ESEA Title III report describes fault tree analysis and assesses its applicability to education. Fault tree analysis is an operations research tool which is designed to increase the probability of success in any system by analyzing the most likely modes of failure that could occur. A graphic portrayal, which has the form of a tree, is…

  9. CoryneBase: Corynebacterium genomic resources and analysis tools at your fingertips.

    PubMed

    Heydari, Hamed; Siow, Cheuk Chuen; Tan, Mui Fern; Jakubovics, Nick S; Wee, Wei Yee; Mutha, Naresh V R; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  10. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT): Semi-Annual Progress Report

    SciTech Connect

    Williams, D N

    2012-02-29

    This report summarizes work carried out by the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of July 1, 2011 through December 31, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. The UV-CDAT team is positioned to address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, Visualization interfaces.

  11. An integrated data analysis tool for improving measurements on the MST RFP

    SciTech Connect

    Reusch, L. M. Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J.; Franz, P.; Stephens, H. D.

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  12. Decision Analysis Tool to Compare Energy Pathways for Transportation

    SciTech Connect

    Bloyd, Cary N.; Stork, Kevin

    2011-02-01

    With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). A prototype model, Analytica Transportation Energy Analysis Model (ATEAM), has been developed using the Analytica decision modeling environment, visualizing the structure as a hierarchy of influence diagrams. The report summarized the FY2010 ATEAM accomplishments.

  13. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    SciTech Connect

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  14. Design and Analysis Tools for Concurrent Blackboard Systems

    NASA Technical Reports Server (NTRS)

    McManus, John W.

    1991-01-01

    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  15. ComprehensiveTool Wear Estimation in Finish-Machiningvia Multivariate Time-Series Analysis of 3-D Cutting Forces

    E-print Network

    Yao, Y. Lawrence

    ComprehensiveTool Wear Estimation in Finish-Machiningvia Multivariate Time-Series Analysis of 3-D by the tool wear at the minorflank and nosearea. This paper describes an investigation into "comprehensive"tool wear estimation, including flank-,crater-. minorflank-, and nose-wear, based on an analysis of dynamic

  16. IMPLEMENTING THE STANDARD SPECTRUM METHOD FOR ANALYSIS OF ?-? COINCIDENCE SPECTRA

    SciTech Connect

    Biegalski, S.; Flory, Adam E.; Schrom, Brian T.; Ely, James H.; Haas, Derek A.; Bowyer, Ted W.; Hayes, James C.

    2011-09-14

    The standard deconvolution analysis tool (SDAT) algorithms were developed and tested at the University of Texas at Austin. These algorithms utilize the standard spectrum technique for spectral analysis of {beta}-{gamma} coincidence spectra for nuclear explosion monitoring. Work has been conducted under this contract to implement these algorithms into a useable scientific software package with a graphical user interface. Improvements include the ability to read in PHD formatted data, gain matching, and data visualization. New auto-calibration algorithms were developed and implemented based on 137Cs spectra for assessment of the energy vs. channel calibrations. Details on the user tool and testing are included.

  17. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  18. ExoData: Open Exoplanet Catalogue exploration and analysis tool

    NASA Astrophysics Data System (ADS)

    Varley, Ryan

    2015-12-01

    ExoData is a python interface for accessing and exploring the Open Exoplanet Catalogue. It allows searching of planets (including alternate names) and easy navigation of hierarchy, parses spectral types and fills in missing parameters based on programmable specifications, and provides easy reference of planet parameters such as GJ1214b.ra, GJ1214b.T, and GJ1214b.R. It calculates values such as transit duration, can easily rescale units, and can be used as an input catalog for large scale simulation and analysis of planets.

  19. Stakeholder analysis: a useful tool for biobank planning.

    PubMed

    Bjugn, Roger; Casati, Bettina

    2012-06-01

    Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway. PMID:24835062

  20. GIPSY 3D: Analysis, Visualization and VO Tools for Datacubes

    NASA Astrophysics Data System (ADS)

    Ruíz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; van der Hulst, J. M.

    2009-09-01

    The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing System (GIPSY), and new ones based on use cases elaborated in collaboration with advanced users. One of the main goals is to provide local interoperability between GIPSY and other VO software. The connectivity with the Virtual Observatory environment will provide general access to 3D data VO archives and services, maximizing the potential for scientific discovery.