Sample records for deconvolution analysis tool

  1. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    Microsoft Academic Search

    K. M. Foltz Biegalski; S. R. Biegalski; D. A. Haas

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits\\u000a for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze\\u000a both ?-? coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution\\u000a spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray

  2. DEVELOPMENT OF THE SPECTRAL DECONVOLUTION ANALYSIS TOOL (SDAT) TO IMPROVE COUNTING STATISTICS AND DETECTION LIMITS FOR NUCLEAR EXPLOSION RADIONUCLIDE MEASUREMENTS

    Microsoft Academic Search

    S. R. Biegalski; K. M. Foltz Biegalski; D. A. Haas

    The Spectral Deconvolution Analysis Tool (SDAT) is being written to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT will utilize spectral deconvolution spectroscopy techniques to analyze both ??? coincidence spectra for radioxenon isotopes and high-resolution High Purity Germanium (HPGe) spectra that are utilized for aerosol monitoring. Spectral deconvolution spectroscopy is an analysis method that utilizes the

  3. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  4. PVT Analysis With A Deconvolution Algorithm

    SciTech Connect

    Kouzes, Richard T.

    2011-02-01

    Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

  5. Monte Carlo error analysis in x-ray spectral deconvolution

    SciTech Connect

    Shirk, D.G.; Hoffman, N.M.

    1984-01-01

    The deconvolution of spectral information from sparse x-ray data is a widely encountered problem in data analysis. An often-neglected aspect of this problem is the propagation of random error in the deconvolution process. We have developed a Monte Carlo approach that enables us to attach error bars to unfolded x-ray spectra. Our Monte Carlo error analysis has been incorporated into two specific deconvolution techniques: the first is an iterative convergent weight method; the second is a singular-value-decomposition (SVD) method. These two methods were applied to an x-ray spectral deconvolution problem having m channels of observations with n points in energy space. When m is less than n, this problem has no unique solution. We discuss the systematics of non-unique solutions and energy-dependent error bars for both methods. The Monte Carlo approach has a particular benefit in relation to the SVD method: it allows us to apply the constraint of spectral non-negativity after the SVD deconvolution rather than before. Consequently we can identify inconsistencies between different detector channels. 4 references, 6 figures.

  6. Chemometric Data Analysis for Deconvolution of Overlapped Ion Mobility Profiles

    NASA Astrophysics Data System (ADS)

    Zekavat, Behrooz; Solouki, Touradj

    2012-11-01

    We present the details of a data analysis approach for deconvolution of the ion mobility (IM) overlapped or unresolved species. This approach takes advantage of the ion fragmentation variations as a function of the IM arrival time. The data analysis involves the use of an in-house developed data preprocessing platform for the conversion of the original post-IM/collision-induced dissociation mass spectrometry (post-IM/CID MS) data to a Matlab compatible format for chemometric analysis. We show that principle component analysis (PCA) can be used to examine the post-IM/CID MS profiles for the presence of mobility-overlapped species. Subsequently, using an interactive self-modeling mixture analysis technique, we show how to calculate the total IM spectrum (TIMS) and CID mass spectrum for each component of the IM overlapped mixtures. Moreover, we show that PCA and IM deconvolution techniques provide complementary results to evaluate the validity of the calculated TIMS profiles. We use two binary mixtures with overlapping IM profiles, including (1) a mixture of two non-isobaric peptides (neurotensin (RRPYIL) and a hexapeptide (WHWLQL)), and (2) an isobaric sugar isomer mixture of raffinose and maltotriose, to demonstrate the applicability of the IM deconvolution.

  7. Deconvolution of variability and uncertainty in the Cassini safety analysis

    SciTech Connect

    Kampas, Frank J.; Loughin, Stephen [Lockheed Martin Missiles and Space, P.O. Box 8555, Philadelphia, Pennsylvania 19101 (United States); WAM Systems, 650 Loraine Street, Ardmore, Pennsylvania 19003 (United States)

    1998-01-15

    The standard method for propagation of uncertainty in a risk analysis requires rerunning the risk calculation numerous times with model parameters chosen from their uncertainty distributions. This was not practical for the Cassini nuclear safety analysis, due to the computationally intense nature of the risk calculation. A less computationally intense procedure was developed which requires only two calculations for each accident case. The first of these is the standard 'best-estimate' calculation. In the second calculation, variables and parameters change simultaneously. The mathematical technique of deconvolution is then used to separate out an uncertainty multiplier distribution, which can be used to calculate distribution functions at various levels of confidence.

  8. A L? sparse analysis prior for blind poissonian image deconvolution.

    PubMed

    Gong, Xiaojin; Lai, Baisheng; Xiang, Zhiyu

    2014-02-24

    This paper proposes a new approach for blindly deconvolving images that are contaminated by Poisson noise. The proposed approach incorporates a new prior, that is the L0 sparse analysis prior, together with the total variation constraint into the maximum a posteriori (MAP) framework for deconvolution. A greedy analysis pursuit numerical scheme is exploited to solve the L0 regularized MAP problem. Experimental results show that our approach not only produces smooth results substantially suppressing artifacts and noise, but also preserves intensity changes sharply. Both quantitative and qualitative comparisons to the specialized state-of-the-art algorithms demonstrate its superiority. PMID:24663705

  9. Deconvolution of variability and uncertainty in the Cassini safety analysis

    SciTech Connect

    Kampas, F.J. [Lockheed Martin Missiles and Space, P.O. Box 8555, Philadelphia, Pennsylvania 19101 (United States); Loughin, S. [WAM Systems, 650 Loraine Street, Ardmore, Pennsylvania 19003 (United States)

    1998-01-01

    The standard method for propagation of uncertainty in a risk analysis requires rerunning the risk calculation numerous times with model parameters chosen from their uncertainty distributions. This was not practical for the Cassini nuclear safety analysis, due to the computationally intense nature of the risk calculation. A less computationally intense procedure was developed which requires only two calculations for each accident case. The first of these is the standard {open_quotes}best-estimate{close_quotes} calculation. In the second calculation, variables and parameters change simultaneously. The mathematical technique of deconvolution is then used to separate out an uncertainty multiplier distribution, which can be used to calculate distribution functions at various levels of confidence. {copyright} {ital 1998 American Institute of Physics.}

  10. Multispectral imaging analysis: spectral deconvolution and applications in biology

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Ahmed, Wamiq; Bayraktar, Bulent; Rajwa, Bartek; Sturgis, Jennifer; Robinson, J. P.

    2005-03-01

    Multispectral imaging has been in use for over half a century. Owing to advances in digital photographic technology, multispectral imaging is now used in settings ranging from clinical medicine to industrial quality control. Our efforts focus on the use of multispectral imaging coupled with spectral deconvolution for measurement of endogenous tissue fluorophores and for animal tissue analysis by multispectral fluorescence, absorbance, and reflectance data. Multispectral reflectance and fluorescence images may be useful in evaluation of pathology in histological samples. For example, current hematoxylin/eosin diagnosis limits spectral analysis to shades of red and blue/grey. It is possible to extract much more information using multispectral techniques. To collect this information, a series of filters or a device such as an acousto-optical tunable filter (AOTF) or liquid-crystal filter (LCF) can be used with a CCD camera, enabling collection of images at many more wavelengths than is possible with a simple filter wheel. In multispectral data processing the "unmixing" of reflectance or fluorescence data and analysis and the classification based upon these spectra is required for any classification. In addition to multispectral techniques, extraction of topological information may be possible by reflectance deconvolution or multiple-angle imaging, which could aid in accurate diagnosis of skin lesions or isolation of specific biological components in tissue. The goal of these studies is to develop spectral signatures that will provide us with specific and verifiable tissue structure/function information. In addition, relatively complex classification techniques must be developed so that the data are of use to the end user.

  11. Convergence analysis of blind image deconvolution via dispersion minimization

    Microsoft Academic Search

    C. Vural; W. A. Sethares

    2006-01-01

    SUMMARY A new non-linear adaptive filter called blind image deconvolution via dispersion minimization has recently been proposed for restoring noisy blurred images blindly. This is essentially a two-dimensional version of the constant modulus algorithm that is well known in the field of blind equalization. The two-dimensional extension has been shown capable of reconstructing noisy blurred images using partial a priori

  12. Comparative analysis of UWB deconvolution and feature-extraction algorithms for GPR landmine detection

    NASA Astrophysics Data System (ADS)

    Savelyev, Timofei G.; Sato, Motoyuki

    2004-09-01

    In this work we developed target recognition algorithms for landmine detection with ultra-wideband ground penetrating radar (UWB GPR). Due to non-stationarity of UWB signals their processing requires advanced techniques, namely regularized deconvolution, time-frequency or time-scale analysis. We use deconvolution to remove GPR and soil characteristics from the received signals. An efficient algorithm of deconvolution, based on a regularized Wiener inverse filter with wavelet noise level estimation, has been developed. Criteria of efficiency were stability of the signal after deconvolution, difference between the received signal and the convolved back signal, and computational speed. The novelty of the algorithm is noise level estimation with wavelet decomposition, which defines the noise level separately for any signal, independently of its statistics. The algorithm was compared with an iterative time-domain deconvolution algorithm based on regularization. For target recognition we apply singular value decomposition (SVD) to a time-frequency signal distribution. Here we compare the Wigner transform and continuous wavelet transform (CWT) for discriminant feature selection. The developed algorithms have been checked on the data acquired with a stepped-frequency GPR.

  13. Monitoring a Building Using Deconvolution Interferometry. II: Ambient-Vibration Analysis

    E-print Network

    Snieder, Roel

    Monitoring a Building Using Deconvolution Interferometry. II: Ambient- Vibration Analysis by Nori interferometry to ambient-vibration data, instead of using earthquake data, to monitor a building. The time continuity of ambient vibrations is useful for temporal monitoring. We show that, because multiple sources

  14. UV spectral deconvolution: A valuable tool for waste water quality determination

    Microsoft Academic Search

    O. Thomas; F. Theraulaz; M. Domeizel; C. Massiani

    1993-01-01

    This paper presents a new, rapid and simple method for the determination of water quality, in particular the evaluation of Organic Carbon, nitrate and total suspended solids in waters. On the basis of UV spectral deconvolution, its principle supposes that the absorbency spectrum of water can be decomposed from a few number of characteristic spectra. For a given parameter, for

  15. Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution

    NASA Technical Reports Server (NTRS)

    Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.

    1987-01-01

    The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

  16. CT-perfusion imaging of the human brain: advanced deconvolution analysis using circulant singular value decomposition.

    PubMed

    Wittsack, H J; Wohlschläger, A M; Ritzl, E K; Kleiser, R; Cohnen, M; Seitz, R J; Mödder, U

    2008-01-01

    According to indicator dilution theory tissue time-concentration curves have to be deconvolved with arterial input curves in order to get valid perfusion results. Our aim was to adapt and validate a deconvolution method originating from magnetic resonance techniques and apply it to the calculation of dynamic contrast enhanced computed tomography perfusion imaging. The application of a block-circulant matrix approach for singular value decomposition renders the analysis independent of tracer arrival time to improve the results. PMID:18029143

  17. Data dependent peak model based spectrum deconvolution for analysis of high resolution LC-MS data.

    PubMed

    Wei, Xiaoli; Shi, Xue; Kim, Seongho; Patrick, Jeffrey S; Binkley, Joe; Kong, Maiying; McClain, Craig; Zhang, Xiang

    2014-02-18

    A data dependent peak model (DDPM) based spectrum deconvolution method was developed for analysis of high resolution LC-MS data. To construct the selected ion chromatogram (XIC), a clustering method, the density based spatial clustering of applications with noise (DBSCAN), is applied to all m/z values of an LC-MS data set to group the m/z values into each XIC. The DBSCAN constructs XICs without the need for a user defined m/z variation window. After the XIC construction, the peaks of molecular ions in each XIC are detected using both the first and the second derivative tests, followed by an optimized chromatographic peak model selection method for peak deconvolution. A total of six chromatographic peak models are considered, including Gaussian, log-normal, Poisson, gamma, exponentially modified Gaussian, and hybrid of exponential and Gaussian models. The abundant nonoverlapping peaks are chosen to find the optimal peak models that are both data- and retention-time-dependent. Analysis of 18 spiked-in LC-MS data demonstrates that the proposed DDPM spectrum deconvolution method outperforms the traditional method. On average, the DDPM approach not only detected 58 more chromatographic peaks from each of the testing LC-MS data but also improved the retention time and peak area 3% and 6%, respectively. PMID:24533635

  18. FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques

    NASA Astrophysics Data System (ADS)

    Madavarapu, Sateesh Babu

    The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60°C and 80°C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the activating position of the main Si-O-T (where T is Al/Si) stretching band in the FTIR spectrum, which gives an indication of the relative changes in the Si/Al ratio. Also, the effect of the cation and silicate concentration in the activating solution has been discussed using the Fourier self deconvolution technique.

  19. Analysis of force-deconvolution methods in frequency-modulation atomic force microscopy

    PubMed Central

    Illek, Esther; Giessibl, Franz J

    2012-01-01

    Summary In frequency-modulation atomic force microscopy the direct observable is the frequency shift of an oscillating cantilever in a force field. This frequency shift is not a direct measure of the actual force, and thus, to obtain the force, deconvolution methods are necessary. Two prominent methods proposed by Sader and Jarvis (Sader–Jarvis method) and Giessibl (matrix method) are investigated with respect to the deconvolution quality. Both methods show a nontrivial dependence of the deconvolution quality on the oscillation amplitude. The matrix method exhibits spikelike features originating from a numerical artifact. By interpolation of the data, the spikelike features can be circumvented. The Sader–Jarvis method has a continuous amplitude dependence showing two minima and one maximum, which is an inherent property of the deconvolution algorithm. The optimal deconvolution depends on the ratio of the amplitude and the characteristic decay length of the force for the Sader–Jarvis method. However, the matrix method generally provides the higher deconvolution quality. PMID:22496997

  20. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  1. Susceptibility of Tmax to tracer delay on perfusion analysis: quantitative evaluation of various deconvolution algorithms using digital phantoms.

    PubMed

    Kudo, Kohsuke; Sasaki, Makoto; Østergaard, Leif; Christensen, Soren; Uwano, Ikuko; Suzuki, Masako; Ogasawara, Kuniaki; Shirato, Hiroki; Ogawa, Akira

    2011-03-01

    The time-to-maximum of the tissue residue function (T(max)) perfusion index has proven very predictive of infarct growth in large clinical trials, yet its dependency on simple tracer delays remains unknown. Here, we determine the dependency of computed tomography (CT) perfusion (CTP) T(max) estimates on tracer delay using a range of deconvolution techniques and digital phantoms. Digital phantom data sets simulating the tracer delay were created from CTP data of six healthy individuals, in which time frames of the left cerebral hemisphere were shifted forward and backward by up to ±5?seconds. These phantoms were postprocessed with three common singular value decomposition (SVD) deconvolution algorithms-standard SVD (sSVD), block-circulant SVD (bSVD), and delay-corrected SVD (dSVD)-with an arterial input function (AIF) obtained from the right middle cerebral artery (MCA). The T(max) values of the left hemisphere were compared among different tracer delays and algorithms by a region of interest-based analysis. The T(max) values by sSVD were positively correlated with 'positive shifts' but unchanged with 'negative shifts,' those by bSVD had an excellent positive linear correlation with both positive and negative shifts, and those by dSVD were relatively constant, although slightly increased with the positive shifts. The T(max) is a parameter highly dependent on tracer delays and deconvolution algorithm. PMID:20859294

  2. Susceptibility of Tmax to tracer delay on perfusion analysis: quantitative evaluation of various deconvolution algorithms using digital phantoms

    PubMed Central

    Kudo, Kohsuke; Sasaki, Makoto; Østergaard, Leif; Christensen, Soren; Uwano, Ikuko; Suzuki, Masako; Ogasawara, Kuniaki; Shirato, Hiroki; Ogawa, Akira

    2011-01-01

    The time-to-maximum of the tissue residue function (Tmax) perfusion index has proven very predictive of infarct growth in large clinical trials, yet its dependency on simple tracer delays remains unknown. Here, we determine the dependency of computed tomography (CT) perfusion (CTP) Tmax estimates on tracer delay using a range of deconvolution techniques and digital phantoms. Digital phantom data sets simulating the tracer delay were created from CTP data of six healthy individuals, in which time frames of the left cerebral hemisphere were shifted forward and backward by up to ±5?seconds. These phantoms were postprocessed with three common singular value decomposition (SVD) deconvolution algorithms—standard SVD (sSVD), block-circulant SVD (bSVD), and delay-corrected SVD (dSVD)—with an arterial input function (AIF) obtained from the right middle cerebral artery (MCA). The Tmax values of the left hemisphere were compared among different tracer delays and algorithms by a region of interest-based analysis. The Tmax values by sSVD were positively correlated with ‘positive shifts' but unchanged with ‘negative shifts,' those by bSVD had an excellent positive linear correlation with both positive and negative shifts, and those by dSVD were relatively constant, although slightly increased with the positive shifts. The Tmax is a parameter highly dependent on tracer delays and deconvolution algorithm. PMID:20859294

  3. CT-perfusion imaging of the human brain: Advanced deconvolution analysis using circulant singular value decomposition

    Microsoft Academic Search

    H.-J. Wittsack; Afra M. Wohlschläger; E. K. Ritzl; R. Kleiser; M. Cohnen; R. J. Seitz; U. Mödder

    2008-01-01

    According to indicator dilution theory tissue time–concentration curves have to be deconvolved with arterial input curves in order to get valid perfusion results. Our aim was to adapt and validate a deconvolution method originating from magnetic resonance techniques and apply it to the calculation of dynamic contrast enhanced computed tomography perfusion imaging. The application of a block-circulant matrix approach for

  4. Flight Operations Analysis Tool

    NASA Technical Reports Server (NTRS)

    Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca

    2006-01-01

    Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.

  5. A System Analysis Tool

    SciTech Connect

    CAMPBELL,PHILIP L.; ESPINOZA,JUAN

    2000-06-01

    In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

  6. Frequency Response Analysis Tool

    SciTech Connect

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

    2014-12-31

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  7. Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis

    SciTech Connect

    Greenberg, M.; Ebel, D.S. (AMNH)

    2009-03-19

    We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

  8. Portfolio Analysis Tool

    NASA Technical Reports Server (NTRS)

    Barth, Tim; Zapata, Edgar; Benjamin, Perakath; Graul, Mike; Jones, Doug

    2005-01-01

    Portfolio Analysis Tool (PAT) is a Web-based, client/server computer program that helps managers of multiple projects funded by different customers to make decisions regarding investments in those projects. PAT facilitates analysis on a macroscopic level, without distraction by parochial concerns or tactical details of individual projects, so that managers decisions can reflect the broad strategy of their organization. PAT is accessible via almost any Web-browser software. Experts in specific projects can contribute to a broad database that managers can use in analyzing the costs and benefits of all projects, but do not have access for modifying criteria for analyzing projects: access for modifying criteria is limited to managers according to levels of administrative privilege. PAT affords flexibility for modifying criteria for particular "focus areas" so as to enable standardization of criteria among similar projects, thereby making it possible to improve assessments without need to rewrite computer code or to rehire experts, and thereby further reducing the cost of maintaining and upgrading computer code. Information in the PAT database and results of PAT analyses can be incorporated into a variety of ready-made or customizable tabular or graphical displays.

  9. Independent evaluation of a commercial deconvolution reporting software for gas chromatography mass spectrometry analysis of pesticide residues in fruits and vegetables

    Microsoft Academic Search

    Hans Ragnar Norli; Agnethe Christiansen; Børge Holen

    2010-01-01

    The gas chromatography mass spectrometry (GC–MS) deconvolution reporting software (DRS) from Agilent Technologies has been evaluated for its ability as a screening tool to detect a large number of pesticides in incurred and fortified samples extracted with acetone\\/dichloromethane\\/light petroleum (Mini-Luke method). The detection of pesticides is based on fixed retention times using retention time locking (RTL) and full scan mass

  10. Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

    2011-01-01

    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

  11. Java Radar Analysis Tool

    NASA Technical Reports Server (NTRS)

    Zaczek, Mariusz P.

    2005-01-01

    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  12. Signal classification Statistical analysis tools

    E-print Network

    Ravelet, Florent

    Signal classification Statistical analysis tools Data processing in Fluid Mechanics Data processing´elie Danlos, Florent Ravelet Experimental methods for fluid flows: an introduction #12;Signal classification Statistical analysis tools Data processing in Fluid Mechanics Signal-to-noise ratio Deterministic signal

  13. Convolution-deconvolution in DIGES

    SciTech Connect

    Philippacopoulos, A.J.; Simos, N. [Brookhaven National Lab., Upton, NY (United States). Dept. of Advanced Technology

    1995-05-01

    Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities.

  14. Marginal Abatement Cost Analysis Tool

    EPA Science Inventory

    The Non-CO2 Marginal Abatement Cost Analysis Tool is an extensive bottom-up engineering-economic spreadsheet model capturing the relevant cost and performance data on sectors emitting non-CO2 GHGs. The tool has 24 regions and 7 sectors and produces marginal abatement cost curves...

  15. Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis

    PubMed Central

    Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

    2015-01-01

    The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

  16. Nonstandard Tools for Nonsmooth Analysis

    E-print Network

    S. S. Kutateladze

    2012-06-11

    This is an overview of the basic tools of nonsmooth analysis which are grounded on nonstandard models of set theory. By way of illustration we give a criterion for an infinitesimally optimal path of a general discrete dynamic system.

  17. Fourier self-deconvolution of the IR spectra as a tool for investigation of distinct functional groups in porous materials: Brønsted acid sites in zeolites.

    PubMed

    Vazhnova, Tanya; Lukyanov, Dmitry B

    2013-12-01

    For many decades, IR and FT-IR spectroscopy has generated valuable information about different functional groups in zeolites, metal-organic frameworks (MOFs), and other porous materials. However, this technique cannot distinguish between functional groups in different local environments. Our study demonstrates that this limitation could be overcome by using Fourier self-deconvolution of infrared spectra (FSD-IR). We apply this method to study three acidic mordenite zeolites and show (i) that these zeolites contain six distinct Brønsted acid sites (BAS) as opposed to 2-4 different BAS previously considered in literature and (ii) that the relative amounts of these BAS are different in the three zeolites examined. We then analyze possible locations of six BAS in the mordenite structure and explain a number of conflicting results in literature. On this basis, we conclude that the FSD-IR method allows direct visualization and examination of distributions of distinct BAS in zeolites, thus providing a unique research opportunity, which no other method can provide. Given the similarities in the IR analysis of different functional groups in solids, we expect that the FSD-IR method will be also instrumental in the research into other porous materials, such as solid oxides and MOFs. The latter point is illustrated by FSD of the IR spectrum of hydroxyl groups in a sample of ?-alumina. PMID:24219854

  18. Constrained spherical deconvolution analysis of the limbic network in human, with emphasis on a direct cerebello-limbic pathway

    PubMed Central

    Arrigo, Alessandro; Mormina, Enricomaria; Anastasi, Giuseppe Pio; Gaeta, Michele; Calamuneri, Alessandro; Quartarone, Angelo; De Salvo, Simona; Bruschetta, Daniele; Rizzo, Giuseppina; Trimarchi, Fabio; Milardi, Demetrio

    2014-01-01

    The limbic system is part of an intricate network which is involved in several functions like memory and emotion. Traditionally the role of the cerebellum was considered mainly associated to motion control; however several evidences are raising about a role of the cerebellum in learning skills, emotions control, mnemonic and behavioral processes involving also connections with limbic system. In 15 normal subjects we studied limbic connections by probabilistic Constrained Spherical Deconvolution (CSD) tractography. The main result of our work was to prove for the first time in human brain the existence of a direct cerebello-limbic pathway which was previously hypothesized but never demonstrated. We also extended our analysis to the other limbic connections including cingulate fasciculus, inferior longitudinal fasciculus, uncinated fasciculus, anterior thalamic connections and fornix. Although these pathways have been already described in the tractographic literature we provided reconstruction, quantitative analysis and Fractional Anisotropy (FA) right-left symmetry comparison using probabilistic CSD tractography that is known to provide a potential improvement compared to previously used Diffusion Tensor Imaging (DTI) techniques. The demonstration of the existence of cerebello-limbic pathway could constitute an important step in the knowledge of the anatomic substrate of non-motor cerebellar functions. Finally the CSD statistical data about limbic connections in healthy subjects could be potentially useful in the diagnosis of pathological disorders damaging this system. PMID:25538606

  19. eCRAM computer algorithm for implementation of the charge ratio analysis method to deconvolute electrospray ionization mass spectra

    NASA Astrophysics Data System (ADS)

    Maleknia, Simin D.; Green, David C.

    2010-02-01

    A computer program (eCRAM) has been developed for automated processing of electrospray mass spectra based on the charge ratio analysis method. The eCRAM algorithm deconvolutes electrospray mass spectra solely from the ratio of mass-to-charge (m/z) values of multiply charged ions. The program first determines the ion charge by correlating the ratio of m/z values for any two (i.e., consecutive or non-consecutive) multiply charged ions to the unique ratios of two integers. The mass, and subsequently the identity of the charge carrying species, is further determined from m/z values and charge states of any two ions. For the interpretation of high-resolution electrospray mass spectra, eCRAM correlates isotopic peaks that share the same isotopic compositions. This process is also performed through charge ratio analysis after correcting the multiply charged ions to their lowest common ion charge. The application of eCRAM algorithm has been demonstrated with theoretical mass-to-charge ratios for proteins lysozyme and carbonic anhydrase, as well as experimental data for both low and high-resolution FT-ICR electrospray mass spectra of a range of proteins (ubiquitin, cytochrome c, transthyretin, lysozyme and calmodulin). This also included the simulated data for mixtures by combining experimental data for ubiquitin, cytochrome c and transthyretin.

  20. New spectral deconvolution algorithms for the analysis of polycyclic aromatic hydrocarbons and sulfur heterocycles by comprehensive two-dimensional gas chromatography-quadrupole mass spectrometery.

    PubMed

    Antle, Patrick M; Zeigler, Christian D; Gankin, Yuriy; Robbat, Albert

    2013-11-01

    New mass spectral deconvolution algorithms have been developed for comprehensive two-dimensional gas chromatography/quadrupole mass spectrometry (GC × GC/qMS). This paper reports the first use of spectral deconvolution of full scan quadrupole GC × GC/MS data for the quantitative analysis of polycyclic aromatic hydrocarbons (PAH) and polycyclic aromatic sulfur heterocycles (PASH) in coal tar-contaminated soil. A method employing four ions per isomer and multiple fragmentation patterns per alkylated homologue (MFPPH) is used to quantify target compounds. These results are in good agreement with GC/MS concentrations, and an examination of method precision, accuracy, selectivity, and sensitivity is discussed. MFPPH and SIM/1-ion concentration differences are also examined. PMID:24063305

  1. Bayesian deconvolution and analysis of photoelectron or any other spectra: Fermi-liquid versus marginal Fermi-liquid behavior of the 3d electrons in Ni

    NASA Astrophysics Data System (ADS)

    Gerhardt, U.; Marquardt, S.; Schroeder, N.; Weiss, S.

    1998-09-01

    We present a simple and effective iterative deconvolution of noisy experimental spectra D broadened by the spectrometer function. We show that this ``iterative Bayesian deconvolution'' is closely related to the more complex ``Bayesian analysis,'' also known as the quantified maximum-entropy method. A model m of the true spectral function is needed in both cases. The Bayesian analysis is the most powerful and precise method to relate measured spectra D to the corresponding theoretical models m via the respective probabilities, but two grave conceptual problems together with two severe technical difficulties prevented widespread application. We remove these four obstacles by (i) demonstrating analytically and also by computer simulations that the most probable deconvolution a⁁ obtained as a by-product from the Bayesian analysis gets closer to the true spectral function as the quality of m increases, (ii) finding it equivalent but vastly more efficient to optimize the parameters contained in a given model m by the usual least-squares fit between D and the convolution of m prior to the Bayesian analysis instead of using the Bayesian analysis itself for that purpose, (iii) approximating the convolution by a summation over the energies of the n data points only, with the normalization of the spectrometer function chosen to minimize the errors at both edges of the spectrum, and (iv) avoiding the severe convergence problems frequently encountered in the Bayesian analysis by a simple reformulation of the corresponding system of n nonlinear equations. We also apply our version of the Bayesian analysis to angle-resolved photoelectron spectra taken at normal emission from Ni(111) close to the Fermi energy at about 12 K, using two different physical models: Compared with the marginal Fermi liquid, the Fermi-liquid line shape turns out to be about 104 times more probable to conform with the observed structure of the majority and minority spin peaks in the low-photon and small-binding-energy region.

  2. Data Analysis — Algorithms and Tools

    NASA Astrophysics Data System (ADS)

    Spousta, Martin

    2015-05-01

    Modeling of detector response, modeling of physics, and software tools for modeling and analyzing are three fields among others that were discussed during 16th International workshop on Advanced Computing and Analysis Techniques in physics research - ACAT 2014. This short report represents a summary of track two where the current status and progress in these fields were reported and discussed.

  3. WEAT: Web Enabled Analysis Tool

    NSDL National Science Digital Library

    Center for Disease Control

    Behavioral Risk Factor Surveillance System The BRFSS, the world’s largest telephone survey, tracks health risks in the United States. Information from the survey is used to improve the health of the American people. This tool allows users to create crosstablulations and perform logistic analysis on these data.  

  4. Susceptibility of Tmax to tracer delay on perfusion analysis: quantitative evaluation of various deconvolution algorithms using digital phantoms

    Microsoft Academic Search

    Kohsuke Kudo; Makoto Sasaki; Leif Østergaard; Soren Christensen; Ikuko Uwano; Masako Suzuki; Kuniaki Ogasawara; Hiroki Shirato; Akira Ogawa

    2011-01-01

    The time-to-maximum of the tissue residue function (Tmax) perfusion index has proven very predictive of infarct growth in large clinical trials, yet its dependency on simple tracer delays remains unknown. Here, we determine the dependency of computed tomography (CT) perfusion (CTP) Tmax estimates on tracer delay using a range of deconvolution techniques and digital phantoms. Digital phantom data sets simulating

  5. Deconvolution analysis to determine relaxation time spectra of internal friction peaks

    SciTech Connect

    Cost, J.R.

    1985-01-01

    A new method for analysis of an internal friction vs temperature peak to obtain an approximation of the spectrum of relaxation time responsible for the peak is described. This method, referred to as direct spectrum analysis (DSA), is shown to provide an accurate estimate of the distribution of relaxation times. The method is validated for various spectra, and it is shown that: (1) It provides approximations to known input spectra which replicate the position, amplitude, width and shape with good accuracy (typically 10%). (2) It does not yield approximations which have false spectral peaks.

  6. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  7. Failure environment analysis tool applications

    NASA Technical Reports Server (NTRS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-01-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  8. Failure environment analysis tool applications

    NASA Astrophysics Data System (ADS)

    Pack, Ginger L.; Wadsworth, David B.

    1993-02-01

    Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

  9. Digital deconvolution filter derived from linear discriminant analysis and application for multiphoton fluorescence microscopy.

    PubMed

    Sullivan, Shane Z; Schmitt, Paul D; Muir, Ryan D; DeWalt, Emma L; Simpson, Garth J

    2014-04-01

    A digital filter derived from linear discriminant analysis (LDA) is developed for recovering impulse responses in photon counting from a high speed photodetector (rise time of ~1 ns) and applied to remove ringing distortions from impedance mismatch in multiphoton fluorescence microscopy. Training of the digital filter was achieved by defining temporally coincident and noncoincident transients and identifying the projection within filter-space that best separated the two classes. Once trained, data analysis by digital filtering can be performed quickly. Assessment of the reliability of the approach was performed through comparisons of simulated voltage transients, in which the ground truth results were known a priori. The LDA filter was also found to recover deconvolved impulses for single photon counting from highly distorted ringing waveforms from an impedance mismatched photomultiplier tube. The LDA filter was successful in removing these ringing distortions from two-photon excited fluorescence micrographs and through data simulations was found to extend the dynamic range of photon counting by approximately 3 orders of magnitude through minimization of detector paralysis. PMID:24559143

  10. Adaptive multichannel blind deconvolution using state-space models

    Microsoft Academic Search

    A. Cichocki; L. Zhang

    1999-01-01

    Independent component analysis (ICA) and related problems of blind source separation (BSS) and multichannel blind deconvolution (MBD) problems have recently gained much interest due to many applications in biomedical signal processing, wireless communications and geophysics. In this paper both linear and nonlinear state space models for blind and semi-blind deconvolution are proposed. New unsupervised adaptive learning algorithms performing extended linear

  11. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  12. An evaluation of deconvolution techniques in x-ray profile broadening analysis and the application of the maximum entropy method to alumina data

    SciTech Connect

    Kalceff, W.; Armstrong, N. [Univ. of Technology, Sydney (Australia); Cline, J.P. [National Inst. of Standards and Technology Gaithersburg, MD (United States)

    1995-12-31

    This paper reviews several procedures for the removal of instrumental contributions from measured x-ray diffraction profiles, including: direct convolution, unconstrained and constrained deconvolution, an iterative technique, and a maximum entropy method (MEM) which we have adapted to x-ray diffraction profile analysis. Deconvolutions using the maximum entropy approach were found to be the most robust with simulated profiles which included Poisson-distributed noise and uncertainties in the instrument profile function (IPF). The MEM procedure is illustrated by application to the analysis for domain size and microstrain carried out on the four calcined {alpha}-alumina candidate materials for Standard Reference Material (SRM) 676 (a quantitative analysis standard for I/I{sub c} determinations), along with the certified material. Williamson-Hall plots of these data were problematic with respect to interpretation of the microstrain, indicating that the line profile standard, SRM 660 (LaB{sub 6}), exhibits a small amount of strain broadening, particularly at high 2{theta} angle. The domain sizes for all but one of the test materials were much smaller than the crystallite (particle) size; indicating the presence of low angle grain boundaries. 19 refs., 3 figs., 1 tab.

  13. Behavior based network traffic analysis tool

    Microsoft Academic Search

    Sindhu Kakuru

    2011-01-01

    Pattern matching systems are mainly based on network models, which are formed from detailed analysis of user statistics and network traffic. These models are used in developing traffic analysis tools. This paper focuses on development of a behavior analysis tool on any operating system and its use on detecting internal active\\/passive attacks. Many kinds of tools and firewalls are in

  14. Optimal seismic deconvolution: distributed algorithms

    Microsoft Academic Search

    Konstantinos N. Plataniotis; Sokratis K. Katsikas; Demetrios G. Lainiotis; Anastasios N. Venetsanopoulos

    1998-01-01

    Deconvolution is one of the most important aspects of seismic signal processing. The objective of the deconvolution procedure is to remove the obscuring effect of the wavelet's replica making up the seismic trace and therefore obtain an estimate of the reflection coefficient sequence. This paper introduces a new deconvolution algorithm. Optimal distributed estimators and smoothers are utilized in the proposed

  15. Modeling error in Approximate Deconvolution Models

    E-print Network

    Adrian Dunca; Roger Lewandowski

    2012-10-09

    We investigate the assymptotic behaviour of the modeling error in approximate deconvolution model in the 3D periodic case, when the order $N$ of deconvolution goes to $\\infty$. We consider successively the generalised Helmholz filters of order $p$ and the Gaussian filter. For Helmholz filters, we estimate the rate of convergence to zero thanks to energy budgets, Gronwall's Lemma and sharp inequalities about Fouriers coefficients of the residual stress. We next show why the same analysis does not allow to conclude convergence to zero of the error modeling in the case of Gaussian filter, leaving open issues.

  16. Comparison of environmental TLD (thermoluminescent dosimeter) results obtained using glow curve deconvolution and region of interest analysis

    SciTech Connect

    Not Available

    1987-01-01

    We tested a Harshaw Model 4000 TLD Reader in the Sandia Environmental TLD Program. An extra set of LiF TLD-700 chips were prepared for each field location and calibration level. At the end of quarter one, half of the TLDs were read on the Model 4000 and the other half were read on our standard Harshaw Model 2000. This presentation compares the results of the two systems. The Model 4000 results are reported for two regions of interest and for background subtraction using Harshaw Glow Curve Deconvolution Software.

  17. Extension of deconvolution algorithms for the mapping of moving acoustic sources.

    PubMed

    Fleury, Vincent; Bulté, Jean

    2011-03-01

    Several deconvolution algorithms are commonly used in aeroacoustics to estimate the power level radiated by static sources, for instance, the deconvolution approach for the mapping of acoustic sources (DAMAS), DAMAS2, CLEAN, and the CLEAN based on spatial source coherence algorithm (CLEAN-SC). However, few efficient methodologies are available for moving sources. In this paper, several deconvolution approaches are proposed to estimate the narrow-band spectra of low-Mach number uncorrelated sources. All of them are based on a beamformer output. Due to velocity, the beamformer output is inherently related to the source spectra over the whole frequency range, which makes the deconvolution very complex from a computational point of view. Using the conventional Doppler approximation and for limited time analysis, the problem can be separated into multiple independent problems, each involving a single source frequency, as for static sources. DAMAS, DAMAS2, CLEAN, and CLEAN-SC are then extended to moving sources. These extensions are validated from both synthesized data and real aircraft flyover noise measurements. Comparable performances to those of the corresponding static methodologies are recovered. All these approaches constitute complementary and efficient tools in order to quantify the noise level emitted from moving acoustic sources. PMID:21428506

  18. Flow Analysis Tool White Paper

    NASA Technical Reports Server (NTRS)

    Boscia, Nichole K.

    2012-01-01

    Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

  19. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  20. System analysis: Developing tools for the future

    SciTech Connect

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  1. Subcellular Microanatomy by 3D Deconvolution Brightfield Microscopy: Method and Analysis Using Human Chromatin in the Interphase Nucleus

    PubMed Central

    Tadrous, Paul Joseph

    2012-01-01

    Anatomy has advanced using 3-dimensional (3D) studies at macroscopic (e.g., dissection, injection moulding of vessels, radiology) and microscopic (e.g., serial section reconstruction with light and electron microscopy) levels. This paper presents the first results in human cells of a new method of subcellular 3D brightfield microscopy. Unlike traditional 3D deconvolution and confocal techniques, this method is suitable for general application to brightfield microscopy. Unlike brightfield serial sectioning it has subcellular resolution. Results are presented of the 3D structure of chromatin in the interphase nucleus of two human cell types, hepatocyte and plasma cell. I show how the freedom to examine these structures in 3D allows greater morphological discrimination between and within cell types and the 3D structural basis for the classical “clock-face” motif of the plasma cell nucleus is revealed. Potential for further applications discussed. PMID:22567315

  2. Difference in tracer delay-induced effect among deconvolution algorithms in CT perfusion analysis: quantitative evaluation with digital phantoms.

    PubMed

    Kudo, Kohsuke; Sasaki, Makoto; Ogasawara, Kuniaki; Terae, Satoshi; Ehara, Shigeru; Shirato, Hiroki

    2009-04-01

    Institutional review board approval and informed consent were obtained. The purpose was to evaluate the differences in tracer delay-induced effects of various deconvolution algorithms for computed tomographic (CT) perfusion imaging by using digital phantoms created from actual source data. Three methods of singular value decomposition (SVD) were evaluated. For standard SVD (sSVD), the delays induced significant errors in cerebral blood flow and mean transit time. In contrast, for block-circulant SVD (bSVD), these values remained virtually unchanged, whereas for delay-corrected SVD (dSVD), mild changes were observed. bSVD was superior to sSVD and dSVD for avoiding the tracer delay-induced effects in CT perfusion imaging. PMID:19190251

  3. Shearlet-based deconvolution.

    PubMed

    Patel, Vishal M; Easley, Glenn R; Healy, Dennis M

    2009-12-01

    In this paper, a new type of deconvolution algorithm is proposed that is based on estimating the image from a shearlet decomposition. Shearlets provide a multidirectional and multiscale decomposition that has been mathematically shown to represent distributed discontinuities such as edges better than traditional wavelets. Constructions such as curvelets and contourlets share similar properties, yet their implementations are significantly different from that of shearlets. Taking advantage of unique properties of a new M-channel implementation of the shearlet transform, we develop an algorithm that allows for the approximation inversion operator to be controlled on a multiscale and multidirectional basis. A key improvement over closely related approaches such as ForWaRD is the automatic determination of the threshold values for the noise shrinkage for each scale and direction without explicit knowledge of the noise variance using a generalized cross validation (GCV). Various tests show that this method can perform significantly better than many competitive deconvolution algorithms. PMID:19666337

  4. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

  5. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  6. REDCAT: a residual dipolar coupling analysis tool

    Microsoft Academic Search

    Homayoun Valafar; James H. Prestegard

    2004-01-01

    Recent advancements in the utilization of residual dipolar couplings (RDCs) as a means of structure validation and elucidation have demonstrated the need for, not only a more user friendly, but also a more powerful RDC analysis tool. In this paper, we introduce a software package named REsidual Dipolar Coupling Analysis Tool (REDCAT) designed to address the above issues. REDCAT is

  7. Iterative deconvolution and semiblind deconvolution methods in magnetic archaeological prospecting

    E-print Network

    Bertero, Mario

    Iterative deconvolution and semiblind deconvolution methods in magnetic archaeological prospecting In archaeological magnetic prospecting, most targets can be modeled by a single layer of constant burial depth. INTRODUCTION One of the most difficult problems in potential-field prospecting is to extract information about

  8. NCI Interactive Budget Analysis Tool

    Cancer.gov

    This tool provides users an interactive overview of the National Cancer Institute (NCI) budget and Fact Book data since Fiscal Year 1999. Additional historical NCI budget information can be obtained through the NCI Fact Book Collection.

  9. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Microsoft Academic Search

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-01-01

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet

  10. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  11. Toole County Secondary Data Analysis

    E-print Network

    Maxwell, Bruce D.

    Infarction prevalence (Heart Attack) 4.4% 4.1% 6.0% All Sites Cancer 461.9 455.5 543.2 1 Community) Leading Causes of Death County1 Montana1,2 Nation2 1. Heart Disease 2. Cancer 3. CLRD* 1. Cancer 2. Heart Disease 3.CLRD* 1. Heart Disease 2. Cancer 3. CLRD* #12; Toole County Secondary Data

  12. GAIA: Graphical Astronomy and Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Draper, Peter W.; Gray, Norman; Berry, David S.; Taylor, Mark

    2014-03-01

    GAIA is an image and data-cube display and analysis tool for astronomy. It provides the usual facilities of image display tools, plus more astronomically useful ones such as aperture and optimal photometry, contouring, source detection, surface photometry, arbitrary region analysis, celestial coordinate readout, calibration and modification, grid overlays, blink comparison, defect patching and the ability to query on-line catalogues and image servers. It can also display slices from data-cubes, extract and visualize spectra as well as perform full 3D rendering. GAIA uses the Starlink software environment (ascl:1110.012) and is derived from the ESO SkyCat tool (ascl:1109.019).

  13. SPIDA — A Novel Data Analysis Tool

    Microsoft Academic Search

    D Nauck; M Spott; B Azvine

    2003-01-01

    In modern businesses, intelligent data analysis (IDA) is an important aspect of turning data into information and then into\\u000a action. Data analysis has become a practical area and data analysis methods are nowadays used as tools. This approach to data\\u000a analysis requires IDA platforms that support users and prevent them from making errors or from using methods in the wrong

  14. The deconvolution of afterflow affected data 

    E-print Network

    Schellenbaum, Sue Ellen

    1988-01-01

    OF FIGURES FIGURE Page 1 The Characteristic Influence of We llbore Storage 2 Flowchart Illustrating Simulator Routine 3 Kucuk and Ayestaran, and Thompson ~ al. Deconvo lution Methods 4 Flowchart Illustrating Deconvolution Routine 5 Semi-log Analysis... = IOE4, Skin = 0 sD 9 Semi-log Analysis of an Infinite-Acting Reservoir C 0 = 10E5, Skin = 0 sD 10 Semi-log Analysis of an Infinite-Acting Reservoir C 0 = 1000, Skin = 5 sD 11 Semi-log Analysis of an Infinite-Acting Reservoir C 0 = 1000, Skin...

  15. A network evaluation and analysis tool

    SciTech Connect

    Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.

    1993-05-01

    The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

  16. A network evaluation and analysis tool

    SciTech Connect

    Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.

    1993-01-01

    The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

  17. Independent evaluation of a commercial deconvolution reporting software for gas chromatography mass spectrometry analysis of pesticide residues in fruits and vegetables.

    PubMed

    Norli, Hans Ragnar; Christiansen, Agnethe; Holen, Børge

    2010-03-26

    The gas chromatography mass spectrometry (GC-MS) deconvolution reporting software (DRS) from Agilent Technologies has been evaluated for its ability as a screening tool to detect a large number of pesticides in incurred and fortified samples extracted with acetone/dichloromethane/light petroleum (Mini-Luke method). The detection of pesticides is based on fixed retention times using retention time locking (RTL) and full scan mass spectral comparison with a partly customer built automated mass spectral deconvolution and identification system (AMDIS) database. The GC-MS was equipped with a programmable temperature vaporising (PTV) injector system which enables more sample to be injected. In a blind study of 52 real samples a total number of 158 incurred pesticides were found. In addition to the 85 pesticides found by manual interpretation of GC-NPD/ECD chromatograms, the DRS revealed 73 more pesticides (+46%). The DRS system also shows its potential to discover pesticides which are normally not searched for (EPN in long beans from Thailand). A spiking experiment was performed to blank matrices of apple, orange and lettuce with 177 different pesticides at concentration levels 0.02 and 0.1 mg/kg. The samples were analysed on GC-MS full scan and the AMDIS match factor was used as a mass spectral quality criterion. The threshold level of the AMDIS match factor was set at 20 to eliminate most of the false positives. AMDIS match factors from 20 up to 69 are regarded only as indication of a positive hit and must be followed by manual interpretation. Pesticides giving AMDIS match factors at > or = 70 are regarded as identified. To simplify and decrease the large amount of data generated at each concentration level, the AMDIS match factors > or = 20 was averaged (mean AMF) for each pesticide including the commodities and their replicates. Among 177 different pesticides spiked at 0.02 and 0.1 mg/kg level, the percentage of mean AMF values > or = 70 were 23% and 80%, respectively. For 531 individual detections of pesticides (177 pesticides x 3 replicates) giving AMDIS match factor 20 in apple, orange and lettuce, the detection rates at 0.02 mg/kg were 71%, 63% and 72%, respectively. For the 0.1 mg/kg level the detection rates were 89%, 85% and 89%, respectively. In real samples some manual interpretation must be performed in addition. However, screening by GC-MS/DRS is about 5-10 times faster compared to screening with GC-NPD/ECD because the time used for manual interpretation is much shorter and there is no need for re-injection on GC-MS for the identification of suspect peaks found on GC-NPD/ECD. PMID:20172528

  18. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  19. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  20. lmbench: Portable Tools for Performance Analysis

    Microsoft Academic Search

    Larry W. Mcvoy; Carl Staelin

    1996-01-01

    lmbench is a micro-benchmark suite designed to focus attention on the basic building blocks of man y common system applications, such as databases, simu- lations, software development, and networking. In almost all cases, the indi vidual tests are the result of analysis and isolation of a customer' sa ctual perfor- mance problem. These tools can be, and currently are, used

  1. Accelerator physics analysis with interactive tools

    SciTech Connect

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties.

  2. ADVANCED COMPOSITES REPAIR ANALYSIS TOOL (ACRAT)

    Microsoft Academic Search

    Thomas E. Mack; James Y. Song

    The Advanced Composites Repair Analysis Tool (ACRAT) has been under development for the USAF Advanced Composites Program Office under an Ogden ALC Design Engineering Program (DEP) Contractual Engineering Task (CET) Order. ACRAT is an integrated prototype software system consisting of commercial-off-the-shelf (COTS) and public domain CAE simulation codes and customized databases. The objective has been to develop Beta versions of

  3. CoastWatch Data Analysis Tool

    E-print Network

    and animation support for map comparisons · Geographic feature support with attributes #12;CDAT Software DesignCoastWatch Data Analysis Tool Software Design Review Peter Hollemans January 2008 #12;CDAT Software Review Peter Hollemans, January 2008 3 Overview #12;CDAT Software Design Review Peter Hollemans, January

  4. Paramedir: A Tool for Programmable Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  5. Integrated multidisciplinary analysis tool IMAT users' guide

    NASA Technical Reports Server (NTRS)

    Meissner, Frances T. (editor)

    1988-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

  6. Patent citation analysis: A policy analysis tool

    Microsoft Academic Search

    M. M. S. Karki

    1997-01-01

    Patent citation analysis is a recent development which uses bibliometric techniques to analyse the wealth of patent citation information. This paper describes the various facets of patent citations and patent citation studies, and their important applications. Construction of technology indicators being an important use of patent citations, various patent citation based technological indicators and their applications are also described.

  7. INVITED PAPER Blind Deconvolution of Dynamical Systems

    E-print Network

    Zhang, Liqing

    backpropagation 1. Introduction Blind separation/deconvolution of source sig- nals has been a subject under a mixing and/or ltering, natural or synthetic medium. The blind source separa- tion/deconvolution problem cient algorithm in blind separation and blind deconvolution 1]. In the Cumulant Fitting Procedure (CFP

  8. Integrated tools for control-system analysis

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  9. Interpretation and Deconvolution of Nanodisc Native Mass Spectra

    NASA Astrophysics Data System (ADS)

    Marty, Michael T.; Zhang, Hao; Cui, Weidong; Gross, Michael L.; Sligar, Stephen G.

    2013-12-01

    Nanodiscs are a promising system for studying gas-phase and solution complexes of membrane proteins and lipids. We previously demonstrated that native electrospray ionization allows mass spectral analysis of intact Nanodisc complexes at single lipid resolution. This report details an improved theoretical framework for interpreting and deconvoluting native mass spectra of Nanodisc lipoprotein complexes. In addition to the intrinsic lipid count and charge distributions, Nanodisc mass spectra are significantly shaped by constructive overlap of adjacent charge states at integer multiples of the lipid mass. We describe the mathematical basis for this effect and develop a probability-based algorithm to deconvolute the underlying mass and charge distributions. The probability-based deconvolution algorithm is applied to a series of dimyristoylphosphatidylcholine Nanodisc native mass spectra and used to provide a quantitative picture of the lipid loss in gas-phase fragmentation.

  10. Optimal application of Morrison's iterative noise removal for deconvolution

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1986-01-01

    Morrison's iterative method of noise removal can be applied for both noise removal alone and noise removal prior to deconvolution. This method is applied to noise of various noise levels added to determine the optimum use of the method. The phase shift method of migration and modeling is evaluated and the results are compared to Stolt's approach. A method is introduced by which the optimum iterative number for deconvolution can be found. Statistical computer simulation is used to describe the optimum use of two convergent iterative techniques for seismic data. The Always-Convergent deconvolution technique was applied to data recorded during the quantitative analysis of materials through NonDestructive Evaluation (NDE) in which ultrasonic signals were used to detect flaws in substances such as composites.

  11. PINPAS: A TOOL FOR POWER ANALYSIS OF SMARTCARDS

    E-print Network

    de Vink, Erik

    PINPAS: A TOOL FOR POWER ANALYSIS OF SMARTCARDS J. den Hartog 1 , J. Verschuren 2 , E. de Vink 3 of power analysis and other side-channel attacks on smartcards. The PINPAS tool supports the testing is discussed to illustrate the usage of the tool. Keywords: Smartcard, Power Analysis, Side-Channel Analysis

  12. Statistical deconvolution for superresolution fluorescence microscopy.

    PubMed

    Mukamel, Eran A; Babcock, Hazen; Zhuang, Xiaowei

    2012-05-16

    Superresolution microscopy techniques based on the sequential activation of fluorophores can achieve image resolution of ?10 nm but require a sparse distribution of simultaneously activated fluorophores in the field of view. Image analysis procedures for this approach typically discard data from crowded molecules with overlapping images, wasting valuable image information that is only partly degraded by overlap. A data analysis method that exploits all available fluorescence data, regardless of overlap, could increase the number of molecules processed per frame and thereby accelerate superresolution imaging speed, enabling the study of fast, dynamic biological processes. Here, we present a computational method, referred to as deconvolution-STORM (deconSTORM), which uses iterative image deconvolution in place of single- or multiemitter localization to estimate the sample. DeconSTORM approximates the maximum likelihood sample estimate under a realistic statistical model of fluorescence microscopy movies comprising numerous frames. The model incorporates Poisson-distributed photon-detection noise, the sparse spatial distribution of activated fluorophores, and temporal correlations between consecutive movie frames arising from intermittent fluorophore activation. We first quantitatively validated this approach with simulated fluorescence data and showed that deconSTORM accurately estimates superresolution images even at high densities of activated fluorophores where analysis by single- or multiemitter localization methods fails. We then applied the method to experimental data of cellular structures and demonstrated that deconSTORM enables an approximately fivefold or greater increase in imaging speed by allowing a higher density of activated fluorophores/frame. PMID:22677393

  13. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  14. RSAT 2015: Regulatory Sequence Analysis Tools.

    PubMed

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-07-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

  15. Tools for Next Generation Sequencing Data Analysis

    PubMed Central

    Bodi, K.

    2011-01-01

    As NGS technology continues to improve, the amount of data generated per run grows exponentially. Unfortunately, the primary bottleneck in NGS studies is still bioinformatics analysis. Not all researchers have access to a bioinformatics core or dedicated bioinformatician. Additionally, much of the software for NGS analyses is written to run in a Unix / Linux environment. Researchers unfamiliar with the Unix command line may be unable to use these tools, or face a steep learning curve in trying to do so. Commercial packages exist, such as the CLC Genomics Workbench, DNANexus, and GenomeQuest. However, these commercial packages often incorporate proprietary algorithms to perform data analysis and may be costly. Galaxy provides a solution to this problem by incorporating popular open-source and community linux command line tools into an easy to use web-based environment. After sequence data has been uploaded and mapped, there are a variety of workflows for NGS analyses that use open-source tools. This includes peak-calling analyses for ChIP-Seq (MACS, GeneTrack indexer, Peak predictor), RNA-Seq (Tophat, Cufflinks), and finding small insertions, deletions, and SNPs using SAMtools. Any researcher can apply a workflow to his NGS data and retrieve results, without having to interact with a command line. Additionally, since Galaxy is cloud-based, expensive computing hardware for performing analyses is not needed. In this presentation we will provide an overview of two popular open source RNA-Seq analysis tools, Tophat and Cufflinks, and demonstrate how they can be used in Galaxy.

  16. 9. Analysis a. Analysis tools for dam removal

    E-print Network

    Tullos, Desiree

    9. Analysis a. Analysis tools for dam removal v. Hydrodynamic, sediment transport and physical are frequently the main concerns associated with a dam removal due to the possible effects on infrastructure reservoir sediment when removing a dam are river erosion, mechanical removal, and stabilization (ASCE 1997

  17. Spatially varying regularization of deconvolution in 3D microscopy.

    PubMed

    Seo, J; Hwang, S; Lee, J-M; Park, H

    2014-08-01

    Confocal microscopy has become an essential tool to explore biospecimens in 3D. Confocal microcopy images are still degraded by out-of-focus blur and Poisson noise. Many deconvolution methods including the Richardson-Lucy (RL) method, Tikhonov method and split-gradient (SG) method have been well received. The RL deconvolution method results in enhanced image quality, especially for Poisson noise. Tikhonov deconvolution method improves the RL method by imposing a prior model of spatial regularization, which encourages adjacent voxels to appear similar. The SG method also contains spatial regularization and is capable of incorporating many edge-preserving priors resulting in improved image quality. The strength of spatial regularization is fixed regardless of spatial location for the Tikhonov and SG method. The Tikhonov and the SG deconvolution methods are improved upon in this study by allowing the strength of spatial regularization to differ for different spatial locations in a given image. The novel method shows improved image quality. The method was tested on phantom data for which ground truth and the point spread function are known. A Kullback-Leibler (KL) divergence value of 0.097 is obtained with applying spatially variable regularization to the SG method, whereas KL value of 0.409 is obtained with the Tikhonov method. In tests on a real data, for which the ground truth is unknown, the reconstructed data show improved noise characteristics while maintaining the important image features such as edges. PMID:24917510

  18. Bayesian deconvolution method applied to experimental bidirectional transmittance distribution functions

    NASA Astrophysics Data System (ADS)

    Audenaert, Jan; Leloup, Frédéric B.; Durinck, Guy; Deconinck, Geert; Hanselaer, Peter

    2013-03-01

    Optical simulations are a common tool in the development of luminaires for lighting applications. The reliability of the virtual prototype is strongly dependent on the accuracy of the input data such as the emission characteristics of the light source and the scattering properties of the optical components (reflectors, filters and diffusers). These scattering properties are characterized by the bidirectional scatter distribution function (BSDF). Experimental determination of the BSDF of the materials is however very sensitive to the characteristics of the measuring instrument, i.e. the dimensions of the illumination spot, the detector aperture, etc. These instrumental characteristics are reflected in the instrument function. In order to eliminate the influence of the instrument function the use of a Bayesian deconvolution technique is proposed. A suitable stopping rule for the iterative deconvolution algorithm is presented. The deconvolution method is validated using Monte Carlo ray tracing software by simulating a BSDF measurement instrument and a virtual sample with a known bidirectional transmittance distribution function (BTDF). The Bayesian deconvolution technique is applied to experimental BTDF data of holographic diffusers, which exhibit a symmetrical angular broadening under normal incident irradiation. In addition, the effect of applying deconvolved experimental BTDF data on simulations of luminance maps is illustrated.

  19. Distribution System Analysis Tools for Studying High Penetration of PV

    E-print Network

    Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System Analysis Tools for Studying High Penetration of PV project titled "Distribution System Analysis Tools for Studying High Penetration of PV with Grid Support

  20. The CoastWatch Data Analysis Tool Peter Hollemans

    E-print Network

    The CoastWatch Data Analysis Tool Peter Hollemans SES Inc. Contractor for NOAA/NESDIS October, 2003 #12;October, 2003 Peter Hollemans, SES Inc. Contractor for NOAA/ NESDIS CoastWatch Data Analysis Tool #12;October, 2003 Peter Hollemans, SES Inc. Contractor for NOAA/ NESDIS CoastWatch Data Analysis Tool

  1. Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle

    NASA Astrophysics Data System (ADS)

    Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

    2015-05-01

    UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement.

  2. Microfracturing and new tools improve formation analysis

    SciTech Connect

    McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))

    1992-12-07

    This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

  3. SEAT: A strategic engagement analysis tool

    SciTech Connect

    Dreicer, J.; Michelsen, C.; Morgeson, D.

    1988-01-01

    The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

  4. Nonstandard Tools for Nonsmooth Analysis S. S. Kutateladze

    E-print Network

    Kutateladze, Semen Samsonovich

    Nonstandard Tools for Nonsmooth Analysis S. S. Kutateladze Sobolev Institute Novosibirsk June 18, 2012 S. S. Kutateladze (Sobolev Institute) Nonstandard Tools for Nonsmooth Analysis June 18, 2012 1 by the tools of nonstandard models to be discussed in this talk. S. S. Kutateladze (Sobolev Institute

  5. Initiating a Benchmark for UML and OCL Analysis Tools

    E-print Network

    Paris-Sud XI, Université de

    Initiating a Benchmark for UML and OCL Analysis Tools Martin Gogolla(A) , Fabian B by a variety of analysis tools having different scopes, aims and technologi- cal corner stones. The spectrum. The paper sketches how these questions are handled by two OCL tools, USE and EMFtoCSP. The claim

  6. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  7. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  8. PyRAT - python radiography analysis tool (u)

    SciTech Connect

    Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory; Armstrong, Jerawan C [Los Alamos National Laboratory

    2011-01-14

    PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

  9. An alternating minimization method for blind deconvolution from Poisson data

    NASA Astrophysics Data System (ADS)

    Prato, Marco; La Camera, Andrea; Bonettini, Silvia

    2014-10-01

    Blind deconvolution is a particularly challenging inverse problem since information on both the desired target and the acquisition system have to be inferred from the measured data. When the collected data are affected by Poisson noise, this problem is typically addressed by the minimization of the Kullback-Leibler divergence, in which the unknowns are sought in particular feasible sets depending on the a priori information provided by the specific application. If these sets are separated, then the resulting constrained minimization problem can be addressed with an inexact alternating strategy. In this paper we apply this optimization tool to the problem of reconstructing astronomical images from adaptive optics systems, and we show that the proposed approach succeeds in providing very good results in the blind deconvolution of nondense stellar clusters.

  10. Tools for Lighting Design and Analysis

    Microsoft Academic Search

    Gregory J. Ward

    Herein we describe some of the more useful tools and methods for applying computer technology to lighting design problems. We discuss input tools for 3D geometry, materials and photometry, simulation and rendering tools and methods, and output devices and formats. Numerous examples are given. DISCLAIMER A number of specific products and manufacturers will be mentioned in the text of these

  11. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  12. ISHM Decision Analysis Tool: Operations Concept

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

  13. DERMAL ABSORPTION OF PESTICIDES CALCULATED BY DECONVOLUTION

    EPA Science Inventory

    Using published human data on skin-to-urine and blood-to-urine transfer of 12 pesticides and herbicides, the skin-to-blood transfer rates for each compound were estimated by two numerical deconvolution techniques. Regular constrained deconvolution produced an estimated upper limi...

  14. Better Logging to Improve Interactive Data Analysis Tools S. Alspaugh

    E-print Network

    Hearst, Marti

    Better Logging to Improve Interactive Data Analysis Tools S. Alspaugh University of California@eecs.berkeley.edu Abstract Interactive data analysis applications have become critical tools for making sense of our world logged from interactive data analysis systems. Such data is invaluable for improving our understanding

  15. Scalable analysis tools for sensitivity analysis and UQ (3160) results.

    SciTech Connect

    Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

    2009-09-01

    The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

  16. Knowledge base navigator facilitating regional analysis inter-tool communication.

    SciTech Connect

    Hampton, Jeffery Wade; Chael, Eric Paul; Hart, Darren M.; Merchant, Bion John; Chown, Matthew N.

    2004-08-01

    To make use of some portions of the National Nuclear Security Administration (NNSA) Knowledge Base (KB) for which no current operational monitoring applications were available, Sandia National Laboratories have developed a set of prototype regional analysis tools (MatSeis, EventID Tool, CodaMag Tool, PhaseMatch Tool, Dendro Tool, Infra Tool, etc.), and we continue to maintain and improve these. Individually, these tools have proven effective in addressing specific monitoring tasks, but collectively their number and variety tend to overwhelm KB users, so we developed another application - the KB Navigator - to launch the tools and facilitate their use for real monitoring tasks. The KB Navigator is a flexible, extensible java application that includes a browser for KB data content, as well as support to launch any of the regional analysis tools. In this paper, we will discuss the latest versions of KB Navigator and the regional analysis tools, with special emphasis on the new overarching inter-tool communication methodology that we have developed to make the KB Navigator and the tools function together seamlessly. We use a peer-to-peer communication model, which allows any tool to communicate with any other. The messages themselves are passed as serialized XML, and the conversion from Java to XML (and vice versa) is done using Java Architecture for XML Binding (JAXB).

  17. A Requirements Analysis for Videogame Design Support Tools

    E-print Network

    California at Santa Cruz, University of

    A Requirements Analysis for Videogame Design Support Tools Mark J. Nelson School of Interactive fields, such as architecture and mechanical design, that have CAD tools to support designers in reasoning about and visualizing designs, game designers have no tools for reasoning about and visualizing systems

  18. Single-wafer cluster tool performance: an analysis of throughput

    Microsoft Academic Search

    T. L. Perkinson; P. K. McLarty; R. S. Gyurcsik

    1994-01-01

    Cluster tools gained greater acceptance over the past several years, although concerns still exist over the throughput these tools can achieve. This paper presents an analysis of the relationship between process times, transport times, and maximum throughput in an individual cluster tool. Theoretical models which quantify the time required to process both an individual wafer and a lot in a

  19. Review of Performance Analysis Tools for MPI Parallel Programs

    Microsoft Academic Search

    Shirley Moore; David Cronk; Kevin London; Jack Dongarra

    In order to produce MPI applications that perform well on today’s parallel architectures, programmers need effective tools\\u000a for collecting and analyzing performance data. A variety of such tools, both commercial and research, are becoming available.\\u000a This paper reviews and evaluations the available cross-platform MPI performance analysis tools.

  20. Knowledge mapping: A multipurpose task analysis tool

    Microsoft Academic Search

    Timm J. Esque

    1988-01-01

    A tool was developed to increase the objectivity and accuracy of task difficulty ratings for job design. The tool, knowledge\\u000a mapping, involves identifying specific types of prerequisite knowledge for a given task and then assessing the difficulty\\u000a of each type. The tool was applied in a semiconductor manufacturing environment and yielded valuable information not only\\u000a for job design, but also

  1. Receiver Functions from Autoregressive Deconvolution

    NASA Astrophysics Data System (ADS)

    Wu, Qingju; Li, Yonghua; Zhang, Ruiqing; Zeng, Rongsheng

    2007-12-01

    Receiver functions can be estimated by minimizing the square errors of Wiener filter in time-domain or spectrum division in frequency domain. To avoid the direct calculation of auto-correlation and cross-correlation coefficients in Toeplitz equation or of auto-spectrum and cross-spectrum in spectrum division equation as well as empirically choosing a damping parameter, autoregressive deconvolution is presented to isolate receiver function from three-component teleseismic P waveforms. The vertical component of teleseismic P waveform is modeled by an autoregressive model, which can be forward and backward, predicted respectively. The optimum length of the autoregressive model is determined by the Akaike criterion. By minimizing the square errors of forward and backward predicting filters, autoregressive filter coefficients can be recursively solved, and receiver function is also estimated in the similar procedure. Both synthetic and real data tests show that autoregressive deconvolution is an effective method to isolate receiver function from teleseismic P waveforms in time-domain.

  2. Bayesian approach based blind image deconvolution with fuzzy median filter

    NASA Astrophysics Data System (ADS)

    Mohan, S. Chandra; Rajan, K.; Srinivasan, R.

    2011-10-01

    The inverse problem associated with reconstruction of Poisson blurred images has attracted attention in recent years. In this paper, we propose an alternative unified approach to blind image deconvolution problem using fuzzy median filter as Gibbs prior to model the nature of inter pixel interaction for better edge preserving reconstruction. The performance of the algorithm at various SNR levels has been studied quantitatively using PSNR, RMSE and universal quality index (UQI). Comparative analysis with existing methods has also been carried out.

  3. TAFFYS: An Integrated Tool for Comprehensive Analysis of Genomic Aberrations in Tumor Samples

    PubMed Central

    Feng, Huanqing; Wang, Minghui

    2015-01-01

    Background Tumor single nucleotide polymorphism (SNP) array is a common platform for investigating the cancer genomic aberration and the functionally important altered genes. Original SNP array signals are usually corrupted by noise, and need to be de-convoluted into absolute copy number profile by analytical methods. Unfortunately, in contrast with the popularity of tumor Affymetrix SNP array, the methods that are specifically designed for this platform are still limited. The complicated characteristics of noise in signals is one of the difficulties for dissecting tumor Affymetrix SNP array data, as they inevitably blur the distinction between aberrations and create an obstacle for the copy number aberration (CNA) identification. Results We propose a tool named TAFFYS for comprehensive analysis of tumor Affymetrix SNP array data. TAFFYS introduce a wavelet-based de-noising approach and copy number-specific signal variance model for suppressing and modelling the noise in signals. Then a hidden Markov model is employed for copy number inference. Finally, by using the absolute copy number profile, statistical significance of each aberration region is calculated in term of different aberration types, including amplification, deletion and loss of heterozygosity (LOH). The result shows that copy number specific-variance model and wavelet de-noising algorithm fits well with the Affymetrix SNP array signals, leading to more accurate estimation for diluted tumor sample (even with only 30% of cancer cells) than other existed methods. Results of examinations also demonstrate a good compatibility and extensibility for different Affymetrix SNP array platforms. Application on the 35 breast tumor samples shows that TAFFYS can automatically dissect the tumor samples and reveal statistically significant aberration regions where cancer-related genes locate. Conclusions TAFFYS provide an efficient and convenient tool for identifying the copy number alteration and allelic imbalance and assessing the recurrent aberrations for the tumor Affymetrix SNP array data. PMID:26111017

  4. Composable Tools For Network Discovery and Security Analysis

    E-print Network

    Vigna, Giovanni

    Composable Tools For Network Discovery and Security Analysis Giovanni Vigna Fredrik Valeur Jingyu and are difficult to extend. This paper presents NetMap, a security tool for network modeling, discovery to obtain the requested results. Keywords: Network Security, Network Modeling and Analysis, Network

  5. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  6. Space mission scenario development and performance analysis tool

    Microsoft Academic Search

    M. Kordon; J. Baker; J. Gilbert; D. Hanks

    2005-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multidisciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during

  7. ProMAT: protein microarray analysis tool

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

    2006-04-04

    Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

  8. Homomorphic deconvolution of marine magnetic anomalies 

    E-print Network

    Jones, Leo David

    1976-01-01

    HOMOMORPIIIC DECONVOLUT ION OF MAR INE MAGNETIC ANOMAL I ES A Thesis by LEO DAVID JONES Submitted to the Graduate Colleoe of Texas Ahhi Unis ars i ty in partial fulfillment of the requirement for tne degree MASTEP, OF SCIENCE December 1975... Major Subject: Geophysics HOMOMORPHIC DECONVOLUT ION OF MARINE MAGNETIC ACNOMALIES A Thesis by LEO DAVID JONES Approved es to sty1e and content by: r"hi ~f C itt: Ch h~7 December 1976 ABSTRACT Homomorpi;ic Deconvolution of Marine Magnetic...

  9. Time domain deconvolution of transient radar data

    NASA Astrophysics Data System (ADS)

    Rothwell, Edward J.; Sun, Weimin

    1990-04-01

    A simple technique is presented for obtaining the impulse response of a conducting radar target by deconvolving the measured response of the experimental system from the measured response of the target. The scheme uses singular value decomposition. It is shown that the ill-conditioned nature of the deconvolution process can be linked to the use of inappropriate frequency information in the representation of the unknown impulse response. Examples using measured transient radar data establish the usefulness of the scheme. The noise sensitivity of the deconvolution process can also be traced to the improper representation of the impulse response. Accurate deconvolution is demonstrated even in the presence of large amounts of random noise.

  10. A comparison of commonly used re-entry analysis tools

    NASA Astrophysics Data System (ADS)

    Lips, Tobias; Fritsche, Bent

    2005-07-01

    Most spacecraft or rocket bodies re-entering the Earth's atmosphere, controlled or uncontrolled, do not demise completely during re-entry. Fragments of these re-entry objects survive and reach the ground where they pose a risk to people. Re-entry tools have been developed all over the world in order to calculate the destruction processes and to assess the resulting ground risk. This paper describes the NASA re-entry analysis tools DAS (Debris Assessment Software) and ORSAT (Object Re-entry Survival Analysis Tool), and the ESA tools SCARAB (Spacecraft Atmospheric Re-entry and Aero-thermal Breakup) and SESAM (Spacecraft Entry Survival Analysis Module). Results calculated with these tools are compared in order to identify the major differences. Final recommendations are given in order to improve these tools and to minimize the identified differences.

  11. Least squares deconvolution of the stellar intensity and polarization spectra

    E-print Network

    Kochukhov, O; Piskunov, N

    2010-01-01

    Least squares deconvolution (LSD) is a powerful method of extracting high-precision average line profiles from the stellar intensity and polarization spectra. Despite its common usage, the LSD method is poorly documented and has never been tested using realistic synthetic spectra. In this study we revisit the key assumptions of the LSD technique, clarify its numerical implementation, discuss possible improvements and give recommendations how to make LSD results understandable and reproducible. We also address the problem of interpretation of the moments and shapes of the LSD profiles in terms of physical parameters. We have developed an improved, multiprofile version of LSD and have extended the deconvolution procedure to linear polarization analysis taking into account anomalous Zeeman splitting of spectral lines. This code is applied to the theoretical Stokes parameter spectra. We test various methods of interpreting the mean profiles, investigating how coarse approximations of the multiline technique trans...

  12. Tools for Knowledge Analysis, Synthesis, and Sharing

    NASA Astrophysics Data System (ADS)

    Medland, Michael B.

    2007-04-01

    Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

  13. Deconvolution of dynamic mechanical networks

    E-print Network

    Michael Hinczewski; Yann von Hansen; Roland R. Netz

    2011-07-13

    Time-resolved single-molecule biophysical experiments yield data that contain a wealth of dynamic information, in addition to the equilibrium distributions derived from histograms of the time series. In typical force spectroscopic setups the molecule is connected via linkers to a read-out device, forming a mechanically coupled dynamic network. Deconvolution of equilibrium distributions, filtering out the influence of the linkers, is a straightforward and common practice. We have developed an analogous dynamic deconvolution theory for the more challenging task of extracting kinetic properties of individual components in networks of arbitrary complexity and topology. Our method determines the intrinsic linear response functions of a given molecule in the network, describing the power spectrum of conformational fluctuations. The practicality of our approach is demonstrated for the particular case of a protein linked via DNA handles to two optically trapped beads at constant stretching force, which we mimic through Brownian dynamics simulations. Each well in the protein free energy landscape (corresponding to folded, unfolded, or possibly intermediate states) will have its own characteristic equilibrium fluctuations. The associated linear response function is rich in physical content, since it depends both on the shape of the well and its diffusivity---a measure of the internal friction arising from such processes like the transient breaking and reformation of bonds in the protein structure. Starting from the autocorrelation functions of the equilibrium bead fluctuations measured in this force clamp setup, we show how an experimentalist can accurately extract the state-dependent protein diffusivity using a straightforward two-step procedure.

  14. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.

    PubMed

    Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G

    2012-05-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters. PMID:21765155

  15. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  16. FDTD simulation tools for UWB antenna analysis.

    SciTech Connect

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  17. Constrained iterations for blind deconvolution and convexity issues

    NASA Astrophysics Data System (ADS)

    Spaletta, Giulia; Caucci, Luca

    2006-12-01

    The need for image restoration arises in many applications of various scientific disciplines, such as medicine and astronomy and, in general, whenever an unknown image must be recovered from blurred and noisy data [M. Bertero, P. Boccacci, Introduction to Inverse Problems in Imaging, Institute of Physics Publishing, Philadelphia, PA, USA, 1998]. The algorithm studied in this work restores the image without the knowledge of the blur, using little a priori information and a blind inverse filter iteration. It represents a variation of the methods proposed in Kundur and Hatzinakos [A novel blind deconvolution scheme for image restoration using recursive filtering, IEEE Trans. Signal Process. 46(2) (1998) 375-390] and Ng et al. [Regularization of RIF blind image deconvolution, IEEE Trans. Image Process. 9(6) (2000) 1130-1134]. The problem of interest here is an inverse one, that cannot be solved by simple filtering since it is ill-posed. The imaging system is assumed to be linear and space-invariant: this allows a simplified relationship between unknown and observed images, described by a point spread function modeling the distortion. The blurring, though, makes the restoration ill-conditioned: regularization is therefore also needed, obtained by adding constraints to the formulation. The restoration is modeled as a constrained minimization: particular attention is given here to the analysis of the objective function and on establishing whether or not it is a convex function, whose minima can be located by classic optimization techniques and descent methods. Numerical examples are applied to simulated data and to real data derived from various applications. Comparison with the behavior of methods [D. Kundur, D. Hatzinakos, A novel blind deconvolution scheme for image restoration using recursive filtering, IEEE Trans. Signal Process. 46(2) (1998) 375-390] and [M. Ng, R.J. Plemmons, S. Qiao, Regularization of RIF Blind Image Deconvolution, IEEE Trans. Image Process. 9(6) (2000) 1130-1134] show the effectiveness of our variant.

  18. Deconvolution of immittance data: some old and new methods

    SciTech Connect

    Tuncer, Enis [ORNL; Macdonald, Ross J. [University of North Carolina

    2007-01-01

    The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of Kronig-Kramers transforms, a powerful nonlinear-least-squares inversion method, and a new Monte-Carlo inversion method. The important distinction, usually ignored, between discrete-point distributions and continuous ones is emphasized, and both recent parametric and non-parametric deconvolution/inversion procedures for frequency-response data are discussed and compared. Information missing in a recent parametric measurement-model deconvolution approach is pointed out and remedied, and its priority evaluated. Comparisons are presented between the standard parametric least squares inversion method and a new non-parametric Monte Carlo one that allows complicated composite distributions of relaxation times (DRT) to be accurately estimated without the uncertainty present with regularization methods. Also, detailed Monte-Carlo DRT estimates for the supercooled liquid 0.4Ca(NO) 0.6KNO3(CKN) at 350 K are compared with appropriate frequency-response-model fit results. These composite models were derived from stretched-exponential Kohlrausch temporal response with the inclusion of either of two different series electrode-polarization functions.

  19. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

  20. Zoo: A tool for traffic analysis and characterization User manual

    E-print Network

    Owezarski, Philippe

    Zoo: A tool for traffic analysis and characterization User manual Nicolas LARRIEU, Philippe Introduction This paper introduces the Zoo tool that has been designed in the French Metropolis project, explaining why Zoo speaks French; we will soon teach Zoo how to speak English. More technically speaking

  1. Coastal online analysis and synthesis tool 2.0 (COAST)

    Microsoft Academic Search

    R. B. Brown; A. R. Navard; B. T. Nguyen

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) geobrowser has been developed at NASA Stennis Space Center (SSC) for integration of previously disparate coastal datasets from NASA and other sources into a common desktop client tool. COAST will provide insightful new data visualization and analysis capabilities for the coastal researcher. COAST is built upon the NASA open source 3D geobrowser,

  2. Tools and techniques for failure analysis and qualification of MEMS.

    SciTech Connect

    Walraven, Jeremy Allen

    2003-07-01

    Many of the tools and techniques used to evaluate and characterize ICs can be applied to MEMS technology. In this paper we discuss various tools and techniques used to provide structural, chemical, and electrical analysis and how these data aid in qualifying MEMS technologies.

  3. ATACOBOL: A COBOL Test Coverage Analysis Tool and Its Applications

    Microsoft Academic Search

    Sam K. S. Sze; Michael R. Lyu

    2000-01-01

    A coverage testing tool ATACOBOL (Automatic Test Analysis for COBOL) that applies data flow coverage technique is developed for software development on IBM System\\/390 mainframe. We show that the data flow coverage criteria can identify possible problematic paths that maps to the actual testing semantic required by Y2K compliance software testing. However, the mainframe environment lacks testing tools that equip

  4. Tools for Physics Analysis in CMS

    Microsoft Academic Search

    Andreas Hinzmann

    2011-01-01

    The CMS Physics Analysis Toolkit (PAT) is presented. The PAT is a high-level analysis layer enabling the development of common analysis efforts across and within physics analysis groups. It aims at fulfilling the needs of most CMS analyses, providing both ease-of-use for the beginner and flexibility for the advanced user. The main PAT concepts are described in detail and some

  5. Development of wavelet analysis tools for turbulence

    NASA Technical Reports Server (NTRS)

    Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

    1992-01-01

    Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

  6. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  7. Introducing an Online Cooling Tower Performance Analysis Tool 

    E-print Network

    Muller, M.R.; Muller, M.B.; Rao, P.

    2012-01-01

    to a default operating condition and forgotten. This paper will introduce a web-based cooling tower analysis tool being developed to help users understand and optimize operational efficiency. The calculations, evaluations, and models will be discussed...

  8. A Tool for Efficient Fault Tree Analysis (extended version)

    E-print Network

    Vellekoop, Michel

    DFTCalc: A Tool for Efficient Fault Tree Analysis (extended version) Florian Arnold1 , Axel that provides (1) efficient fault tree modelling via compact representations; (2) effec- tive analysis, allowing. Fault Trees. Fault tree analysis (FTA) is a graphical technique that is often used in industry [27

  9. Development of High Performance Fluxomics Tools for Microbial Metabolism Analysis

    E-print Network

    Subramanian, Venkat

    Analysis Genome-scale metabolic model Amino acids Model reconstruction GC-MS Protein Hydrolysis Isotopic labeling Software development #12;Fluxomics Tools 13C-based Pathway & Flux analysis Fluxomics software · Monod model · Bi-level dynamic flux balance analysis · Isotopomer labeling simulation in slow

  10. A Flexible Data Analysis Tool for Chemical Genetic Screens

    Microsoft Academic Search

    Brian P. Kelley; Mitchell R. Lunn; David E. Root; Stephen P. Flaherty; Allison M. Martino; Brent R. Stockwell

    2004-01-01

    High-throughput assays generate immense quantities of data that require sophisticated data analysis tools. We have created a freely available software tool, SLIMS (Small Laboratory Information Management System), for chemical genetics which facilitates the collection and analysis of large-scale chemical screening data. Compound structures, physical locations, and raw data can be loaded into SLIMS. Raw data from high-throughput assays are normalized

  11. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  12. Bridging Performance Analysis Tools and Analytic Performance Modeling for HPC

    Microsoft Academic Search

    Torsten Hoefler

    \\u000a Application performance is critical in high-performance computing (HPC), however, it is not considered in a systematic way\\u000a in the HPC software development process. Integrated performance models could improve this situation. Advanced analytic performance\\u000a modeling and performance analysis tools exist in isolation but have similar goals and could benefit mutually. We find that\\u000a existing analysis tools could be extended to support

  13. BRFSS: Prevalence Data and Data Analysis Tools

    NSDL National Science Digital Library

    Center for Disease Control

    RFSS is the nation's premier system of health-related telephone surveys that collect state data about U.S. residents regarding their health-related risk behaviors, chronic health conditions, and use of preventive services. BRFSS collects data in all 50 states as well as the District of Columbia and three U.S. territories. BRFSS completes more than 400,000 adult interviews each year, making it the largest continuously conducted health survey system in the world. These tools allow the user to perform various analyses and display the data in different means. 

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  16. LCD ROOT Simulation and Analysis Tools

    E-print Network

    Masako Iwasaki; Toshinori Abe

    2001-02-07

    The North American Linear Collider Detector group has developed a simulation program package based on the ROOT system. The package consists of Fast simulation, the reconstruction of the Full simulated data, and physics analysis utilities.

  17. LCD ROOT Simulation and Analysis Tools

    SciTech Connect

    Iwasaki, Masako

    2001-02-08

    The North American Linear Collider Detector group has developed a simulation program package based on the ROOT system. The package consists of Fast simulation, the reconstruction of the Full simulated data, and physics analysis utilities.

  18. JAVA based LCD Reconstruction and Analysis Tools

    SciTech Connect

    Bower, G.

    2004-10-11

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

  19. A visual tool for forensic analysis of mobile phone traffic

    Microsoft Academic Search

    Salvatore Amato Catanese; Giacomo Fiumara

    2010-01-01

    In this paper we present our tool LogAnalysis for forensic visual statistical analysis of mobile phone traffic. LogAnalysis graphically represents the relationships among mobile phone users with a node-link layout. Its aim is to explore the structure of a large graph, measure connectivity among users and give support to visual search and automatic identification of organizations. To do so, LogAnalysis

  20. Multi-order blind deconvolution algorithm with adaptive Tikhonov regularization for infrared spectroscopic data

    NASA Astrophysics Data System (ADS)

    Liu, Hai; Zhou, Mo; Zhang, Zhaoli; Shu, Jiangbo; Liu, Tingting; Zhang, Tianxu

    2015-07-01

    Infrared spectra often suffer from common problems of bands overlap and random noise. In this paper, we introduce a blind spectral deconvolution method to recover the degraded infrared spectra. Firstly, we present an analysis of the causes of band-side artifacts found in current deconvolution methods, and model the spectral noise with the multi-order derivative that are inspired by those analysis. Adaptive Tikhonov regularization is employed to preserve the spectral structure and suppress the noise. Then, an effective optimization scheme is described to alternate between IRF estimation and latent spectrum until convergence. Numerical experiments demonstrate the superior performance of the proposed method comparing with the traditional methods.

  1. GEOGRAPHIC ANALYSIS TOOL FOR HEALTH AND ENVIRONMENTAL RESEARCH (GATHER)

    EPA Science Inventory

    GATHER, Geographic Analysis Tool for Health and Environmental Research, is an online spatial data access system that provides members of the public health community and general public access to spatial data that is pertinent to the analysis and exploration of public health issues...

  2. FUTURE POWER GRID INITIATIVE Market Design Analysis Tool

    E-print Network

    FUTURE POWER GRID INITIATIVE Market Design Analysis Tool OBJECTIVE Power market design plays the FPGI website or contact: gridoptics.pnnl.gov large-scale power grid simulation and analysis. Focus Area a critical role in the outcomes related to power system reliability and market efficiency. However

  3. Protocol Analysis of a Federated Search Tool: Designing for Users

    Microsoft Academic Search

    Emily Alling; Rachael Naismith

    2007-01-01

    Librarians at Springfield College conducted usability testing of Endeavor's federated search tool, ENCompass for Resource Access. The purpose of the testing was to make informed decisions prior to customizing the look and function of the software's interface in order to make the product more usable for their patrons. Protocol, or think-aloud, analysis was selected as a testing and analysis method.

  4. Cost-Benefit Analysis: Tools for Decision Making.

    ERIC Educational Resources Information Center

    Bess, Gary

    2002-01-01

    Suggests that cost-benefit analysis can be a helpful tool for assessing difficult and complex problems in child care facilities. Defines cost-benefit analysis as an approach to determine the most economical way to manage a program, describes how to analyze costs and benefits through hypothetical scenarios, and discusses some of the problems…

  5. Pin: building customized program analysis tools with dynamic instrumentation

    Microsoft Academic Search

    Chi-Keung Luk; Robert S. Cohn; Robert Muth; Harish Patil; Artur Klauser; P. Geoffrey Lowney; Steven Wallace; Vijay Janapa Reddi; Kim M. Hazelwood

    2005-01-01

    Robust and powerful software instrumentation tools are essential for program analysis tasks such as profiling, performance evaluation, and bug detection. To meet this need, we have developed a new instrumentation system called to instrument executables while they are running. For efficiency, Pin uses several techniques, including inlining, register re-allocation, liveness analysis, and instruction scheduling to optimize instrumentation. This fully automated

  6. A new tool for contamination analysis

    SciTech Connect

    Meltzer, M.; Gregg, H.

    1996-06-01

    The Contamination Analysis Unit (CAU) is a sensing system that facilitates a new approach to industrial cleaning. Through use of portable mass spectrometry and various desorption techniques, the CAU provides in-process, near-real-time measurement of surface cleanliness levels. It can be of help in significantly reducing hazardous waste generation and toxic air emissions from manufacturing operations.

  7. Is citation analysis a legitimate evaluation tool?

    Microsoft Academic Search

    E. Garfield

    1979-01-01

    A comprehensive discussion on the use of citation analysis to rate scientific performance and the controversy surrounding it. The general adverse criticism that citation counts include an excessive number of negative citations (citations to incorrect results worthy of attack), self-citations (citations to the works of the citing authors), and citations to methodological papers is analyzed. Included are a discussion of

  8. Open Source Tools for Seismicity Analysis

    Microsoft Academic Search

    P. Powers

    2010-01-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a \\

  9. Millennial scale system impulse response of polar climates - deconvolution results between ? 18O records from Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Reischmann, E.; Yang, X.; Rial, J. A.

    2013-12-01

    Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, ? 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the periodicity of the thermo-haline circulation (THC). Simplified models of the polar climate fluctuations are shown to support these findings.

  10. Blind-deconvolution optical-resolution photoacoustic microscopy in vivo.

    PubMed

    Chen, Jianhua; Lin, Riqiang; Wang, Huina; Meng, Jing; Zheng, Hairong; Song, Liang

    2013-03-25

    Optical-resolution photoacoustic microscopy (OR-PAM) is becoming a vital tool for studying the microcirculation system in vivo. By increasing the numerical aperture of optical focusing, the lateral resolution of OR-PAM can be improved; however, the depth of focus and thus the imaging range will be sacrificed correspondingly. In this work, we report our development of blind-deconvolution optical-resolution photoacoustic microscopy (BD-PAM) that can provide a lateral resolution ~2-fold finer than that of conventional OR-PAM (3.04 vs. 5.78?m), without physically increasing the system's numerical aperture. The improvement achieved with BD-PAM is demonstrated by imaging graphene nanoparticles and the microvasculature of mice ears in vivo. Our results suggest that BD-PAM may become a valuable tool for many biomedical applications that require both fine spatial resolution and extended depth of focus. PMID:23546115

  11. Development of a climate data analysis tool (CDAT)

    SciTech Connect

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  12. [SIGAPS, a tool for the analysis of scientific publications].

    PubMed

    Sillet, Arnauld

    2015-04-01

    The System for the Identification, Management and Analysis of Scientific Publications (SIGAPS) is essential for the funding of teaching hospitals on the basis of scientific publications. It is based on the analysis of articles indexed in Medline and is calculated by taking into account the place of the author and the ranking of the journal according to the disciplinary field. It also offers tools for the bibliometric analysis of scientific production. PMID:26043639

  13. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  14. Semi-blind nonstationary deconvolution: Joint reflectivity and Q estimation

    NASA Astrophysics Data System (ADS)

    Gholami, Ali

    2015-06-01

    Source signature deconvolution and attenuation or inverse quality factor- (Q-) filtering are two challenging problems in seismic data analysis which are used for extending the temporal bandwidth of the data. The separate estimates of the wavelet and, especially, the Earth Q model are by themselves problematic and add further uncertainties to inverse problems which are clearly ill-conditioned. The two problems are formulated in the framework of polynomial extrapolation and a closed form solution is provided based on the Lagrange interpolation. Analysis of the stability issue shows that the errors in the estimated results grow exponentially with both the problem size N and the inverse of Q. In order to circumvent both the instability and uncertainty of the Q model, these problems are addressed in a unified formulation as a semi-blind nonstationary deconvolution (SeND) to decompose the observed trace into the least number of nonstationary wavelets selected from a dictionary via a basis pursuit algorithm. The dictionary is constructed from the known source wavelet with different propagation times, each attenuated with a range of possible Q values. Using the Horner's rule, an efficient algorithm is also provided for application of the dictionary and its adjoint. SeND is an extension of the conventional sparse spike deconvolution to its nonstationary form, which provides the reflectivity and Q models simultaneously without requiring a-priori Q information. Assuming that the wavelet and attenuation mechanism are both known, the numerical data SeND allows to estimate both the original reflectivity and the Q models with higher accuracy, especially with respect to conventional spectral ratio techniques. The application of the algorithm to field data finally indicates a substantial improvement in temporal resolution on a seismic record.

  15. Serial concept maps: tools for concept analysis.

    PubMed

    All, Anita C; Huycke, LaRae I

    2007-05-01

    Nursing theory challenges students to think abstractly and is often a difficult introduction to graduate study. Traditionally, concept analysis is useful in facilitating this abstract thinking. Concept maps are a way to visualize an individual's knowledge about a specific topic. Serial concept maps express the sequential evolution of a student's perceptions of a selected concept. Maps reveal individual differences in learning and perceptions, as well as progress in understanding the concept. Relationships are assessed and suggestions are made during serial mapping, which actively engages the students and faculty in dialogue that leads to increased understanding of the link between nursing theory and practice. Serial concept mapping lends itself well to both online and traditional classroom environments. PMID:17547345

  16. GATB: Genome Assembly & Analysis Tool Box

    PubMed Central

    Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique

    2014-01-01

    Motivation: Efficient and fast next-generation sequencing (NGS) algorithms are essential to analyze the terabytes of data generated by the NGS machines. A serious bottleneck can be the design of such algorithms, as they require sophisticated data structures and advanced hardware implementation. Results: We propose an open-source library dedicated to genome assembly and analysis to fasten the process of developing efficient software. The library is based on a recent optimized de-Bruijn graph implementation allowing complex genomes to be processed on desktop computers using fast algorithms with low memory footprints. Availability and implementation: The GATB library is written in C++ and is available at the following Web site http://gatb.inria.fr under the A-GPL license. Contact: lavenier@irisa.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990603

  17. Design and analysis tools for concurrent blackboard systems

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1991-01-01

    A set of blackboard system design and analysis tools that consists of a knowledge source organizer, a knowledge source input/output connectivity analyzer, and a validated blackboard system simulation model is discussed. The author presents the structure and functionality of the knowledge source input/output connectivity analyzer. An example outlining the use of the analyzer to aid in the design of a concurrent tactical decision generator for air-to-air combat is presented. The blackboard system design and analysis tools were designed for generic blackboard systems and are application independent.

  18. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  19. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  20. Interoperability of the analysis tools within the IMPEx project

    NASA Astrophysics Data System (ADS)

    Génot, Vincent; Khodachenko, Maxim; Kallio, Esa; Al-Ubaidi, Tarek; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourel, Natacha; Modolo, Ronan; Hess, Sébastien; André, Nicolas; Penou, Emmanuel; Topf, Florian; Alexeev, Igor; Belenkaya, Elena; Kalegaev, Vladimir; Schmidt, Walter

    2013-04-01

    The growing amount of data in planetary sciences requires adequate tools for visualisation enabling in depth analysis. Within the FP7 IMPEx infrastructure data will originate from heterogeneous sources : large observational databases (CDAWeb, AMDA at CDPP, ...), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Together with the common "time series" visualisation functionality for both in-situ and modeled data (provided by AMDA and CLWeb tools), IMPEx will also provide immersion capabilities into the complex 3D data originating from models (provided by 3DView). The functionalities of these tools will be described. The emphasis will be put on how these tools 1/ can share information (for instance Time Tables or user composed parameters) and 2/ be operated synchronously via dynamic connections based on Virtual Observatory standards.

  1. Adaptive wavelet deconvolution for strongly mixing sequences

    E-print Network

    Paris-Sud XI, Université de

    Adaptive wavelet deconvolution for strongly mixing sequences Christophe Chesneau Abstract square error over Besov balls, we explore the performances of two wavelet estimators: a standard linear, Strongly mixing, Adap- tivity, Wavelets, Hard thresholding. AMS 2000 Subject Classifications: 62G07, 62G20

  2. Satellite image deconvolution based on nonlocal means.

    PubMed

    Zhao, Ming; Zhang, Wei; Wang, Zhile; Hou, Qingyu

    2010-11-10

    The deconvolution of blurred and noisy satellite images is an ill-posed inverse problem, which can be regularized under the Bayesian framework by introducing an appropriate image prior. In this paper, we derive a new image prior based on the state-of-the-art nonlocal means (NLM) denoising approach under Markov random field theory. Inheriting from the NLM, the prior exploits the intrinsic high redundancy of satellite images and is able to encode the image's nonsmooth information. Using this prior, we propose an inhomogeneous deconvolution technique for satellite images, termed nonlocal means-based deconvolution (NLM-D). Moreover, in order to make our NLM-D unsupervised, we apply the L-curve approach to estimate the optimal regularization parameter. Experimentally, NLM-D demonstrates its capacity to preserve the image's nonsmooth structures (such as edges and textures) and outperforms the existing total variation-based and wavelet-based deconvolution methods in terms of both visual quality and signal-to-noise ratio performance. PMID:21068860

  3. Nonstationary sparsity-constrained seismic deconvolution

    NASA Astrophysics Data System (ADS)

    Sun, Xue-Kai; Sam, Zandong Sun; Xie, Hui-Wen

    2014-12-01

    The Robinson convolution model is mainly restricted by three inappropriate assumptions, i.e., statistically white reflectivity, minimum-phase wavelet, and stationarity. Modern reflectivity inversion methods (e.g., sparsity-constrained deconvolution) generally attempt to suppress the problems associated with the first two assumptions but often ignore that seismic traces are nonstationary signals, which undermines the basic assumption of unchanging wavelet in reflectivity inversion. Through tests on reflectivity series, we confirm the effects of nonstationarity on reflectivity estimation and the loss of significant information, especially in deep layers. To overcome the problems caused by nonstationarity, we propose a nonstationary convolutional model, and then use the attenuation curve in log spectra to detect and correct the influences of nonstationarity. We use Gabor deconvolution to handle nonstationarity and sparsity-constrained deconvolution to separating reflectivity and wavelet. The combination of the two deconvolution methods effectively handles nonstationarity and greatly reduces the problems associated with the unreasonable assumptions regarding reflectivity and wavelet. Using marine seismic data, we show that correcting nonstationarity helps recover subtle reflectivity information and enhances the characterization of details with respect to the geological record.

  4. Signal restoration through deconvolution applied to deep

    E-print Network

    Renaut, Rosemary

    Signal restoration through deconvolution applied to deep mantle seismic probes Rosemary Renaut@asu.edu stefan@mathpost.la.asu.edu garnero@asu.edu Abstract In [1] we present a method of signal restoration. The resulting restored time series facilitates more accurate and objective rel- ative travel time estimation

  5. Signal restoration through deconvolution applied to deep

    E-print Network

    Renaut, Rosemary

    Signal restoration through deconvolution applied to deep mantle seismic probes Rosemary Renaut restoration to improve the signal to noise ratio, sharpen seismic arrival onset, and act as an empirical in the wave train. The resulting restored time series facilitates more accurate and objective rel- ative

  6. Time domain deconvolution of transient radar data

    Microsoft Academic Search

    E. J. Rothwell; Weimin Sun

    1990-01-01

    A simple technique is presented for obtaining the impulse response of a conducting radar target by deconvolving the measured response of the experimental system from the measured response of the target. The scheme uses singular value decomposition (SVD). It is shown that the ill-conditioned nature of the deconvolution process can be linked to the use of inappropriate frequency information in

  7. Motion blurred image deconvolution with anisotropic regularization

    Microsoft Academic Search

    Faouzi Benzarti; Ezzedine Ben Braiek; Hamid Amiri

    2004-01-01

    Image restoration or deconvolution is an evolving research topic in the area of image processing and computer vision. It refers to the task of recovering a good estimate of the true image from a degraded observation. In this paper, we consider the problem of restoring an image that has been blurred by a motion blur, which occurs in many practical

  8. FPGA Analysis Tool: High-Level Flows for Low-Level Design Analysis in Reconfigurable Computing

    Microsoft Academic Search

    Krzysztof Kepa; Fearghal Morgan; Krzysztof Kosciuszkiewicz; Lars Braun; Michael Hübner; Jürgen Becker

    2009-01-01

    The growth of the reconfigurable systems community exposes diverse requirements with regard to functionality of Electronic\\u000a Design Automation (EDA) tools. Those targeting reconfigurable design analysis and manipulation require low-level design tools\\u000a for bitstream debugging and IP core design assurance. While tools for low-level analy sis of design netlists do exist there\\u000a is a need for a low-level, open-source, extended tool

  9. Design Issues for Software Analysis and Maintenance Tools

    Microsoft Academic Search

    Dean Jin

    2005-01-01

    Software maintainers have long been acutely aware of the challenges involved in managing software change processes. Activities such as software migration, restructuring and reengineering all involve source code modification. They rely heavily on analysis and comprehension of the complex system structures and interactions that characterize both legacy and modern software systems. It is widely accepted that tools that support software

  10. Analysis tools for multicharged and multispecies ion spectra

    Microsoft Academic Search

    M. Cavenago

    2004-01-01

    A systematic analysis of the charge state distribution (CSD) of each element (or isotope) X extracted from a highly charged ion source as a function of control variables or time needs properly automated tools for the large amount of data involved and the possibility of peak superposition, due to the mass separator properties. Some key features of identification of ion

  11. The Adversarial Route Analysis Tool: A Web Application

    SciTech Connect

    Casson, William H. Jr. [Los Alamos National Laboratory

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  12. An Automated Data Analysis Tool for Livestock Market Data

    ERIC Educational Resources Information Center

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  13. The Continuous Wavelet Transform, an Analysis Tool for NMR Spectroscopy

    Microsoft Academic Search

    D. Barache; J. P. Antoine; J. M. Dereppe

    1997-01-01

    The discrete wavelet transform has been used in NMR spectroscopy by several authors. We show here that the continuous wavelet transform (CWT) is also an efficient tool in that context. After reviewing briefly the analysis of spectral lines with the CWT, we discuss two applications specific to NMR, namely the removal of a large unwanted line and the rephasing of

  14. Lagrangian analysis. Modern tool of the dynamics of solids

    Microsoft Academic Search

    J. Cagnoux; P. Chartagnac; P. Hereil; M. Perez; L. Seaman

    1987-01-01

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for

  15. Recursive Frame Analysis: A Practitioner's Tool for Mapping Therapeutic Conversation

    ERIC Educational Resources Information Center

    Keeney, Hillary; Keeney, Bradford; Chenail, Ronald J.

    2012-01-01

    Recursive frame analysis (RFA), both a practical therapeutic tool and an advanced qualitative research method that maps the structure of therapeutic conversation, is introduced with a clinical case vignette. We present and illustrate a means of mapping metaphorical themes that contextualize the performance taking place in the room, recursively…

  16. Selected Tools for Risk Analysis in Logistics Processes

    NASA Astrophysics Data System (ADS)

    Kuli?ska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  17. Advanced Statistical and Data Analysis Tools for Astrophysics

    NASA Technical Reports Server (NTRS)

    Kashyap, V.; Scargle, Jeffrey D. (Technical Monitor)

    2001-01-01

    The goal of the project is to obtain, derive, and develop statistical and data analysis tools that would be of use in the analyses of high-resolution, high-sensitivity data that are becoming available with new instruments. This is envisioned as a cross-disciplinary effort with a number of collaborators.

  18. Water Optimizer Suite: Tools for Decision Support and Policy Analysis

    E-print Network

    Nebraska-Lincoln, University of

    , irrigation system options, well and pump characteristics and water supply. Irrigation options include center costs and crop prices. One of the strengths of Water Optimizer is that it can be easily used to evaluate1 Water Optimizer Suite: Tools for Decision Support and Policy Analysis Water Optimizer is a suite

  19. Auto-calibrating spherical deconvolution based on ODF sparsity.

    PubMed

    Schultz, Thomas; Groeschel, Samuel

    2013-01-01

    Spherical deconvolution models the diffusion MRI signal as the convolution of a fiber orientation density function (fODF) with a single fiber response. We propose a novel calibration procedure that automatically determines this fiber response. This has three advantages: First, the user no longer needs to provide an estimate of the response. Second, we estimate a per-voxel fiber response, which is more adequate for the analysis of patient data with focal white matter degeneration. Third, parameters of the estimated response reflect diffusion properties of the white matter tissue, and can be used for quantitative analysis. Our method works by finding a tradeoff between a low fitting error and a sparse fODF. Results on simulated data demonstrate that auto-calibration successfully avoids erroneous fODF peaks that can occur with standard deconvolution, and that it resolves fiber crossings with better angular resolution than FORECAST, an alternative method. Parameter maps and tractography results corroborate applicability to clinical data. PMID:24505724

  20. Discovery and New Frontiers Project Budget Analysis Tool

    NASA Technical Reports Server (NTRS)

    Newhouse, Marilyn E.

    2011-01-01

    The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

  1. Analysis tools for non-radially pulsating objects

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Pollard, K. R.; Cottrell, P. L.

    2007-06-01

    At the University of Canterbury we have been developing a set of tools for the analysis of spectra of varying types of non-radially pulsating objects. This set currently includes: calculation of the moments, calculations of the phase across the profile as well as basic binary profile fitting for determination of orbital characteristics and projected rotational velocity (v sin i) measurement. Recently the ability to calculate cross-correlation profiles using either specified or synthesized line lists has been added, all implemented in MATLAB. A number of observations of ? Doradus candidates is currently being used to test these tools. For information on our observing facilities see Pollard et al. (2007).

  2. fMRI analysis software tools: an evaluation framework

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Colli, Vittoria; Strocchi, Sabina; Vite, Cristina; Binaghi, Elisabetta; Conte, Leopoldo

    2011-03-01

    Performance comparison of functional Magnetic Resonance Imaging (fMRI) software tools is a very difficult task. In this paper, a framework for comparison of fMRI analysis results obtained with different software packages is proposed. An objective evaluation is possible only after pre-processing steps that normalize input data in a standard domain. Segmentation and registration algorithms are implemented in order to classify voxels belonging to brain or not, and to find the non rigid transformation that best aligns the volume under inspection with a standard one. Through the definitions of intersection and union of fuzzy logic an index was defined which quantify information overlap between Statistical Parametrical Maps (SPMs). Direct comparison between fMRI results can only highlight differences. In order to assess the best result, an index that represents the goodness of the activation detection is required. The transformation of the activation map in a standard domain allows the use of a functional Atlas for labeling the active voxels. For each functional area the Activation Weighted Index (AWI) that identifies the mean activation level of whole area was defined. By means of this brief, but comprehensive description, it is easy to find a metric for the objective evaluation of a fMRI analysis tools. Trough the first evaluation method the situations where the SPMs are inconsistent were identified. The result of AWI analysis suggest which tool has higher sensitivity and specificity. The proposed method seems a valid evaluation tool when applied to an adequate number of patients.

  3. On the use of wafer positional and spatial pattern analysis to identify process marginality and to de-convolute counterintuitive experimental results

    Microsoft Academic Search

    Greg Klein; Laurence Kohler; Joseph Wiseman; Brian Dunham; Anh-Thu Tran; Stacie Brown; Masaki Shingo; I. Burki

    2007-01-01

    The use of wafer randomization and positional analysis in manufacturing is ubiquitous and well established. Wafer electrical and yield data can be traced back to specific operations in the manufacturing process with the help of wafer sequencing records. Tight process windows or complex process technologies may however require that the statistical parameter versus sequence signal be combined with other variables

  4. Rosetta CONSERT operations and data analysis preparation: simulation software tools.

    NASA Astrophysics Data System (ADS)

    Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

    2014-05-01

    The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

  5. ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution

    E-print Network

    Jureckova, Jana

    ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution Jaroslav Pazdera pazdera the atomic deconvolution problem and we propose the estimator for an atom location and give its asymptotic in the ordinary deconvolution problem. ATOMIC DECONVOLUTION In the ordinary deconvolution problem one wants

  6. Microscopy image segmentation tool: Robust image data analysis

    SciTech Connect

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)] [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  7. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  8. Field Quality Analysis as a Tool to Monitor Magnet Production

    SciTech Connect

    Gupta, R.; Anerella, M.; Cozzolino, J.; Fisher, D.; Ghosh, A.; Jain, A.; Sampson, W.; Schmalzle, J.; Thompson, P.; Wanderer, P.; Willen, E.

    1997-10-18

    Field harmonics offer a powerful tool to examine the mechanical structure of accelerator magnets. A large deviation from the nominal values suggests a mechanical defect. Magnets with such defects are likely to have a poor quench performance. Similarly, a trend suggests a wear in tooling or a gradual change in the magnet assem-bly or in the size of a component. This paper presents the use of the field quality as a tool to monitor the magnet production of the Relativistic Heavy Ion Collider (RHIC). Several examples are briefly described. Field quality analysis can also rule out a suspected geometric error if it can not be supported by the symmetry and the magnitude of the measured harmonics.

  9. Numerical tools applied to power reactor noise analysis

    SciTech Connect

    Demaziere, C.; Pazsit, I. [Chalmers Univ. of Technology, Dept. of Nuclear Engineering, SE-412 96 Goeteborg (Sweden)

    2006-07-01

    In this paper, the development of numerical tools allowing the determination of the neutron noise in power reactors is reported. These tools give the space-dependence of the fluctuations of the neutron flux induced by fluctuating properties of the medium in the 2-group diffusion approximation and in a 2-dimensional representation of heterogeneous systems. Some applications of these tools to power reactor noise analysis are then described. These applications include the unfolding of the noise source from the resulting neutron noise measured at a few discrete locations throughout the core, the study of the space-dependence of the Decay Ratio in Boiling Water Reactors, the noise-based estimation of the Moderator Temperature Coefficient of reactivity in Pressurized Water Reactors, the modeling of shell-mode core barrel vibrations in Pressurized Water Reactors, and the investigation of the validity of the point-kinetic approximation in subcritical systems. (authors)

  10. Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment

    PubMed Central

    McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D.

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  11. Systematic Omics Analysis Review (SOAR) tool to support risk assessment.

    PubMed

    McConnell, Emma R; Bell, Shannon M; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J; Garcia-Reyero, Natàlia; Gong, Ping; Burgoon, Lyle D

    2014-01-01

    Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

  12. SATRAT: Staphylococcus aureus transcript regulatory network analysis tool

    PubMed Central

    Nagarajan, Vijayaraj; Elasri, Mohamed O.

    2015-01-01

    Staphylococcus aureus is a commensal organism that primarily colonizes the nose of healthy individuals. S. aureus causes a spectrum of infections that range from skin and soft-tissue infections to fatal invasive diseases. S. aureus uses a large number of virulence factors that are regulated in a coordinated fashion. The complex regulatory mechanisms have been investigated in numerous high-throughput experiments. Access to this data is critical to studying this pathogen. Previously, we developed a compilation of microarray experimental data to enable researchers to search, browse, compare, and contrast transcript profiles. We have substantially updated this database and have built a novel exploratory tool—SATRAT—the S. aureus transcript regulatory network analysis tool, based on the updated database. This tool is capable of performing deep searches using a query and generating an interactive regulatory network based on associations among the regulators of any query gene. We believe this integrated regulatory network analysis tool would help researchers explore the missing links and identify novel pathways that regulate virulence in S. aureus. Also, the data model and the network generation code used to build this resource is open sourced, enabling researchers to build similar resources for other bacterial systems. PMID:25653902

  13. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  14. Quantifying mineral abundances of complex mixtures by coupling spectral deconvolution of SWIR spectra (2.1-2.4 ?m) and regression tree analysis

    USGS Publications Warehouse

    Mulder, V.L.; Plotze, Michael; de Bruin, Sytze; Schaepman, Michael E.; Mavris, C.; Kokaly, Raymond F.; Egli, Markus

    2013-01-01

    This paper presents a methodology for assessing mineral abundances of mixtures having more than two constituents using absorption features in the 2.1-2.4 ?m wavelength region. In the first step, the absorption behaviour of mineral mixtures is parameterised by exponential Gaussian optimisation. Next, mineral abundances are predicted by regression tree analysis using these parameters as inputs. The approach is demonstrated on a range of prepared samples with known abundances of kaolinite, dioctahedral mica, smectite, calcite and quartz and on a set of field samples from Morocco. The latter contained varying quantities of other minerals, some of which did not have diagnostic absorption features in the 2.1-2.4 ?m region. Cross validation showed that the prepared samples of kaolinite, dioctahedral mica, smectite and calcite were predicted with a root mean square error (RMSE) less than 9 wt.%. For the field samples, the RMSE was less than 8 wt.% for calcite, dioctahedral mica and kaolinite abundances. Smectite could not be well predicted, which was attributed to spectral variation of the cations within the dioctahedral layered smectites. Substitution of part of the quartz by chlorite at the prediction phase hardly affected the accuracy of the predicted mineral content; this suggests that the method is robust in handling the omission of minerals during the training phase. The degree of expression of absorption components was different between the field sample and the laboratory mixtures. This demonstrates that the method should be calibrated and trained on local samples. Our method allows the simultaneous quantification of more than two minerals within a complex mixture and thereby enhances the perspectives of spectral analysis for mineral abundances.

  15. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    SciTech Connect

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; Chung, Suh-Urk; D'Angelo, Annalisa; De Vita, Rafaella; Döring, Michael; Dudek, Jozef; Eidelman, S.; Fegan, Stuart; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, Derek; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, David G.; Ketzer, B.; Klein, Franz J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, Vincent; McKinnon, Brian; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, Alessandro; Salgado, Carlos [Norfolk State University; Santopinto, E.; Sarantsev, Andrey V.; Sato, Toru; Schlüter, T.; da Silva, M. L.L.; Stankovic, I.; Strakovsky, Igor [George Washington University; Szczepaniak, Adam; Vassallo, A.; Walford, Natalie K. [Catholic University; Watts, Daniel P.; Zana, Lorenzo

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  16. A Tool for Enterprise Architecture Analysis Using the PRM Formalism

    Microsoft Academic Search

    Markus Buschle; Johan Ullberg; Ulrik Franke; Robert Lagerström; Teodor Sommestad

    \\u000a Enterprise architecture advocates for model-based decision-making on enterprise-wide information system issues. In order to\\u000a provide decision-making support, enterprise architecture models should not only be descriptive but also enable analysis. This\\u000a paper presents a software tool, currently under development, for the evaluation of enterprise architecture models. In particular,\\u000a the paper focuses on how to encode scientific theories so that they can

  17. An integrated thermal management analysis tool [for aircraft

    Microsoft Academic Search

    F. Issacci; A. Telal Wassel; V. Van Griethuysen

    1996-01-01

    A computational tool, developed to perform subsystem and system level integrated thermal management assessment and design calculations, is described in this paper. The Vehicle Integrated Thermal Management Analysis Code (VITMAC) simulates the coupled thermal-fluid response of airframe\\/engine active cooling circuits, airframe\\/engine structural components, and fuel tanks subject to aeroheating and internal\\/engine heat loads. VITMAC simulates both the steady-state and transient

  18. Galileo: A Tool for Dynamic Fault Tree Analysis

    Microsoft Academic Search

    Joanne Bechta Dugan

    2000-01-01

    Galileo is a prototype software tool for dependability analysis of fault tolerant computer-based systems. Reliability models\\u000a are specified using dynamic fault trees, which provide special constructs for modeling sequential failure modes in addition\\u000a to standard combinatorial fault tree gates. Independent modules are determined automatically, and separate modules are solved\\u000a combinatorially (using Binary Decision Diagrams) or using Markov Methods.

  19. Stranger: An Automata-Based String Analysis Tool for PHP

    Microsoft Academic Search

    Fang Yu; Muath Alkhalaf; Tevfik Bultan

    2010-01-01

    \\u000a \\u000a Stranger is an automata-based string analysis tool for finding and eliminating string-related security vulnerabilities in PHP applications.\\u000a Stranger uses symbolic forward and backward reachability analyses to compute the possible values that the string expressions can take\\u000a during program execution. Stranger can automatically (1) prove that an application is free from specified attacks or (2) generate vulnerability signatures that\\u000a characterize all

  20. SMART (Shop floor Modeling, Analysis and Reporting Tool Project

    NASA Technical Reports Server (NTRS)

    Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

    1999-01-01

    This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

  1. AstroStat-A VO tool for statistical analysis

    NASA Astrophysics Data System (ADS)

    Kembhavi, A. K.; Mahabal, A. A.; Kale, T.; Jagade, S.; Vibhute, A.; Garg, P.; Vaghmare, K.; Navelkar, S.; Agrawal, T.; Chattopadhyay, A.; Nandrekar, D.; Shaikh, M.

    2015-06-01

    AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analyses are done using the public domain statistical software-R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features-as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on them.

  2. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

  3. Interactive Software Fault Analysis Tool for Operational Anomaly Resolution

    NASA Technical Reports Server (NTRS)

    Chen, Ken

    2002-01-01

    Resolving software operational anomalies frequently requires a significant amount of resources for software troubleshooting activities. The time required to identify a root cause of the anomaly in the software may lead to significant timeline impacts and in some cases, may extend to compromise of mission and safety objectives. An integrated tool that supports software fault analysis based on the observed operational effects of an anomaly could significantly reduce the time required to resolve operational anomalies; increase confidence for the proposed solution; identify software paths to be re-verified during regression testing; and, as a secondary product of the analysis, identify safety critical software paths.

  4. OSS tools in a heterogeneous environment for embedded systems modelling: an analysis of adoptions of XMI

    E-print Network

    Scacchi, Walt

    39 OSS tools in a heterogeneous environment for embedded systems modelling: an analysis information between tools ­ whether in a tool chain, for legacy reasons or because of the natural of XMI interchange for supporting OSS tool adoption to complement other tools in an embedded systems

  5. SDAT: analysis of 131m Xe with 133 Xe interference

    Microsoft Academic Search

    Steven R. Biegalski; Kendra M. Foltz Biegalski; Derek A. Haas

    2009-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed at The University of Texas at Austin. SDAT utilizes\\u000a a standard spectrum technique for the analysis of ?–? coincidence spectra. Testing was performed on the software to compare\\u000a the standard spectrum analysis technique with a region of interest (ROI) analysis technique. Experimentally produced standard\\u000a spectra and sample data were produced at

  6. Graphical Tools for Network Meta-Analysis in STATA

    PubMed Central

    Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

  7. Tool for bonded optical element thermal stability analysis

    NASA Astrophysics Data System (ADS)

    Klotz, Gregory L.

    2011-09-01

    An analytical tool is presented which supports the opto-mechanical design of bonded optical elements. Given the mounting requirements from the optical engineer, the alignment stability and optical stresses in bonded optics can be optimized for the adhesive and housing material properties. While a perfectly athermalized mount is desirable, it is not realistic. The tool permits evaluation of element stability and stress over the expected thermal range at nominal, or worst case, achievable assembly and manufacturing tolerances. Selection of the most appropriate mount configuration and materials, which maintain the optical engineer's design, is then possible. The tool is based on a stress-strain analysis using Hooke's Law in the worst case plane through the optic centerline. The optimal bond line is determined for the selected adhesive, housing and given optic materials using the basic athermalization equation. Since a mounting solution is expected to be driven close to an athermalized design, the stress variations are considered linearly related to strain. A review of the equation set, the tool input and output capabilities and formats and an example will be discussed.

  8. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    EPA Science Inventory

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  9. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

  10. Networking Sensor Observations, Forecast Models & Data Analysis Tools

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

    2009-12-01

    This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and population statistics services to be registered with the GEOSS registry and made findable through the GEOSS Clearinghouse. The fusion of data sources and different web service interfaces illustrate the agility in using standard interfaces and help define the type of input and output interfaces needed to connect models and analysis tools within sensor webs.

  11. Power System Analysis Software Package (PSASP)-an integrated power system analysis tool

    Microsoft Academic Search

    Wu Zhongxi; Zhou Xiaoxin

    1998-01-01

    In this paper, an integrated power system analysis tool called Power System Analysis Software Package (PSASP) is presented. A brief description of every constituent module of PSASP is given first, then user-defined (UD) modeling function and user program interface (UPI) are introduced further. Finally, the new development of PSASP, power system analysis software platform, is illustrated through a diagram

  12. Least-squares (LS) deconvolution of a series of overlapping cortical auditory evoked potentials: a simulation and experimental study

    NASA Astrophysics Data System (ADS)

    Bardy, Fabrice; Van Dun, Bram; Dillon, Harvey; Cowan, Robert

    2014-08-01

    Objective. To evaluate the viability of disentangling a series of overlapping ‘cortical auditory evoked potentials’ (CAEPs) elicited by different stimuli using least-squares (LS) deconvolution, and to assess the adaptation of CAEPs for different stimulus onset-asynchronies (SOAs). Approach. Optimal aperiodic stimulus sequences were designed by controlling the condition number of matrices associated with the LS deconvolution technique. First, theoretical considerations of LS deconvolution were assessed in simulations in which multiple artificial overlapping responses were recovered. Second, biological CAEPs were recorded in response to continuously repeated stimulus trains containing six different tone-bursts with frequencies 8, 4, 2, 1, 0.5, 0.25 kHz separated by SOAs jittered around 150 (120-185), 250 (220-285) and 650 (620-685) ms. The control condition had a fixed SOA of 1175 ms. In a second condition, using the same SOAs, trains of six stimuli were separated by a silence gap of 1600 ms. Twenty-four adults with normal hearing (<20 dB HL) were assessed. Main results. Results showed disentangling of a series of overlapping responses using LS deconvolution on simulated waveforms as well as on real EEG data. The use of rapid presentation and LS deconvolution did not however, allow the recovered CAEPs to have a higher signal-to-noise ratio than for slowly presented stimuli. The LS deconvolution technique enables the analysis of a series of overlapping responses in EEG. Significance. LS deconvolution is a useful technique for the study of adaptation mechanisms of CAEPs for closely spaced stimuli whose characteristics change from stimulus to stimulus. High-rate presentation is necessary to develop an understanding of how the auditory system encodes natural speech or other intrinsically high-rate stimuli.

  13. Perceived Image Quality Improvements from the Application of Image Deconvolution to Retinal Images from an Adaptive Optics Fundus Imager

    NASA Astrophysics Data System (ADS)

    Soliz, P.; Nemeth, S. C.; Erry, G. R. G.; Otten, L. J.; Yang, S. Y.

    Aim: The objective of this project was to apply an image restoration methodology based on wavefront measurements obtained with a Shack-Hartmann sensor and evaluating the restored image quality based on medical criteria.Methods: Implementing an adaptive optics (AO) technique, a fundus imager was used to achieve low-order correction to images of the retina. The high-order correction was provided by deconvolution. A Shack-Hartmann wavefront sensor measures aberrations. The wavefront measurement is the basis for activating a deformable mirror. Image restoration to remove remaining aberrations is achieved by direct deconvolution using the point spread function (PSF) or a blind deconvolution. The PSF is estimated using measured wavefront aberrations. Direct application of classical deconvolution methods such as inverse filtering, Wiener filtering or iterative blind deconvolution (IBD) to the AO retinal images obtained from the adaptive optical imaging system is not satisfactory because of the very large image size, dificulty in modeling the system noise, and inaccuracy in PSF estimation. Our approach combines direct and blind deconvolution to exploit available system information, avoid non-convergence, and time-consuming iterative processes. Results: The deconvolution was applied to human subject data and resulting restored images compared by a trained ophthalmic researcher. Qualitative analysis showed significant improvements. Neovascularization can be visualized with the adaptive optics device that cannot be resolved with the standard fundus camera. The individual nerve fiber bundles are easily resolved as are melanin structures in the choroid. Conclusion: This project demonstrated that computer-enhanced, adaptive optic images have greater detail of anatomical and pathological structures.

  14. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  15. Coastal Online Analysis and Synthesis Tool 2.0 (COAST)

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

    2009-01-01

    The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

  16. TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS

    SciTech Connect

    Da Rio, Nicola [European Space Agency, Keplerlaan 1, 2200-AG Noordwijk (Netherlands); Robberto, Massimo, E-mail: ndario@rssd.esa.int [Space Telescope Science Institute, 3700 San Martin Dr., Baltimore, MD 21218 (United States)

    2012-12-01

    We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

  17. GLIDER: Free tool imagery data visualization, analysis and mining

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Graves, S. J.; Berendes, T.; Maskey, M.; Chidambaram, C.; Hogan, P.; Gaskin, T.

    2009-12-01

    Satellite imagery can be analyzed to extract thematic information, which has increasingly been used as a source of information for making policy decisions. The uses of such thematic information can vary from military applications such as detecting assets of interest to science applications such as characterizing land-use/land cover change at local, regional and global scales. However, extracting thematic information using satellite imagery is a non-trivial task. It requires a user to preprocess the data by applying operations for radiometric and geometric corrections. The user also needs to be able to visualize the data and apply different image enhancement operations to digitally improve the images to identify subtle information that might be otherwise missed. Finally, the user needs to apply different information extraction algorithms to the imagery to obtain the thematic information. At present, there are limited tools that provide users with the capability to easily extract and exploit the information contained within the satellite imagery. This presentation will present GLIDER, a free software tool addressing this void. GLIDER provides users with a easy to use tool to visualize, analyze and mine satellite imagery. GLIDER allows users to visualize and analyze satellite in its native sensor view, an important capability because any transformation to either a geographic coordinate system or any projected coordinate system entails spatial and intensity interpolation; and hence, loss of information. GLIDER allows users to perform their analysis in the native sensor view without any loss of information. GLIDER provides users with a full suite of image processing algorithms that can be used to enhance the satellite imagery. It also provides pattern recognition and data mining algorithms for information extraction. GLIDER allows its users to project satellite data and the analysis/mining results onto to a globe and overlay additional data layers. Traditional analysis tools generally do not provide a good interface between visualization and analysis, especially a 3D view, and GLIDER fills this gap. This feature gives the users extremely useful spatial context to their data and analysis/mining results. This presentation will demonstrate the latest version of GLIDER and also describe its supporting documentation such as video tutorial, online resources etc.

  18. Spatial deconvolution of spectropolarimetric data: an application to quiet Sun magnetic elements

    NASA Astrophysics Data System (ADS)

    Quintero Noda, C.; Asensio Ramos, A.; Orozco Suárez, D.; Ruiz Cobo, B.

    2015-07-01

    Context. One of the difficulties in extracting reliable information about the thermodynamical and magnetic properties of solar plasmas from spectropolarimetric observations is the presence of light dispersed inside the instruments, known as stray light. Aims: We aim to analyze quiet Sun observations after the spatial deconvolution of the data. We examine the validity of the deconvolution process with noisy data as we analyze the physical properties of quiet Sun magnetic elements. Methods: We used a regularization method that decouples the Stokes inversion from the deconvolution process, so that large maps can be quickly inverted without much additional computational burden. We applied the method on Hinode quiet Sun spectropolarimetric data. We examined the spatial and polarimetric properties of the deconvolved profiles, comparing them with the original data. After that, we inverted the Stokes profiles using the Stokes Inversion based on Response functions (SIR) code, which allow us to obtain the optical depth dependence of the atmospheric physical parameters. Results: The deconvolution process increases the contrast of continuum images and makes the magnetic structures sharper. The deconvolved Stokes I profiles reveal the presence of the Zeeman splitting while the Stokes V profiles significantly change their amplitude. The area and amplitude asymmetries of these profiles increase in absolute value after the deconvolution process. We inverted the original Stokes profiles from a magnetic element and found that the magnetic field intensity reproduces the overall behavior of theoretical magnetic flux tubes, that is, the magnetic field lines are vertical in the center of the structure and start to fan when we move far away from the center of the magnetic element. The magnetic field vector inferred from the deconvolved Stokes profiles also mimic a magnetic flux tube but in this case we found stronger field strengths and the gradients along the line-of-sight are larger for the magnetic field intensity and for its inclination. Moreover, the discontinuity between the magnetic and non magnetic environment in the flux tube gets sharper. Conclusions: The deconvolution process used in this paper reveals information that the smearing induced by the point spread function (PSF) of the telescope hides. Additionally, the deconvolution is done with a low computational load, making it appealing for its use on the analysis of large data sets. A copy of the IDL code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/579/A3

  19. Matrix: A statistical method and software tool for linguistic analysis through corpus comparison

    E-print Network

    Rayson, Paul

    Matrix: A statistical method and software tool for linguistic analysis through corpus comparison and software tool for linguistic analysis through corpus comparison A thesis submitted to Lancaster University the development of a new kind of method and tool (Matrix) for advancing the statistical analysis of electronic

  20. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  1. The Precision Formation Flying Integrated Analysis Tool (PFFIAT)

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

    2004-01-01

    Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

  2. Cash flow analysis: a tool for the dairy farmer 

    E-print Network

    McMorrough, Mark D

    1985-01-01

    CASH FLON ANALYSIS: A TOOL FOR THE DAIRY FARNER A 685 Paper by Hark D. HcHorrough Submitted to the Graduate College of Texas A S N University in partial fulfillment of the requirement for the degree of HASTER OF AGRICULTURE Advisor Dr... the most severe crisis in our nation's history since the great depression. " Everyone involved in agriculture is feeling its impact in one way or another, and dairy Farmers are no exception. Since dairy Farming is quite capital inten- sive, the high...

  3. SNooPy: TypeIa supernovae analysis tools

    NASA Astrophysics Data System (ADS)

    Burns, Christopher R.; Stritzinger, Maximilian; Phillips, M. M.; Kattner, ShiAnne; Persson, S. E.; Madore, Barry F.; Freedman, Wendy L.; Boldt, Luis; Campillay, Abdo; Contreras, Carlos; Folatelli, Gaston; Gonzalez, Sergio; Krzeminski, Wojtek; Morrell, Nidia; Salgado, Francisco; Suntzeff, Nicholas B.

    2015-05-01

    The SNooPy package (also known as SNpy), written in Python, contains tools for the analysis of TypeIa supernovae. It offers interactive plotting of light-curve data and models (and spectra), computation of reddening laws and K-corrections, LM non-linear least-squares fitting of light-curve data, and various types of spline fitting, including Diercx and tension. The package also includes a SNIa lightcurve template generator in the CSP passbands, estimates of Milky-Way Extinction, and a module for dealing with filters and spectra.

  4. Image analysis tools and emerging algorithms for expression proteomics

    PubMed Central

    English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

    2012-01-01

    Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

  5. Ultrasound medical image deconvolution using CLEAN L.-T. Chiraa

    E-print Network

    Boyer, Edmond

    Ultrasound medical image deconvolution using CLEAN algorithm L.-T. Chiraa , J.-M. Giraultb , T reconstruction problem of ultrasound medical images using blind deconvolution algorithm has been recognized tissues scatters number. 1 Introduction The medical diagnostic using ultrasounds has intensively used

  6. Sparse Representation-based Image Deconvolution by Iterative Thresholding

    Microsoft Academic Search

    M. J. Fadili

    2007-01-01

    Image deconvolution algorithms with overcomplete sparse representations and fast iterative thresholding methods are presented. The image to be recovered is assumed to be sparsely rep- resented in a redundant dictionary of transforms. These transforms are chosen to offer a wider range of generating atoms; allowing more flexibility in image representation and adaptativity to its morphological content. The deconvolution inverse problem

  7. Blind deconvolution for high-resolution confocal scanning laser ophthalmoscopy

    Microsoft Academic Search

    B. Vohnsen; P. Artal

    2005-01-01

    We investigate the potential of image deconvolution techniques, either in combination or as a substitute for adaptive optics, in a high-resolution confocal scanning laser ophthalmoscope (SLO). After reviewing the validity of standard hypotheses and the a priori information, we implement two deconvolution algorithms to be applied to experimental retinal images recorded with our own high-resolution research SLO. Despite the important

  8. VisIt: Interactive Parallel Visualization and Graphical Analysis Tool

    NASA Astrophysics Data System (ADS)

    Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)

    2011-03-01

    VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool’s features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

  9. Java Analysis Tools for Element Production Calculations in Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Lingerfelt, E.; Hix, W.; Guidry, M.; Smith, M.

    2002-12-01

    We are developing a set of extendable, cross-platform tools and interfaces using Java and vector graphic technologies such as SVG and SWF to facilitate element production calculations in computational astrophysics. The Java technologies are customizable and portable, and can be utilized as stand-alone applications or distributed across a network. These tools, which have broad applications in general scientific visualization, are currently being used to explore and analyze a large library of nuclear reaction rates and visualize results of explosive nucleosynthesis calculations with compact, high quality vector graphics. The facilities for reading and plotting nuclear reaction rates and their components from a network or library permit the user to easily include new rates and compare and adjust current ones. Sophisticated visualization and graphical analysis tools offer the ability to view results in an interactive, scalable vector graphics format, which leads to a dramatic (ten-fold) reduction in visualization file sizes while maintaining high visual quality and interactive control. ORNL Physics Division is managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  10. PyRAT (python radiography analysis tool): overview

    SciTech Connect

    Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

    2011-01-14

    PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

  11. Stacks: an analysis tool set for population genomics

    PubMed Central

    CATCHEN, JULIAN; HOHENLOHE, PAUL A.; BASSHAM, SUSAN; AMORES, ANGEL; CRESKO, WILLIAM A.

    2014-01-01

    Massively parallel short-read sequencing technologies, coupled with powerful software platforms, are enabling investigators to analyse tens of thousands of genetic markers. This wealth of data is rapidly expanding and allowing biological questions to be addressed with unprecedented scope and precision. The sizes of the data sets are now posing significant data processing and analysis challenges. Here we describe an extension of the Stacks software package to efficiently use genotype-by-sequencing data for studies of populations of organisms. Stacks now produces core population genomic summary statistics and SNP-by-SNP statistical tests. These statistics can be analysed across a reference genome using a smoothed sliding window. Stacks also now provides several output formats for several commonly used downstream analysis packages. The expanded population genomics functions in Stacks will make it a useful tool to harness the newest generation of massively parallel genotyping data for ecological and evolutionary genetics. PMID:23701397

  12. Topological Tools For The Analysis Of Active Region Filament Stability

    NASA Astrophysics Data System (ADS)

    DeLuca, Edward E.; Savcheva, A.; van Ballegooijen, A.; Pariat, E.; Aulanier, G.; Su, Y.

    2012-05-01

    The combination of accurate NLFFF models and high resolution MHD simulations allows us to study the changes in stability of an active region filament before a CME. Our analysis strongly supports the following sequence of events leading up to the CME: first there is a build up of magnetic flux in the filament through flux cancellation beneath a developing flux rope; as the flux rope develops a hyperbolic flux tube (HFT) forms beneath the flux rope; reconnection across the HFT raises the flux rope while adding addition flux to it; the eruption is triggered when the flux rope becomes torus-unstable. The work applies topological analysis tools that have been developed over the past decade and points the way for future work on the critical problem of CME initiation in solar active regions. We will present the uses of this approach, current limitations and future prospects.

  13. Message Correlation Analysis Tool for NOvA

    NASA Astrophysics Data System (ADS)

    Lu, Qiming; Biery, Kurt A.; Kowalkowski, James B.

    2012-12-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  14. The Lagrangian analysis tool LAGRANTO - version 2.0

    NASA Astrophysics Data System (ADS)

    Sprenger, M.; Wernli, H.

    2015-02-01

    Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

  15. CRITICA: coding region identification tool invoking comparative analysis

    NASA Technical Reports Server (NTRS)

    Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

    1999-01-01

    Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

  16. A comparison of data envelopment analysis and ratio analysis as tools for performance assessment

    Microsoft Academic Search

    R. G. Dyson; A. Boussofiane

    1996-01-01

    This paper compares data envelopment analysis (DEA) and ratio analysis as alternative tools for assessing the performance of organisational units such as bank branches and schools. Such units typically use one or more resources to secure one or more outputs, the inputs and\\/or outputs being possibly incommensurate. The assessment of District Health Authorities in England on the provision of perinatal

  17. AIDA: An Adaptive Image Deconvolution Algorithm

    NASA Astrophysics Data System (ADS)

    Hom, Erik; Marchis, F.; Lee, T. K.; Haase, S.; Agard, D. A.; Sedat, J. W.

    2007-10-01

    We recently described an adaptive image deconvolution algorithm (AIDA) for myopic deconvolution of multi-frame and three-dimensional data acquired through astronomical and microscopic imaging [Hom et al., J. Opt. Soc. Am. A 24, 1580 (2007)]. AIDA is a reimplementation and extension of the MISTRAL method developed by Mugnier and co-workers and shown to yield object reconstructions with excellent edge preservation and photometric precision [J. Opt. Soc. Am. A 21, 1841 (2004)]. Written in Numerical Python with calls to a robust constrained conjugate gradient method, AIDA has significantly improved run times over the original MISTRAL implementation. AIDA includes a scheme to automatically balance maximum-likelihood estimation and object regularization, which significantly decreases the amount of time and effort needed to generate satisfactory reconstructions. Here, we present a gallery of results demonstrating the effectiveness of AIDA in processing planetary science images acquired using adaptive-optics systems. Offered as an open-source alternative to MISTRAL, AIDA is available for download and further development at: http://msg.ucsf.edu/AIDA. This work was supported in part by the W. M. Keck Observatory, the National Institutes of Health, NASA, the National Science Foundation Science and Technology Center for Adaptive Optics at UC-Santa Cruz, and the Howard Hughes Medical Institute.

  18. Structure preserving color deconvolution for immunohistochemistry images

    NASA Astrophysics Data System (ADS)

    Chen, Ting; Srinivas, Chukka

    2015-03-01

    Immunohistochemistry (IHC) staining is an important technique for the detection of one or more biomarkers within a single tissue section. In digital pathology applications, the correct unmixing of the tissue image into its individual constituent dyes for each biomarker is a prerequisite for accurate detection and identification of the underlying cellular structures. A popular technique thus far is the color deconvolution method1 proposed by Ruifrok et al. However, Ruifrok's method independently estimates the individual dye contributions at each pixel which potentially leads to "holes and cracks" in the cells in the unmixed images. This is clearly inadequate since strong spatial dependencies exist in the tissue images which contain rich cellular structures. In this paper, we formulate the unmixing algorithm into a least-square framework of image patches, and propose a novel color deconvolution method which explicitly incorporates the spatial smoothness and structure continuity constraint into a neighborhood graph regularizer. An analytical closed-form solution to the cost function is derived for this algorithm for fast implementation. The algorithm is evaluated on a clinical data set containing a number of 3,3-Diaminobenzidine (DAB) and hematoxylin (HTX) stained IHC slides and demonstrates better unmixing results than the existing strategy.

  19. EXPERT TOOLS FOR QSAR ANALYSIS AND LEAD OPTIMIZATION The basic tools needed to build powerful, predictive models of biological activity (or any other property) from molecu-

    E-print Network

    Ferreira, Márcia M. C.

    EXPERT TOOLS FOR QSAR ANALYSIS AND LEAD OPTIMIZATION The basic tools needed to build powerful in SYBYL's QSAR module. These include molecular field generation tools, least-squares (PLS, PCA and SIMCA) and non-linear (hierarchical clustering) analysis tools. The most powerful of these techniques can

  20. Limited-memory scaled gradient projection methods for real-time image deconvolution in microscopy

    NASA Astrophysics Data System (ADS)

    Porta, F.; Zanella, R.; Zanghirati, G.; Zanni, L.

    2015-04-01

    Gradient projection methods have given rise to effective tools for image deconvolution in several relevant areas, such as microscopy, medical imaging and astronomy. Due to the large scale of the optimization problems arising in nowadays imaging applications and to the growing request of real-time reconstructions, an interesting challenge to be faced consists in designing new acceleration techniques for the gradient schemes, able to preserve their simplicity and low computational cost of each iteration. In this work we propose an acceleration strategy for a state-of-the-art scaled gradient projection method for image deconvolution in microscopy. The acceleration idea is derived by adapting a step-length selection rule, recently introduced for limited-memory steepest descent methods in unconstrained optimization, to the special constrained optimization framework arising in image reconstruction. We describe how important issues related to the generalization of the step-length rule to the imaging optimization problem have been faced and we evaluate the improvements due to the acceleration strategy by numerical experiments on large-scale image deconvolution problems.

  1. IPMP 2013--a comprehensive data analysis tool for predictive microbiology.

    PubMed

    Huang, Lihan

    2014-02-01

    Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS® or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS® or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS®, R, or other more sophisticated statistical packages for model development in predictive microbiology. PMID:24334095

  2. A Freeware Java Tool for Spatial Point Analysis of Neuronal Barry G. Condron

    E-print Network

    Condron, Barry

    NEWS ITEM A Freeware Java Tool for Spatial Point Analysis of Neuronal Structures Barry G. Condron, a freeware tool, called PAJ, has been developed. This Java-based tool takes 3D Cartesian coordinates as input in Java that is based on previously described statistical analysis (Diggle 2003). In PAJ, data is copied

  3. Multiple Lyapunov functions and other analysis tools for switched and hybrid systems

    Microsoft Academic Search

    Michael S. Branicky

    1998-01-01

    We introduce some analysis tools for switched and hybrid systems. We first present work on stability analysis. We introduce multiple Lyapunov functions as a tool for analyzing Lyapunov stability and use iterated function systems theory as a tool for Lagrange stability. We also discuss the case where the switched systems are indexed by an arbitrary compact set. Finally, we extend

  4. WHY CONVENTIONAL TOOLS FOR POLICY ANALYSIS ARE OFTEN INADEQUATE FOR PROBLEMS OF GLOBAL CHANGE

    E-print Network

    Risbey, James S.

    WHY CONVENTIONAL TOOLS FOR POLICY ANALYSIS ARE OFTEN INADEQUATE FOR PROBLEMS OF GLOBAL CHANGE of tools for quantitative policy analysis. As policy analysts have turned to the consideration of climate and other problems of global change, they have found it natural to employ such now standard tools as utility

  5. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  6. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  7. Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools

    SciTech Connect

    Barrand, Guy

    2002-08-20

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

  8. Input Range Testing for the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

  9. SINEBase: a database and tool for SINE analysis

    PubMed Central

    Vassetzky, Nikita S.; Kramerov, Dmitri A.

    2013-01-01

    SINEBase (http://sines.eimb.ru) integrates the revisited body of knowledge about short interspersed elements (SINEs). A set of formal definitions concerning SINEs was introduced. All available sequence data were screened through these definitions and the genetic elements misidentified as SINEs were discarded. As a result, 175 SINE families have been recognized in animals, flowering plants and green algae. These families were classified by the modular structure of their nucleotide sequences and the frequencies of different patterns were evaluated. These data formed the basis for the database of SINEs. The SINEBase website can be used in two ways: first, to explore the database of SINE families, and second, to analyse candidate SINE sequences using specifically developed tools. This article presents an overview of the database and the process of SINE identification and analysis. PMID:23203982

  10. Decision Analysis Tool to Compare Energy Pathways for Transportation

    SciTech Connect

    Bloyd, Cary N.

    2010-06-30

    With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). Biomass is seen as an important domestic energy feedstock, and there are multiple pathways in which it can be linked to the transport sector. Contenders include the use of cellulosic ethanol from biomass to replace gasoline or the use of a biomass-fueled combined cycle electrical power generation facility in conjunction plug-in hybrid electric vehicles (PHEVs). This paper reviews a project that is developing a scenario decision analysis tool to assist policy makers, program managers, and others to obtain a better understanding of these uncertain possibilities and how they may interact over time.

  11. System-of-Systems Technology-Portfolio-Analysis Tool

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

    2012-01-01

    Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

  12. Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning

    PubMed Central

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.

    2014-01-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  13. Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.

    PubMed

    Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C

    2013-05-01

    Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

  14. Quantitative Identification of Pesticides as Target Compounds and Unknowns by Spectral Deconvolution of Gas Chromatographic\\/Mass Spectrometric Data

    Microsoft Academic Search

    ANDREAS HOFFMANN; Irish Distillers-Pernod Ricard; Ireland YONGLI HUANG

    The results of gas chromatography\\/mass spectrometry (MS), with Ion Signature Technology, Inc. (North Smithfield, RI) quantitative deconvolution software, are discussed for pesticides identified both as target compounds by using retention and MS data and as unknowns by using only mass spectra. Target compound analysis of 32 pesticides, surrogates, and an internal standard added to lemon oil over a wide concentration

  15. Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique

    ERIC Educational Resources Information Center

    Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.

    2005-01-01

    A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to…

  16. Lagrangian analysis. Modern tool of the dynamics of solids

    NASA Astrophysics Data System (ADS)

    Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.

    Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The Lagrangian specificity of the required measurements is assured by the fact that a transducer enclosed within a solid material is necessarily linked in motion to the particles of the material which surround it. This Lagrangian instrumentation is described in the second chapter. The authors are concerned with the techniques considered today to be the most effective. These are, for stress : piezoresistive gauges (50 ? and low impedance) and piezoelectric techniques (PVF2 gauges, quartz transducers) ; and for particle velocity : electromagnetic gauges, VISAR and IDL Doppler laser interferometers. In each case both the physical principles as well as techniques of use are set out in detail. For the most part, the authors use their own experience to describe the calibration of these instrumentation systems and to compare their characteristics : measurement range, response time, accuracy, useful recording time, detection area... These characteristics should be taken into account by the physicist when he has to choose the instrumentation systems best adapted to the Lagrangian analysis he intends to apply to any given material. The discussion at the end of chapter 2 should guide his choice both for plane and spherical one-dimensional motions. The third chapter examines to what extent the accuracy of Lagrangian analysis is affected by the accuracies of the numerical analysis methods and experimental techniques. By means of a discussion of different cases of analysis, the authors want to make the reader aware of the different kinds of sources of errors that may be encountered. This work brings up to date the state of studies on Lagrangian analysis methods based on a wide review of bibliographical sources together with the contribution made to research in this field by the four authors themselves in the course of the last ten years. Le formage des métaux par explosif, la consolidation dynamique des poudres, la balistique terminale, l'abattage des roches par explosif, sont autant d'applications, dans les domaines civil et militaire, qui exigent d'approfondir les connaissances q

  17. Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs

    SciTech Connect

    Howard, M. [NSTec; Luttman, A. [NSTec; Fowler, M. [NSTec

    2014-11-01

    In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

  18. The impact of beam deconvolution on noise properties in CMB measurements: Application to Planck LFI

    E-print Network

    Keihänen, E; Lindholm, V; Reinecke, M; Suur-Uski, A -S

    2015-01-01

    We present an analysis of the effects of beam deconvolution on noise properties in CMB measurements. The analysis is built around the artDeco beam deconvolver code. We derive a low-resolution noise covariance matrix that describes the residual noise in deconvolution products, both in harmonic and pixel space. The matrix models the residual correlated noise that remains in time-ordered data after destriping, and the effect of deconvolution on it. To validate the results, we generate noise simulations that mimic the data from the Planck LFI instrument. A $\\chi^2$ test for the full 70 GHz covariance in multipole range $\\ell=0-50$ yields a mean reduced $\\chi^2$ of 1.0037. We compare two destriping options, full and independent destriping, when deconvolving subsets of available data. Full destriping leaves substantially less residual noise, but leaves data sets intercorrelated. We derive also a white noise covariance matrix that provides an approximation of the full noise at high multipoles, and study the properti...

  19. Deconvolution of mixed magnetism in multilayer graphene

    SciTech Connect

    Swain, Akshaya Kumar [IITB-Monash Research Academy, Department of Metallurgical Engineering and Materials Science, IIT Bombay, Mumbai 400076 (India); Bahadur, Dhirendra, E-mail: dhirenb@iitb.ac.in [Department of Metallurgical Engineering and Materials Science, IIT Bombay, Mumbai 400076 (India)

    2014-06-16

    Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

  20. Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.

    PubMed

    Caruana, Matthew V; Carvalho, Susana; Braun, David R; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W K

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  1. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

  2. NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems

    NASA Technical Reports Server (NTRS)

    Liu, Nan-Suey; Quealy, Angela

    1999-01-01

    A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

  3. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  4. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  5. Wavelet analysis as a nonstationary plasma fluctuation diagnostic tool

    SciTech Connect

    Santoso, S.; Powers, E.J.; Ouroua, A.; Heard, J.W.; Bengtson, R.D. [Univ. of Texas, Austin, TX (United States). Fusion Research Center

    1996-12-31

    Analysis of nonstationary plasma fluctuation data has been a long-time challenge for the plasma diagnostic community. For this reason, in this paper the authors present and apply wavelet transforms as a new diagnostic tool to analyze nonstationary plasma fluctuation data. Unlike the Fourier transform, which represents a given signal globally without temporal resolution, the wavelet transform provides a local representation of the given signal in the time-scale domain. The fundamental concepts and multiresolution properties of wavelet transforms, along with a brief comparison with the short-time Fourier transform, are presented in this paper. The selection of a prototype wavelet or a mother wavelet is also discussed. Digital implementation of wavelet spectral analysis, which include time-scale power spectra and scale power spectra are described. The efficacy of the wavelet approach is demonstrated by analyzing transient broadband electrostatic potential fluctuations inside the inversion radius of sawtoothing TEXT-U plasmas during electron cyclotron resonance heating. The potential signals are collected using a 2 MeV heavy ion beam probe.

  6. Orienting the Neighborhood: A Subdivision Energy Analysis Tool

    SciTech Connect

    Christensen, C.; Horowitz, S.

    2008-01-01

    In subdivisions, house orientations are largely determined by street layout. The resulting house orientations affect energy consumption (annual and on-peak) for heating and cooling, depending on window area distributions and shading from neighboring houses. House orientations also affect energy production (annual and on-peak) from solar thermal and photovoltaic systems, depending on available roof surfaces. Therefore, house orientations fundamentally influence both energy consumption and production, and an appropriate street layout is a prerequisite for taking full advantage of energy efficiency and renewable energy opportunities. The potential influence of street layout on solar performance is often acknowledged, but solar and energy issues must compete with many other criteria and constraints that influence subdivision street layout. When only general guidelines regarding energy are available, these factors may be ignored or have limited effect. Also, typical guidelines are often not site-specific and do not account for local parameters such as climate and the time value of energy. For energy to be given its due consideration in subdivision design, energy impacts need to be accurately quantified and displayed interactively to facilitate analysis of design alternatives. This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

  7. Mission operations data analysis tools for Mars Observer guidance and control

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.

    1994-01-01

    Mission operations for the Mars Observer (MO) Project at the Jet Propulsion Laboratory were supported by a variety of ground data processing software and analysis tools. Some of these tools were generic to multimission spacecraft mission operations, some were specific to the MO spacecraft, and others were custom tailored to the operation and control of the Attitude and Articulation Control Subsystem (AACS). The focus of this paper is on the data analysis tools for the AACS. Four different categories of analysis tools are presented; with details offered for specific tools. Valuable experience was gained from the use of these tools and through their development. These tools formed the backbone and enhanced the efficiency of the AACS Unit in the Mission Operations Spacecraft Team. These same tools, and extensions thereof, have been adopted by the Galileo mission operations, and are being designed into Cassini and other future spacecraft mission operations.

  8. Study of academic achievements using spatial analysis tools

    NASA Astrophysics Data System (ADS)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

  9. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  10. Generic Tools for Data Analysis and Visualisation Edinburgh Napier University, School of Computing,

    E-print Network

    Priss, Uta

    Generic Tools for Data Analysis and Visualisation Uta Priss Edinburgh Napier University, School in order to build generic tools for data analysis and visualisation. Its intention is to stimulate programs, data analysis is usually accomplished by speak- ing to the computer and asking it to analyse some

  11. The discrete Kalman filtering approach for seismic signals deconvolution

    SciTech Connect

    Kurniadi, Rizal; Nurhandoko, Bagus Endar B. [Departement of Physics Intitut Teknologi Bandung, Jl. Ganesha 10 Bandung (Indonesia)

    2012-06-20

    Seismic signals are a convolution of reflectivity and seismic wavelet. One of the most important stages in seismic data processing is deconvolution process; the process of deconvolution is inverse filters based on Wiener filter theory. This theory is limited by certain modelling assumptions, which may not always valid. The discrete form of the Kalman filter is then used to generate an estimate of the reflectivity function. The main advantage of Kalman filtering is capability of technique to handling continually time varying models and has high resolution capabilities. In this work, we use discrete Kalman filter that it was combined with primitive deconvolution. Filtering process works on reflectivity function, hence the work flow of filtering is started with primitive deconvolution using inverse of wavelet. The seismic signals then are obtained by convoluting of filtered reflectivity function with energy waveform which is referred to as the seismic wavelet. The higher frequency of wavelet gives smaller wave length, the graphs of these results are presented.

  12. Blind Deconvolution for Ultrasound Sequences Using a Noninverse Greedy Algorithm

    PubMed Central

    Chira, Liviu-Teodor; Rusu, Corneliu; Tauber, Clovis; Girault, Jean-Marc

    2013-01-01

    The blind deconvolution of ultrasound sequences in medical ultrasound technique is still a major problem despite the efforts made. This paper presents a blind noninverse deconvolution algorithm to eliminate the blurring effect, using the envelope of the acquired radio-frequency sequences and a priori Laplacian distribution for deconvolved signal. The algorithm is executed in two steps. Firstly, the point spread function is automatically estimated from the measured data. Secondly, the data are reconstructed in a nonblind way using proposed algorithm. The algorithm is a nonlinear blind deconvolution which works as a greedy algorithm. The results on simulated signals and real images are compared with different state of the art methods deconvolution. Our method shows good results for scatters detection, speckle noise suppression, and execution time. PMID:24489533

  13. Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification

    NASA Astrophysics Data System (ADS)

    Lopes, R.; Vicente, D.; Silva, N.

    2009-05-01

    Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

  14. Two-dimensional blind Bayesian deconvolution of medical ultrasound images

    Microsoft Academic Search

    R. Jirik; T. Taxt

    2008-01-01

    A new approach to 2-D blind deconvolution of ultrasonic images in a Bayesian framework is presented. The radio-frequency image data are modeled as a convolution of the point-spread function and the tissue function, with additive white noise. The deconvolution algorithm is derived from statistical assumptions about the tissue function, the point-spread function, and the noise. It is solved as an

  15. Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)

    1999-01-01

    A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

  16. Deconvolution and signal extraction in geophysics and acoustics

    NASA Astrophysics Data System (ADS)

    Sibul, Leon H.; Roan, Michael J.; Erling, Josh

    2002-11-01

    Deconvolution and signal extraction are fundamental signal processing techniques in geophysics and acoustics. An introductory overview of the standard second-order methods and minimum entropy deconvolution is presented. Limitations of the second-order methods are discussed and the need for more general methods is established. The minimum entropy deconvolution (MED), as proposed by Wiggins in 1977, is a technique for the deconvolution of seismic signals that overcomes limitations of the second-order method of deconvolution. The unifying conceptual framework MED, as presented in the Donoho's classical paper (1981) is discussed. The basic assumption of MED is that input signals to the forward filter are independent, identically distributed non-Gaussian random processes. A forward convolution filter ''makes'' the output of the forward filter more Gaussian which increases its entropy. The minimization of entropy restores the original non-Gaussian input. We also give an overview of recent developments in blind deconvolution (BDC), blind source separation (BSS), and blind signal extraction (BSE). The recent research in these areas uses information theoretic (IT) criteria (entropy, mutual information, K-L divergence, etc.) for optimization objective functions. Gradients of these objective functions are nonlinear functions, resulting in nonlinear algorithms. Some of the recursive algorithms for nonlinear optimization are reviewed.

  17. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  18. Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions

    E-print Network

    Halgamuge, Malka N.

    Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions Dung V hack tools. However, beside U3 technology, attackers also have another more flexible alternative, portable application or application virtualization, which allows a wide range of hack tools to be compiled

  19. Quality Assessment of Computational Techniques and Software Tools for Planar-Antenna Analysis

    Microsoft Academic Search

    A. Vasylchenko; Y. Schols; W. De Raedt; G. A. E. Vandenbosch

    2009-01-01

    The goal of this paper is a thorough investigation of the quality of the software tools widely used nowadays in the field of planar-antenna analysis and synthesis. Six simulation tools - five well-known commercial tools and one developed in-house - are compared with each other for four different planar antennas. It is crucial to point out that all possible efforts

  20. Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.

    PubMed

    Villeneuve, Emma; Carfantan, Hervé

    2014-10-01

    Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

  1. Fast, Automated Implementation of Temporally Precise Blind Deconvolution of Multiphasic Excitatory Postsynaptic Currents

    PubMed Central

    Andor-Ardó, Daniel; Keen, Erica C.; Hudspeth, A. J.; Magnasco, Marcelo O.

    2012-01-01

    Records of excitatory postsynaptic currents (EPSCs) are often complex, with overlapping signals that display a large range of amplitudes. Statistical analysis of the kinetics and amplitudes of such complex EPSCs is nonetheless essential to the understanding of transmitter release. We therefore developed a maximum-likelihood blind deconvolution algorithm to detect exocytotic events in complex EPSC records. The algorithm is capable of characterizing the kinetics of the prototypical EPSC as well as delineating individual release events at higher temporal resolution than other extant methods. The approach also accommodates data with low signal-to-noise ratios and those with substantial overlaps between events. We demonstrated the algorithm’s efficacy on paired whole-cell electrode recordings and synthetic data of high complexity. Using the algorithm to align EPSCs, we characterized their kinetics in a parameter-free way. Combining this approach with maximum-entropy deconvolution, we were able to identify independent release events in complex records at a temporal resolution of less than 250 µs. We determined that the increase in total postsynaptic current associated with depolarization of the presynaptic cell stems primarily from an increase in the rate of EPSCs rather than an increase in their amplitude. Finally, we found that fluctuations owing to postsynaptic receptor kinetics and experimental noise, as well as the model dependence of the deconvolution process, explain our inability to observe quantized peaks in histograms of EPSC amplitudes from physiological recordings. PMID:22761670

  2. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  3. Experimental analysis of change detection algorithms for multitooth machine tool fault detection

    NASA Astrophysics Data System (ADS)

    Reñones, Aníbal; de Miguel, Luis J.; Perán, José R.

    2009-10-01

    This paper describes an industrial application of fault diagnosis method for a multitooth machine tool. Different statistical approaches have been used to detect and diagnose insert breakage in multitooth tools based on the analysis of electrical power consumption of the tool drives. Great effort has been made to obtain a robust method, able to avoid any needed re-calibration process, after, for example, a maintenance operation. From the point of view of maintenance costs, these multitooth tools are the most critical part of the machine tools used for mass production in the car industry. These tools integrate different kinds of machining operations and cutting conditions.

  4. Analysis of the influence of tool dynamics in diamond turning

    SciTech Connect

    Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

    1988-12-01

    This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

  5. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  6. Timed Petri nets in modeling and analysis of cluster tools

    Microsoft Academic Search

    Wlodzimierz M. Zuberek

    2001-01-01

    Timed Petri nets are used as models of cluster tools representing the flow of wafers through the chambers of the tool as well as sequences of actions performed by the robotic transporter. Since the durations of all activities are also represented in the model, performance characteristics can be derived from the model for steady-state, as well as transient behaviors. The

  7. An Analysis of Teacher Selection Tools in Pennsylvania

    ERIC Educational Resources Information Center

    Vitale, Tracy L.

    2009-01-01

    The purpose of this study was to examine teacher screening and selection tools currently being utilized by public school districts in Pennsylvania and to compare these tools to the research on qualities of effective teachers. The researcher developed four research questions that guided her study. The Pennsylvania Association of School Personnel…

  8. DETECTING ISOLATED SPECTRUM OF TRANSFER AND KOOPMAN OPERATORS WITH FOURIER ANALYSIS TOOLS

    E-print Network

    Froyland, Gary

    DETECTING ISOLATED SPECTRUM OF TRANSFER AND KOOPMAN OPERATORS WITH FOURIER ANALYSIS TOOLS GARY to identify the isolated spectrum and large-scale structures alluded to in the ansatz. This harmonic analysis

  9. A Library of Cortical Morphology Analysis Tools to Study Development, Aging and Genetics of Cerebral Cortex

    Microsoft Academic Search

    Peter Kochunov; William Rogers; Jean-Francois Mangin; Jack Lancaster

    Sharing of analysis techniques and tools is among the main driving forces of modern neuroscience. We describe a library of\\u000a tools developed to quantify global and regional differences in cortical anatomy in high resolution structural MR images. This\\u000a library is distributed as a plug-in application for popular structural analysis software, BrainVisa (BV). It contains tools\\u000a to measure global and regional

  10. DEVELOPMENT OF AN ANALYSIS TOOL FOR THE DESIGN OF BONDED COMPOSITE REPAIRS

    Microsoft Academic Search

    R. J. C. Creemers

    As part of the programme a design and analysis tool for bonded composite repairs has been developed. The repair design tool runs on a normal PC under Microsoft Office Excel, which is easily accessible for most people. A wide variety of joint designs, including external patches and scarf repairs, can be specified via a s imple-to-use input interface. The analysis

  11. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  12. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  13. Online Analysis of Wind and Solar Part I: Ramping Tool

    SciTech Connect

    Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

  14. Online Analysis of Wind and Solar Part II: Transmission Tool

    SciTech Connect

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

  15. Applied Climate-Change Analysis: The Climate Wizard Tool

    PubMed Central

    Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

    2009-01-01

    Background Although the message of “global climate change” is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951–2002 occurred in northern hemisphere countries (especially during January–April), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50°N during February-March to 10°N during August-September. Precipitation decreases occurred most commonly in countries between 0–20°N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 2070–2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate change. Moreover, Climate Wizard is not a static product, but rather a data analysis framework designed to be used for climate change impact and adaption planning, which can be expanded to include other information, such as downscaled future projections of hydrology, soil moisture, wildfire, vegetation, marine conditions, disease, and agricultural productivity. PMID:20016827

  16. Ris-R-1359(EN) Fractography analysis of tool samples

    E-print Network

    -speed tool steels 6 Heat treatment 7 Effect of alloying elements 8 Powder metallurgy 10 Third generation P through finite element modelling (FEM) (See for example [3]), material selection, heat treat- ment

  17. Novel tools for sequence and epitope analysis of glycosaminoglycans

    E-print Network

    Behr, Jonathan Robert

    2007-01-01

    Our understanding of glycosaminoglycan (GAG) biology has been limited by a lack of sensitive and efficient analytical tools designed to deal with these complex molecules. GAGs are heterogeneous and often sulfated linear ...

  18. An Integrated Traverse Planner and Analysis Tool for Planetary Exploration

    E-print Network

    Johnson, Aaron William

    Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. ...

  19. Cemented carbide cutting tool: Laser processing and thermal stress analysis

    Microsoft Academic Search

    B. S. Yilbas; A. F. M. Arif; C. Karatas; M. Ahsan

    2007-01-01

    Laser treatment of cemented carbide tool surface consisting of W, C, TiC, TaC is examined and thermal stress developed due to temperature gradients in the laser treated region is predicted numerically. Temperature rise in the substrate material is computed numerically using the Fourier heating model. Experiment is carried out to treat the tool surfaces using a CO2 laser while SEM,

  20. Improving space debris detection in GEO ring using image deconvolution

    NASA Astrophysics Data System (ADS)

    Núñez, Jorge; Núñez, Anna; Montojo, Francisco Javier; Condominas, Marta

    2015-07-01

    In this paper we present a method based on image deconvolution to improve the detection of space debris, mainly in the geostationary ring. Among the deconvolution methods we chose the iterative Richardson-Lucy (R-L), as the method that achieves better goals with a reasonable amount of computation. For this work, we used two sets of real 4096 × 4096 pixel test images obtained with the Telescope Fabra-ROA at Montsec (TFRM). Using the first set of data, we establish the optimal number of iterations in 7, and applying the R-L method with 7 iterations to the images, we show that the astrometric accuracy does not vary significantly while the limiting magnitude of the deconvolved images increases significantly compared to the original ones. The increase is in average about 1.0 magnitude, which means that objects up to 2.5 times fainter can be detected after deconvolution. The application of the method to the second set of test images, which includes several faint objects, shows that, after deconvolution, up to four previously undetected faint objects are detected in a single frame. Finally, we carried out a study of some economic aspects of applying the deconvolution method, showing that an important economic impact can be envisaged.

  1. Suspected-target pesticide screening using gas chromatography-quadrupole time-of-flight mass spectrometry with high resolution deconvolution and retention index/mass spectrum library.

    PubMed

    Zhang, Fang; Wang, Haoyang; Zhang, Li; Zhang, Jing; Fan, Ruojing; Yu, Chongtian; Wang, Wenwen; Guo, Yinlong

    2014-10-01

    A strategy for suspected-target screening of pesticide residues in complicated matrices was exploited using gas chromatography in combination with hybrid quadrupole time-of-flight mass spectrometry (GC-QTOF MS). The screening workflow followed three key steps of, initial detection, preliminary identification, and final confirmation. The initial detection of components in a matrix was done by a high resolution mass spectrum deconvolution; the preliminary identification of suspected pesticides was based on a special retention index/mass spectrum (RI/MS) library that contained both the first-stage mass spectra (MS(1) spectra) and retention indices; and the final confirmation was accomplished by accurate mass measurements of representative ions with their response ratios from the MS(1) spectra or representative product ions from the second-stage mass spectra (MS(2) spectra). To evaluate the applicability of the workflow in real samples, three matrices of apple, spinach, and scallion, each spiked with 165 test pesticides in a set of concentrations, were selected as the models. The results showed that the use of high-resolution TOF enabled effective extractions of spectra from noisy chromatograms, which was based on a narrow mass window (5 mDa) and suspected-target compounds identified by the similarity match of deconvoluted full mass spectra and filtering of linear RIs. On average, over 74% of pesticides at 50 ng/mL could be identified using deconvolution and the RI/MS library. Over 80% of pesticides at 5 ng/mL or lower concentrations could be confirmed in each matrix using at least two representative ions with their response ratios from the MS(1) spectra. In addition, the application of product ion spectra was capable of confirming suspected pesticides with specificity for some pesticides in complicated matrices. In conclusion, GC-QTOF MS combined with the RI/MS library seems to be one of the most efficient tools for the analysis of suspected-target pesticide residues in complicated matrices. PMID:25059143

  2. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed instruction tailored to their experience level along with proper support and mentoring.This project was funded by a grant from the National Science Foundation, Grant # PHY1157078.

  3. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  4. An Evaluation of Visual and Textual Network Analysis Tools

    SciTech Connect

    Goodall, John R [ORNL

    2011-01-01

    User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This article presents the results of a comparative evaluation between a visualization-based application and a more traditional, table-based application for analyzing computer network packet captures. We conducted this evaluation as part of the user-centered design process. Participants performed both structured, well-defined tasks and exploratory, open-ended tasks with both tools. We measured accuracy and efficiency for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study design presented may be useful for future researchers performing user testing on visualization for cyber security applications.

  5. Translational meta-analysis tool for temporal gene expression profiles.

    PubMed

    Tusch, Guenter; Tole, Olvi

    2012-01-01

    Widespread use of microarray technology that led to highly complex datasets often is addressing similar or related biological questions. In translational medicine research is often based on measurements that have been obtained at different points in time. However, the researcher looks at them as a progression over time. If a biological stimulus shows an effect on a particular gene that is reversed over time, this would show, for instance, as a peak in the gene's temporal expression profile. Our program SPOT helps researchers find these patterns in large sets of microarray data. We created the software tool using open-source platforms and the Semantic Web tool Protégé-OWL. PMID:22874385

  6. Accuracy of peak deconvolution algorithms within chromatographic integrators.

    PubMed

    Papas, A N; Tougas, T P

    1990-02-01

    The soundness of present-day algorithms to deconvolve overlapping skewed peaks was investigated. From simulated studies based on the exponentially modified Gaussian model (EMG), chromatographic peak area inaccuracies for unresolved peaks are presented for the two deconvolution methods, the tangent skim and the perpendicular drop method. These inherent inaccuracies, in many cases exceeding 50%, are much greater than those calculated from ideal Gaussian profiles. Multiple linear regression (MLR) was used to build models that predict the relative error for either peak deconvolution method. MLR also provided a means for determining influential independent variables, defining the required chromatographic relationships needed for prediction. Once forecasted errors for both methods are calculated, selection of either peak deconvolution method can be made by minimum errors. These selection boundaries are contrasted to method selection criteria of present data systems' algorithms. PMID:2305954

  7. Accuracy of peak deconvolution algorithms within chromatographic integrators

    SciTech Connect

    Papas, A.N. (Food Drug Administration, Winchester, MA (USA)); Tougas, T.P. (Univ. of Lowell, MA (USA))

    1990-02-01

    The soundness of present-day algorithms to deconvolve overlapping skewed peaks was investigated. From simulated studies based on the exponentially modified Gaussian model (EMG), chromatographic peak area inaccuracies for unresolved peaks are presented for the two deconvolution methods, the tangent skim and the perpendicular drop method. These inherent inaccuracies, in many cases exceeding 50%, are much greater than those calculated from ideal Gaussian profiles. Multiple linear regression (MLR) was used to build models that predict the relative error for either peak deconvolution method. MLR also provided a means for determining influential independent variables, defining the required chromatographic relationships needed for prediction. Once forecasted errors for both methods are calculated, selection of either peak deconvolution method can be made by minimum errors. These selection boundaries are contrasted to method selection criteria of present data systems algorithms.

  8. On-line algorithms for blind deconvolution of multichannel linear time-invariant systems

    Microsoft Academic Search

    Y. Inoue; T. Sato

    1997-01-01

    Blind deconvolution and blind equalization have been important interesting topics in diverse fields including data communication, image processing and geophysical data processing. Inouye and Habe (1995) proposed a constrained multistage criterion for attaining blind deconvolution of multichannel linear time-invariant (LTI) systems. In this paper, based on their constrained criterion, we present an iterative algorithm for solving the blind deconvolution problem

  9. Multiscale Analysis of Surface Topography from Single Point Incremental Forming using an Acetal Tool

    NASA Astrophysics Data System (ADS)

    Ham, M.; Powers, B. M.; Loiselle, J.

    2014-03-01

    Single point incremental forming (SPIF) is a sheet metal manufacturing process that forms a part by incrementally applying point loads to the material to achieve the desired deformations and final part geometry. This paper investigates the differences in surface topography between a carbide tool and an acetal-tipped tool. Area-scale analysis is performed on the confocal areal surface measurements per ASME B46. The objective of this paper is to determine at which scales surfaces formed by two different tool materials can be differentiated. It is found that the surfaces in contact with the acetal forming tool have greater relative areas at all scales greater than 5 × 104 ?m2 than the surfaces in contact with the carbide tools. The surfaces not in contact with the tools during forming, also referred to as the free surface, are unaffected by the tool material.

  10. A requirements analysis for videogame design support tools

    Microsoft Academic Search

    Mark J. Nelson; Michael Mateas

    2009-01-01

    Designing videogames involves weaving together systems of rules, called game mechanics, which support and str ucture com- pelling player experiences. Thus a significant port ion of game design involves reasoning about the effects of diff erent potential game mechanics on player experience. Unlike some design fields, such as architecture and mechanical design, that ha ve CAD tools to support designers

  11. Numerical tools applied to power reactor noise analysis

    Microsoft Academic Search

    Christophe Demazière; Imre Pázsit

    2009-01-01

    In order to be able to calculate the space- and frequency-dependent neutron noise in real inhomogeneous systems in two-group theory, a code was developed for the calculation of the Green's function (dynamic transfer function) of such systems. This paper reports on the development as well as the test and application of the numerical tools employed. The code that was developed

  12. Residual Stress Field Analysis and Prediction in Nitrided Tool Steel

    Microsoft Academic Search

    B. Podgornik; V. Leskovšek; M. Kova?i?; J. Vižintin

    2011-01-01

    Residual stresses are present in engineering components as an unintended consequence of manufacturing processes, but they are also deliberately introduced to beneficial effect during surface engineering procedures. Plasma nitriding is a process of particular importance for forming tools and dies, giving significant advantages in wear and fatigue resistance through the generation of near-surface compressive residual stresses. A precise knowledge of

  13. Clinical decision support tools: analysis of online drug information databases

    Microsoft Academic Search

    Kevin A Clauson; Wallace A Marsh; Hyla H Polen; Matthew J Seamon; Blanca I Ortiz

    2007-01-01

    BACKGROUND: Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information

  14. Assessment of Fourier Tools for Cancellous Bone Structure Analysis

    Microsoft Academic Search

    Tammy M. Cleek; Murk J. Bottema; Nicola L. Fazzalari; Karen J. Reynolds

    The usefulness of Fourier analyses as a tool for determining key parameters of cancellous bone structure is investigated. The autocorrelation function is used to determine measures of preferred orientation of trabeculae and anisotropy. Peaks in the power spectrum are used to determine average trabecular strut spacing. Good agreement and high correlations were observed when these frequency domain measurements were compared

  15. Deconvolution of NICISS profiles involving elements of similar masses

    NASA Astrophysics Data System (ADS)

    Ridings, Christiaan; Andersson, Gunther G.

    2014-12-01

    Neutral impact collision ion scattering spectroscopy uses the backscattering of projectiles to determine the concentration depth profiles of elements in soft matter systems. As such, the measured profiles are the actual concentration depth profiles convoluted with the inelastic energy loss distribution of the projectile. The inelastic loss distribution depends on the element from which the projectile is backscattered. In the case that two elements of similar masses are detected, their profiles can overlap within these energy loss distributions. In this case the standard deconvolution procedure used must be modified to exclude the energy loss straggling of the projectiles in the bulk to adequately deconvolute the profiles of two elements at the same time.

  16. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    PubMed Central

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-01-01

    Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806

  17. VEA-bility Security Metric: A Network Security Analysis Tool Melanie Tupper

    E-print Network

    Zincir-Heywood, Nur

    VEA-bility Security Metric: A Network Security Analysis Tool Melanie Tupper Dalhousie University to configure a secure network. Based on our findings, we conclude that the VEA- bility can be used the given options. These tools are important to network administrators as they strive to provide secure, yet

  18. A TOOL TO SUPPORT INTERACTION AND COLLABORATION ANALYSIS OF LEARNING ACTIVITIES

    Microsoft Academic Search

    Nikolaos Avouris; Vassilis Komis; Giorgos Fiotakis; Meletis Margaritis; Nikos Tselios

    An increasing amount of data is collected today during studies in which students and educators are engaged in learning activities using information technology and other tools. These data are indispensable for analysis and evaluation of learning activities, for evaluation of new tools and for students' meta-cognitive activities. The data can take various forms, including video and audio recordings, log files

  19. Experimental modal analysis and vibration monitoring of cutting tool support structure

    Microsoft Academic Search

    Jongkil Lee; Dae-Hwan Kim

    1995-01-01

    The objective of this research is to determine how a cutting tool vibrates when chatter occurs and how this motion is related to the acoustic emission signal. Modal analysis impact tests were conducted to obtain the actual natural frequencies and mode shapes of the Cutting Tool Support Structure (CTSS) system. Cutting tests were also conducted to determine the phase relationship

  20. A comparative analysis of DEA as a discrete alternative multiple criteria decision tool

    Microsoft Academic Search

    Joseph Sarkis

    2000-01-01

    The application of Data Envelopment Analysis (DEA) as a discrete alternative multiple criteria decision making (MCDM) tool has been gaining more attention in the literature. In this literature, DEA has been applied as an MCDM tool and compared analytically to other MCDM models and their structures, especially those that are based on multiple objective linear programming approaches. In this paper,

  1. A new energy analysis tool for ground source heat pump systems

    Microsoft Academic Search

    A. Michopoulos; N. Kyriakis

    2009-01-01

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab® environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2h etc.) and the calculation time required is very short. The

  2. Compiling Dynamic Fault Trees into Dynamic Bayesian Nets for Reliability Analysis: the RADYBAN tool

    Microsoft Academic Search

    Luigi Portinale; Andrea Bobbio; Daniele Codetta-Raiteri; Stefania Montani

    2007-01-01

    In this paper, we present Radyban (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze systems modeled by means of Dynamic Fault Trees (DFT), by relying on automatic conversion into Dynamic Bayesian Networks (DBN). The tools aims at providing a familiar interface to reliability engineers, by allowing them to model the system to be analyzed with

  3. Radyban: a tool for reliability analysis of Dynamic Fault Trees through conversion into Dynamic Bayesian Networks

    Microsoft Academic Search

    S. Montani; L. Portinale; A. Bobbio; D. Codetta Raiteri

    2008-01-01

    In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic

  4. International Workshop on Analysis Tools and Methodologies for Embedded and Real-time

    E-print Network

    Lipari, Giuseppe

    1st International Workshop on Analysis Tools and Methodologies for Embedded and Real-time Systems, research in the field of real-time and embedded systems would greatly ben- efit from the availability of the International Workshop on Anaysis Tools and Methodologies for Embedded and Real-time Systems (WATERS

  5. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  6. Javelin Diagrams: A Graphical Tool for Probabilistic Sensitivity Analysis

    Microsoft Academic Search

    James C. Felli; Gordon B. Hazen

    2004-01-01

    Abstract In order to demonstrate posthoc robustness of decision problems to parameter estimates, analysts may conduct a probabilistic sensitivity analysis, assigning distributions to uncertain parameters and computing the probability of decision change. In contrast to classical threshold proximity methods of sensitivity analysis, no appealing graphical methods are available to present the results of a probabilistic sensitivity analysis. Here we introduce

  7. A comparison of commonly used re-entry analysis tools

    Microsoft Academic Search

    Tobias Lips; Bent Fritsche

    2005-01-01

    Most spacecraft or rocket bodies re-entering the Earth's atmosphere, controlled or uncontrolled, do not demise completely during re-entry. Fragments of these re-entry objects survive and reach the ground where they pose a risk to people. Re-entry tools have been developed all over the world in order to calculate the destruction processes and to assess the resulting ground risk. This paper

  8. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  9. Online Tool for Analysis of Denaturing Gradient Gel Electrophoresis Profiles

    Microsoft Academic Search

    Florian Huber; Peter Peduzzi

    2004-01-01

    We present an online tool (EquiBands, http:\\/\\/www.univie.ac.at\\/IECB\\/limno\\/equibands\\/EquiBands.html) that quantifies the matching of two bands considered to be the same in different samples, even when samples are applied to different denaturing gradient gel electrophoresis gels. With an environmental example we demon- strate the procedure for the classification of two bands of different samples with the help of EquiBands. In denaturing gradient

  10. Analysis tools for the calibration and commissioning of the AOF

    NASA Astrophysics Data System (ADS)

    Garcia-Rissmann, Aurea; Kolb, Johann; Le Louarn, Miska; Madec, Pierre-Yves; Muller, Nicolas

    2013-12-01

    The Adaptive Optics Facility (AOF) is an AO-oriented upgrade envisaged to be implemented at the UT4 in Paranal in 2013-2014, and which could serve as a test case for the E-ELT. Counting on the largest Deformable Secondary Mirror ever built (1170 actuators) and on four off-axes Na laser launch telescopes, the AOF will operate in distinct modes (GLAO, LTAO, SCAO), in accordance to the instruments attached to the 2 telescope Nasmyth ports (GALACSI+MUSE, GRAAL+HAWK-I) and to the Cassegrain port (ERIS). Tools are under development to allow a fast testing of important parameters for these systems when at commissioning and for posterior assessment of telemetry data. These concern the determination of turbulence parameters and Cn2 profiling, measurement of Strehl and ensquared energies, misregistration calculation, bandwidth & overall performance, etc. Our tools are presented as Graphical User Interfaces developed in the Matlab environment, and will be able to grab through a dedicated server data saved in SPARTA standards. We present here the tools developed up to present date and discuss details of what can be obtained from the AOF, based on simulations.

  11. Process-oriented evaluation of user interactions in integrated system analysis tools

    E-print Network

    Lee, Chaiwoo

    When computer-based tools are used for analysis of complex systems, the design of user interactions and interfaces becomes an essential part of development that determines the overall quality. The objective of this study ...

  12. The Ribosomal Database Project: improved alignments and new tools for rRNA analysis.

    PubMed

    Cole, J R; Wang, Q; Cardenas, E; Fish, J; Chai, B; Farris, R J; Kulam-Syed-Mohideen, A S; McGarrell, D M; Marsh, T; Garrity, G M; Tiedje, J M

    2009-01-01

    The Ribosomal Database Project (RDP) provides researchers with quality-controlled bacterial and archaeal small subunit rRNA alignments and analysis tools. An improved alignment strategy uses the Infernal secondary structure aware aligner to provide a more consistent higher quality alignment and faster processing of user sequences. Substantial new analysis features include a new Pyrosequencing Pipeline that provides tools to support analysis of ultra high-throughput rRNA sequencing data. This pipeline offers a collection of tools that automate the data processing and simplify the computationally intensive analysis of large sequencing libraries. In addition, a new Taxomatic visualization tool allows rapid visualization of taxonomic inconsistencies and suggests corrections, and a new class Assignment Generator provides instructors with a lesson plan and individualized teaching materials. Details about RDP data and analytical functions can be found at http://rdp.cme.msu.edu/. PMID:19004872

  13. An integrated traverse planner and analysis tool for future lunar surface exploration

    E-print Network

    Johnson, Aaron William

    2010-01-01

    This thesis discusses the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT), a system designed to help maximize productivity, scientific return, and safety on future lunar and planetary explorations,. The ...

  14. A simple classification tool for single-trial analysis of ERP components Christoph Bandt1

    E-print Network

    Bandt, Christoph

    A simple classification tool for single-trial analysis of ERP components Christoph Bandt1 , Mathias: Single trial analysis of ERP components Corresponding author: Prof. Christoph Bandt Institute-mail: bandt@uni-greifswald.de #12;Single trial analysis of ERP components 2 Abstract Event-related potentials

  15. Participatory Analysis of Synchronous Collaborative Problem Solving using the OCAF methodology and tools

    Microsoft Academic Search

    Nikos Avouris; Angeligue Dimitracopoulou; Vassilis Komis; Meletis Margaritis

    2003-01-01

    This interactive event aims at introducing the participants in analysis of collaborative problem-solving activities, in which they are going to be first involved themselves, using the OCAF (Object-oriented collaboration analysis framework) methodology and Collaboration analysis tools. Overview This interactive event is planned to take place in a computer laboratory. It evolves in three phases: During the first stage, the participants

  16. Geometric diffusions as a tool for harmonic analysis and structure definition of data: Multiscale methods

    E-print Network

    of the space. It can be thought of as global Fourier analysis. The multiscale analysis proposed hereGeometric diffusions as a tool for harmonic analysis and structure definition of data: Multiscale, CT 06510; and Department of Computer Science, Yale University, 51 Prospect Street, New Haven, CT

  17. Tools and algorithms to advance interactive intrusion analysis via Machine Learning and Information Retrieval

    E-print Network

    Tools and algorithms to advance interactive intrusion analysis via Machine Learning and Information in the intrusion analysis of log data from the perspectives of Machine Learning and Information Retrieval, and we's efficiency. In doing so, we attempt to translate intrusion analysis problems into the language

  18. Automated image analysis as a tool to quantify the colour and composition of rainbow trout

    E-print Network

    Manne, Fredrik

    Automated image analysis as a tool to quantify the colour and composition of rainbow trout cutlets in rainbow trout. The proposed automated image analysis methods were tested on a total of 983 of trout cutlets. © 2006 Elsevier B.V. All rights reserved. Keywords: Image analysis; Rainbow trout; Cutlet

  19. HiTRACE-Web: an online tool for robust analysis of high-throughput capillary electrophoresis

    E-print Network

    Das, Rhiju

    HiTRACE-Web: an online tool for robust analysis of high-throughput capillary electrophoresis Hanjoo-scale high- throughput capillary electrophoresis data, we previ- ously proposed a suite of efficient analysis software named HiTRACE (High Throughput Robust Analysis of Capillary Electrophoresis). HiTRACE has been

  20. Kriging the Fields: a New Statistical Tool for Wave Propagation Analysis

    E-print Network

    Libre de Bruxelles, Université

    Kriging the Fields: a New Statistical Tool for Wave Propagation Analysis Ph. De Doncker , J, the method proposed here is to apply the same technique to electromagnetic wave propaga- tion analysis- mance analysis or electromagnetic compatibility studies. It generally focus either on the local field

  1. Using a WCET Analysis Tool in Real-Time Systems Education

    Microsoft Academic Search

    Samuel Petersson; Andreas Ermedahl; Anders Pettersson; Daniel Sundmark; Niklas Holsti

    2005-01-01

    To reach a more widespread use, WCET analysis tools need to be a standard part in the education of embedded systems developers. Many real-time courses in academia use Lego Mindstorms, an off-the-shelf kit of Lego bricks for building and controlling small prototype robots. We describe work on porting the Bound-T WCET analysis tool to the Lego Mindstorms microprocessor; the Renesas

  2. Development of a task analysis tool to facilitate user interface design

    NASA Technical Reports Server (NTRS)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  3. VAST: A Human-Centered, Domain-Independent Video Analysis Support Tool

    E-print Network

    Nordt, Marlo Faye

    2011-08-08

    VAST: A HUMAN-CENTERED, DOMAIN-INDEPENDENT VIDEO ANALYSIS SUPPORT TOOL A Dissertation by MARLO FAYE NORDT Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements... for the degree of DOCTOR OF PHILOSOPHY December 2008 Major Subject: Computer Science VAST: A HUMAN-CENTERED, DOMAIN-INDEPENDENT VIDEO ANALYSIS SUPPORT TOOL A Dissertation by MARLO FAYE NORDT Submitted to the Office of Graduate...

  4. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  5. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  6. Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details

    PubMed Central

    Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca

    2011-01-01

    Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners. PMID:21904538

  7. SOLAR ARRAY VERIFICATION AND ANALYSIS TOOL (SAVANT) ANALYSIS OF THE MICROELECTRONICS AND PHOTONICS TESTBED (MPTB) SPACE SOLAR CELL DATA

    Microsoft Academic Search

    R. J. Walters; G. P. Summers; S. R. Messenger; T. L. Morton

    The United States Naval Research Laboratory (NRL), in collaboration with the Ohio Aerospace Institute (OAI) and NASA Glenn Research Center (GRC), have developed an improved space solar cell radiation response analysis capability and produced a computer modeling tool which implements the analysis. This was accomplished, in part, through analysis of data taken from the solar panel that powers the Micro-electronics

  8. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  9. Time-varying wavelet estimation and deconvolution by kurtosis maximization

    E-print Network

    van der Baan, Mirko

    Time-varying wavelet estimation and deconvolution by kurtosis maximization Mirko van der Baan1 reveal the phase of a seismic wavelet. It is robust enough to detect time-varying phase changes. Phase deconvo- lution can be achieved using time-varying Wiener filtering. Time-varying wavelet extraction

  10. Robust wavelet estimation and blind deconvolution of noisy surface seismics

    E-print Network

    van der Baan, Mirko

    Robust wavelet estimation and blind deconvolution of noisy surface seismics Mirko Van der Baan1 if the bandwidth of the seismic wavelet is narrow to very narrow; that is, if the wavelet bandwidth is similar to its principal frequency. The main problem is to estimate the phase of the wavelet with sufficient

  11. Non-iterative wavelet-based deconvolution for sparse aperturesystem

    NASA Astrophysics Data System (ADS)

    Xu, Wenhai; Zhao, Ming; Li, Hongshu

    2013-05-01

    Optical sparse aperture imaging is a promising technology to obtain high resolution but with a significant reduction in size and weight by minimizing the total light collection area. However, with the decreasing of collection area, its OTF is also greatly attenuated, and thus the directly imaging quality of sparse aperture system is very poor. In this paper, we focus on the post-processing methods for sparse aperture systems, and propose a non-iterative wavelet-based deconvolution algorithm. The algorithm is performed by adaptively denoising the Fourier-based deconvolution results on the wavelet basis. We set up a Golay-3 sparse-aperture imaging system, where the imaging and deconvolution experiments of the natural scenes are performed. The experiments demonstrate that the proposed method has greatly improved the imaging quality of Golay-3 sparse-aperture system, and produce satisfactory visual quality. Furthermore, our experimental results also indicate that the sparse aperture system has the potential to reach higher resolution with the help of better post-processing deconvolution techniques.

  12. IMAGE DECONVOLUTION BY STEIN BLOCK THRESHOLDING C. Chesneaua

    E-print Network

    Paris-Sud XI, Université de

    block thresholding and Vaguelet-Wavelet Decomposition. The approach consists in first denoising the observed image using a wavelet-domain Stein block thresholding, and then inverting the convolution operator problem in image processing. There is an extensive statistical literature on wavelet- based deconvolution

  13. Adaptive parameter selection for block wavelet-thresholding deconvolution

    E-print Network

    Boyer, Edmond

    Adaptive parameter selection for block wavelet-thresholding deconvolution F. Navarro , M.J. Fadili.Chesneau@math.unicaen.fr). Abstract: In this paper, we propose a data-driven block thresholding procedure for wavelet- based non. [2012]. For image denoising problems, a Stein risk estimator have been proposed in Peyr´e et al. [2011

  14. Accuracy of peak deconvolution algorithms within chromatographic integrators

    Microsoft Academic Search

    Andrew N. Papas; Terrence P. Tougas

    1990-01-01

    The soundness of present-day algorithms to deconvolve overlapping skewed peaks was investigated. From simulated studies based on the exponentially modified Gaussian model (EMG), chromatographic peak area inaccuracies for unresolved peaks are presented for the two deconvolution methods, the tangent skim and the perpendicular drop method. These inherent inaccuracies, in many cases exceeding 50%, are much greater than those calculated from

  15. IMPROVED TEMPORAL AND SPATIAL FOCUSING USING DECONVOLUTION: THEORETICAL, NUMERICAL AND

    E-print Network

    Snieder, Roel

    scattered wave- forms to a point in both time and space, ideally to a delta function (r of fields such as medicine, communications, nondestructive evaluation (NDE), and seismology. In practice the improved spatial focus achieved using deconvolution by scanning around the source location with a laser

  16. Generalized Gaussian Model for Multi Channel Image Deconvolution

    Microsoft Academic Search

    M. El-Sayed Waheed

    2006-01-01

    In Multichannel blind deconvolution (MBD) the goal is to calculate possibly scaled and delayed estimates of the source signals from their convolutive mixtures, using approximate knowledge of the source characteristics only. Nearly all of the solutions to MBD proposed so far require from the source signals to be pair-wise statistically independent and to be timely correlated. In practice, this can

  17. Deconvolution of adaptive optics retinal images Julian C. Christou

    E-print Network

    Deconvolution of adaptive optics retinal images Julian C. Christou Center for Adaptive Optics the contrast of the adaptive optics images. In this work we demonstrate that quantitative information is also by using adaptive optics1 (AO). The wave-front correction is not perfect, however. Although a diffraction

  18. Radial homomorphic deconvolution of B-mode medical ultrasound images

    Microsoft Academic Search

    T. Taxt

    1994-01-01

    Describes how homomorphic deconvolution can be used to improve the radial resolution of in vitro and in vivo medical ultrasound images. Each of the recorded radiofrequency ultrasound beams used to form the image was considered as a finite depth sequence of length N, and was weighted with the same exponential depth sequence to create at least some minimum phase sequences.

  19. Deconvolution Estimation of Nerve and Muscle Conduction Velocity Distribution

    E-print Network

    Gonzalez Cueto, Jose

    than the muscle estimates, however a filtering technique helps to reduce the variance of the muscleDeconvolution Estimation of Nerve and Muscle Conduction Velocity Distribution José A. Glez-Cueto1 , Philip A. Parker2 Abstract ­ The Nerve Evoked Response and the Muscle Voluntary Response are described

  20. STOCHASTIC DECONVOLUTION OVER GROUPS FOR INVERSE PROBLEMS IN IMAGING

    E-print Network

    Yazici, Birsen

    appmach BCC group reprcsmation thcor). and thc conccpt of group stntiooaritl.. Wc formulate 3 minimum mean squarc solution to thc deconvolution problem in thc prcscoce of nonstationaty measurmmt nokc. Our approach incorporates a priori information about thc noisc and thc unknown signal into thc inversion pmblcm

  1. Phase unwrapping for 2-D blind deconvolution of ultrasound images

    Microsoft Academic Search

    Oleg V. Michailovich; Dan Adam

    2004-01-01

    In most approaches to the problem of two-dimensional homomorphic deconvolution of ultrasound images, the estimation of a corresponding point-spread function (PSF) is necessarily the first stage in the process of image restoration. This estimation is usually performed in the Fourier domain by either successive or simultaneous estimation of the amplitude and phase of the Fourier transform (FT) of the PSF.

  2. Machine learning deconvolution filter kernels for image restoration

    NASA Astrophysics Data System (ADS)

    Mainali, Pradip; Wittebrood, Rimmert

    2015-03-01

    In this paper, we propose a novel algorithm to recover a sharp image from its corrupted form by deconvolution. The algorithm learns the deconvolution process. This is achieved by learning the deconvolution filter kernels for the set of learnt basic pixel patterns. The algorithm consists of the offline learning and online filtering stages. In the one-time offline learning stage, the algorithm learns the dictionary of various local characteristics of the pixel patch as the basic pixel patterns from a huge number of natural images in the training database. Later, the deconvolution filter coefficients for each pixel pattern is optimized by using the source and the corrupted image pairs in the training database. In the online stage, the algorithm only needs to find the nearest matching pixel pattern in the dictionary for each pixel and filter it using the filter optimized for the corresponding pixel pattern. Experimental results on natural images show that our method achieves the state-of-art result on an image deblurring. The proposed approach can be applied to recover a sharp image for applications such as camera, HD/UHD TV, document scanning systems etc.

  3. Estimation of colored plant noise using Kalman filter based deconvolution

    Microsoft Academic Search

    M.-H. Yoon; T. V. Ramabadran

    1991-01-01

    In many deconvolution problems, the signal to be estimated is modeled as the input to a known plant and assumed white. There are, however, situations in which this signal is not white. A simple iterative scheme for estimating colored sequences is presented. In this scheme, the colored plant noise is modeled as the output of a shaping filter excited by

  4. Identification of RC networks by deconvolution: chances and limits

    Microsoft Academic Search

    V. Szekely

    1998-01-01

    This paper deals with the identification of RC networks from their time- or frequency-domain responses. A new method is presented based on a recent approach of the network description where all response functions are calculated by convolution integrals. The identification is carried out by deconvolution (NID method). This paper discusses the practical details of the method. Special attention is paid

  5. MULTICHANNEL BLIND SEPARATION AND DECONVOLUTION OF SOURCES WITH

    E-print Network

    Cichocki, Andrzej

    for the blind signal separation task 1]{ 12]. Such methodsuse higher-order statisticalinformationaboutthe sourceMULTICHANNEL BLIND SEPARATION AND DECONVOLUTION OF SOURCES WITH ARBITRARY DISTRIBUTIONS Scott C non-Gaussian sources. Our technique monitors the statistics of each of the outputs of the sepa- rator

  6. MULTICHANNEL BLIND SEPARATION AND DECONVOLUTION OF SOURCES WITH

    E-print Network

    Douglas, Scott C.

    ]--[12]. Such methods use higher­order statistical information about the source signals to iteratively adjustMULTICHANNEL BLIND SEPARATION AND DECONVOLUTION OF SOURCES WITH ARBITRARY DISTRIBUTIONS Scott C of arbitrary non­Gaussian sources. Our technique monitors the statistics of each of the outputs of the sepa

  7. TOTAL VARIATION SEMI-BLIND DECONVOLUTION USING SHOCK FILTERS

    E-print Network

    TOTAL VARIATION SEMI-BLIND DECONVOLUTION USING SHOCK FILTERS By James H. Money and Sung Ha Kang IMA Filters James H. Money and Sung Ha Kang March 23, 2006 Abstract We present a Semi-Blind method for image results indicate the method is robust for both black and non-black background images while reducing

  8. New regularization scheme for blind color image deconvolution

    NASA Astrophysics Data System (ADS)

    Chen, Li; He, Yu; Yap, Kim-Hui

    2011-01-01

    This paper proposes a new regularization scheme to address blind color image deconvolution. Color images generally have a significant correlation among the red, green, and blue channels. Conventional blind monochromatic deconvolution algorithms handle each color image channels independently, thereby ignoring the interchannel correlation present in the color images. In view of this, a unified regularization scheme for image is developed to recover edges of color images and reduce color artifacts. In addition, by using the color image properties, a spectral-based regularization operator is adopted to impose constraints on the blurs. Further, this paper proposes a reinforcement regularization framework that integrates a soft parametric learning term in addressing blind color image deconvolution. A blur modeling scheme is developed to evaluate the relevance of manifold parametric blur structures, and the information is integrated into the deconvolution scheme. An optimization procedure called alternating minimization is then employed to iteratively minimize the image- and blur-domain cost functions. Experimental results show that the method is able to achieve satisfactory restored color images under different blurring conditions.

  9. Detection of gravity field source boundaries using deconvolution method

    NASA Astrophysics Data System (ADS)

    Zuo, Boxin; Hu, Xiangyun; Liang, Yin; Han, Qi

    2014-12-01

    Complications arise in the interpretation of gravity fields because of interference from systematic degradations, such as boundary blurring and distortion. The major sources of these degradations are the various systematic errors that inevitably occur during gravity field data acquisition, discretization and geophysical forward modelling. To address this problem, we evaluate deconvolution method that aim to detect the clear horizontal boundaries of anomalous sources by the suppression of systematic errors. A convolution-based multilayer projection model, based on the classical 3-D gravity field forward model, is innovatively derived to model the systematic error degradation. Our deconvolution algorithm is specifically designed based on this multilayer projection model, in which three types of systematic error are defined. The degradations of the different systematic errors are considered in the deconvolution algorithm. As the primary source of degradation, the convolution-based systematic error is the main object of the multilayer projection model. Both the random systematic error and the projection systematic error are shown to form an integral part of the multilayer projection model, and the mixed norm regularization method and the primal-dual optimization method are therefore employed to control these errors and stabilize the deconvolution solution. We herein analyse the parameter identification and convergence of the proposed algorithms, and synthetic and field data sets are both used to illustrate their effectiveness. Additional synthetic examples are specifically designed to analyse the effects of the projection systematic error, which is caused by the uncertainty associated with the estimation of the impulse response function.

  10. RELIABILITY ANALYSIS OF BGA PACKAGES - A TOOL FOR DESIGN ENGINEERS

    Microsoft Academic Search

    James W. Jones; Jason Li; Yuichi Yamazaki; Jidong Yang; Masaki Shiratori; Qiang Yu

    An innovative design and analysis procedure has been developed by The MacNeal-Schwendler Corporation (MSC) that enables the design engineer to perform analysis of Ball Grid Array (BGA) packages in an order of magnitude less time than was previously required by experienced analysts. This procedure captures the analytical expertise of the experienced analyst and makes it available to the design engineer

  11. Negotiation Process Analysis: A Research and Training Tool.

    ERIC Educational Resources Information Center

    Williams, Timothy

    This paper proposes the use of interaction process analysis to study negotiation behaviors. Following a review of current literature in the field, the paper presents a theoretical framework for the analysis of both labor/management and social negotiation processes. Central to the framework described are two systems of activities that together…

  12. De-convoluting mixed crude oil in Prudhoe Bay Field, North Slope, Alaska

    USGS Publications Warehouse

    Peters, K.E.; Scott, Ramos L.; Zumberge, J.E.; Valin, Z.C.; Bird, K.J.

    2008-01-01

    Seventy-four crude oil samples from the Barrow arch on the North Slope of Alaska were studied to assess the relative volumetric contributions from different source rocks to the giant Prudhoe Bay Field. We applied alternating least squares to concentration data (ALS-C) for 46 biomarkers in the range C19-C35 to de-convolute mixtures of oil generated from carbonate rich Triassic Shublik Formation and clay rich Jurassic Kingak Shale and Cretaceous Hue Shale-gamma ray zone (Hue-GRZ) source rocks. ALS-C results for 23 oil samples from the prolific Ivishak Formation reservoir of the Prudhoe Bay Field indicate approximately equal contributions from Shublik Formation and Hue-GRZ source rocks (37% each), less from the Kingak Shale (26%), and little or no contribution from other source rocks. These results differ from published interpretations that most oil in the Prudhoe Bay Field originated from the Shublik Formation source rock. With few exceptions, the relative contribution of oil from the Shublik Formation decreases, while that from the Hue-GRZ increases in reservoirs along the Barrow arch from Point Barrow in the northwest to Point Thomson in the southeast (???250 miles or 400 km). The Shublik contribution also decreases to a lesser degree between fault blocks within the Ivishak pool from west to east across the Prudhoe Bay Field. ALS-C provides a robust means to calculate the relative amounts of two or more oil types in a mixture. Furthermore, ALS-C does not require that pure end member oils be identified prior to analysis or that laboratory mixtures of these oils be prepared to evaluate mixing. ALS-C of biomarkers reliably de-convolutes mixtures because the concentrations of compounds in mixtures vary as linear functions of the amount of each oil type. ALS of biomarker ratios (ALS-R) cannot be used to de-convolute mixtures because compound ratios vary as nonlinear functions of the amount of each oil type.

  13. Energy life-cycle analysis modeling and decision support tool

    SciTech Connect

    Hoza, M.; White, M.E.

    1993-06-01

    As one of DOE`s five multi-program national laboratories, Pacific Northwest Laboratory (PNL) develops and deploys technology for national missions in energy and the environment. The Energy Information Systems Group, within the Laboratory`s Computer Sciences Department, focuses on the development of the computational and data communications infrastructure and automated tools for the Transmission and Distribution energy sector and for advanced process engineering applications. The energy industry is being forced to operate in new ways and under new constraints. It is in a reactive mode, reacting to policies and politics, and to economics and environmental pressures. The transmission and distribution sectors are being forced to find new ways to maximize the use of their existing infrastructure, increase energy efficiency, and minimize environmental impacts, while continuing to meet the demands of an ever increasing population. The creation of a sustainable energy future will be a challenge for both the soft and hard sciences. It will require that we as creators of our future be bold in the way we think about our energy future and aggressive in its development. The development of tools to help bring about a sustainable future will not be simple either. The development of ELCAM, for example, represents a stretch for the computational sciences as well as for each of the domain sciences such as economics, which will have to be team members.

  14. Recognition of Protozoa and Metazoa using image analysis tools, discriminant analysis, neural networks and decision trees.

    PubMed

    Ginoris, Y P; Amaral, A L; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2007-07-01

    Protozoa and metazoa are considered good indicators of the treatment quality in activated sludge systems due to the fact that these organisms are fairly sensitive to physical, chemical and operational processes. Therefore, it is possible to establish close relationships between the predominance of certain species or groups of species and several operational parameters of the plant, such as the biotic indices, namely the Sludge Biotic Index (SBI). This procedure requires the identification, classification and enumeration of the different species, which is usually achieved manually implying both time and expertise availability. Digital image analysis combined with multivariate statistical techniques has proved to be a useful tool to classify and quantify organisms in an automatic and not subjective way. This work presents a semi-automatic image analysis procedure for protozoa and metazoa recognition developed in Matlab language. The obtained morphological descriptors were analyzed using discriminant analysis, neural network and decision trees multivariable statistical techniques to identify and classify each protozoan or metazoan. The obtained procedure was quite adequate for distinguishing between the non-sessile protozoa classes and also for the metazoa classes, with high values for the overall species recognition with the exception of sessile protozoa. In terms of the wastewater conditions assessment the obtained results were found to be suitable for the prediction of these conditions. Finally, the discriminant analysis and neural networks results were found to be quite similar whereas the decision trees technique was less appropriate. PMID:17605996

  15. IMPLEMENTING THE STANDARD SPECTRUM METHOD FOR ANALYSIS OF ?-? COINCIDENCE SPECTRA

    SciTech Connect

    Biegalski, S.; Flory, Adam E.; Schrom, Brian T.; Ely, James H.; Haas, Derek A.; Bowyer, Ted W.; Hayes, James C.

    2011-09-14

    The standard deconvolution analysis tool (SDAT) algorithms were developed and tested at the University of Texas at Austin. These algorithms utilize the standard spectrum technique for spectral analysis of {beta}-{gamma} coincidence spectra for nuclear explosion monitoring. Work has been conducted under this contract to implement these algorithms into a useable scientific software package with a graphical user interface. Improvements include the ability to read in PHD formatted data, gain matching, and data visualization. New auto-calibration algorithms were developed and implemented based on 137Cs spectra for assessment of the energy vs. channel calibrations. Details on the user tool and testing are included.

  16. Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles

    SciTech Connect

    Baer, Donald R.; Gaspar, Daniel J.; Nachimuthu, Ponnusamy; Techane, Sirnegeda D.; Castner, David G.

    2010-02-01

    The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties.

  17. Technical advance: autofluorescence as a tool for myeloid cell analysis.

    PubMed

    Mitchell, Andrew J; Pradel, Lydie C; Chasson, Lionel; Van Rooijen, Nico; Grau, Georges E; Hunt, Nicholas H; Chimini, Giovanna

    2010-09-01

    Cellular AF is usually considered a hindrance to flow cytometric analysis. Here, we incorporate AF into analysis of complex mixtures of leukocytes. Using a mouse model, we examined cellular AF at multiple excitation and emission wavelengths, and populations with discrete patterns were gated and examined for surface marker expression. In the spleen, all major myeloid populations were identified. In particular, the approach allowed simultaneous characterization of RPM and resident monocytes. When monocytes and RPM were compared, RPM exhibited a phenotype that was consistent with involvement in physiological processes, including expression of genes involved in lipid and iron metabolism. The presence of large amounts of stored ferric iron within RPM enabled purification of these cells using a magnetic-based approach. When adapted for use on leukocytes isolated from a range of other organs, incorporation of AF into analysis allowed identification and isolation of biologically important myeloid populations, including subsets that were not readily identifiable by conventional cytometric analysis. PMID:20534703

  18. PRAAD: Preprocessing and Analysis Tool for Arabic Ancient Documents

    Microsoft Academic Search

    Wafa Boussellaa; Abderrazak Zahour; Bruno Taconet; Adel Alimi; Abdellatif Benabdelhafid

    2007-01-01

    This paper presents the new system PRAAD for preprocessing and analysis of Arabic historical documents. It is composed of two important parts: pre-processing and analysis of ancient documents. After digitization, the color or greyscale ancient documents images are distorted by the presence of strong background artefacts such as scan optical blur and noise, show-through and bleed-through effects and spots. In

  19. Simultaneous deconvolution of the bivariate distribution of molecular weight and chemical composition of polyolefins made with ziegler-natta catalysts.

    PubMed

    Alghyamah, Abdulaziz A; Soares, João B P

    2009-02-18

    Polyolefins made with Ziegler-Natta catalysts have non-uniform distributions of molecular weight (MWD) and chemical composition (CCD). The MWD is usually measured by high-temperature gel permeation chromatography (GPC) and the CCD by either temperature rising elution fractionation (TREF) or crystallization analysis fractionation (CRYSTAF). A mathematical model is needed to quantify the information provided by these analytical techniques and to relate it to the presence of multiple site types on Ziegler-Natta catalysts. We developed a robust computer algorithm to deconvolute the MWD and CCD of polyolefins simultaneously using Flory's most probable distribution and the cumulative CCD component of Stockmayer's distribution, which includes the soluble fraction commonly present in linear low-density polyethylene (LLDPE) resins and have applied this procedure for the first time to several industrial LLDPE resins. The deconvolution results are reproducible and consistent with theoretical expectations. PMID:21706614

  20. OPE The Campus Safety and Security Data Analysis Cutting Tool

    NSDL National Science Digital Library

    Provided by the Office of Postsecondary Education (OPE) of the US Department of Education, this searchable database allows users to browse records of reported criminal offenses at over 6000 colleges and universities. The database contains records for 1997-99 and may be browsed by region, state, city, type of institution, instructional program, and number of students. Users can also simply type in the name of a specific institution. Initial entries include basic contact information and links to statistics for criminal offenses, hate offenses, and arrests. Each entry page also links to the relevant page at the National Center for Education Statistics IPEDS COOL (College Opportunities On-Line) website (reviewed in the March 31, 2000 Scout Report), a tool for comparison shopping between different collges and universities.

  1. Bayesian deconvolution of mass and ion mobility spectra: from binary interactions to polydisperse ensembles.

    PubMed

    Marty, Michael T; Baldwin, Andrew J; Marklund, Erik G; Hochberg, Georg K A; Benesch, Justin L P; Robinson, Carol V

    2015-04-21

    Interpretation of mass spectra is challenging because they report a ratio of two physical quantities, mass and charge, which may each have multiple components that overlap in m/z. Previous approaches to disentangling the two have focused on peak assignment or fitting. However, the former struggle with complex spectra, and the latter are generally computationally intensive and may require substantial manual intervention. We propose a new data analysis approach that employs a Bayesian framework to separate the mass and charge dimensions. On the basis of this approach, we developed UniDec (Universal Deconvolution), software that provides a rapid, robust, and flexible deconvolution of mass spectra and ion mobility-mass spectra with minimal user intervention. Incorporation of the charge-state distribution in the Bayesian prior probabilities provides separation of the m/z spectrum into its physical mass and charge components. We have evaluated our approach using systems of increasing complexity, enabling us to deduce lipid binding to membrane proteins, to probe the dynamics of subunit exchange reactions, and to characterize polydispersity in both protein assemblies and lipoprotein Nanodiscs. The general utility of our approach will greatly facilitate analysis of ion mobility and mass spectra. PMID:25799115

  2. The enhancement of fault detection and diagnosis in rolling element bearings using minimum entropy deconvolution combined with spectral kurtosis

    NASA Astrophysics Data System (ADS)

    Sawalhi, N.; Randall, R. B.; Endo, H.

    2007-08-01

    Spectral kurtosis (SK) represents a valuable tool for extracting transients buried in noise, which makes it very powerful for the diagnostics of rolling element bearings. However, a high value of SK requires that the individual transients are separated, which in turn means that if their repetition rate is high their damping must be sufficiently high that each dies away before the appearance of the next. This paper presents an algorithm for enhancing the surveillance capability of SK by using the minimum entropy deconvolution (MED) technique. The MED technique effectively deconvolves the effect of the transmission path and clarifies the impulses, even where they are not separated in the original signal. The paper illustrates these issues by analysing signals taken from a high-speed test rig, which contained a bearing with a spalled inner race. The results show that the use of the MED technique dramatically sharpens the pulses originating from the impacts of the balls with the spall and increases the kurtosis values to a level that reflects the severity of the fault. Moreover, when the algorithm was tested on signals taken from a gearbox for a bearing with a spalled outer race, it shows that each of the impulses originating from the impacts is made up of two parts (corresponding to entry into and exit from the spall). This agrees well with the literature but is often difficult to observe without the use of the MED technique. The use of the MED along with SK analysis also greatly improves the results of envelope analysis for making a complete diagnosis of the fault and trending its progression.

  3. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  4. A pitfall in the reconstruction of fibre ODFs using spherical deconvolution of diffusion MRI data

    PubMed Central

    Parker, G.D.; Marshall, D.; Rosin, P.L.; Drage, N.; Richmond, S.; Jones, D.K.

    2013-01-01

    Diffusion weighted (DW) MRI facilitates non-invasive quantification of tissue microstructure and, in combination with appropriate signal processing, three-dimensional estimates of fibrous orientation. In recent years, attention has shifted from the diffusion tensor model, which assumes a unimodal Gaussian diffusion displacement profile to recover fibre orientation (with various well-documented limitations), towards more complex high angular resolution diffusion imaging (HARDI) analysis techniques. Spherical deconvolution (SD) approaches assume that the fibre orientation density function (fODF) within a voxel can be obtained by deconvolving a ‘common’ single fibre response function from the observed set of DW signals. In practice, this common response function is not known a priori and thus an estimated fibre response must be used. Here the establishment of this single-fibre response function is referred to as ‘calibration’. This work examines the vulnerability of two different SD approaches to inappropriate response function calibration: (1) constrained spherical harmonic deconvolution (CSHD)—a technique that exploits spherical harmonic basis sets and (2) damped Richardson–Lucy (dRL) deconvolution—a technique based on the standard Richardson–Lucy deconvolution. Through simulations, the impact of a discrepancy between the calibrated diffusion profiles and the observed (‘Target’) DW-signals in both single and crossing-fibre configurations was investigated. The results show that CSHD produces spurious fODF peaks (consistent with well known ringing artefacts) as the discrepancy between calibration and target response increases, while dRL demonstrates a lower over-all sensitivity to miscalibration (with a calibration response function for a highly anisotropic fibre being optimal). However, dRL demonstrates a reduced ability to resolve low anisotropy crossing-fibres compared to CSHD. It is concluded that the range and spatial-distribution of expected single-fibre anisotropies within an image must be carefully considered to ensure selection of the appropriate algorithm, parameters and calibration. Failure to choose the calibration response function carefully may severely impact the quality of any resultant tractography. PMID:23085109

  5. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    NASA Technical Reports Server (NTRS)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  6. Distributed X-Ray Data and Distributed Analysis Tools

    NASA Astrophysics Data System (ADS)

    Plummer, D.; Schachter, J.; Elvis, M.; Garcia, M.; Conroy, M.

    X-ray data and analysis packages can now be distributed directly to the desks of the user community using CDROMs and portable, familiar analysis packages (like IRAF/PROS). The Einstein IPC Slew Survey is an example of a complete X-ray data set distributed via CDROMs and is the first to use the new FITS standard for photon event lists (BINTABLE). Users can analyze the Slew data directly off the CDROM using PROS. As an example, we present a recipe for producing a radial profile of the Cygnus Loop using PROS and the Slew Survey CDROM data. CDROMs of the complete Einstein IPC and HRI archive data sets will soon be distributed in BINTABLE format allowing similar analysis with those data.

  7. Droplet microfluidics--a tool for single-cell analysis.

    PubMed

    Joensson, Haakan N; Andersson Svahn, Helene

    2012-12-01

    Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding. PMID:23180509

  8. Extension of a System Level Tool for Component Level Analysis

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul

    2002-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  9. Extension of a System Level Tool for Component Level Analysis

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)

    2001-01-01

    This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

  10. An integrated data analysis tool for improving measurements on the MST RFP

    SciTech Connect

    Reusch, L. M., E-mail: lmmcguire@wisc.edu; Galante, M. E.; Johnson, J. R.; McGarry, M. B.; Den Hartog, D. J. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Franz, P. [Consorzio RFX, EURATOM-ENEA Association, Padova (Italy); Stephens, H. D. [Physics Department, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Pierce College Fort Steilacoom, Lakewood, Washington 98498 (United States)

    2014-11-15

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  11. An integrated data analysis tool for improving measurements on the MST RFPa)

    NASA Astrophysics Data System (ADS)

    Reusch, L. M.; Galante, M. E.; Franz, P.; Johnson, J. R.; McGarry, M. B.; Stephens, H. D.; Den Hartog, D. J.

    2014-11-01

    Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

  12. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    E-print Network

    Paris-Sud XI, Université de

    Socio-economic analysis: a tool for assessing the potential of nanotechnologies Jean-Marc Brignon the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic

  13. CoryneBase: Corynebacterium genomic resources and analysis tools at your fingertips.

    PubMed

    Heydari, Hamed; Siow, Cheuk Chuen; Tan, Mui Fern; Jakubovics, Nick S; Wee, Wei Yee; Mutha, Naresh V R; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  14. A Collaborative Analysis Tool for Visualisation and Interaction with Spatial Data

    E-print Network

    Taylor, Hamish

    -independent shared analysis of spatial data and urban planning proposals. The system seeks to extend the physical1 A Collaborative Analysis Tool for Visualisation and Interaction with Spatial Data Tina Manoharana.hw.ac.uk Abstract A collaborative virtual environment system is described that is designed to support location

  15. An independent component analysis based tool for exploring functional connections in the brain

    E-print Network

    Washington at Seattle, University of

    An independent component analysis based tool for exploring functional connections in the brain S. M for investigating functional connectivity in the brain. Independent component analysis (ICA) is used as a measure of voxel similarity which allows the user to find and view statistically independent maps of correlated

  16. IMPACT: Generic household-level databases and diagnostics tools for integrated crop-livestock systems analysis

    Microsoft Academic Search

    M. Herrero; E. González-Estrada; P. K. Thornton; C. Quirós; M. M. Waithaka; R. Ruiz; G. Hoogenboom

    2007-01-01

    We outline the need for generic crop-livestock systems databases and data standards for comprehensive systems analysis in developing countries. We also indicate the type of data that such databases should contain and review how they can be collected. We describe IMPACT, a database and analysis tool that we have developed that goes some way to meeting the demands that may

  17. Abstract Title: Image Informatics Tools for the Analysis of Retinal Images

    E-print Network

    California at Santa Barbara, University of

    Abstract Title: Image Informatics Tools for the Analysis of Retinal Images Presentation Start Barbara, Santa Barbara, CA. Keywords: 682 retinal detachment, 541 image processing, 543 imaging/image and quantitative analysis of retinal images, and to test these methods on a large retinal image database. Methods

  18. BGP Eye: A New Visualization Tool for Real-time Detection and Analysis of BGP Anomalies

    E-print Network

    Chuah, Chen-Nee

    BGP Eye: A New Visualization Tool for Real-time Detection and Analysis of BGP Anomalies Soon, BGP Eye performs real- time analysis of BGP anomalies through a hierarchical deep-dive approach. First BGP updates are clustered to obtain BGP events that are more representative of an anomaly

  19. BGP eye: a new visualization tool for real-time detection and analysis of BGP anomalies

    Microsoft Academic Search

    Soon Tee Teoh; Supranamaya Ranjan; Antonio Nucci; Chen-nee Chuah

    2006-01-01

    Owing to the inter-domain aspects of BGP routing, it is difficult to correlate information across multiple domains in order to analyze the root cause of the routing outages. This paper presents BGP Eye, a tool for visualization aided root-cause analysis of BGP anoma- lies. In contrast to previous approaches, BGP Eye performs real- time analysis of BGP anomalies through a

  20. Review Plant Analysis as a Diagnostic Tool for Evaluating Nutritional Requirements of Bananas

    Microsoft Academic Search

    N. MEMON; K. S. MEMON

    Plant analysis has been considered a very promising tool to assess nutritional requirements of plants for cost effective and environment friendly agriculture. Diagnosing nutritional status of bananas through plant analysis not only provides the basis of correct fertilizer requirement of the crop but also guides towards the nutritional requirements of future crops. The total contents of nutrients in leaves, and

  1. FREEWAY PERFORMANCE MEASUREMENT SYSTEM (PeMS): AN OPERATIONAL ANALYSIS TOOL

    E-print Network

    Varaiya, Pravin

    FREEWAY PERFORMANCE MEASUREMENT SYSTEM (PeMS): AN OPERATIONAL ANALYSIS TOOL Tom Choe Office/Skabardonis/Varaiya 1 ABSTRACT PeMS is a freeway performance measurement system for all of California. It processes 2 GB and calibrate simulation models. The paper describes the use of PeMS in conducting operational analysis

  2. BULLWHIP EFFECT AND SUPPLY CHAIN MODELLING AND ANALYSIS USING CPN TOOLS

    E-print Network

    van der Aalst, Wil

    Chain analysis and demonstrated their model using a case study from the food industry. Their model uses). Supply Chains in food industry are also modeled in [1]. The authors propose a supply chain managementBULLWHIP EFFECT AND SUPPLY CHAIN MODELLING AND ANALYSIS USING CPN TOOLS Dragana Makaji

  3. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  4. Generalized Aliasing as a Basis for Program Analysis Tools

    E-print Network

    , Ajax can prove the safety of more than 50% of the downcast instructions in some real­life Java programs in building an analysis system for off­the­shelf Java applications, and suggests some possible avenues rather than dealing with secondary issues. I can honestly say I do not expect ever again to work

  5. ADVANCED HIGH LIFT CFD ANALYSIS AND DESIGN TOOLS

    Microsoft Academic Search

    R. Rudnik; S. Wallin

    2004-01-01

    This contribution highlights development, validation, and application of European CFD codes for high lift aerodynamics. Major contributions to the status of CFD have been achieved in EC funded European projects, such as EUROLIFT and HiAer. Concerning the flow field analysis, progress in numerical methods as well as computer resources resulted in the ability of treating complete 3D transport aircraft high

  6. Automated simultaneous analysis phylogenetics (ASAP): an enabling tool for phlyogenomics

    Microsoft Academic Search

    Indra Neil Sarkar; Mary G. Egan; Gloria M. Coruzzi; Ernest K. Lee; Robert Desalle

    2008-01-01

    Background: The availability of sequences from whole genomes to reconstruct the tree of life has the potential to enable the development of phylogenomic hypotheses in ways that have not been before possible. A significant bottleneck in the analysis of genomic-scale views of the tree of life is the time required for manual curation of genomic data into multi-gene phylogenetic matrices.

  7. HIV drug resistance analysis tool based on process algebra

    Microsoft Academic Search

    Luciano Vieira De Araújo; Ester C. Sabino; João Eduardo Ferreira

    2008-01-01

    The increasing number of drugs used in HIV patient treatment and the mutations associated with drug resistance make the inference of drug resistance a complex task that demands computational systems. Furthermore, the software development\\/update can generate an extra level of complexity in the process drug resistance analysis. An alternative to handle the complexity of drug resistance and software development is

  8. Life Cycle Analysis as a Tool of Pollution Prevention

    Microsoft Academic Search

    Ranya ElSayed; Elsayed A. Shalaby; Mohamed Abdel Karim

    2004-01-01

    This study aimed to present a clear definition of life cycle analysis, LCA, and bring perspective to practical applications of LCA. A comparison was made between Mazout and natural gas as a fuel oil in electrical power station. It was also clarified which of the two systems (Mazout\\/natural gas) had least environmental impact. The amounts of Mazout, equivalent to natural

  9. The Scyther Tool: Verification, Falsification, and Analysis of Security Protocols

    Microsoft Academic Search

    Cas J. F. Cremers

    2008-01-01

    With the rise of the Internet and other open networks, a large number of security protocols have been developed and deployed in order to provide secure communication. The analysis of such security protocols has turned out to be extremely difficult for humans, as witnessed by the fact that many protocols were found to be flawed after deployment. This has driven

  10. Market research for requirements analysis using linguistic tools

    Microsoft Academic Search

    Luisa Mich; Mariangela Franch; Pierluigi Novi Inverardi

    2004-01-01

    Numerous studies in recent months have proposed the use of linguistic instruments to support requirements analysis. There are two main reasons for this: (i) the progress made in natural language processing, (ii) the need to provide the developers of software systems with support in the early phases of requirements definition and conceptual modelling. This paper presents the results of an

  11. Comparative genomic hybridization: Tools for high resolution analysis

    Microsoft Academic Search

    R. D. Knapp; R. Antonacci; B. Haddad

    1994-01-01

    Comparative genomic hybridization (CGH) is a powerful FISH-based technique that allows detection and mapping of genome imbalances using genomic DNA as probe. Limitations are resolution (limited by the use of metaphase chromosomes as target and by the high noise generated by the technique), sensitivity (imbalances as large as 10 Mb or more may remain undetected), and cumbersome analysis. We have

  12. Cluster analysis as a prediction tool for pregnancy outcomes.

    PubMed

    Banjari, Ines; Kenjeri?, Daniela; Šoli?, Krešimir; Mandi?, Milena L

    2015-03-01

    Considering specific physiology changes during gestation and thinking of pregnancy as a "critical window", classification of pregnant women at early pregnancy can be considered as crucial. The paper demonstrates the use of a method based on an approach from intelligent data mining, cluster analysis. Cluster analysis method is a statistical method which makes possible to group individuals based on sets of identifying variables. The method was chosen in order to determine possibility for classification of pregnant women at early pregnancy to analyze unknown correlations between different variables so that the certain outcomes could be predicted. 222 pregnant women from two general obstetric offices' were recruited. The main orient was set on characteristics of these pregnant women: their age, pre-pregnancy body mass index (BMI) and haemoglobin value. Cluster analysis gained a 94.1% classification accuracy rate with three branch- es or groups of pregnant women showing statistically significant correlations with pregnancy outcomes. The results are showing that pregnant women both of older age and higher pre-pregnancy BMI have a significantly higher incidence of delivering baby of higher birth weight but they gain significantly less weight during pregnancy. Their babies are also longer, and these women have significantly higher probability for complications during pregnancy (gestosis) and higher probability of induced or caesarean delivery. We can conclude that the cluster analysis method can appropriately classify pregnant women at early pregnancy to predict certain outcomes. PMID:26040101

  13. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  14. Leaf analysis as an exploratory tool in mineralogy

    NASA Astrophysics Data System (ADS)

    Mirzai, A. A.; McKee, J. S. C.; Yeo, Y. H.; Gallop, D.; Medved, J.

    1990-04-01

    PIXE analysis has been used for more than a decade at the University of Manitoba to determine trace-element concentrations in a wide variety of materials including minerals. Detailed analysis of the elemental composition of leaves is of interest because the macronutrient and micronutrient elements present in plant tissues are already well known from chemical studies. In the present work samples from species Betula populifolia and Picea glauca were irradiated at an incident proton energy of 40 MeV to determine possible additional trace-element concentrations due to migration from mineral deposits present underground. In addition to known nutrient elements, other elements such as Rb, Sr, Cd and Ba were readily detected. In some samples the presence of Pt was also identified.

  15. TA-DA: a Tool for Astrophysical Data Analysis

    E-print Network

    Da Rio, Nicola

    2012-01-01

    We present TA-DA, a new software aimed at greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams, to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled IDL widget-based application; its graphical user interface makes it considerably user-friendly. In this paper we describe the software and its functionalities.

  16. TRACY: A tool for accelerator design and analysis

    SciTech Connect

    Nishimura, Hiroshi

    1988-06-01

    A simulation code TRACY has been developed for accelerator design and analysis. The code can be used for lattice design work simulation of magnet misalignments, closed orbit calculations and corrections, undulator calculations and particle tracking. TRACY has been used extensively for single particle simulations for the Advanced Light Source (ALS), a 1-2 GeV Synchrotron Radiation Source now under construction at Lawrence Berkeley Laboratory. 9 refs., 2 figs.

  17. Decision Analysis Tool to Compare Energy Pathways for Transportation

    SciTech Connect

    Bloyd, Cary N.; Stork, Kevin

    2011-02-01

    With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). A prototype model, Analytica Transportation Energy Analysis Model (ATEAM), has been developed using the Analytica decision modeling environment, visualizing the structure as a hierarchy of influence diagrams. The report summarized the FY2010 ATEAM accomplishments.

  18. Software Tool for Real Time Power Quality Disturbance Analysis and Classification

    Microsoft Academic Search

    Mohammed E. Salem; Azah Mohamed; Salina Abdul Samad; Iskandar Yahya

    2007-01-01

    Real time detection and classification of power quality disturbances is important for quick diagnosis and mitigation of such disturbances. This paper presents the development of a software tool based on MatLab for power quality disturbance analysis and classification. Prior to the development of the software tool, the disturbance signals are captured and processed in real-time using the TMS320C6711DSP starter kit.

  19. SizeUp: A Tool for Interactive Comparative Collection Analysis for Very Large Species Collections

    E-print Network

    Ozor, Andrew

    2009-11-18

    SizeUp: A Tool for Interactive Comparative Collection Analysis for Very Large Species Collections Andrew Ozor Generated by Foxit PDF Creator © Foxit Software http://www.foxitsoftware.com For evaluation only. Wide Ranging Biological Data l Global... How do we compare and analyze large data sets, and visualize the result in a user friendly tool? Generated by Foxit PDF Creator © Foxit Software http://www.foxitsoftware.com For evaluation only. Multiple Problems l No formal definition for 'quality...

  20. COMPUTATIONAL TOOLS FOR SEMI-ANALYTICAL FINITE ELEMENT STABILITY ANALYSIS OF THIN-WALLED STRUCTURES

    Microsoft Academic Search

    Michail Samofalov; Remigijus Kutas

    Computational tools implementing the semi-analytical finite elements for the linear and stability analysis of thin-walled beams are considered. The proposed method presents two-stage discretisation, where the thin-walled cross section is approximated by the semi-analytical finite elements at the first stage, while conventional longitudinal finite element discretisation is performed at the second stage. The developed computational tools for the processor allow

  1. Analysis of Directional Logging Tools in Anisotropic and Multieccentric Cylindrically-Layered Earth Formations

    Microsoft Academic Search

    Guo-Sheng Liu; Fernando L. Teixeira; Guo-Ji Zhang

    2012-01-01

    We develop a pseudoanalytical method for the analysis of directional resistivity well-logging tools consisting of multiple tilted-coil antennas in cylindrically-layered Earth formations which have anisotropic conductivities and where each layer may be eccentric to the others (multieccentric formations). The cylindrically-layered, anisotropic, and multieccentric scenarios considered here are often encountered in deviated\\/horizontal drilling for hydrocarbon exploration, where the logging tool may

  2. Power Systems Life Cycle Analysis Tool (Power L-CAT).

    SciTech Connect

    Andruski, Joel; Drennen, Thomas E.

    2011-01-01

    The Power Systems L-CAT is a high-level dynamic model that calculates levelized production costs and tracks environmental performance for a range of electricity generation technologies: natural gas combined cycle (using either imported (LNGCC) or domestic natural gas (NGCC)), integrated gasification combined cycle (IGCC), supercritical pulverized coal (SCPC), existing pulverized coal (EXPC), nuclear, and wind. All of the fossil fuel technologies also include an option for including carbon capture and sequestration technologies (CCS). The model allows for quick sensitivity analysis on key technical and financial assumptions, such as: capital, O&M, and fuel costs; interest rates; construction time; heat rates; taxes; depreciation; and capacity factors. The fossil fuel options are based on detailed life cycle analysis reports conducted by the National Energy Technology Laboratory (NETL). For each of these technologies, NETL's detailed LCAs include consideration of five stages associated with energy production: raw material acquisition (RMA), raw material transport (RMT), energy conversion facility (ECF), product transportation and distribution (PT&D), and end user electricity consumption. The goal of the NETL studies is to compare existing and future fossil fuel technology options using a cradle-to-grave analysis. The NETL reports consider constant dollar levelized cost of delivered electricity, total plant costs, greenhouse gas emissions, criteria air pollutants, mercury (Hg) and ammonia (NH3) emissions, water withdrawal and consumption, and land use (acreage).

  3. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  4. Design and Analysis Tools for Concurrent Blackboard Systems

    NASA Technical Reports Server (NTRS)

    McManus, John W.

    1991-01-01

    A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

  5. Risk analysis tools for force protection and infrastructure/asset protection

    SciTech Connect

    Jaeger, C.D.; Duggan, R.A.; Paulus, W.K.

    1998-09-01

    The Security Systems and Technology Center at Sandia National Laboratories has for many years been involved in the development and use of vulnerability assessment and risk analysis tools. In particular, two of these tools, ASSESS and JTS, have been used extensively for Department of Energy facilities. Increasingly, Sandia has been called upon to evaluate critical assets and infrastructures, support DoD force protection activities and assist in the protection of facilities from terrorist attacks using weapons of mass destruction. Sandia is involved in many different activities related to security and force protection and is expanding its capabilities by developing new risk analysis tools to support a variety of users. One tool, in the very early stages of development, is EnSURE, Engineered Surety Using the Risk Equation. EnSURE addresses all of the risk equation and integrates the many components into a single, tool-supported process to help determine the most cost-effective ways to reduce risk. This paper will briefly discuss some of these risk analysis tools within the EnSURE framework.

  6. Thorium concentrations in the lunar surface. V - Deconvolution of the central highlands region

    NASA Technical Reports Server (NTRS)

    Metzger, A. E.; Etchegaray-Ramirez, M. I.; Haines, E. L.

    1982-01-01

    The distribution of thorium in the lunar central highlands measured from orbit by the Apollo 16 gamma-ray spectrometer is subjected to a deconvolution analysis to yield improved spatial resolution and contrast. Use of two overlapping data fields for complete coverage also provides a demonstration of the technique's ability to model concentrations several degrees beyond the data track. Deconvolution reveals an association between Th concentration and the Kant Plateau, Descartes Mountain and Cayley plains surface formations. The Kant Plateau and Descartes Mountains model with Th less than 1 part per million, which is typical of farside highlands but is infrequently seen over any other nearside highland portions of the Apollo 15 and 16 ground tracks. It is noted that, if the Cayley plains are the result of basin-forming impact ejecta, the distribution of Th concentration with longitude supports an origin from the Imbrium basin rather than the Nectaris or Orientale basins. Nectaris basin materials are found to have a Th concentration similar to that of the Descartes Mountains, evidence that the latter may have been emplaced as Nectaris basin impact deposits.

  7. Mechanisms proposed for spectrogram correlation and transformation deconvolution in FM bat sonar

    NASA Astrophysics Data System (ADS)

    Simmons, James A.

    2005-09-01

    Big brown bats use time/frequency distributions to represent FM biosonar pulses and echoes as a consequence of reception through frequency tuned channels of the inner ear and subsequent processing by similarly tuned neural channels in the auditory pathway. Integration time is 350 ?s, yet delay resolution is 2-10 ?s, which must be based on detecting changes in the echo spectrum caused by interference between overlapping reflections inside the integration time. However, bats perceive not merely the echo interference spectrum but the numerical value of the delay separation from the spectrum, which requires deconvolution. Because spectrograms are the initial representation, this process is spectrogram correlation and transformation (SCAT). Proposed SCAT deconvolution mechanisms include extraction of echo envelope ripples for time-domain spectrometry, cepstral analysis of echoes, use of coherent or noncoherent reconstruction with basis functions, segmentation of onsets of overlapping replicas at moderate to long time separations, and localization of the occurrence of spectral interference ripples at specific times within dechirped spectrograms. Physiological evidence from single-unit recordings reveals a cepstral-like time-frequency process based on freqlets, both single-unit and multiunit responses reveal which may prove to be time-domain basis functions, and multiunit responses exhibit modulations by onset and envelope ripple. [Work supported by NIH and ONR.

  8. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    E-print Network

    Gonzalez, Adriana; Jacques, Laurent

    2015-01-01

    Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. Optics are never perfect and the non-ideal path through the telescope is usually represented by the convolution of an ideal image with a Point Spread Function (PSF). Other sources of noise (read-out, Photon) also contaminate the image acquisition process. The problem of estimating both the PSF filter and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, it does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis image prior model and weak assumptions on the PSF filter's response. We use the observations from a celestial body transit where such object can be assumed to be a black disk. Such constraints limits the interchangeabil...

  9. University Economic Impact Analysis: Applying microeconomic tools and concepts

    NSDL National Science Digital Library

    Nancy Brooks

    This service-learning impact analysis project had students look in detail at the current employment and purchasing practices and policies of the University of Vermont. Unlike traditional impact analyses that attempt to calculate the total impact of an institution on the local economy, this project attempted to identify where the University could change policies and practices to increase positive local impacts both from an efficiency and equity perspective. Students worked with a 14-person advisory committee from the University, local and state government and local non-profits.

  10. GIPSY 3D: Analysis, Visualization and VO Tools for Datacubes

    NASA Astrophysics Data System (ADS)

    Ruíz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; van der Hulst, J. M.

    2009-09-01

    The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing System (GIPSY), and new ones based on use cases elaborated in collaboration with advanced users. One of the main goals is to provide local interoperability between GIPSY and other VO software. The connectivity with the Virtual Observatory environment will provide general access to 3D data VO archives and services, maximizing the potential for scientific discovery.

  11. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum

    NASA Astrophysics Data System (ADS)

    Wille, M.-L.; Zapf, M.; Ruiter, N. V.; Gemmeke, H.; Langton, C. M.

    2015-06-01

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13??s versus 0.18??s standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.

  12. Determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution

    NASA Technical Reports Server (NTRS)

    Ioup, George E.; Ioup, Juliette W.

    1991-01-01

    The final report for work on the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution is presented. Papers and theses prepared during the research report period are included. Among all the research results reported, note should be made of the specific investigation of the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution. A methodology was developed to determine design and operation parameters for error minimization when deconvolution is included in data analysis. An error surface is plotted versus the signal-to-noise ratio (SNR) and all parameters of interest. Instrumental characteristics will determine a curve in this space. The SNR and parameter values which give the projection from the curve to the surface, corresponding to the smallest value for the error, are the optimum values. These values are constrained by the curve and so will not necessarily correspond to an absolute minimum in the error surface.

  13. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum.

    PubMed

    Wille, M-L; Zapf, M; Ruiter, N V; Gemmeke, H; Langton, C M

    2015-06-21

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13??s versus 0.18??s standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity. PMID:26047163

  14. Dynamic analysis tool development for advanced geometry wind turbine blades

    NASA Astrophysics Data System (ADS)

    Larwood, Scott Michael

    This dissertation describes work to develop a dynamic analysis code for swept wind turbine blades. Because of their aeroelastic behavior, swept blades offer the potential to increase energy capture and lower fatigue loads. This work was an outgrowth of United States Department of Energy contract on swept blades, where the author used the Adams(TM)dynamic software. The author based the new code on the National Renewable Energy Laboratory's FAST code. The new code would allow for lower cost analysis and faster computation times for swept blades compared to Adams. The FAST revisions included the geometry and mode shapes required for the bending and twisting motion of the swept blade. The author also developed a finite-element program to determine mode shapes for the swept blade. The author verified the new code with Adams. The comparisons were favorable; however, the Adams model exhibited more twist. The differences may be attributed to differences in modeling approach. The author attempted to validate the code with field test data; however, uncertainties in the test wind speed and the turbine controller made comparison difficult. The author used the new code to perform preliminary designs of swept rotors for 1.5 MW and 3.0MWwind turbines. The designs showed a 5% increase in annual energy production and a decrease in flap-bending fatigue over the baseline straight-blade designs.

  15. Strengthened IAEA Safeguards-Imagery Analysis: Geospatial Tools for Nonproliferation Analysis

    SciTech Connect

    Pabian, Frank V [Los Alamos National Laboratory] [Los Alamos National Laboratory

    2012-08-14

    This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated infrastructure. The digital globes also provide highly accurate terrain mapping for better geospatial context and allow detailed 3-D perspectives of all sites or areas of interest. 3-D modeling software (i.e., Google's SketchUp6 newly available in 2007) when used in conjunction with these digital globes can significantly enhance individual building characterization and visualization (including interiors), allowing for better assessments including walk-arounds or fly-arounds and perhaps better decision making on multiple levels (e.g., the best placement for International Atomic Energy Agency (IAEA) video monitoring cameras).

  16. Automated analysis tools for reducing spacecraft telemetry data

    SciTech Connect

    Voss, T.J.

    1993-04-26

    A practical description is presented of the methods used to reduce spacecraft telemetry data using a hierarchial toolkit of software programs developed for a UNIX environment. A project requiring the design, implementation and test flight of small, lightweight spacecraft was recently conducted. This spacecraft development required hundreds of tests and integrations of subsystems on several special purpose testbeds, with each test creating large amounts of telemetered data. This paper focuses on the automated analysis and reduction of data which is telemetered from one of the key subsystems, the Probe. A typical telemetry stream from a testbed run averaged 50 Megabytes of raw data, containing over 1600 system variables. The large telemetry file (raw data) sent from the Probe was decoded and decomposed into a large number of smaller Break Out Files (BOFs) containing variables with timestamps, and image files.

  17. Raman optical activity: a tool for protein structure analysis.

    PubMed

    Zhu, Fujiang; Isaacs, Neil W; Hecht, Lutz; Barron, Laurence D

    2005-10-01

    On account of its sensitivity to chirality, Raman optical activity (ROA), measured here as the intensity of a small, circularly polarized component in the scattered light using unpolarized incident light, is a powerful probe of protein structure and behavior. Protein ROA spectra provide information on secondary and tertiary structures of polypeptide backbones, backbone hydration, and side chain conformations, and on structural elements present in unfolded states. This article describes the ROA technique and presents ROA spectra, recorded with a commercial instrument of novel design, of a selection of proteins to demonstrate how ROA may be used to readily distinguish between the main classes of protein structure. A principal component analysis illustrates how the many structure-sensitive bands in protein ROA spectra are favorable for applying pattern recognition techniques to determine structural relationships between different proteins. PMID:16216573

  18. 2D image fuzzy deconvolution and scattering centre detection

    Microsoft Academic Search

    Luigi Giubbolini; Paul Pazandak

    2010-01-01

    A new innovative technique based on fuzzy deconvolution for scattering centre detection (F-SCD) is proposed together with its implementation in FPGA for real-time deployment in UAV and automotive collision avoidance application. F-SCD emulates the human interpretation of radar images using fuzzy measurement of features of the radar Point Spread Function (PSF) differently from other classic detection techniques. The first stage

  19. Frequency-domain blind deconvolution based on mutual information rate

    Microsoft Academic Search

    Anthony Larue; Jérôme I. Mars; Christian Jutten

    2006-01-01

    In this paper, a new blind single-input single-output (SISO) deconvolution method based on the minimization of the mutual information rate of the deconvolved output is proposed. The method works in the frequency domain and requires estima- tion of the signal probability density function. Thus, the algorithm uses higher order statistics (except for Gaussian source) and al- lows non-minimum-phase filter estimation.

  20. Robust Speech Dereverberation Using Multichannel Blind Deconvolution With Spectral Subtraction

    Microsoft Academic Search

    Ken'ichi Furuya; Akitoshi Kataoka

    2007-01-01

    A robust dereverberation method is presented for speech enhancement in a situation requiring adaptation where a speaker shifts his\\/her head under reverberant conditions causing the impulse responses to change frequently. We combine correlation-based blind deconvolution with modified spectral subtraction to improve the quality of inverse-filtered speech degraded by the estimation error of inverse filters obtained in practice. Our method computes

  1. Seismic interferometry by multidimensional deconvolution without wavefield separation

    NASA Astrophysics Data System (ADS)

    Ravasi, Matteo; Meles, Giovanni; Curtis, Andrew; Rawlinson, Zara; Yikuo, Liu

    2015-07-01

    Seismic interferometry comprises a suite of methods to redatum recorded wavefields to those that would have been recorded if different sources (so-called virtual sources) had been activated. Seismic interferometry by cross-correlation has been formulated using either two-way (for full wavefields) or one-way (for directionally decomposed wavefields) representation theorems. To obtain improved Green's function estimates, the cross-correlation result can be deconvolved by a quantity that identifies the smearing of the virtual source in space and time, the so-called point-spread function. This type of interferometry, known as interferometry by multidimensional deconvolution (MDD), has so far been applied only to one-way directionally decomposed fields, requiring accurate wavefield decomposition from dual (e.g. pressure and velocity) recordings. Here we propose a form of interferometry by multidimensional deconvolution that uses full wavefields with two-way representations, and simultaneously invert for pressure and (normal) velocity Green's functions, rather than only velocity responses as for its one-way counterpart. Tests on synthetic data show that two-way MDD improves on results of interferometry by cross-correlation, and generally produces estimates of similar quality to those obtained by one-way MDD, suggesting that the preliminary decomposition into up- and downgoing components of the pressure field is not required if pressure and velocity data are jointly used in the deconvolution. We also show that constraints on the directionality of the Green's functions sought can be added directly into the MDD inversion process to further improve two-way multidimensional deconvolution. Finally, as a by-product of having pressure and particle velocity measurements, we adapt one- and two-way representation theorems to convert any particle velocity receiver into its corresponding virtual dipole/gradient source by means of MDD. Thus data recorded from standard monopolar (e.g. marine) pressure sources can be converted into data from dipolar (derivative) sources at no extra acquisition cost.

  2. JULIDE: A Software Tool for 3D Reconstruction and Statistical Analysis of Autoradiographic Mouse Brain Sections

    PubMed Central

    Ribes, Delphine; Parafita, Julia; Charrier, Rémi; Magara, Fulvio; Magistretti, Pierre J.; Thiran, Jean-Philippe

    2010-01-01

    In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool. PMID:21124830

  3. Object-oriented parser-based finite element analysis tool interface

    NASA Astrophysics Data System (ADS)

    Koo, Donald; Peak, Russell S.; Fulton, Robert E.

    1999-08-01

    To better integrate engineering design and analysis, the multi-representation architecture (MRA) and related methodology have been developed to represent the information transformations between CAD and CAE models. The MRA consists of four representations for increased modularity and flexibility. As one of the representations, solution method models (SMMs) are object-oriented wrappings of tool-specific inputs and outputs that enable highly automated operation of general purpose analysis tools. Outer contexts in the MRA create SMMs from product data and map SMM results back into product-specific terms.

  4. Adaptive wavelet-based deconvolution method for remote sensing imaging.

    PubMed

    Zhang, Wei; Zhao, Ming; Wang, Zhile

    2009-08-20

    Fourier-based deconvolution (FoD) techniques, such as modulation transfer function compensation, are commonly employed in remote sensing. However, the noise is strongly amplified by FoD and is colored, thus producing poor visual quality. We propose an adaptive wavelet-based deconvolution algorithm for remote sensing called wavelet denoise after Laplacian-regularized deconvolution (WDALRD) to overcome the colored noise and to preserve the textures of the restored image. This algorithm adaptively denoises the FoD result on a wavelet basis. The term "adaptive" means that the wavelet-based denoising procedure requires no parameter to be estimated or empirically set, and thus the inhomogeneous Laplacian prior and the Jeffreys hyperprior are proposed. Maximum a posteriori estimation based on such a prior and hyperprior leads us to an adaptive and efficient nonlinear thresholding estimator, and therefore WDALRD is computationally inexpensive and fast. Experimentally, textures and edges of the restored image are well preserved and sharp, while the homogeneous regions remain noise free, so WDALRD gives satisfactory visual quality. PMID:19696869

  5. Extraction of pencil beam kernels by the deconvolution method

    SciTech Connect

    Chui, C.; Mohan, R.

    1988-03-01

    A method has been developed to extract pencil beam kernels from measured broad beam profiles. In theory, the convolution of a symmetric kernel with a step function will yield a function that is symmetric about the inflection point. Conversely, by deconvolution, the kernel may be extracted from a measured distribution. In practice, however, due to the uncertainties and errors associated with the measurements and due to the singularities produced in the fast Fourier transforms employed in the deconvolution process, the kernels thus obtained and the dose distributions calculated therefrom, often exhibit erratic fluctuations. We propose a method that transforms measured profiles to new, modified distributions so that they satisfy the theoretical symmetry condition. The resultant kernel from the deconvolution is then free of fluctuations. We applied this method to compute photon and electron dose distributions at various depths in water and electron fluence distributions in air. The agreement between measured and computed profiles is within 1% in dose or 1 mm in distance in high dose gradient regions.

  6. Joint deconvolution and interpolation of remote sensing data

    NASA Astrophysics Data System (ADS)

    Kane, Jonathan A.; Rodi, William

    2004-02-01

    We present a method for the simultaneous deconvolution and interpolation of remote sensing data in a single joint inverse problem. Joint inversion allows sparsely sampled data to improve deconvolution results and, conversely, allows large-scale blurred data to improve the interpolation of sampled data. Geostatistical interpolation and geostatistically damped deconvolution are special cases such a joint inverse problem. Our method is posed in the Bayesian framework and requires the definition of likelihood functions for each data set involved, as well as a prior model of the parameter field of interest. The solution of such a problem is the posterior probability distribution. We present an algorithm for finding the maximum of this distribution. The particular application we apply our algorithm to is the fusion of digital elevation model and global positioning system data sets. The former data is a larger scale blurred image of topography, while the latter represent point samples of the same field. A synthetic data set is constructed to first show the performance of the method. Real data is then inverted.

  7. Estimating backscatter spectra after deconvolution with Kalman smoothing

    NASA Astrophysics Data System (ADS)

    Guenter, Armin I.

    2001-05-01

    In quantitative tissue characterization. Obtaining processed ultrasonic echoes with a direct relationship to local tissue response (backscatter spectrum) and that are free from systemic depth-dependent effects, such as diffraction, is essential. In general practice today, these unwanted distortions are eliminated by dividing short time power spectra. However, this method has its drawbacks; noise is not taken into account, and shorter time gates lead to an increasing bias within the relative spectra. To overcome these methodological issues, I propose a different approach as follows. Entire deconvolved A-scans are estimated by a Kalman smoothing deconvolution algorithm. These then serve as a basis for estimating the relative backscatter spectra. In addition, due to the principle of the deconvolution algorithm, it is possible to suppress additive noise to some degree. To examine the properties of the method proposed, this paper presents an analytical expression for the power spectrum of the deconvolved signals obtained by Kalman Smoothing. This result is then compared to the expectations of relative short time power spectra. Simulations demonstrate the behavior of the deconvolution method in a non-stationary environment.

  8. Time series analysis as a tool for karst water management

    NASA Astrophysics Data System (ADS)

    Fournier, Matthieu; Massei, Nicolas; Duran, Léa

    2015-04-01

    Karst hydrosystems are well known for their vulnerability to turbidity due to their complex and unique characteristics which make them very different from other aquifers. Moreover, many parameters can affect their functioning. It makes the characterization of their vulnerability difficult and needs the use of statistical analyses Time series analyses on turbidity, electrical conductivity and water discharge datasets, such as correlation and spectral analyses, have proven to be useful in improving our understanding of karst systems. However, the loss of information on time localization is a major drawback of those Fourier spectral methods; this problem has been overcome by the development of wavelet analysis (continuous or discrete) for hydrosystems offering the possibility to better characterize the complex modalities of variation inherent to non stationary processes. Nevertheless, from wavelet transform, signal is decomposed on several continuous wavelet signals which cannot be true with local-time processes frequently observed in karst aquifer. More recently, a new approach associating empirical mode decomposition and the Hilbert transform was presented for hydrosystems. It allows an orthogonal decomposition of the signal analyzed and provides a more accurate estimation of changing variability scales across time for highly transient signals. This study aims to identify the natural and anthropogenic parameters which control turbidity released at a well for drinking water supply. The well is located in the chalk karst aquifer near the Seine river at 40 km of the Seine estuary in western Paris Basin. At this location, tidal variations greatly affect the level of the water in the Seine. Continuous wavelet analysis on turbidity dataset have been used to decompose turbidity release at the well into three components i) the rain event periods, ii) the pumping periods and iii) the tidal range of Seine river. Time-domain reconstruction by inverse wavelet transform allows the assessment of the variance explained by each component. Then, empirical mode decomposition and the Hilbert transform put in evidence the highly transient signal in karst well due to the common effect of pumping and tidal range on turbidity signal. Then, univariate clustering has been used to identify turbidity origins and their periods of occurrence at the scale of the hydrologic year. These results demonstrate the impact of tidal range on turbidity release at the well and allow a new water resource management for owner.

  9. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    PubMed

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers. PMID:19450233

  10. A software tool for 3D dose verification and analysis

    NASA Astrophysics Data System (ADS)

    Sa'd, M. Al; Graham, J.; Liney, G. P.

    2013-06-01

    The main recent developments in radiotherapy have focused on improved treatment techniques in order to generate further significant improvements in patient prognosis. There is now an internationally recognised need to improve 3D verification of highly conformal radiotherapy treatments. This is because of the very high dose gradients used in modern treatment techniques, which can result in a small error in the spatial dose distribution leading to a serious complication. In order to gain the full benefits of using 3D dosimetric technologies (such as gel dosimetry), it is vital to use 3D evaluation methods and algorithms. We present in this paper a software solution that provides a comprehensive 3D dose evaluation and analysis. The software is applied to gel dosimetry, which is based on magnetic resonance imaging (MRI) as a read-out method. The software can also be used to compare any two dose distributions, such as two distributions planned using different methods of treatment planning systems, or different dose calculation algorithms.

  11. Climbing the CO Ladder: An Automated CO Excitation Analysis Tool

    NASA Astrophysics Data System (ADS)

    Rosenberg, Marissa; Van der Werf, P.; Loenen, E.; Israel, F. P.

    2013-01-01

    In order to determine the physical properties of the ISM in galactic nuclei, we fit the 12CO excitation ladder with a large grid of Photon-Dominated Region (PDR) and X-ray Dominated Region (XDR) models. Since the HerCULES open time key program is gathering many extragalactic SPIRE spectra with multiple 12CO transitions, we developed an automated process to analyze the excitation components in these galaxies. The program finds the chi-squared minimized fit of up to three PDR/XDR components, as well as graphically displaying the chi-squared values for the other models, in order to analyze the degeneracies of the best fit models. A test case to validate our method is the merger Arp 299, which is composed of three sources. Source A (IC 694) harbors a dust enshrouded AGN, Source B (NGC 3690) is a starburst nucleus, and Source C is a small compact galaxy to the northwest of the merging galaxies. We find that our method is successful at automatically fitting their CO ladders, as well as distinguishing between XDR and PDR excitation, especially for the highest J transitions. We also find that the analysis of the chi-squared distribution is essential to interpreting the best-fit models and determining their validity.

  12. Differential Power Analysis as a digital forensic tool.

    PubMed

    Souvignet, T; Frinken, J

    2013-07-10

    Electronic payment fraud is considered a serious international crime by Europol. An important part of this fraud comes from payment card data skimming. This type of fraud consists of an illegal acquisition of payment card details when a user is withdrawing cash at an automated teller machine (ATM) or paying at a point of sale (POS). Modern skimming devices, also known as skimmers, use secure crypto-algorithms (e.g. Advanced Encryption Standard (AES)) to protect skimmed data stored within their memory. In order to provide digital evidence in criminal cases involving skimmers, law enforcement agencies (LEAs) must retrieve the plaintext skimmed data, generally without having knowledge of the secret key. This article proposes an alternative to the current solution at the Bundeskriminalamt (BKA) to reveal the secret key. The proposed solution is non-invasive, based on Power Analysis Attack (PAA). This article first describes the structure and the behaviour of an AES skimmer, followed by the proposal of the full operational PAA process, from power measurements to attack computation. Finally, it presents results obtained in several cases, explaining the latest improvements and providing some ideas for further developments. PMID:23623248

  13. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  14. Bayesian networks as a tool for epidemiological systems analysis

    NASA Astrophysics Data System (ADS)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  15. First Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNWD-SA-6904 Metrics and Measures for Intelligence Analysis Task Difficulty

    E-print Network

    and Measures for Intelligence Analysis Task Difficulty Abstract Evaluating the effectiveness of toolsFirst Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNWD-SA-6904 Metrics in assessing task difficulty. 1. Introduction An active area of research is the design and development of tools

  16. Data analysis techniques: a tool for cumulative exposure assessment.

    PubMed

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine

    2015-01-01

    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework. PMID:25248936

  17. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    SciTech Connect

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  18. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  19. The digital penalized LMS deconvolution method for TPC X-ray polarimeter signal processing

    NASA Astrophysics Data System (ADS)

    He, L.; Deng, Z.; Li, H.; Liu, Y. N.; Feng, H.

    2015-04-01

    This article presents the Digital Penalized LMS (Least Mean Square) deconvolution method for processing the X-ray polarimeter readout electronics output signal. The deconvolution filter is used to recover the detector signal high frequency component, which is lost due to the limited bandwidth of the readout electronics. The DPLMS deconvolution method does not need to know the transfer function of the readout electronics system in advance and can restrain the deconvolution noise by using a noise constraint. In this paper, this method will be applied to process the simulation data generated by GEANT4 and the resulting photoelectron angular resolution of a X-ray polarimeter will be presented.

  20. TeloTool: a new tool for telomere length measurement from terminal restriction fragment analysis with improved probe intensity correction

    PubMed Central

    Göhring, Janett; Fulcher, Nick; Jacak, Jaroslaw; Riha, Karel

    2014-01-01

    Telomeres comprise the protective caps of natural chromosome ends and function in the suppression of DNA damage signaling and cellular senescence. Therefore, techniques used to determine telomere length are important in a number of studies, ranging from those investigating telomeric structure to effects on human disease. Terminal restriction fragment (TRF) analysis has for a long time shown to be one of the most accurate methods for quantification of absolute telomere length and range from a number of species. As this technique centers on standard Southern blotting, telomeric DNA is observed on resulting autoradiograms as a heterogeneous smear. Methods to accurately determine telomere length from telomeric smears have proven problematic, and no reliable technique has been suggested to obtain mean telomere length values. Here, we present TeloTool, a new program allowing thorough statistical analysis of TRF data. Using this new method, a number of methodical biases are removed from previously stated techniques, including assumptions based on probe intensity corrections. This program provides a standardized mean for quick and reliable extraction of quantitative data from TRF autoradiograms; its wide application will allow accurate comparison between datasets generated in different laboratories. PMID:24366880

  1. Ultrametric networks: a new tool for phylogenetic analysis

    PubMed Central

    2013-01-01

    Background The large majority of optimization problems related to the inference of distance?based trees used in phylogenetic analysis and classification is known to be intractable. One noted exception is found within the realm of ultrametric distances. The introduction of ultrametric trees in phylogeny was inspired by a model of evolution driven by the postulate of a molecular clock, now dismissed, whereby phylogeny could be represented by a weighted tree in which the sum of the weights of the edges separating any given leaf from the root is the same for all leaves. Both, molecular clocks and rooted ultrametric trees, fell out of fashion as credible representations of evolutionary change. At the same time, ultrametric dendrograms have shown good potential for purposes of classification in so far as they have proven to provide good approximations for additive trees. Most of these approximations are still intractable, but the problem of finding the nearest ultrametric distance matrix to a given distance matrix with respect to the L? distance has been long known to be solvable in polynomial time, the solution being incarnated in any minimum spanning tree for the weighted graph subtending to the matrix. Results This paper expands this subdominant ultrametric perspective by studying ultrametric networks, consisting of the collection of all edges involved in some minimum spanning tree. It is shown that, for a graph with n vertices, the construction of such a network can be carried out by a simple algorithm in optimal time O(n2) which is faster by a factor of n than the direct adaptation of the classical O(n3) paradigm by Warshall for computing the transitive closure of a graph. This algorithm, called UltraNet, will be shown to be easily adapted to compute relaxed networks and to support the introduction of artificial points to reduce the maximum distance between vertices in a pair. Finally, a few experiments will be discussed to demonstrate the applicability of subdominant ultrametric networks. Availability http://www.dei.unipd.it/~ciompin/main/Ultranet/Ultranet.html PMID:23497437

  2. Raman molecular chemical imaging: 3D Raman using deconvolution

    Microsoft Academic Search

    John S. Maier; Patrick J. Treado

    2004-01-01

    Chemical imaging is a powerful technique combining molecular spectroscopy and digital imaging for rapid, non-invasive and reagentless analysis of materials, including biological cells and tissues. Raman chemical imaging is suited to the characterization of molecular composition and structure of biomateials at submicron spatial resolution (< 250 nm). As a result, Raman imaging has potential as a routine tool for the

  3. AngioLab--a software tool for morphological analysis and endovascular treatment planning of intracranial aneurysms.

    PubMed

    Larrabide, Ignacio; Villa-Uriol, Maria-Cruz; Cárdenes, Rubén; Barbarito, Valeria; Carotenuto, Luigi; Geers, Arjan J; Morales, Hernán G; Pozo, José M; Mazzeo, Marco D; Bogunovi?, Hrvoje; Omedas, Pedro; Riccobene, Chiara; Macho, Juan M; Frangi, Alejandro F

    2012-11-01

    Determining whether and how an intracranial aneurysm should be treated is a tough decision that clinicians face everyday. Emerging computational tools could help clinicians analyze clinical data and make these decisions. AngioLab is a single graphical user interface, developed on top of the open source framework GIMIAS, that integrates some of the latest image analysis and computational modeling tools for intracranial aneurysms. Two workflows are available: Advanced Morphological Analysis (AMA) and Endovascular Treatment Planning (ETP). AngioLab has been evaluated by a total of 62 clinicians, who considered the information provided by AngioLab relevant and meaningful. They acknowledged the emerging need of these type of tools and the potential impact they might have on the clinical decision-making process. PMID:22749086

  4. PROVAT: a tool for Voronoi tessellation analysis of protein structures and complexes

    Microsoft Academic Search

    Swanand P. Gore; David F. Burke; Tom L. Blundell

    2005-01-01

    Summary: Voronoi tessellation has proved to be a useful tool in protein structure analysis. We have developed PROVAT, a versatile public domain software that enables computation and visualization of Voronoi tessellations of proteins and protein complexes. It is a set of Python scripts that integrate freely available specialized soft- ware (Qhull, Pymol etc.) into a pipeline. The calculation component of

  5. Developing a Training Tool for Intraoperative Mitral Valve Analysis Neil A. Tenenholtz, Robert D. Howe

    E-print Network

    Developing a Training Tool for Intraoperative Mitral Valve Analysis Neil A. Tenenholtz, Robert D@seas.harvard.edu INTRODUCTION The mitral valve is one of the four valves of the human heart. Serving as a passive check valve transparency. To produce such a fast simulation, a mass-spring approximation of a finite element model

  6. An analysis tool for the assessment of student participation and implementation dynamics in online discussion forums

    Microsoft Academic Search

    Mark Pendergast

    2006-01-01

    In order to conduct a successful online forum it is necessary to have a collection of proven provocative discussion topics, a sound technique to implement them, and a consistent way to assess student participation. In this paper I present an analysis tool that helps me evaluate individual student participation, the viability of the forum cases, and their success\\/failure of their

  7. Research Article Towards spatial data quality information analysis tools for experts

    E-print Network

    data quality analysis. This paper presents the design of such a tool that can manage heterogeneous data, multi-granularity and context-sensitive spatial data quality indicators that help experts to build this approach. Keywords: Spatial data quality; Fitness for use; Visualization; Indicators; Spatial OLAP

  8. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  9. Description The TumorTrace program is an automated image analysis tool, developed in MATLAB

    E-print Network

    Yavuz, Deniz

    -pixel outline to the desired image channel and averaging the 8-connect neighborhood surrounding each pixelDescription The TumorTrace program is an automated image analysis tool, developed in MATLAB (The morphology, protein expression and movement. It takes as input multiple image channels, either single images

  10. Application of Frameworks in the Analysis and (Re)design of Interactive Visual Learning Tools

    ERIC Educational Resources Information Center

    Liang, Hai-Ning; Sedig, Kamran

    2009-01-01

    Interactive visual learning tools (IVLTs) are software environments that encode and display information visually and allow learners to interact with the visual information. This article examines the application and utility of frameworks in the analysis and design of IVLTs at the micro level. Frameworks play an important role in any design. They…

  11. Integrated Tools for the Simulation Analysis of Peer-To-Peer Backup Systems

    E-print Network

    Paris-Sud XI, Université de

    than centralized architectures using a peer-to-peer architecture. Theoretically, they can reach betterIntegrated Tools for the Simulation Analysis of Peer-To-Peer Backup Systems Olivier Dalle to evaluate the performance and estimate the re- source usage of peer-to-peer backup systems, it is important

  12. Workshop on tools for program development and analysis in computational science

    Microsoft Academic Search

    Christof Klausecker; Arndt Bode; Andreas Knüpfer; Dieter Kranzlmüller; Jie Tao; Jens Volkert; Roland Wismüller

    2010-01-01

    The use of supercomputing technology, parallel and distributed processing, and sophisticated algorithms is of major importance for computational scientists. Yet, the scientists’ goals are to solve their challenging scientific problems, not the software engineering tasks associated with it. For that reason, computational science and engineering must be able to rely on dedicated support from program development and analysis tools. The

  13. REGULARIZATION TOOLS: A Matlab package for analysis and solution of discrete ill-posed problems

    Microsoft Academic Search

    Per Christian Hansen

    1994-01-01

    The package REGULARIZATION TOOLS consists of 54 Matlab routines for analysis and solution of discrete ill-posed problems, i.e., systems of linear equations whose coefficient matrix has the properties that its condition number is very large, and its singular values decay gradually to zero. Such problems typically arise in connection with discretization of Fredholm integral equations of the first kind, and

  14. Urban goods movement (UGM) analysis as a tool for urban planning

    E-print Network

    Boyer, Edmond

    Urban goods movement (UGM) analysis as a tool for urban planning Mathieu Gardrat, Jesus Gonzalez 07 Submission for track: F - Transport, Land Use and Sustainability Since several years, urban planning has become a major stake for the sustainable development of cities. Each decision taken for urban

  15. Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...

  16. Spectral analysis of seismic noise induced by rivers: A new tool to monitor spatiotemporal changes in

    E-print Network

    Demouchy, Sylvie

    Spectral analysis of seismic noise induced by rivers: A new tool to monitor spatiotemporal changes and deeply incised channel of the Trisuli River, a major trans-Himalayan river. The early summer increase the Trisuli River. Seasonal increase in ambient noise coincides with the strong monsoon rainfall and a period

  17. Combining multivariate statistical analysis with geographic information systems mapping: a tool for delineating groundwater contamination

    Microsoft Academic Search

    Silas E. Mathes; Todd C. Rasmussen

    2006-01-01

    Multivariate Statistical Analysis (MSA) has successfully been coupled with geographic information system (GIS) mapping tools to delineate zones of aquifer contamination potential. While delineating contaminants is key to site remediation, it is often compromised by a poor understanding of hydrogeologic conditions, and by uncertainties in contaminant observations. MSA provides improved estimates of contamination potential by augmenting observed contaminant concentrations with

  18. Türkçe ?çin Geni?letilebilir Siklik Analiz Programi An Extendible Frequency Analysis Tool for Turkish

    Microsoft Academic Search

    Melek OKTAY; Atakan KURT; Mehmet KARA

    The analysis of Turkish texts is significant in Turkish language, literature and a wide spectrum of areas. It is a complicated task to count language structures manually. A computer application that processes and analyzes Turkish text documents or document sets (corpus) is beneficial. In this paper, the text processing and analyzing tool is developed to analyze the texts and computes

  19. Visual DSD: A design and analysis tool for DNA strand displacement systems

    Microsoft Academic Search

    Matthew R. Lakin; Simon Youssef; Stephen Emmott; Andrew Phillips

    2011-01-01

    The Visual DSD (DNA Strand Displacement) tool allows rapid prototyping and analysis of computational devices implemented using DNA strand displacement, in a convenient web-based graphical interface. It is an implementation of the DSD programming language and compiler described by Lakin et al. (2011) with additional features such as support for polymers of unbounded length. It also supports stochastic and deterministic

  20. Vidock: a Tool for Impact Analysis of Aspect Weaving on Test Cases

    E-print Network

    Paris-Sud XI, Université de

    Vidock: a Tool for Impact Analysis of Aspect Weaving on Test Cases Romain Delamare1 , Freddy Munoz1, through aspect weaving, has an impact on its existing behaviors. If test cases ex- ist for the program, it is necessary to identify the subset of test cases that trigger the behavior impacted by the aspect. This subset