These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

PVT Analysis With A Deconvolution Algorithm  

SciTech Connect

Polyvinyl Toluene (PVT) plastic scintillator is the most common gamma ray detector material used for large systems when only gross counting is needed because of its low cost, robustness, and relative sensitivity. PVT does provide some energy information about the incident photons, as has been demonstrated through the development of Energy Windowing analysis. There is a more sophisticated energy analysis algorithm developed by Symetrica, Inc., and they have demonstrated the application of their deconvolution algorithm to PVT with very promising results. The thrust of such a deconvolution algorithm used with PVT is to allow for identification and rejection of naturally occurring radioactive material, reducing alarm rates, rather than the complete identification of all radionuclides, which is the goal of spectroscopic portal monitors. Under this condition, there could be a significant increase in sensitivity to threat materials. The advantage of this approach is an enhancement to the low cost, robust detection capability of PVT-based radiation portal monitor systems. The success of this method could provide an inexpensive upgrade path for a large number of deployed PVT-based systems to provide significantly improved capability at a much lower cost than deployment of NaI(Tl)-based systems of comparable sensitivity.

Kouzes, Richard T.



Figure 1. Software flowchart of the CAM-CM algorithm. CAM-CM: A Signal Deconvolution Tool for In Vivo Dynamic Con-  

E-print Network

1 Figure 1. Software flowchart of the CAM-CM algorithm. CAM-CM: A Signal Deconvolution Tool-mixed tissue heterogeneity. CAM-CM (Convex Analysis of Mixtures - Compartment Modeling) signal deconvolution parameters. CAM-CM can dissect complex tissues into regions with differential tracer kinetics at pixel

Wang, Ge


AIRYLN: an adhoc numerical tool for deconvolution of images from the LBT instrument LINCNIRVANA  

E-print Network

of the col­ lection of various numerical packages that will support the ``Science'' of LINCAIRY­LN: an ad­hoc numerical tool for deconvolution of images from the LBT instrument LINC the already existing Sofware Package AIRY (a set of IDL­based modules developed within the CAOS ``system

Bertero, Mario


A L? sparse analysis prior for blind poissonian image deconvolution.  


This paper proposes a new approach for blindly deconvolving images that are contaminated by Poisson noise. The proposed approach incorporates a new prior, that is the L0 sparse analysis prior, together with the total variation constraint into the maximum a posteriori (MAP) framework for deconvolution. A greedy analysis pursuit numerical scheme is exploited to solve the L0 regularized MAP problem. Experimental results show that our approach not only produces smooth results substantially suppressing artifacts and noise, but also preserves intensity changes sharply. Both quantitative and qualitative comparisons to the specialized state-of-the-art algorithms demonstrate its superiority. PMID:24663705

Gong, Xiaojin; Lai, Baisheng; Xiang, Zhiyu



Iterative deconvolution and semiblind deconvolution methods in magnetic archaeological prospecting  

E-print Network

Iterative deconvolution and semiblind deconvolution methods in magnetic archaeological prospecting In archaeological magnetic prospecting, most targets can be modeled by a single layer of constant burial depth- ploiting image deconvolution tools, two iterative reconstruc- tion methods are applied to minimize

Bertero, Mario


Multispectral imaging analysis: spectral deconvolution and applications in biology  

NASA Astrophysics Data System (ADS)

Multispectral imaging has been in use for over half a century. Owing to advances in digital photographic technology, multispectral imaging is now used in settings ranging from clinical medicine to industrial quality control. Our efforts focus on the use of multispectral imaging coupled with spectral deconvolution for measurement of endogenous tissue fluorophores and for animal tissue analysis by multispectral fluorescence, absorbance, and reflectance data. Multispectral reflectance and fluorescence images may be useful in evaluation of pathology in histological samples. For example, current hematoxylin/eosin diagnosis limits spectral analysis to shades of red and blue/grey. It is possible to extract much more information using multispectral techniques. To collect this information, a series of filters or a device such as an acousto-optical tunable filter (AOTF) or liquid-crystal filter (LCF) can be used with a CCD camera, enabling collection of images at many more wavelengths than is possible with a simple filter wheel. In multispectral data processing the "unmixing" of reflectance or fluorescence data and analysis and the classification based upon these spectra is required for any classification. In addition to multispectral techniques, extraction of topological information may be possible by reflectance deconvolution or multiple-angle imaging, which could aid in accurate diagnosis of skin lesions or isolation of specific biological components in tissue. The goal of these studies is to develop spectral signatures that will provide us with specific and verifiable tissue structure/function information. In addition, relatively complex classification techniques must be developed so that the data are of use to the end user.

Leavesley, Silas; Ahmed, Wamiq; Bayraktar, Bulent; Rajwa, Bartek; Sturgis, Jennifer; Robinson, J. P.



Relative Extents of Preformation and Neoformation in Tree Shoots: Analysis by a Deconvolution Method  

PubMed Central

Background and Aims Neoformation is the process by which organs not preformed in a bud are developed on a growing shoot, generally after preformation extension. The study of neoformation in trees has been hindered due to methodological reasons. The present report is aimed at assessing the relative importance of preformation and neoformation in the development of shoots of woody species. Methods A deconvolution method was applied to estimate the distribution of the number of neoformed organs for eight data sets corresponding to four Nothofagus species and a Juglans hybrid. Key Results The number of preformed organs was higher and less variable than the number of neoformed organs. Neoformation contributed more than preformation to explain full-size differences between shoots developed in different positions within the architecture of each tree species. Conclusions Differences between the distributions of the numbers of preformed and neoformed organs may be explained by alluding to the duration of differentiation and extension for each of these groups of organs. The deconvolution of distributions is a useful tool for the analysis of neoformation and shoot structure in trees. PMID:16899472




Richardson-Lucy deconvolution as a general tool for combining images with complementary strengths.  


We use Richardson-Lucy (RL) deconvolution to combine multiple images of a simulated object into a single image in the context of modern fluorescence microscopy techniques. RL deconvolution can merge images with very different point-spread functions, such as in multiview light-sheet microscopes,1,?2 while preserving the best resolution information present in each image. We show that RL deconvolution is also easily applied to merge high-resolution, high-noise images with low-resolution, low-noise images, relevant when complementing conventional microscopy with localization microscopy. We also use RL deconvolution to merge images produced by different simulated illumination patterns, relevant to structured illumination microscopy (SIM)3,?4 and image scanning microscopy (ISM). The quality of our ISM reconstructions is at least as good as reconstructions using standard inversion algorithms for ISM data, but our method follows a simpler recipe that requires no mathematical insight. Finally, we apply RL deconvolution to merge a series of ten images with varying signal and resolution levels. This combination is relevant to gated stimulated-emission depletion (STED) microscopy, and shows that merges of high-quality images are possible even in cases for which a non-iterative inversion algorithm is unknown. PMID:24436314

Ingaramo, Maria; York, Andrew G; Hoogendoorn, Eelco; Postma, Marten; Shroff, Hari; Patterson, George H



Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research  

NASA Astrophysics Data System (ADS)

During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer studies is critically discussed, where special emphasis is set on evaluating different data processing strategies on the example of enriched stable Sr isotopes.1 The analytical key parameters such as blank (Kr, Sr and Rb), variation of the natural Sr isotopic composition in the sample, mass bias, interferences (Rb) and total combined uncertainty are considered. A full metrological protocol for data processing using IPD is presented based on data gained during two transgenerational marking studies of fish, where the transfer of a Sr isotope double spike (84Sr and 86Sr) from female spawners of common carp (Cyprinus carpio L.) and brown trout (Salmo trutta f.f.)2 to the centre of the otoliths of their offspring was studied by (LA)-MC-ICP-MS. 1J. Irrgeher, A. Zitek, M. Cervicek and T. Prohaska, J. Anal. At. Spectrom., 2014, 29, 193-200. 2A. Zitek, J. Irrgeher, M. Kletzl, T. Weismann and T. Prohaska, Fish. Manage. Ecol., 2013, 20, 654-361.

Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas



Objective determination of the water level in frequency-domain deconvolution for receiver function analysis  

NASA Astrophysics Data System (ADS)

Deconvolution is the central operation carried out in teleseismic receiver function (RF) analysis. It transforms the recorded teleseismic signal into the Earth's impulse response by effectively removing the source and instrument responses from this signal. The operation can be carried out either in the time domain or in the frequency domain. Time-domain deconvolution is generally more computationally intensive, but it allows for automatic convergence towards a stable solution (i.e., an RF devoid of ringing) for noisy data. Frequency-domain deconvolution is faster to compute, but it often requires user input to find the optimal regularization/water-level parameter that yields a stable solution. In this study, we investigate ways to objectively determine the optimal water level parameter for frequency-domain deconvolution of teleseismic RFs. Using synthetic and field data, we compare various optimization schemes with L-curves that provide a tradeoff between the root-mean-square error, L2-norm, signal sparseness and spectral flatness of the computed RF. We find that maximising the spectral flatness of the computed RF is the best way to find the optimum water level. Applications to field data from central and northern Norway illustrate the viability of this objective optimization scheme. The resulting RF profiles show clear signals from the Moho (with relief associated with the central Scandes) as well as from the 410 and 660 km-discontinuities below Norway.

Halpaap, Felix; Spieker, Kathrin; Rondenay, Stphane



Punch stretching process monitoring using acoustic emission signal analysis. II - Application of frequency domain deconvolution  

NASA Technical Reports Server (NTRS)

The coloring effect on the acoustic emission signal due to the frequency response of the data acquisition/processing instrumentation may bias the interpretation of AE signal characteristics. In this paper, a frequency domain deconvolution technique, which involves the identification of the instrumentation transfer functions and multiplication of the AE signal spectrum by the inverse of these system functions, has been carried out. In this way, the change in AE signal characteristics can be better interpreted as the result of the change in only the states of the process. Punch stretching process was used as an example to demonstrate the application of the technique. Results showed that, through the deconvolution, the frequency characteristics of AE signals generated during the stretching became more distinctive and can be more effectively used as tools for process monitoring.

Liang, Steven Y.; Dornfeld, David A.; Nickerson, Jackson A.



Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function  

SciTech Connect

A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.



A further analysis for the minimum-variance deconvolution filter performance  

NASA Astrophysics Data System (ADS)

Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

Chi, Chong-Yung



A further analysis for the minimum-variance deconvolution filter performance  

NASA Technical Reports Server (NTRS)

Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

Chi, Chong-Yung



Lexical analysis tool  

Microsoft Academic Search

This paper provides an algorithm for constructing a lexical analysis tool, by different means than the UNIX Lex tool. The input is a keywords table, describing the target language's keywords, keysymbols, and their semantics, instead of using regular expressions to do so.The output is a lexical analyzer for the specific programming language. The tool can also be used as a

Isaiah Pinchas Kantorovitz



eCRAM computer algorithm for implementation of the charge ratio analysis method to deconvolute electrospray ionization mass spectra  

Microsoft Academic Search

A computer program (eCRAM) has been developed for automated processing of electrospray mass spectra based on the charge ratio analysis method. The eCRAM algorithm deconvolutes electrospray mass spectra solely from the ratio of mass-to-charge (m\\/z) values of multiply charged ions. The program first determines the ion charge by correlating the ratio of m\\/z values for any two (i.e., consecutive or

Simin D. Maleknia; David C. Green



Physics analysis tools  

SciTech Connect

There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed.

Kunz, P.F.



Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis.  


We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (F(p)), mean capillary transit time (T(c)), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate F(p) values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated F(p) values on T(c), PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the F(p) values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while T(c) did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent. PMID:17473352

Murase, Kenya; Miyazaki, Shohei



Analysis of a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging  

NASA Astrophysics Data System (ADS)

Grating-based X-ray phase-contrast imaging is a promising imaging modality to increase soft tissue contrast in comparison to conventional attenuation-based radiography. Complementary and otherwise inaccessible information is provided by the dark-field image, which shows the sub-pixel size granularity of the measured object. This could especially turn out to be useful in mammography, where tumourous tissue is connected with the presence of supertiny microcalcifications. In addition to the well-established image reconstruction process, an analysis method was introduced by Modregger, 1 which is based on deconvolution of the underlying scattering distribution within a single pixel revealing information about the sample. Subsequently, the different contrast modalities can be calculated with the scattering distribution. The method already proved to deliver additional information in the higher moments of the scattering distribution and possibly reaches better image quality with respect to an increased contrast-to-noise ratio. Several measurements were carried out using melamine foams as phantoms. We analysed the dependency of the deconvolution-based method with respect to the dark-field image on different parameters such as dose, number of iterations of the iterative deconvolution-algorithm and dark-field signal. A disagreement was found in the reconstructed dark-field values between the FFT method and the iterative method. Usage of the resulting characteristics might be helpful in future applications.

Horn, Florian; Bayer, Florian; Pelzer, Georg; Rieger, Jens; Ritter, Andr; Weber, Thomas; Zang, Andrea; Michel, Thilo; Anton, Gisela



Extended Testability Analysis Tool  

NASA Technical Reports Server (NTRS)

The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

Melcher, Kevin; Maul, William A.; Fulton, Christopher



Characterization of attenuated proestrous luteinizing hormone surges in middle-aged rats by deconvolution analysis.  


Reproductive aging in female rats is associated with attenuated preovulatory LH surges. In this study, detailed analyses of the episodic characteristics of the proestrous LH surge were conducted in young and middle-aged regularly cyclic rats. On proestrus, blood samples were withdrawn at 3-min intervals for 6 h and analyzed for LH concentrations by RIA in triplicate. Deconvolution analysis of immunoreactive LH concentrations revealed that there was no difference in the detectable LH secretory burst frequency between young and middle-aged rats. However, in middle-aged rats with an attenuated LH surge on proestrus, the mass of LH secreted per burst and the maximal rate of LH secretion per burst were only one fourth (p < 0.01) of those in young and middle-aged rats with normal LH surges. Furthermore, middle-aged rats with attenuated LH surges had a 4-fold decrease (p < 0.01) in the maximal rate of LH secretion per burst compared to young and middle-aged females with normal LH surges. The apparent half-life of endogenous LH was similar among the 3 groups. The attenuated LH surges of middle-aged rats were related specifically to a decrease in LH burst amplitude with no change in pulse frequency. The orderliness of moment-to-moment LH release as quantified by the regularity statistic, approximate entropy, was comparable in the 3 groups. Our findings of a markedly decreased amount of LH released per burst and preserved orderliness of the LH release process strongly suggest that a deficient GnRH drive and/or reduced responsivity to the GnRH signal, rather than altered timing of the signal, accounts for the age-related decline in reproductive function in female rats as presaged by an attenuated proestrous LH surge in middle age. PMID:9828195

Matt, D W; Gilson, M P; Sales, T E; Krieg, R J; Kerbeshian, M C; Veldhuis, J D; Evans, W S



A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis  

Microsoft Academic Search

DNA methylation is an indispensible epigenetic modification required for regulating the expression of mammalian genomes. Immunoprecipitation-based methods for DNA methylome analysis are rapidly shifting the bottleneck in this field from data generation to data analysis, necessitating the development of better analytical tools. In particular, an inability to estimate absolute methylation levels remains a major analytical difficulty associated with immunoprecipitation-based DNA

Daniel J Turner; Paul Flicek; Heng Li; Eugene Kulesha; Stefan Grf; Nathan Johnson; Javier Herrero; Eleni M Tomazou; Natalie P Thorne; Liselotte Bckdahl; Marlis Herberth; Kevin L Howe; David K Jackson; Marcos M Miretti; John C Marioni; Ewan Birney; Tim J P Hubbard; Richard Durbin; Simon Tavar; Thomas A Down; Vardhman K Rakyan; Stephan Beck



Swift Science Analysis Tools  

NASA Astrophysics Data System (ADS)

Swift is an autonomous, multiwavelength observatory selected by NASA to study gamma-ray bursts (GRBs) and their afterglows. Its Burst Alert Telescope (BAT) is a large coded mask instrument that will image GRBs in the 15 to 150 keV band. The X-ray Telescope (XRT) focuses X-rays in the 0.2 to 10 keV band onto CCDs, and the co-aligned Ultra-Violet/Optical Telescope (UVOT) has filters and grisms for low-resolution spectroscopy. The Swift team is developing mission-specific tools for processing the telemetry into FITS files and for calibrating and selecting the data for further analysis with such mission-independent tools as XIMAGE and XSPEC. The FTOOLS-based suite of tools will be released to the community before launch with additional updates after launch. Documentation for the tools and standard receipes for their use will be available on the Swift Science Center (SSC) Web site (, and the SSC will provide user assistance with an e-mail help desk. After the verification phase of the mission, all data will be available to the community as soon as it is processed in the Swift Data Center (SDC). Once all the data for an observation is available, the data will be transferred to the HEASARC and data centers in England and Italy. The data can then be searched and accessed using standard tools such as Browse. Before this transfer the quick-look data will be available on an ftp site at the SDC. The SSC will also provide documentation and simulation tools in support of the Swift Guest Investigator program.

Marshall, F. E.; Swift Team Team



Neutron multiplicity analysis tool  

SciTech Connect

I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This program was developed to help speed the analysis of Monte Carlo neutron transport simulation (MCNP) data, and only requires the count-rate data to calculate the mass of material using INCC's analysis methods instead of the full neutron multiplicity distribution required to run analysis in INCC. This paper describes what is implemented within EXCOM, including the methods used, how the program corrects for deadtime, and how uncertainty is calculated. This paper also describes how to use EXCOM within Excel.

Stewart, Scott L [Los Alamos National Laboratory



A discriminant based charge deconvolution analysis pipeline for protein profiling of whole cell extracts using liquid chromatography-electrospray ionization-quadrupole time-of-flight mass spectrometry.  


A discriminant based charge deconvolution analysis pipeline is proposed. The molecular weight determination (MoWeD) charge deconvolution method was applied directly to the discrimination rules obtained by the fuzzy rule-building expert system (FuRES) pattern classifier. This approach was demonstrated with synthetic electrospray ionization-mass spectra. Identification of the tentative protein biomarkers by bacterial cell extracts of Salmonella enterica serovar typhimurium strains A1 and A19 by liquid chromatography-electrospray ionization-mass spectrometry (LC-ESI-MS) was also demonstrated. The data analysis time was reduced by applying this approach. In addition, this method was less affected by noise and baseline drift. PMID:21530796

Lu, Weiying; Callahan, John H; Fry, Frederick S; Andrzejewski, Denis; Musser, Steven M; Harrington, Peter de B



Geodetic Strain Analysis Tool  

NASA Technical Reports Server (NTRS)

A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan



Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis  

SciTech Connect

We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

Greenberg, M.; Ebel, D.S. (AMNH)



Java Radar Analysis Tool  

NASA Technical Reports Server (NTRS)

Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

Zaczek, Mariusz P.



One approach for doublet deconvolution to improve reliability in spectra analysis for in vivo lead measurement.  


Calculation of lead concentration from K-series X-ray fluorescent studies uses a robust normalization technique based on the amplitude or area of the elastic signal. Parameter estimation of the elastic signal can be affected by the overlap of the Kbeta2 line, especially for concentrations greater than 40 ppm where the Kbeta2 amplitude can be greater than 1% of the elastic signal. We tested the combination of estimation by method of least moduli and doublet deconvolution. We found that the estimation of the area of the elastic signal is more robust to changes in the low-energy end of the region of interest with the combined method than with method of least-squares estimation and singlet processing. We recommend use of the combined method for creation of calibration curves at concentrations greater than or equal to 40 ppm. PMID:11225706

Kondrashov, V S; Rothenberg, S J



LPA1,LPA2. Deconvolution Program  

SciTech Connect

The program is suitable for a lot of applications in applied mathematics, experimental physics, signal analytical system and some engineering applications range i.e. deconvolution spectrum, signal analysis and system property analysis etc.

Ping-An, L.; Jiang-Lai, Y. [Bejiing Normal University, Bejiing (China)



Genome Organization Analysis Tool  

E-print Network

point mutations of genes have been the basis of comparative sequence analyses, most often used as a tool for phylogenetic and evolutionary studies. However, point mutations are of limited utility when comparing either slowly evolving genes and genomes or rapidly evolving genes and genomes such as some pathogenic viral genomes. Even when the nucleotide sequences of

Aaron Kaluszka; Cynthia Gibas


Neutron multiplicity analysis tool  

Microsoft Academic Search

I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction

Scott L



Analysis/Design Tool  

NASA Technical Reports Server (NTRS)

Excelerator II, developed by INTERSOLV, Inc., provides a complete environment for rules-based expert systems. The software incorporates NASA's C Language Integrated Production System (CLIPS), a shell for constructing expert systems. Excelerator II provides complex verification and transformation routines based on matching that is simple and inexpensive. *Excelerator II was sold to SELECT Software Tools in June 1997 and is now called SELECT Excelerator. SELECT has assumed full support and maintenance for the product line.



Marginal Abatement Cost Analysis Tool  

EPA Science Inventory

The Non-CO2 Marginal Abatement Cost Analysis Tool is an extensive bottom-up engineering-economic spreadsheet model capturing the relevant cost and performance data on sectors emitting non-CO2 GHGs. The tool has 24 regions and 7 sectors and produces marginal abatement cost curves...


Nonstandard Tools for Nonsmooth Analysis  

E-print Network

This is an overview of the basic tools of nonsmooth analysis which are grounded on nonstandard models of set theory. By way of illustration we give a criterion for an infinitesimally optimal path of a general discrete dynamic system.

S. S. Kutateladze



Information Gathering and Analysis Tools  

NSDL National Science Digital Library

The National Center for Environmental Decision-making Research aims "to improv[e] environmental decision making" at regional, state, and local levels. Administered by the Joint Institute for Energy and Environment in Knoxville, Tennessee, NCEDR offers many decision-making resources, most prominently, tools for information gathering and analysis. Users may select from eight categories of tool use, from Identifying Values to Post-Decision Assessment. Within each category, subcategories offer information and tools on economic market assessment, ecological relationships, and other topics. Additional Links and commentary on Strengths & Weaknesses (of tools), Communicating the Results, Looking Ahead, and Key Sources round out the site.

Research., National C.


Fourier self-deconvolution of the IR spectra as a tool for investigation of distinct functional groups in porous materials: Brnsted acid sites in zeolites.  


For many decades, IR and FT-IR spectroscopy has generated valuable information about different functional groups in zeolites, metal-organic frameworks (MOFs), and other porous materials. However, this technique cannot distinguish between functional groups in different local environments. Our study demonstrates that this limitation could be overcome by using Fourier self-deconvolution of infrared spectra (FSD-IR). We apply this method to study three acidic mordenite zeolites and show (i) that these zeolites contain six distinct Brnsted acid sites (BAS) as opposed to 2-4 different BAS previously considered in literature and (ii) that the relative amounts of these BAS are different in the three zeolites examined. We then analyze possible locations of six BAS in the mordenite structure and explain a number of conflicting results in literature. On this basis, we conclude that the FSD-IR method allows direct visualization and examination of distributions of distinct BAS in zeolites, thus providing a unique research opportunity, which no other method can provide. Given the similarities in the IR analysis of different functional groups in solids, we expect that the FSD-IR method will be also instrumental in the research into other porous materials, such as solid oxides and MOFs. The latter point is illustrated by FSD of the IR spectrum of hydroxyl groups in a sample of ?-alumina. PMID:24219854

Vazhnova, Tanya; Lukyanov, Dmitry B



A new spectral deconvolution - selected ion monitoring method for the analysis of alkylated polycyclic aromatic hydrocarbons in complex mixtures.  


A new gas chromatography/mass spectrometry (GC/MS) method is proffered for the analysis of polycyclic aromatic hydrocarbons (PAH) and their alkylated homologs in complex samples. Recent work elucidated the fragmentation pathways of alkylated PAH, concluding that multiple fragmentation patterns per homolog (MFPPH) are needed to correctly identify all isomers. Programming the MS in selected ion monitoring (SIM) mode to detect homolog-specific MFPPH ions delivers the selectivity and sensitivity that the conventional SIM and/or full scan mass spectrometry methods fail to provide. New spectral deconvolution software eliminates the practice of assigning alkylated homolog peaks via pattern recognition within laboratory-defined retention windows. Findings show that differences in concentration by SIM/molecular ion detection of C1-C4 PAH, now the standard, yield concentration differences compared to SIM/MFPPH of thousands of percent for some homologs. The SIM/MFPPH methodology is also amenable to the analysis of polycyclic aromatic sulfur heterocycles (PASH) and their alkylated homologs, since many PASH have the same m/z ions as those of PAH and, thus, are false positives in SIM/1-ion PAH detection methods. PMID:24840423

Robbat, Albert; Wilton, Nicholas M



Automated Deconvolution of Overlapped Ion Mobility Profiles  

NASA Astrophysics Data System (ADS)

Presence of unresolved ion mobility (IM) profiles limits the efficient utilization of IM mass spectrometry (IM-MS) systems for isomer differentiation. Here, we introduce an automated ion mobility deconvolution (AIMD) computer software for streamlined deconvolution of overlapped IM-MS profiles. AIMD is based on a previously reported post-IM/collision-induced dissociation (CID) deconvolution approach [ J. Am. Soc. Mass Spectrom. 23, 1873 (2012)] and, unlike the previously reported manual approach, it does not require resampling of post-IM/CID data. A novel data preprocessing approach is utilized to improve the accuracy and efficiency of the deconvolution process. Results from AIMD analysis of overlapped IM profiles of data from (1) Waters Synapt G1 for a binary mixture of isomeric peptides (amino acid sequences: GRGDS and SDGRG) and (2) Waters Synapt G2-S for a binary mixture of isomeric trisaccharides (raffinose and isomaltotriose) are presented.

Brantley, Matthew; Zekavat, Behrooz; Harper, Brett; Mason, Rachel; Solouki, Touradj



Analysis Tools for Fusion Simulations  

NASA Astrophysics Data System (ADS)

In this talk, we highlight two analysis tools for evaluating fusion simulations. The first tool is for interactively exploring the topology of the magnetic field using a Poincar map. Unlike traditional Poincar maps that rely on a dense set of puncture points to form a contiguous representation of the magnetic surface we use a sparse set of connected puncture points. The puncture points are connected based on a rational approximation of the safety factor. The resulting analysis not only allows for the visualization of magnetic surfaces using a minimal number of puncture points but also identifies features such as magnetic islands. The second tool is for performing query based analysis on simulations utilizing particles. To assist in the analysis of simulation codes that utilize millions to billions of particles we have developed analysis tools that combine parallel coordinates plots combine with accelerated index searches. Parallel coordinate plots allow one to identify trends within multivariate data while accelerated index searches allows one to quickly perform range based queries on a large number of multivariate entries.

Sanderson, Allen; Kruger, Scott; Breslau, Joshua; Ethier, Stephane



Bayesian deconvolution and analysis of photoelectron or any other spectra: Fermi-liquid versus marginal Fermi-liquid behavior of the 3d electrons in Ni  

NASA Astrophysics Data System (ADS)

We present a simple and effective iterative deconvolution of noisy experimental spectra D broadened by the spectrometer function. We show that this ``iterative Bayesian deconvolution'' is closely related to the more complex ``Bayesian analysis,'' also known as the quantified maximum-entropy method. A model m of the true spectral function is needed in both cases. The Bayesian analysis is the most powerful and precise method to relate measured spectra D to the corresponding theoretical models m via the respective probabilities, but two grave conceptual problems together with two severe technical difficulties prevented widespread application. We remove these four obstacles by (i) demonstrating analytically and also by computer simulations that the most probable deconvolution a⁁ obtained as a by-product from the Bayesian analysis gets closer to the true spectral function as the quality of m increases, (ii) finding it equivalent but vastly more efficient to optimize the parameters contained in a given model m by the usual least-squares fit between D and the convolution of m prior to the Bayesian analysis instead of using the Bayesian analysis itself for that purpose, (iii) approximating the convolution by a summation over the energies of the n data points only, with the normalization of the spectrometer function chosen to minimize the errors at both edges of the spectrum, and (iv) avoiding the severe convergence problems frequently encountered in the Bayesian analysis by a simple reformulation of the corresponding system of n nonlinear equations. We also apply our version of the Bayesian analysis to angle-resolved photoelectron spectra taken at normal emission from Ni(111) close to the Fermi energy at about 12 K, using two different physical models: Compared with the marginal Fermi liquid, the Fermi-liquid line shape turns out to be about 104 times more probable to conform with the observed structure of the majority and minority spin peaks in the low-photon and small-binding-energy region.

Gerhardt, U.; Marquardt, S.; Schroeder, N.; Weiss, S.



WEAT: Web Enabled Analysis Tool  

NSDL National Science Digital Library

Behavioral Risk Factor Surveillance System The BRFSS, the world’s largest telephone survey, tracks health risks in the United States. Information from the survey is used to improve the health of the American people. This tool allows users to create crosstablulations and perform logistic analysis on these data.  

Control, Center F.


EPR spectrum deconvolution and dose assessment of fossil tooth enamel using maximum likelihood common factor analysis  

Microsoft Academic Search

In order to determine the components which give rise to the EPR spectrum around g = 2 we have applied Maximum Likelihood Common Factor Analysis (MLCFA) on the EPR spectra of enamel sample 1126 which has previously been analysed by continuous wave and pulsed EPR as well as EPR microscopy. MLCFA yielded agreeing results on three sets of X-band spectra

G. Vanhaelewyn; F. Callens; R. Grn



Grid Stiffened Structure Analysis Tool  

NASA Technical Reports Server (NTRS)

The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.



Common Bolted Joint Analysis Tool  

NASA Technical Reports Server (NTRS)

Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

Imtiaz, Kauser



Shot Planning and Analysis Tools  

SciTech Connect

Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z



Wavespace-Based Coherent Deconvolution  

NASA Technical Reports Server (NTRS)

Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

Bahr, Christopher J.; Cattafesta, Louis N., III



Modeling error in Approximate Deconvolution Models  

E-print Network

We investigate the assymptotic behaviour of the modeling error in approximate deconvolution model in the 3D periodic case, when the order $N$ of deconvolution goes to $\\infty$. We consider successively the generalised Helmholz filters of order $p$ and the Gaussian filter. For Helmholz filters, we estimate the rate of convergence to zero thanks to energy budgets, Gronwall's Lemma and sharp inequalities about Fouriers coefficients of the residual stress. We next show why the same analysis does not allow to conclude convergence to zero of the error modeling in the case of Gaussian filter, leaving open issues.

Adrian Dunca; Roger Lewandowski



Automatic interpretation of magnetic data based on Euler deconvolution with unprescribed structural index  

NASA Astrophysics Data System (ADS)

A tool for fully automatic magnetic data interpretation, solving Euler's homogeneity equation with unprescribed structural index and for a linear background in each moving window, is presented here. The implemented Euler deconvolution algorithm is based on the properties of the differential similarity transformation, which decouples the coordinates and the structural index of the singular point and the parameters of the linear background field. Since the deconvolution algorithm resolves the singular point locations well, this allows the application of a two stage clustering technique, focusing the estimated singular point coordinates and structural indices, followed by a statistical analysis of the final solutions. The automatic technique was tested on simple and complex 3D model magnetic anomalies. Finally, the technique was applied to real magnetic anomaly data from the Burgas region and the adjoining Black Sea shelf of Bulgaria. The tool consists of two main functions, written in Matlab v.5.3, requiring Matlab's SPLINE and STATISTICS toolkits.

Gerovska, Daniela; Arazo-Bravo, Marcos J.



Survey of visualization and analysis tools  

NASA Technical Reports Server (NTRS)

A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

Meyer, P. J.



Tool for Magnetic Analysis Package  

NSDL National Science Digital Library

The FIT-MART Launcher package is a self-contained file for simulating systems of interacting quantum magnetic moments (spins). The interactions are modeled using the Heisenberg model, calculations are carried out by numerically diagonalizing the matrix representation of the Heisenberg Hamiltonian, and several types of plots are generated to describe various aspects of the model. The FIT-MART package is a Fully Integrated Tool for Magnetic Analysis in Research & Teaching (hence the acronym) which provides a very simple interface for defining complex quantum spin models, carrying out complex calculations, and visualizing the results using several graphical representations. These representations include plots of the energy spectrum as well as plots of the magnetization and magnetic susceptibility as a function of temperature and magnetic field. The FIT-MART package is an Open Source Physics package written to help students as well as researchers who are studying magnetism. It is distributed as a ready-to-run (compiled) Java archive. Double clicking osp_fit_mart.jar file will run the package if Java is installed. In future versions of this package, curricular materials will be included to help students to learn about magnetism, and automated fitting routines will be included to help researchers quickly and easily model experimental data.

Engelhardt, Larry; Rainey, Cameron



Application in Alzheimer's Disease Early Detection Deconvolution  

E-print Network

Outline Application in Alzheimer's Disease Early Detection Deconvolution Conclusions and Future Extraction from PET Images #12;Outline Application in Alzheimer's Disease Early Detection Deconvolution Conclusions and Future Work Application in Alzheimer's Disease Early Detection Deconvolution Regularized Least

Renaut, Rosemary


General Mission Analysis Tool (GMAT) Mathematical Specifications  

NASA Technical Reports Server (NTRS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

Hughes, Steve



NCI Interactive Budget Analysis Tool

This tool provides users an interactive overview of the National Cancer Institute (NCI) budget and Fact Book data since Fiscal Year 1999. Additional historical NCI budget information can be obtained through the NCI Fact Book Collection.


Windprofiler optimization using digital deconvolution procedures  

NASA Astrophysics Data System (ADS)

Digital improvements to data acquisition procedures used for windprofiler radars have the potential for improving the height coverage at optimum resolution, and permit improved height resolution. A few newer systems already use this capability. Real-time deconvolution procedures offer even further optimization, and this has not been effectively employed in recent years. In this paper we demonstrate the advantages of combining these features, with particular emphasis on the advantages of real-time deconvolution. Using several multi-core CPUs, we have been able to achieve speeds of up to 40 GHz from a standard commercial motherboard, allowing data to be digitized and processed without the need for any type of hardware except for a transmitter (and associated drivers), a receiver and a digitizer. No Digital Signal Processor chips are needed, allowing great flexibility with analysis algorithms. By using deconvolution procedures, we have then been able to not only optimize height resolution, but also have been able to make advances in dealing with spectral contaminants like ground echoes and other near-zero-Hz spectral contamination. Our results also demonstrate the ability to produce fine-resolution measurements, revealing small-scale structures within the backscattered echoes that were previously not possible to see. Resolutions of 30 m are possible for VHF radars. Furthermore, our deconvolution technique allows the removal of range-aliasing effects in real time, a major bonus in many instances. Results are shown using new radars in Canada and Costa Rica.

Hocking, W. K.; Hocking, A.; Hocking, D. G.; Garbanzo-Salas, M.



Fast Holographic Deconvolution: a new technique for precision radio interferometry  

E-print Network

We introduce the Fast Holographic Deconvolution method for analyzing interferometric radio data. Our new method is an extension of A-projection/software-holography/forward modeling analysis techniques and shares their precision deconvolution and widefield polarimetry, while being significantly faster than current implementations that use full direction-dependent antenna gains. Using data from the MWA 32 antenna prototype, we demonstrate the effectiveness and precision of our new algorithm. Fast Holographic Deconvolution may be particularly important for upcoming 21 cm cosmology observations of the Epoch of Reionization and Dark Energy where foreground subtraction is intimately related to the precision of the data reduction.

Sullivan, Ian; Hazelton, Bryna; Arcus, Wayne; Barnes, David; Bernardi, Gianni; Briggs, Frank; Bowman, Judd D; Bunton, John; Cappallo, Roger; Corey, Brian; Deshpande, Avinash; deSouza, Ludi; Emrich, David; Gaensler, B M; Goeke, Robert; Greenhill, Lincoln; Herne, David; Hewitt, Jacqueline; Johnston-Hollitt, Melanie; Kaplan, David; Kasper, Justin; Kincaid, Barton; Koenig, Ronald; Kratzenberg, Eric; Lonsdale, Colin; Lynch, Mervyn; McWhirter, Russell; Mitchell, Daniel; Morgan, Edward; Oberoi, Divya; Ord, Stephen; Pathikulangara, Joseph; Prabu, Thiagaraj; Remillard, Ron; Rogers, Alan; Roshi, Anish; Salah, Joseph; Sault, Robert; Shankar, Udaya; Srivani, K; Stevens, Jamie; Subrahmanyan, Ravi; Tingay, Steven; Wayth, Randall; Waterson, Mark; Webster, Rachel; Whitney, Alan; Williams, Andrew; Williams, Chris; Wyithe, Stuart



Analysis of Ten Reverse Engineering Tools  

NASA Astrophysics Data System (ADS)

Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

Koskinen, Jussi; Lehmonen, Tero


Statistical Tools for Forensic Analysis of Toolmarks  

SciTech Connect

Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser



Stochastic Simulation Tool for Aerospace Structural Analysis  

NASA Technical Reports Server (NTRS)

Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

Knight, Norman F.; Moore, David F.



NEAT: Nebular Empirical Analysis Tool  

NASA Astrophysics Data System (ADS)

NEAT is a fully automated code which carries out a complete analysis of lists of emission lines to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEAT uses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances.

Wesson, R.; Stock, D.; Scicluna, P.



Deconvolution using a neural network  

SciTech Connect

Viewing one dimensional deconvolution as a matrix inversion problem, we compare a neural network backpropagation matrix inverse with LMS, and pseudo-inverse. This is a largely an exercise in understanding how our neural network code works. 1 ref.

Lehman, S.K.



Tools for Basic Statistical Analysis  

NASA Technical Reports Server (NTRS)

Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

Luz, Paul L.



The Galileo Fault Tree Analysis Tool  

Microsoft Academic Search

We present Galileo, a dynamic fault tree modeling and analysis tool that combines the innovative DIF- Tree analysis methodology with a rich user interface built using package-oriented programming. DIFTree integrates binary decision diagram and Markov meth- ods under the common notation of dynamic fault trees, allowing the user to exploit the benefits of both tech- niques while avoiding the need

Kevin J. Sullivan; Joanne Bechta Dugan; David Coppit



The Vampir Performance Analysis Tool-Set  

Microsoft Academic Search

\\u000a This paper presents the Vampir tool-set for performance analysis of parallel applications. It consists of the run-time measurement\\u000a system VampirTrace and the visualization tools Vampir and VampirServer. It describes the major features and outlines the underlying\\u000a implementation that is necessary to provide low overhead and good scalability. Furthermore, it gives a short overview about\\u000a the development history and future work

Andreas Knpfer; Holger Brunst; Jens Doleschal; Matthias Jurenz; Matthias Lieber; Holger Mickler; Matthias S. Muller; Wolfgang E. Nagel



A network evaluation and analysis tool  

SciTech Connect

The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.



A network evaluation and analysis tool  

SciTech Connect

The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.



Built Environment Energy Analysis Tool Overview (Presentation)  

SciTech Connect

This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

Porter, C.



Genomic sequence analysis tools: a user's guide.  


The wealth of information from various genome sequencing projects provides the biologist with a new perspective from which to analyze, and design experiments with, mammalian systems. The complexity of the information, however, requires new software tools, and numerous such tools are now available. Which type and which specific system is most effective depends, in part, upon how much sequence is to be analyzed and with what level of experimental support. Here we survey a number of mammalian genomic sequence analysis systems with respect to the data they provide and the ease of their use. The hope is to aid the experimental biologist in choosing the most appropriate tool for their analyses. PMID:11226611

Fortna, A; Gardiner, K



Performance Analysis of GYRO: A Tool Evaluation  

SciTech Connect

The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.



Design and Analysis Tools for Supersonic Inlets  

NASA Technical Reports Server (NTRS)

Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

Slater, John W.; Folk, Thomas C.



CRAB: Distributed analysis tool for CMS  

NASA Astrophysics Data System (ADS)

CMS has a distributed computing model, based on a hierarchy of tiered regional computing centers and adopts a data driven model for the end user analysis. This model foresees that jobs are submitted to the analysis resources where data are hosted. The increasing complexity of the whole computing infrastructure makes the simple analysis work flow more and more complicated for the end user. CMS has developed and deployed a dedicated tool named CRAB (CMS Remote Analysis Builder) in order to guarantee the physicists an efficient access to the distributed data whilst hiding the underlying complexity. This tool is used by CMS to enable the running of physics analysis jobs in a transparent manner over data distributed across sites. It factorizes out the interaction with the underlying batch farms, grid infrastructure and CMS data management tools, allowing the user to deal only with a simple and intuitive interface. We present the CRAB architecture, as well as the current status and lessons learnt in deploying this tool for use by the CMS collaboration. We also present the future development of the CRAB system.

Sala, Leonardo; CMS Collaboration



Mars Reconnaissance Orbiter Uplink Analysis Tool  

NASA Technical Reports Server (NTRS)

This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline



Interactive Machine Learning Tools for Data Analysis  

NASA Astrophysics Data System (ADS)

In this work we propose a scientific data exploration methodology and software environment that permits to obtain both data clustering/labeling and visualization. The approach is based on an elaboration pipeline, going from data importing to cluster analysis and assessment with each stage supported by dedicated visualization and interaction tools. Supported techniques include a stability based procedure for the algorithmic parameters estimation (i.e. number of centers in K-means or map dimension in Self-Organizing Maps); a set of models realizing the fuzzy membership analysis and allowing users to select sub-clusters according to a given stability threshold; a tool for studying cluster reliability based on resampling; a tool for the interactive hierarchical agglomeration of clusters. Finally, a novel technique allowing the exploration and visualization of the space of clustering solutions is introduced. All implemented techniques are supported through appealing visualizations in a highly interactive environment.

Tagliaferri, Roberto; Iorio, Francesco; Napolitano, Francesco; Raiconi, Giancarlo; Miele, Gennaro




Microsoft Academic Search

The Advanced Composites Repair Analysis Tool (ACRAT) has been under development for the USAF Advanced Composites Program Office under an Ogden ALC Design Engineering Program (DEP) Contractual Engineering Task (CET) Order. ACRAT is an integrated prototype software system consisting of commercial-off-the-shelf (COTS) and public domain CAE simulation codes and customized databases. The objective has been to develop Beta versions of

Thomas E. Mack; James Y. Song


Virtual tools for teaching electrocardiographic rhythm analysis.  


Electrocardiographic (ECG) rhythm analysis is inadequately taught in training programs, resulting in undertrained physicians reading a large percentage of the 40 million ECGs recorded annually. The effective use of simple tools (calipers, ruler, and magnifier) required for crucial measurements and comparisons of intervals requires considerable time for interactive instruction and is difficult to teach in the classroom. The ECGViewer (Blaufuss Medical Multimedia Laboratories, Palo Alto, Calif) program was developed using virtual tools easily manipulated by computer mouse that can be used to analyze archived scanned ECGs on computer screens and classroom projection. Trainees manipulate the on-screen tools from their seats by wireless mouse while the instructor makes corrections with a second mouse, in clear view of the trainees. An on-screen ladder diagram may be constructed by the trainee and critiqued by the instructor. The ECGViewer program has been successfully used and well received by trainees at medical school, residence, and subspecialty fellow level. PMID:16387064

Criley, John Michael; Nelson, William P



Multiscale CLEAN Deconvolution of Radio Synthesis Images  

Microsoft Academic Search

Radio synthesis imaging is dependent upon deconvolution algorithms to counteract the sparse sampling of the Fourier plane. These deconvolution algorithms find an estimate of the true sky brightness from the necessarily incomplete sampled visibility data. The most widely used radio synthesis deconvolution method is the CLEAN algorithm of Hogbom. This algorithm works extremely well for collections of point sources and

Tim J. Cornwell



Deconvolution of gas chromatographic data  

NASA Technical Reports Server (NTRS)

The use of deconvolution methods on gas chromatographic data to obtain an accurate determination of the relative amounts of each material present by mathematically separating the merged peaks is discussed. Data were obtained on a gas chromatograph with a flame ionization detector. Chromatograms of five xylenes with differing degrees of separation were generated by varying the column temperature at selected rates. The merged peaks were then successfully separated by deconvolution. The concept of function continuation in the frequency domain was introduced in striving to reach the theoretical limit of accuracy, but proved to be only partially successful.

Howard, S.; Rayborn, G. H.



Integrated tools for control-system analysis  

NASA Technical Reports Server (NTRS)

The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.



Modeling of pharmacokinetic systems using stochastic deconvolution.  


In environments where complete mechanistic knowledge of the system dynamics is not available, a synergy of first-principle concepts, stochastic methods and statistical approaches can provide an efficient, accurate, and insightful strategy for model development. In this work, a system of ordinary differential equations describing system pharmacokinetics (PK) was coupled to a Wiener process for tracking the absorption rate coefficient, and was embedded in a nonlinear mixed effects population PK formalism. The procedure is referred to as "stochastic deconvolution" and it is proposed as a diagnostic tool to inform on a mapping function between the fraction of the drug absorbed and the fraction of the drug dissolved when applying one-stage methods to in vitro-in vivo correlation modeling. The goal of this work was to show that stochastic deconvolution can infer an a priori specified absorption profile given dense observational (simulated) data. The results demonstrate that the mathematical model is able to accurately reproduce the simulated data in scenarios where solution strategies for linear, time-invariant systems would assuredly fail. To this end, PK systems that are representative of Michaelis-Menten kinetics and enterohepatic circulation were investigated. Furthermore, the solution times are manageable using a modest computer hardware platform. PMID:24174399

Kakhi, Maziar; Chittenden, Jason



Interpretation and Deconvolution of Nanodisc Native Mass Spectra  

NASA Astrophysics Data System (ADS)

Nanodiscs are a promising system for studying gas-phase and solution complexes of membrane proteins and lipids. We previously demonstrated that native electrospray ionization allows mass spectral analysis of intact Nanodisc complexes at single lipid resolution. This report details an improved theoretical framework for interpreting and deconvoluting native mass spectra of Nanodisc lipoprotein complexes. In addition to the intrinsic lipid count and charge distributions, Nanodisc mass spectra are significantly shaped by constructive overlap of adjacent charge states at integer multiples of the lipid mass. We describe the mathematical basis for this effect and develop a probability-based algorithm to deconvolute the underlying mass and charge distributions. The probability-based deconvolution algorithm is applied to a series of dimyristoylphosphatidylcholine Nanodisc native mass spectra and used to provide a quantitative picture of the lipid loss in gas-phase fragmentation.

Marty, Michael T.; Zhang, Hao; Cui, Weidong; Gross, Michael L.; Sligar, Stephen G.



Optimal application of Morrison's iterative noise removal for deconvolution  

NASA Technical Reports Server (NTRS)

Morrison's iterative method of noise removal can be applied for both noise removal alone and noise removal prior to deconvolution. This method is applied to noise of various noise levels added to determine the optimum use of the method. The phase shift method of migration and modeling is evaluated and the results are compared to Stolt's approach. A method is introduced by which the optimum iterative number for deconvolution can be found. Statistical computer simulation is used to describe the optimum use of two convergent iterative techniques for seismic data. The Always-Convergent deconvolution technique was applied to data recorded during the quantitative analysis of materials through NonDestructive Evaluation (NDE) in which ultrasonic signals were used to detect flaws in substances such as composites.

Ioup, George E.; Ioup, Juliette W.



Deconvolution using the complex cepstrum  

SciTech Connect

The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

Riley, H B



Tools for Next Generation Sequencing Data Analysis  

PubMed Central

As NGS technology continues to improve, the amount of data generated per run grows exponentially. Unfortunately, the primary bottleneck in NGS studies is still bioinformatics analysis. Not all researchers have access to a bioinformatics core or dedicated bioinformatician. Additionally, much of the software for NGS analyses is written to run in a Unix / Linux environment. Researchers unfamiliar with the Unix command line may be unable to use these tools, or face a steep learning curve in trying to do so. Commercial packages exist, such as the CLC Genomics Workbench, DNANexus, and GenomeQuest. However, these commercial packages often incorporate proprietary algorithms to perform data analysis and may be costly. Galaxy provides a solution to this problem by incorporating popular open-source and community linux command line tools into an easy to use web-based environment. After sequence data has been uploaded and mapped, there are a variety of workflows for NGS analyses that use open-source tools. This includes peak-calling analyses for ChIP-Seq (MACS, GeneTrack indexer, Peak predictor), RNA-Seq (Tophat, Cufflinks), and finding small insertions, deletions, and SNPs using SAMtools. Any researcher can apply a workflow to his NGS data and retrieve results, without having to interact with a command line. Additionally, since Galaxy is cloud-based, expensive computing hardware for performing analyses is not needed. In this presentation we will provide an overview of two popular open source RNA-Seq analysis tools, Tophat and Cufflinks, and demonstrate how they can be used in Galaxy.

Bodi, K.



Interval estimate with probabilistic background constraints in deconvolution  

NASA Astrophysics Data System (ADS)

We present in this article the use of probabilistic background constraints in astronomical image deconvolution to approach to a solution as an interval estimate. We elaborate our objective -- the interval estimate of the unknown object from observed data and our approach -- monte-carlo experiment and analysis of marginal distributions of image values. One-dimensional observation and deconvolution using proposed approach are simulated. Confidence intervals reveal the uncertainties due to the background constraint are calculated and significance levels for sources retrieved from restored images are provided.

Huo, Zhuo-xi; Zhou, Jian-feng



Desktop Analysis Reporting Tool (DART) User's Guide  

SciTech Connect

The Desktop Analysis Reporting Tool (DART) is a software package that allows a user to easily view and analyze radiation portal monitor (RPM) daily files that span long periods. DART gives users the capability to determine the state of health of a monitor, troubleshoot and diagnose problems, and view data in various time frames to perform trend analysis. In short, it converts the data strings written in the daily files into meaningful tables and plots. DART is an application-based program that was designed to maximize the benefit of a centralized data repository while distributing the workload to individual desktop machines. This networked approach requires a more complex database manager (SQL Server); however, SQL Server is not currently provided with the DART Installation Disk. SQL Express is sufficient for local data analysis and requires the installation of SQL Express and DART on each machine intended for analysis.

Lousteau, Angela L [ORNL; Alcala, Scott [ORNL



A comparative study of social network analysis tools  

E-print Network

1 A comparative study of social network analysis tools David Combe1 , Christine Largeron1, Elod need for social network mining and social network analysis (SNA) methods and tools in order to provide of these tools which implement algorithms dedicated to social network analysis. Keywords: Social Network Analysis

Paris-Sud XI, Université de


9. Analysis a. Analysis tools for dam removal  

E-print Network

reservoir sediment when removing a dam are river erosion, mechanical removal, and stabilization (ASCE 19979. Analysis a. Analysis tools for dam removal v. Hydrodynamic, sediment transport and physical modeling 1.0 Rationale Sediment erosion from the reservoir and subsequent deposition downstream

Tullos, Desiree


REDCAT: a residual dipolar coupling analysis tool  

NASA Astrophysics Data System (ADS)

Recent advancements in the utilization of residual dipolar couplings (RDCs) as a means of structure validation and elucidation have demonstrated the need for, not only a more user friendly, but also a more powerful RDC analysis tool. In this paper, we introduce a software package named REsidual Dipolar Coupling Analysis Tool (REDCAT) designed to address the above issues. REDCAT is a user-friendly program with its graphical-user-interface developed in Tcl/Tk, which is highly portable. Furthermore, the computational engine behind this GUI is written in C/C++ and its computational performance is therefore excellent. The modular implementation of REDCAT's algorithms, with separation of the computational engine from the graphical engine allows for flexible and easy command line interaction. This feature can be utilized for the design of automated data analysis sessions. Furthermore, this software package is portable to Linux clusters for high throughput applications. In addition to basic utilities to solve for order tensors and back calculate couplings from a given order tensor and proposed structure, a number of improved algorithms have been incorporated. These include the proper sampling of the Null-space (when the system of linear equations is under-determined), more sophisticated filters for invalid order-tensor identification, error analysis for the identification of the problematic measurements and simulation of the effects of dynamic averaging processes.

Valafar, Homayoun; Prestegard, James H.



REDCAT: a residual dipolar coupling analysis tool.  


Recent advancements in the utilization of residual dipolar couplings (RDCs) as a means of structure validation and elucidation have demonstrated the need for, not only a more user friendly, but also a more powerful RDC analysis tool. In this paper, we introduce a software package named REsidual Dipolar Coupling Analysis Tool (REDCAT) designed to address the above issues. REDCAT is a user-friendly program with its graphical-user-interface developed in Tcl/Tk, which is highly portable. Furthermore, the computational engine behind this GUI is written in C/C++ and its computational performance is therefore excellent. The modular implementation of REDCAT's algorithms, with separation of the computational engine from the graphical engine allows for flexible and easy command line interaction. This feature can be utilized for the design of automated data analysis sessions. Furthermore, this software package is portable to Linux clusters for high throughput applications. In addition to basic utilities to solve for order tensors and back calculate couplings from a given order tensor and proposed structure, a number of improved algorithms have been incorporated. These include the proper sampling of the Null-space (when the system of linear equations is under-determined), more sophisticated filters for invalid order-tensor identification, error analysis for the identification of the problematic measurements and simulation of the effects of dynamic averaging processes. PMID:15040978

Valafar, Homayoun; Prestegard, James H



BBAT: Bunch and bucket analysis tool  

SciTech Connect

BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation.

Deng, D.P.



Blind deconvolution of impulsive signals using a modified Sato algorithm  

Microsoft Academic Search

Many blind deconvolution algorithms have been designed to extract digital communications signals corrupted by inter-symbol interference. Such algorithms generally fail when applied to signals with impulsive characteristics, such as acoustic signals. In this paper, we provide a theoretical analysis and explanation as to why Bussgang-type algorithms are generally unsuitable for deconvolving impulsive signals. We then propose a novel modification of

Heinz Mathis; Scott C. Douglas



Enhancement of Local Climate Analysis Tool  

NASA Astrophysics Data System (ADS)

The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.



Web-based pre-Analysis Tools  

E-print Network

The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

Moskalets, Tetiana



Microfracturing and new tools improve formation analysis  

SciTech Connect

This paper reports on microfracturing with nitrogen, an experimental extensometer, stress profile determination from wire line logs, and temperature logging in air-filled holes which are new tools and techniques that add resolution to Devonian shale gas well analysis. Microfracturing creates small fractures by injecting small amounts of fluid at very low rates. Microfracs are created usually at several different depths to determine stress variation as a function of depth and rock type. To obtain and oriented core containing the fracture, the formation is microfractured during drilling. These tests are critical in establishing basic open hole parameters for designing the main fracture treatment.

McMechan, D.E.; Venditto, J.J.; Heemstra, T. (New England River Basins Commission, Boston, MA (United States). Power and Environment Committee); Simpson, G. (Halliburton Logging Services, Houston, TX (United States)); Friend, L.L.; Rothman, E. (Columbia Natural Resources Inc., Charleston, WV (United States))



GIS-based hydrogeochemical analysis tools (QUIMET)  

NASA Astrophysics Data System (ADS)

A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Scheller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

Velasco, V.; Tubau, I.; Vzquez-Su, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.



SEAT: A strategic engagement analysis tool  

SciTech Connect

The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

Dreicer, J.; Michelsen, C.; Morgeson, D.



AIDA: Adaptive Image Deconvolution Algorithm  

NASA Astrophysics Data System (ADS)

AIDA is an implementation and extension of the MISTRAL myopic deconvolution method developed by Mugnier et al. (2004) (see J. Opt. Soc. Am. A 21:1841-1854). The MISTRAL approach has been shown to yield object reconstructions with excellent edge preservation and photometric precision when used to process astronomical images. AIDA improves upon the original MISTRAL implementation. AIDA, written in Python, can deconvolve multiple frame data and three-dimensional image stacks encountered in adaptive optics and light microscopic imaging.

Hom, Erik; Haase, Sebastian; Marchis, Franck



Two-kernel image deconvolution.  


A method for deconvolution of the true image of an object from two recorded images is proposed. These two images have to be made by an imaging system with two different but interconnected kernels. The method is formulated as a system of Fredholm equations of the first kind reduced to a single functional equation in Fourier space. The kernels of the system and the true image of an object are found from the same recorded images. PMID:24216949

Gorelik, V



Analysis of Machining Stability for a Parallel Machine Tool  

Microsoft Academic Search

Machine tool chatter is a self-excited vibration generated by chip thickness variation. It severely degrades the quality of the machined surface. The incidence of chatter is greatly affected by the dynamic characteristics of machine tool structure. This article extends chatter stability analysis to a machine tool equipped with a parallel mechanism. The vibration model of a parallel machine tool is

D. Hong; S. Kim; W. C. Choi; J.-B. Song



The Analysis of Stone Tool Procurement, Production, and Maintenance  

E-print Network

The Analysis of Stone Tool Procurement, Production, and Maintenance William Andrefsky Jr. Published stone tools and their production debris have made significant progress in understanding the relationship between stone tools and human organizational strategies. Stone tools are understood to be morphologically

Kohler, Tim A.


Approximate deconvolution large eddy simulation of a barotropic ocean circulation model  

NASA Astrophysics Data System (ADS)

We investigate a new large eddy simulation closure modeling strategy for two-dimensional turbulent geophysical flows. This closure modeling approach utilizes approximate deconvolution, which is based solely on mathematical approximations and does not employ additional phenomenological arguments. The new approximate deconvolution model is tested in the numerical simulation of the wind-driven circulation in a shallow ocean basin, a standard prototype of more realistic ocean dynamics. The model employs the barotropic vorticity equation driven by a symmetric double-gyre wind forcing, which yields a four-gyre circulation in the time mean. The approximate deconvolution model yields the correct four-gyre circulation structure predicted by a direct numerical simulation, but on a coarser mesh and at a fraction of the computational cost. This first step in the numerical assessment of the new model shows that approximate deconvolution could be a viable tool for under-resolved computations in the large eddy simulation of more realistic turbulent geophysical flows.

Staples, Anne; San, Omer



Orbit Analysis Tools Software Version 1 for Windows User's Guide.  

National Technical Information Service (NTIS)

The Orbit Analysis Tools Software (OATS) is a mission planning and analysis tool for earth-orbiting satellites. OATS evolved from a collection of software tools developed by the Astrodynamics and Space Applications Office of the Naval Center for Space Tec...

J. W. Middour, A. S. Hope, J. L. Cox, R. K. Llewellyn



Data Analysis Tools for NSTX-U Physics Meeting  

E-print Network

Data Analysis Tools for NSTX-U Bill Davis Stan Kaye Physics Meeting B-318 Aug. 26, 2013 NSTX LLC #12;NSTX-U Monday Physics Meeting­ Data Analysis Tools, Bill Davis (8/26/2013) 2 Overview ·Web Tools in depth · Overlaying in different ways · Browsing Fast Camera data ·EFITmovies and EFITviewer

Princeton Plasma Physics Laboratory


Multi-Mission Power Analysis Tool  

NASA Technical Reports Server (NTRS)

Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

Broderick, Daniel



Simplified building energy analysis tool for architects  

NASA Astrophysics Data System (ADS)

Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools applicable to the earliest stage of design, where more informed analysis of possible alternatives could yield the most benefit and the greatest cost savings both economic and environmental. This is where computer modeling and simulation can really lead to better and energy efficient buildings. Both apply to internal environment and human comfort, and environmental impact from surroundings.

Chaisuparasmikul, Pongsak


PyRAT - python radiography analysis tool (u)  

SciTech Connect

PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory; Armstrong, Jerawan C [Los Alamos National Laboratory



Built Environment Analysis Tool: April 2013  

SciTech Connect

This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

Porter, C.



Analysis of Ten Reverse Engineering Tools  

Microsoft Academic Search

\\u000a Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining\\u000a large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide\\u000a the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided\\u000a data structures, visualization

Jussi Koskinen; Tero Lehmonen



ISHM Decision Analysis Tool: Operations Concept  

NASA Technical Reports Server (NTRS)

The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.



Numerical Uncertainty Quantification for Radiation Analysis Tools  

NASA Technical Reports Server (NTRS)

Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha



Deconvolution of fluorescence decays and estimation errors  

NASA Astrophysics Data System (ADS)

The non-iterative Prony's method is discussed for the deconvolution of multi-exponential fluorescence decays. The performance of these algorithms in the case of a two- exponential decay process is evaluated, using a Monte-Carlo simulation, in terms of the estimation errors caused by the signal noise. The results which are presented show that the performance of Prony's method can be greatly improved with the selection of an optimized observation window width and a few algorithm-related parameters. Comparison between Prony's method and the Marquardt least-squared-error algorithm is also made, showing the performance of the former is close to that of the latter with a 98% reduction in calculation running time. The applications of Prony's algorithms in real-time, quasi-distributed temperature sensor systems are discussed and the experimental results are presented to justify the use of the algorithms in Prony's method in practical double exponential fluorescence decay analysis.

Zhang, Zhiyi; Sun, Tong; Grattan, Kenneth T. V.; Palmer, Andrew W.



Spacecraft Electrical Power System (EPS) generic analysis tools and techniques  

NASA Technical Reports Server (NTRS)

An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

Morris, Gladys M.; Sheppard, Mark A.



Comparative evaluation of software for deconvolution of metabolomics data based on GC-TOF-MS  

Microsoft Academic Search

Traditional options available for deconvolution of data from gas chromatography-mass spectrometry (GC-MS) experiments have mostly been confined to semi-automated methods, which cannot compete with high-throughput and rapid analysis in metabolomics. In the present study, data sets acquired using GC with time-of-flight MS (GC-TOF-MS) were processed using three different deconvolution software packages (LECO ChromaTOF, AMDIS and SpectralWorks AnalyzerPro).We paid attention to

Warwick B. Dunn; Hailin Shen; Douglas B. Kell



Principal components analysis as a tool for Quaternary paleoclimatic research  

SciTech Connect

Nine small lakes on southeast Baffin Island, NWT, Canada, were cored and the sediments retrieved were analyzed for sediment size and composition, magnetic susceptibility, sediment geochemistry, organic matter content, and carbon isotopic composition. Age control was obtained from 85 AMS radiocarbon dates. in total, 1,847 measurements were made on twelve cores. The size of the data set precluded the use of visual analysis of the trends within each of the variable data sets. The method used to deconvolute the paleoenvironmental signal was one of principal components analysis and regression. Principal components analysis was carried out on the entire data set to determine which variables caused most of the variance within the overall signal. This showed that three principal components axes (PCAs) could account for 79% of the total variance within the data set. For each PCA, the closest correlated variable was chosen (sand content, total organic matter content, and sedimentation rate) and for each lake core, this variable was regressed against time. Residuals from the regression trend were then derived and normalized to a Z score. Z scores for each variable were plotted against age. Then, within 500 year timeslots, the median residual Z score was determined. This gave a stepped record of residuals throughout the Holocene and indicated periods of significant environmental change within the lakes' watersheds. Comparing this to previously obtained pollen and diatom records from the same area showed similarity and also illustrated important local differences.

Miller, R.J.O. (Univ. of Colorado, Boulder, CO (United States). Dept. of Geological Sciences)



ProMAT: protein microarray analysis tool  

SciTech Connect

Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.



Bayesian approach based blind image deconvolution with fuzzy median filter  

NASA Astrophysics Data System (ADS)

The inverse problem associated with reconstruction of Poisson blurred images has attracted attention in recent years. In this paper, we propose an alternative unified approach to blind image deconvolution problem using fuzzy median filter as Gibbs prior to model the nature of inter pixel interaction for better edge preserving reconstruction. The performance of the algorithm at various SNR levels has been studied quantitatively using PSNR, RMSE and universal quality index (UQI). Comparative analysis with existing methods has also been carried out.

Mohan, S. Chandra; Rajan, K.; Srinivasan, R.



Interactive Graphics Tools for Analysis of MOLA and Other Data  

NASA Technical Reports Server (NTRS)

We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

Frey, H.; Roark, J.; Sakimoto, S.



A Multidimensional Analysis Tool for Visualizing Online Interactions  

ERIC Educational Resources Information Center

This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their

Kim, Minjeong; Lee, Eunchul



The CoastWatch Data Analysis Tool Peter Hollemans  

E-print Network

for each open file Preferences for: · Initial data enhancement range · Color palette · Date and timeThe CoastWatch Data Analysis Tool Peter Hollemans SES Inc. Contractor for NOAA/NESDIS October, 2003 #12;October, 2003 Peter Hollemans, SES Inc. Contractor for NOAA/ NESDIS CoastWatch Data Analysis Tool


Distribution System Analysis Tools for Studying High Penetration of PV  

E-print Network

. In order to understand and analyze the impact of high penetration of inverter-interfaced PVDistribution System Analysis Tools for Studying High Penetration of PV with Grid Support Features Electric Energy System #12;#12;Distribution System Analysis Tools for Studying High Penetration of PV


General Mission Analysis Tool (GMAT) User's Guide (Draft)  

NASA Technical Reports Server (NTRS)

4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

Hughes, Steven P.



A coverage analysis tool for the effectiveness of software testing  

Microsoft Academic Search

This paper describes the software testing and analysis tool, ATAC (Automatic Test Analysis for C), developed as a research instrument to measure the effectiveness of testing data. It is also a tool to facilitate the design and evaluation of test cases during software development. To demonstrate the capability and applicability of ATAC, the authors obtained 12 program versions of a

Michael R. Lyu; J. R. Horgan; Saul London



Deconvolution of wellbore pressure and flow rate  

SciTech Connect

Determination of the influence function of a well/reservoir system from the deconvolution of wellbore flow rate and pressure is presented. Deconvolution is fundamental and is particularly applicable to system identification. A variety of different deconvolution algorithms are presented. The simplest algorithm is a direct method that works well for data without measurement noise but that fails in the presence of even small amounts of noise. The authors show, however, that a modified algorithm that imposes constraints on the solution set works well, even with significant measurement errors.

Kuchuk, F.J. (Schlumberger-Doll Research Center, Ridgefield, CT (USA)); Carter, R.G. (National Aeronautics and Space Administration, Hampton, VA (USA). Langley Research Center); Ayestaran, L. (Schlumberger Technical Services, Dubai (AE))



Fuzzy logic components for iterative deconvolution systems  

NASA Astrophysics Data System (ADS)

Deconvolution systems rely heavily on expert knowledge and would benefit from approaches that capture this expert knowledge. Fuzzy logic is an approach that is used to capture expert knowledge rules and produce outputs that range in degree. This paper describes a fuzzy-deconvolution-system that integrates traditional Richardson-Lucy deconvolution with fuzzy components. The system is intended for restoration of 3D widefield images taken under conditions of refractive index mismatch. The system uses a fuzzy rule set for calculating sample refractive index, a fuzzy median filter for inter-iteration noise reduction, and a fuzzy rule set for stopping criteria.

Northan, Brian M.



An Integrated Tool for System Analysis of Sample Return Vehicles  

NASA Technical Reports Server (NTRS)

The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.



Vibroseis deconvolution: A comparison of pre and post correlation vibroseis deconvolution data in real noisy data  

NASA Astrophysics Data System (ADS)

Vibroseis is a source used commonly for inland seismic exploration. This non-destructive source is often used in urban areas with strong environmental noise. The main goal of seismic data processing is to increase the signal/noise ratio where a determinant step is deconvolution. Vibroseis seismic data do not meet the basic minimum-phase assumption for the application of spiking and predictive deconvolution, therefore various techniques, such as phase shift, are applied to the data, to be able to successfully perform deconvolution of vibroseis data. This work analyzes the application of deconvolution techniques before and after cross-correlation on a real data set acquired for high resolution prospection of deep aquifers. In particular, we compare pre-correlation spiking and predictive deconvolution with Wiener filtering and with post-correlation time variant spectral whitening deconvolution. The main result is that at small offsets, post cross-correlation spectral whitening deconvolution and pre-correlation spiking deconvolution yield comparable results, while for large offsets the best result is obtained by applying a pre-cross-correlation predictive deconvolution.

Baradello, Luca; Accaino, Flavio



Model analysis tools in the Virtual Model Repository (VMR)  

NASA Astrophysics Data System (ADS)

The Virtual Model Repository (VMR) provides scientific analysis tools for a wide variety of numerical models of the Earth's magnetosphere. Data discovery, visualization tools and data/model comparisons are provided in a consistent and intuitive format. A large collection of numerical model runs are available to analyze, including the large Earth magnetosphere event run library at the CCMC and many runs from the University of Michigan. Relevant data useful for data/model comparisons is found using various APIs and included in many of the visualization tools. Recent additions to the VMR include a comprehensive suite of tools for analysis of the Global Ionosphere Thermosphere Model (GITM).

De Zeeuw, D.; Ridley, A. J.



Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool  

NASA Technical Reports Server (NTRS)

This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

Maul, William A.; Fulton, Christopher E.



The role and selection of the filter function in Fourier self-deconvolution revisited.  


Overlapped bands often appear in applications of infrared spectroscopy, for instance in the analysis of the amide I band of proteins. Fourier self-deconvolution (FSD) is a popular band-narrowing mathematical method, allowing for the resolution of overlapped bands. The filter function used in FSD plays a significant role in the factor by which the deconvolved bands are actually narrowed (the effective narrowing), as well as in the final signal-to-noise degradation induced by FSD. Moreover, the filter function determines, to a good extent, the band-shape of the deconvolved bands. For instance, the intensity of the harmful side-lobule oscillations that appear in over-deconvolution depends importantly on the filter function used. In the present paper we characterized the resulting band shape, effective narrowing, and signal-to-noise degradation in infra-, self-, and over-deconvolution conditions for several filter functions: Triangle, Bessel, Hanning, Gaussian, Sinc2, and Triangle2. We also introduced and characterized new filters based on the modification of the Blackmann filter. Our conclusion is that the Bessel filter (in infra-, self-, and mild over-deconvolution), the newly introduced BL3 filter (in self- and mild/moderate over-deconvolution), and the Gaussian filter (in moderate/strong over-deconvolution) are the most suitable filter functions to be used in FSD. PMID:19589217

Lrenz-Fonfra, Vctor A; Padrs, Esteve



Perfusion Quantification Using Gaussian Process Deconvolution  

E-print Network

using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual imaging (DSC- MRI) (1,2). The first perfusion measurements using DSC- MRI were performed by Villringer et


IC failure analysis: techniques and tools for quality reliability improvement  

Microsoft Academic Search

The role of failure analysis is discussed. Failure analysis techniques and tools, including electrical measurements, optical microscopy, thermal imaging analysis, electron beam techniques, light emission microscopy, ion beam techniques, and scanning probe microscopy, are reviewed. Opportunities for advances in the field of IC failure analysis are considered




CCMR: Wound Dressing Tool and Wound Analysis  

NSDL National Science Digital Library

The goal of our project is to develop a Wound Dressing Tool (WDT) that in addition to extracting overabundant chemicals like the VAC system does, can also allow for variable rates of mass transfer as well as a way for clinicians to monitor the fluid chemical composition of the wound bed during the healing and treatment processes.

Men, Shannon



Tools for Knowledge Analysis, Synthesis, and Sharing  

ERIC Educational Resources Information Center

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own

Medland, Michael B.



Benefits of image deconvolution in CCD imagery  

NASA Astrophysics Data System (ADS)

We show how wavelet-based image deconvolution can provide to wide-field CCD telescopes an increase in limiting magnitude of ? R 0.6 and significant deblending improvement. Astrometric accuracy is not distorted, therefore, the feasibility of the technique for astrometric projects is validated. We apply the deconvolution process to a set of images from the recently refurbished Baker-Nunn camera at Rothney Astrophysical Observatory.

Fors, O.; Merino, M.; Otazu, X.; Cardinal, R.; Nez, J.; Hildebrand, A.



Minimum entropy deconvolution and blind equalisation  

NASA Technical Reports Server (NTRS)

Relationships between minimum entropy deconvolution, developed primarily for geophysics applications, and blind equalization are pointed out. It is seen that a large class of existing blind equalization algorithms are directly related to the scale-invariant cost functions used in minimum entropy deconvolution. Thus the extensive analyses of these cost functions can be directly applied to blind equalization, including the important asymptotic results of Donoho.

Satorius, E. H.; Mulligan, J. J.



Deconvolution of isobaric interferences in mass spectra  

Microsoft Academic Search

The concept of isobar deconvolution using the mass domain and signal intensity based domains is described. The intensity domain-based\\u000a approach employs the reconstruction of the observed isotope pattern from the isolated patterns of the isobaric species. The\\u000a quantitative information is adjusted with the use of the least squares algorithm. The mass domain-based approach employs signal\\u000a deconvolution by forming Gaussian components

Juris Meija; Joseph A. Caruso



Extrinsic Geometrical Methods for Neural Blind Deconvolution  

NASA Astrophysics Data System (ADS)

The present contribution discusses a Riemannian-gradient-based algorithm and a projection-based learning algorithm over a curved parameter space for single-neuron learning. We consider the `blind deconvolution' signal processing problem. The learning rule naturally arises from a criterion-function minimization over the unitary hyper-sphere setting. We consider the blind deconvolution performances of the two algorithms as well as their computational burden and numerical features.

Fiori, Simone




NASA Technical Reports Server (NTRS)

The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

Pack, G.



A CDMA Data Measurement and Analysis Tool  

Microsoft Academic Search

We present DQoS (Data Quality-of-Service), a tool for measuring and analyzing CDMA data services. For a wireless connection, DQoS collects traces at its two endpoints and at multiple layers including application, IP, TCP, PPP and physical layers. It then synchronizes these traces and aggregates the information to produce graphical plots that illustrate throughput, latencies, protocol (e.g. TCP) behavior and bad

M. Andrews; L. Zhang



Development of wavelet analysis tools for turbulence  

NASA Technical Reports Server (NTRS)

Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.



Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.  


This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters. PMID:21765155

Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G



Automated Security Protocol Analysis With the AVISPA Tool  

Microsoft Academic Search

The AVISPA Tool is a push-button tool for the Automated Validation of Internet Security Protocols and Applications. It provides a modular and expressive formal language for specifying protocols and their security properties, and integrates different back-ends that implement a variety of automatic protocol analysis techniques. Experimental results, carried out on a large library of Internet security protocols, indicate that the

Luca Vigan



Multiple Parametric Circuit Analysis Tool for Detectability Estimation  

Microsoft Academic Search

In this paper a software tool is presented which is capable of producing testability metrics for analog and mixed-signal circuits. These metrics are obtained by performing probabilistic analysis techniques. The presented tool acts as a shell utilizing the power of state of the art external schematic and simulation programs and offers to the user a graphical interface for circuit design

Michael G. Dimopoulos; Dimitris K. Papakostas; Dimitrios K. Konstantinou; Alexios D. Spyronasios; Alkis A. Hatzopoulos



Zoo: A tool for traffic analysis and characterization User manual  

E-print Network

Zoo: A tool for traffic analysis and characterization User manual Nicolas LARRIEU, Philippe Introduction This paper introduces the Zoo tool that has been designed in the French Metropolis project, explaining why Zoo speaks French; we will soon teach Zoo how to speak English. More technically speaking

Owezarski, Philippe


RPA: Design Tool for Liquid Rocket Engine Analysis  

Microsoft Academic Search

RPA (Rocket Propulsion Analysis) is a design tool for the performance prediction of the liquid- propellant rocket engines. RPA is written in Java and can be used under any operating system that has installed Java Runtime Environment (e.g. Mac OS X, Sun Solaris, MS Windows, any Linux etc). The tool can be used either as a standalone GUI application

Alexander Ponomarenko


Quasar: A New Tool for Concurrent Ada Programs Analysis  

E-print Network

Quasar: A New Tool for Concurrent Ada Programs Analysis Sami Evangelista, Claude Kaiser, Jean. We present a new tool, Quasar, which is based on ASIS and which uses fully the concept of patterns the usefulness of Quasar by analyzing several variations of a non trivial concurrent program. 1 Introduction

Evangelista, Sami


A Robust Deconvolution Method based on Transdimensional Hierarchical Bayesian Inference  

NASA Astrophysics Data System (ADS)

Analysis of P-S and S-P conversions allows us to map receiver side crustal and lithospheric structure. This analysis often involves deconvolution of the parent wave field from the scattered wave field as a means of suppressing source-side complexity. A variety of deconvolution techniques exist including damped spectral division, Wiener filtering, iterative time-domain deconvolution, and the multitaper method. All of these techniques require estimates of noise characteristics as input parameters. We present a deconvolution method based on transdimensional Hierarchical Bayesian inference in which both noise magnitude and noise correlation are used as parameters in calculating the likelihood probability distribution. Because the noise for P-S and S-P conversion analysis in terms of receiver functions is a combination of both background noise - which is relatively easy to characterize - and signal-generated noise - which is much more difficult to quantify - we treat measurement errors as an known quantity, characterized by a probability density function whose mean and variance are model parameters. This transdimensional Hierarchical Bayesian approach has been successfully used previously in the inversion of receiver functions in terms of shear and compressional wave speeds of an unknown number of layers [1]. In our method we used a Markov chain Monte Carlo (MCMC) algorithm to find the receiver function that best fits the data while accurately assessing the noise parameters. In order to parameterize the receiver function we model the receiver function as an unknown number of Gaussians of unknown amplitude and width. The algorithm takes multiple steps before calculating the acceptance probability of a new model, in order to avoid getting trapped in local misfit minima. Using both observed and synthetic data, we show that the MCMC deconvolution method can accurately obtain a receiver function as well as an estimate of the noise parameters given the parent and daughter components. Furthermore, we demonstrate that this new approach is far less susceptible to generating spurious features even at high noise levels. Finally, the method yields not only the most-likely receiver function, but also quantifies its full uncertainty. [1] Bodin, T., M. Sambridge, H. Tkal?i?, P. Arroucau, K. Gallagher, and N. Rawlinson (2012), Transdimensional inversion of receiver functions and surface wave dispersion, J. Geophys. Res., 117, B02301

Kolb, J.; Lekic, V.



Lightweight object oriented structure analysis: Tools for building tools to analyze molecular dynamics simulations.  


LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. 2014 Wiley Periodicals, Inc. PMID:25327784

Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan



The environment power system analysis tool development program  

NASA Technical Reports Server (NTRS)

The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.



Visualizing patent statistics by means of social network analysis tools  

Microsoft Academic Search

The present paper reviews the literature on social network analysis with applications to bibliometric data, and in particular, patent information. Several approaches of network analysis are conducted in the field of optoelectronics to exemplify the power of network analysis tools. Cooperation networks between inventors and applicants are illustrated, emphasizing bibliometric measures such as activity, citation frequency, etc. as well as

Christian Sternitzke; Adam Bartkowski; Reinhard Schramm



An Analysis Tool for Flight Dynamics Monte Carlo Simulations  

E-print Network

of Individual Variables : : : : : : : : : 80 XI Aerodynamic Flutter Variables for the Analysis of Failure Regions : : 83 XII Aerodynamic Flutter Ranking of Variable Combinations : : : : : : : 85 XIII Ascent Abort Individual Variables... : : : : : : : : : : : : : : : : : : : 95 XIV Ascent Abort Monte Carlo Results : : : : : : : : : : : : : : : : : : : 99 xi LIST OF FIGURES FIGURE Page 1 Spacecraft Design and Analysis Cycle : : : : : : : : : : : : : : : : : 2 2 Analysis Tool for Flight Dynamics Simulations...

Restrepo, Carolina 1982-



Value Analysis: A Tool for Community Colleges.  

ERIC Educational Resources Information Center

Adoption of a value analysis program is proposed to aid colleges in identifying and implementing educationally sound labor-saving devices and procedures, enabling them to meet more students' needs at less cost with no quality reduction and a minimum of staff resistance. Value analysis is defined as a method for studying how well a product does

White, Rita A.


TERPRED: A Dynamic Structural Data Analysis Tool  

PubMed Central

Computational protein structure prediction mainly involves the main-chain prediction and the side-chain confirmation determination. In this research, we developed a new structural bioinformatics tool, TERPRED for generating dynamic protein side-chain rotamer libraries. Compared with current various rotamer sampling methods, our work is unique in that it provides a method to generate a rotamer library dynamically based on small sequence fragments of a target protein. The Rotamer Generator provides a means for existing side-chain sampling methods using static pre-existing rotamer libraries, to sample from dynamic target-dependent libraries. Also, existing side-chain packing algorithms that require large rotamer libraries for optimal performance, could possibly utilize smaller, target-relevant libraries for improved speed. PMID:25302339

Walker, Karl; Cramer, Carole L.; Jennings, Steven F.; Huang, Xiuzhen



A 3D image analysis tool for SPECT imaging  

NASA Astrophysics Data System (ADS)

We have developed semi-automated and fully-automated tools for the analysis of 3D single-photon emission computed tomography (SPECT) images. The focus is on the efficient boundary delineation of complex 3D structures that enables accurate measurement of their structural and physiologic properties. We employ intensity based thresholding algorithms for interactive and semi-automated analysis. We also explore fuzzy-connectedness concepts for fully automating the segmentation process. We apply the proposed tools to SPECT image data capturing variation of gastric accommodation and emptying. These image analysis tools were developed within the framework of a noninvasive scintigraphic test to measure simultaneously both gastric emptying and gastric volume after ingestion of a solid or a liquid meal. The clinical focus of the particular analysis was to probe associations between gastric accommodation/emptying and functional dyspepsia. Employing the proposed tools, we outline effectively the complex three dimensional gastric boundaries shown in the 3D SPECT images. We also perform accurate volume calculations in order to quantitatively assess the gastric mass variation. This analysis was performed both with the semi-automated and fully-automated tools. The results were validated against manual segmentation performed by a human expert. We believe that the development of an automated segmentation tool for SPECT imaging of the gastric volume variability will allow for other new applications of SPECT imaging where there is a need to evaluate complex organ function or tumor masses.

Kontos, Despina; Wang, Qiang; Megalooikonomou, Vasileios; Maurer, Alan H.; Knight, Linda C.; Kantor, Steve; Fisher, Robert S.; Simonian, Hrair P.; Parkman, Henry P.



Tool for Rapid Analysis of Monte Carlo Simulations  

NASA Technical Reports Server (NTRS)

Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.



Tool for Rapid Analysis of Monte Carlo Simulations  

NASA Technical Reports Server (NTRS)

Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.



Decision Analysis Tools for Volcano Observatories  

Microsoft Academic Search

Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that

T. H. Hincks; W. Aspinall; G. Woo



Competitive intelligence process and tools for intelligence analysis  

Microsoft Academic Search

Purpose The purpose of this survey research is twofold. First, to study and report the process that is commonly used to create and maintain a competitive intelligence (CI) program in organizations. And second, to provide an analysis of several emergent text mining, web mining and visualization-based CI tools, which are specific to collection and analysis of intelligence. Design\\/methodology\\/approach

Ranjit Bose



Pin: building customized program analysis tools with dynamic instrumentation  

Microsoft Academic Search

Robust and powerful software instrumentation tools are essential for program analysis tasks such as profiling, performance evaluation, and bug detection. To meet this need, we have developed a new instrumentation system called to instrument executables while they are running. For efficiency, Pin uses several techniques, including inlining, register re-allocation, liveness analysis, and instruction scheduling to optimize instrumentation. This fully automated

Chi-Keung Luk; Robert S. Cohn; Robert Muth; Harish Patil; Artur Klauser; P. Geoffrey Lowney; Steven Wallace; Vijay Janapa Reddi; Kim M. Hazelwood



Applying engineering feedback analysis tools to climate dynamics  

Microsoft Academic Search

The application of feedback analysis tools from engineering control theory to problems in climate dynamics is discussed through two examples. First, the feedback coupling between the thermohaline circulation and wind-driven circulation in the North Atlantic Ocean is analyzed with a relatively simple model, in order to better understand the coupled system dynamics. The simulation behavior is compared with analysis using

Douglas G. MacMynowski; Eli Tziperman



Cost-Benefit Analysis: Tools for Decision Making.  

ERIC Educational Resources Information Center

Suggests that cost-benefit analysis can be a helpful tool for assessing difficult and complex problems in child care facilities. Defines cost-benefit analysis as an approach to determine the most economical way to manage a program, describes how to analyze costs and benefits through hypothetical scenarios, and discusses some of the problems

Bess, Gary



Maximum likelihood deconvolution: a new perspective  

SciTech Connect

Maximum-likelihood deconvolution can be presented from at least two very different points of view. Unfortunately, in most journal articles, it is couched in the mystique of state-variable models and estimation theory, both of which, are generally quite foreign to geophysical signal processors. This paper explains maximum-likelihood deconvolution using the well-known convolutional model and some relatively simple ideas from optimization theory. Both of these areas should be well known to geophysical signal processors. Although it is straightforward to develop the theory of maximum-likelihood deconvolution using the convolutional model and optimization theory, this approach does not lead to practical computational algorithms. Recursive algorithms must be used; they are orders of magnitude faster than the batch algorithms that are associated with the convolutional model.

Mendel, J.M.



Deconvolution of images with periodic striping noise  

NASA Astrophysics Data System (ADS)

In this paper a new deconvolution algorithm is presented concerning images contaminated by periodic stripes. Inspired by the 2-D power spectrum distribution property of periodic stripes in the frequency domain, we construct a novel regularized inverse filter which allows the algorithm to suppress the amplification of striping noise in the Fourier inverse step and further get rid of most of them, and mirror-wavelet denoising is followed to remove the left colored noise. In simulations with striped images, this algorithm outperforms the traditional mirror-wavelet based deconvolution in terms of both visual effect and SNR comparison, only at the expense of slightly heavier computation load. The same idea about regularized inverse filter can also be used to improve other deconvolution algorithms, such as wavelet packets and wiener filters, when they are employed to images stained by periodic stripes.

Wang, Zuoguan; Xu, Wujun; Fu, Yutian



Estimation of in vivo percutaneous absorption of emedastine from bile excretion data using a deconvolution method.  


In vivo percutaneous absorption of emedastine difumarate was investigated in rats and compared with rat skin in vitro. Since emedastine entering the systemic circulation is mostly excreted in bile, we first came up with the method of collecting bile with a minimal skin incision. In vivo skin permeation of the drug was estimated from biliary excretion data by deconvolution analysis. Prior to applying deconvolution analysis, it was confirmed that biliary excretion of emedastine was linear against its dose. When the in vivo permeation profile estimated by deconvolution was compared with the in vitro profile, the lag time for permeation was significantly shorter in vivo than in vitro, whereas the skin permeability coefficient was almost the same. If we presume a two-layer diffusion model, then this finding may primarily be due to the shorter diffusion length of the dermis. PMID:16272750

Harada, Shoichi; Yamashita, Fumiyoshi; Hashida, Mitsuru



Development of a climate data analysis tool (CDAT)  

SciTech Connect

The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

Marlais, S.M.



Simple Payback: The Wrong Tool for Energy Project Analysis?  

E-print Network

Simple Payback: The Wrong Tool for Energy Project Analysis? Christopher Russell New Business Development Manager Science Applications International Corporation ABSTRACT Industrial decision-makers everywhere depend on "payback" as a way...-risk, or it will pay to reduce that volume of consumption. The save-or-buy criterion is the decision tool for making that choice. *~*~* Christopher Russell is the new business development manager for Science Applications International Corporation?s Energy...

Russell, C.



Pin: building customized program analysis tools with dynamic instrumentation  

Microsoft Academic Search

Robust and powerful software instrumentation tools are essential for program analysis tasks such as profiling, performance evaluation, and bug detection. To meet this need, we have developed a new instrumentation system called Pin. Our goals are to provide easy-to-use, portable, transparent, and efficient instrumentation. Instrumentation tools (called Pintools) are written in C\\/C++ using Pin's rich API. Pin follows the model

Chi-Keung Luk; Robert Cohn; Robert Muth; Harish Patil; Artur Klauser; Geoff Lowney; Steven Wallace; Vijay Janapa Reddi; Kim Hazelwood



Radar Interferometry Time Series Analysis and Tools  

NASA Astrophysics Data System (ADS)

We consider the use of several multi-interferogram analysis techniques for identifying transient ground motions. Our approaches range from specialized InSAR processing for persistent scatterer and small baseline subset methods to the post-processing of geocoded displacement maps using a linear inversion-singular value decomposition solution procedure. To better understand these approaches, we have simulated sets of interferograms spanning several deformation phenomena, including localized subsidence bowls with constant velocity and seasonal deformation fluctuations. We will present results and insights from the application of these time series analysis techniques to several land subsidence study sites with varying deformation and environmental conditions, e.g., arid Phoenix and coastal Houston-Galveston metropolitan areas and rural Texas sink holes. We consistently find that the time invested in implementing, applying and comparing multiple InSAR time series approaches for a given study site is rewarded with a deeper understanding of the techniques and deformation phenomena. To this end, and with support from NSF, we are preparing a first-version of an InSAR post-processing toolkit to be released to the InSAR science community. These studies form a baseline of results to compare against the higher spatial and temporal sampling anticipated from TerraSAR-X as well as the trade-off between spatial coverage and resolution when relying on ScanSAR interferometry.

Buckley, S. M.



A Modern Tool for Classical Plant Growth Analysis  

PubMed Central

We present an all?inclusive software tool for dealing with the essential core of mathematical and statistical calculations in plant growth analysis. The tool calculates up to six of the most fundamental growth parameters according to a purely classical approach across one harvest?interval. All of the estimates carry standard errors and 95% confidence limits. The tool is written in Microsoft Excel 2000 and is available free of charge for use in teaching and research from article supplementary data ( PMID:12324272




Color Deconvolution and Support Vector Machines  

NASA Astrophysics Data System (ADS)

Methods for machine learning (support vector machines) and image processing (color deconvolution) are combined in this paper for the purpose of separating colors in images of documents. After determining the background color, samples from the image that are representative of the colors to be separated are mapped to a feature space. Given the clusters of samples of either color the support vector machine (SVM) method is used to find an optimal separating line between the clusters in feature space. Deconvolution image processing parameters are determined from the separating line. A number of examples of applications in forensic casework are presented.

Berger, Charles E. H.; Veenman, Cor J.


Deconvolution by thresholding in mirror wavelet bases.  


The deconvolution of signals is studied with thresholding estimators that decompose signals in an orthonormal basis and threshold the resulting coefficients. A general criterion is established to choose the orthonormal basis in order to minimize the estimation risk. Wavelet bases are highly sub-optimal to restore signals and images blurred by a low-pass filter whose transfer function vanishes at high frequencies. A new orthonormal basis called mirror wavelet basis is constructed to minimize the risk for such deconvolutions. An application to the restoration of satellite images is shown. PMID:18237922

Kalifa, Jrme; Mallat, Stphane; Roug, Bernard



GATB: Genome Assembly & Analysis Tool Box  

PubMed Central

Motivation: Efficient and fast next-generation sequencing (NGS) algorithms are essential to analyze the terabytes of data generated by the NGS machines. A serious bottleneck can be the design of such algorithms, as they require sophisticated data structures and advanced hardware implementation. Results: We propose an open-source library dedicated to genome assembly and analysis to fasten the process of developing efficient software. The library is based on a recent optimized de-Bruijn graph implementation allowing complex genomes to be processed on desktop computers using fast algorithms with low memory footprints. Availability and implementation: The GATB library is written in C++ and is available at the following Web site under the A-GPL license. Contact: Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24990603

Drezen, Erwan; Rizk, Guillaume; Chikhi, Rayan; Deltel, Charles; Lemaitre, Claire; Peterlongo, Pierre; Lavenier, Dominique



Physics analysis tools for beauty physics in ATLAS  

NASA Astrophysics Data System (ADS)

The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools.

Anastopoulos, C.; B-Thacker, E.; Catmore, J.; Dallison, S.; Derue, F.; Epp, B.; Jussel, P.; Kaczmarska, A.; Mora, L. d.; Radziewski, H. v.; ?ezn?ek, P.; Stahl, T.



Vulnerability assessment using two complementary analysis tools  

SciTech Connect

To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

Paulus, W.K.



Software Construction and Analysis Tools for Future Space Missions  

NASA Technical Reports Server (NTRS)

NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

Lowry, Michael R.; Clancy, Daniel (Technical Monitor)



Blind seismic deconvolution using variational Bayesian method  

NASA Astrophysics Data System (ADS)

Blind seismic deconvolution, which comprises seismic wavelet and reflectivity sequence, is a strongly ill-posed problem. The reflectivity sequence is modeled as a Bernoulli-Gaussian (BG) process, depending on four parameters (noise variance, high and low reflector variances, and reflector density). These parameters need to be estimated from the seismic record, which is the convolution of the reflectivity sequence and the seismic wavelet. In this paper, we propose a variational Bayesian method for blind seismic deconvolution which can determine the reflectivity sequence and the seismic wavelet. The connection between variational Bayesian blind deconvolution and the minimization of the Kullback-Leibler divergence of two probability distributions is also established. The gamma, beta distributions are used for the unknown parameters (hyperparameters) as prior distribution and also we give how these distributions can be inferred in actual situations. The proposed algorithms are tested by simulation and compared to existing blind deconvolution methods. The results show that variational Bayesian method has better agreement with the actual value.

Yanqin, Li; Guoshan, Zhang



Euler deconvolution in satellite geodesy Matthias Roth  

E-print Network

decomposition A = UA u SA 0 0 s VAA vA vA v -1 which yields the solution vector ^ = vAv-1 -30 -20 -10 0 10 20 Euler deconvolution is based on Euler's homogeneous function theorem, for gravity gradients we can get

Stuttgart, Universität


Satellite Image Deconvolution Using Complex Wavelet Packets  

Microsoft Academic Search

The deconvolution of blurred and noisy satellite im- ages is an ill-posed inverse problem. Donoho has pro- posed to deconvolve the image without regularization and to denoise the result in a wavelet basis by thresh- olding the transformed coefficients. We have developed a new filtering method, con- sisting of using a complex wavelet packet basis. Herein, the thresholding functions associated

Andr Jalobeanu; Laure Blanc-fraud; Josiane Zerubia



Bussgang blind deconvolution for impulsive signals  

Microsoft Academic Search

Many blind deconvolution algorithms have been designed to extract digital communications signals corrupted by intersymbol interference (ISI). Such algorithms generally fail when applied to signals with impulsive characteristics, such as acoustic signals. While it is possible to stabilize such procedures in many cases by imposing unit-norm constraints on the adaptive equalizer coefficient vector, these modifications require costly divide and square-root

Heinz Mathis; Scott C. Douglas



ICI-WaRM Regional Analysis of Frequency Tool (ICI-RAFT)  

E-print Network

ICI-WaRM Regional Analysis of Frequency Tool (ICI-RAFT) by Jason Giovannettone & Michael WrightRM's Regional Analysis of Frequency Tool (ICI-RAFT). The tool walks the user through the methodology outlined

US Army Corps of Engineers


COCPIT: a tool for integrated care pathway variance analysis.  


Electronic Health Record (EHR) data has the potential to track patients' journeys through healthcare systems. Many of those journeys are supposed to follow Integrated Care Pathways (ICPs) built on evidence based guidelines. An ICP for a particular condition sets out "what should happen", whereas the EHR records "what did happen". Variance analysis is the process by which the difference between expected and actual care is identified. By performing variance analysis over multiple patients, patterns of deviation from idealised care are revealed. The use of ICP variance analysis, however, is not as widespread as it could be in healthcare quality improvement processes - we argue that this is due to the difficulty of combining the required specialist knowledge and skills from different disciplines. COCPIT (Collaborative Online Care Pathway Investigation Tool) was developed to overcome this difficulty and provides clinicians and health service managers with a web-based tool for Care Pathway Variance Analysis. PMID:22874343

Ainsworth, John; Buchan, Iain



The CoastWatch Data Analysis Tool: Status and  

E-print Network

The CoastWatch Data Analysis Tool: Status and Future Peter Hollemans, Terrenus Earth Sciences Earth Sciences Consultant for NOAA/NESDISAugust, 2008 Talk Outline · About CDAT · Current Release · Future Releases 2 #12;About CDAT Current Release Future Releases Peter Hollemans, Terrenus Earth Sciences


Kinetic Visualizations: A New Class of Tools for Intelligence Analysis  

Microsoft Academic Search

Intelligence analysis requires detecting and ex- ploiting patterns hidden in complex data. When the critical aspects of a data set can be effec- tively visually presented, displays become pow- erful tools by harnessing the pattern-recognition capabilities of human vision. To this end, shape, color, and interactive techniques are widely util- ized in intelligence displays. Unfortunately, the volume and complexity of

Robert J. Bobrow; Aaron Helsinger


Explaining Soar: Analysis of Existing Tools and User Information Requirements  

E-print Network

Explaining Soar: Analysis of Existing Tools and User Information Requirements Isaac G. Councill", with respect to the context of a simulation domain, and "internal", the structure and cognitive processes requirements of an example population, data collected from a usability study of the TacAir-Soar Situation

Ritter, Frank


Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint  

SciTech Connect

This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

Christensen, C.; Horowitz, S.



Water Optimizer Suite: Tools for Decision Support and Policy Analysis  

E-print Network

1 Water Optimizer Suite: Tools for Decision Support and Policy Analysis Water Optimizer is a suite allocation policies which reduce the amount of irrigation water applied to a crop are small for the first various management strategies or water policy consequences for alternative circumstances. We have

Nebraska-Lincoln, University of


Chianti: a tool for change impact analysis of java programs  

Microsoft Academic Search

This paper reports on the design and implementation of Chianti, a change impact analysis tool for Java that is imple- mented in the context of the Eclipse environment. Chianti analyzes two versions of an application and decomposes their difierence into a set of atomic changes. Change impact is then reported in terms of afiected (regression or unit) tests whose execution

Xiaoxia Ren; Fenil Shah; Frank Tip; Barbara G. Ryder; Ophelia Chesley



General Purpose Textual Sentiment Analysis and Emotion Detection Tools  

E-print Network

in many domains: opinion mining, prediction, feedbacks, etc. However, building a general purpose tool. 1.2 Difficulties and existing approaches A difficult aspect of sentiment analysis is the fact and the holder like in [11]. Phenomenon Example Polarity Negation it's not good ; no one thinks it is good

Paris-Sud XI, Université de


Football analysis using spatio-temporal tools Joachim Gudmundsson  

E-print Network

Football analysis using spatio-temporal tools Joachim Gudmundsson University of Sydney and NICTA, Australia ABSTRACT Analysing a football match is without doubt an important task specifically for analysing the performance of football players and teams. The aim, functionality

Wolle, Thomas


Discovery and New Frontiers Project Budget Analysis Tool  

NASA Technical Reports Server (NTRS)

The Discovery and New Frontiers (D&NF) programs are multi-project, uncoupled programs that currently comprise 13 missions in phases A through F. The ability to fly frequent science missions to explore the solar system is the primary measure of program success. The program office uses a Budget Analysis Tool to perform "what-if" analyses and compare mission scenarios to the current program budget, and rapidly forecast the programs ability to meet their launch rate requirements. The tool allows the user to specify the total mission cost (fixed year), mission development and operations profile by phase (percent total mission cost and duration), launch vehicle, and launch date for multiple missions. The tool automatically applies inflation and rolls up the total program costs (in real year dollars) for comparison against available program budget. Thus, the tool allows the user to rapidly and easily explore a variety of launch rates and analyze the effect of changes in future mission or launch vehicle costs, the differing development profiles or operational durations of a future mission, or a replan of a current mission on the overall program budget. Because the tool also reports average monthly costs for the specified mission profile, the development or operations cost profile can easily be validate against program experience for similar missions. While specifically designed for predicting overall program budgets for programs that develop and operate multiple missions concurrently, the basic concept of the tool (rolling up multiple, independently-budget lines) could easily be adapted to other applications.

Newhouse, Marilyn E.



Approximate deconvolution large eddy simulation of a barotropic ocean circulation model  

NASA Astrophysics Data System (ADS)

A new large eddy simulation closure modeling strategy is put forth for two-dimensional turbulent geophysical flows. This closure modeling approach utilizes approximate deconvolution, which is based solely on mathematical approximations and does not employ additional phenomenological arguments in the model. The new approximate deconvolution model is tested in the numerical simulation of the wind-driven circulation in a shallow ocean basin, a standard prototype of more realistic ocean dynamics. The model employs the barotropic vorticity equation driven by a symmetric double-gyre wind forcing, which yields a four-gyre circulation in the time mean. The approximate deconvolution model yields the correct four-gyre circulation structure predicted by a direct numerical simulation, but on a coarser mesh and at a fraction of the computational cost. This first step in the numerical assessment of the new model shows that approximate deconvolution could represent a viable tool for under-resolved computations in the large eddy simulation of more realistic turbulent geophysical flows.

San, Omer; Staples, Anne E.; Wang, Zhu; Iliescu, Traian



NSDL National Science Digital Library

The goal of the lesson is not for students to learn what the simple machines are, even though this is an underlying theme. Students will approach the lesson in a much more open-minded fashion. They will discuss tools and how they function. This will naturally lead to acknowledgment of how tools make our lives easier. By categorizing everyday items, students will come to understand the natural functions of tools. This base of knowledge will lead into exercises and discussions about how complex machines are a conglomerate of simpler tools and motions, as well as how tools have changed and become more sophisticated throughout history. At the end of the lesson to reemphasize the importance of tools in human society, students will write a paper in which they imagine a world without a particular tool.

Science Netlinks;



Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study  

NASA Technical Reports Server (NTRS)

An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

Flores, Melissa; Malin, Jane T.



Omiga: a PC-based sequence analysis tool.  


Computer-based sequence analysis, notation, and manipulation are a necessity for all molecular biologists working with any but the most simple DNA sequences. As sequence data become increasingly available, tools that can be used to manipulate and annotate individual sequences and sequence elements will become an even more vital implement in the molecular biologist's arsenal. The Omiga DNA and Protein Sequence Analysis Software tool, version 2.0 provides an effective and comprehensive tool for the analysis of both nucleic acid and protein sequences that runs on a standard PC available in every molecular biology laboratory. Omiga allows the import of sequences in several common formats. Upon importing sequences and assigning them to various projects, Omiga allows the user to produce, analyze, and edit sequence alignments. Sequences may also be queried for the presence of restriction sites, sequence motifs, and other sequence features, all of which can be added into the notations accompanying each sequence. This newest version of Omiga also allows for sequencing and polymerase chain reaction (PCR) primer prediction, a functionality missing in earlier versions. Finally, Omiga allows rapid searches for putative coding regions, and Basic Local Alignment Search Tool (BLAST) queries against public databases at the National Center for Biotechnology Information (NCBI). PMID:11697223

Kramer, J A



fMRI analysis software tools: an evaluation framework  

NASA Astrophysics Data System (ADS)

Performance comparison of functional Magnetic Resonance Imaging (fMRI) software tools is a very difficult task. In this paper, a framework for comparison of fMRI analysis results obtained with different software packages is proposed. An objective evaluation is possible only after pre-processing steps that normalize input data in a standard domain. Segmentation and registration algorithms are implemented in order to classify voxels belonging to brain or not, and to find the non rigid transformation that best aligns the volume under inspection with a standard one. Through the definitions of intersection and union of fuzzy logic an index was defined which quantify information overlap between Statistical Parametrical Maps (SPMs). Direct comparison between fMRI results can only highlight differences. In order to assess the best result, an index that represents the goodness of the activation detection is required. The transformation of the activation map in a standard domain allows the use of a functional Atlas for labeling the active voxels. For each functional area the Activation Weighted Index (AWI) that identifies the mean activation level of whole area was defined. By means of this brief, but comprehensive description, it is easy to find a metric for the objective evaluation of a fMRI analysis tools. Trough the first evaluation method the situations where the SPMs are inconsistent were identified. The result of AWI analysis suggest which tool has higher sensitivity and specificity. The proposed method seems a valid evaluation tool when applied to an adequate number of patients.

Pedoia, Valentina; Colli, Vittoria; Strocchi, Sabina; Vite, Cristina; Binaghi, Elisabetta; Conte, Leopoldo



Tool Support for Parametric Analysis of Large Software Simulation Systems  

NASA Technical Reports Server (NTRS)

The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony



Application of regularized Richardson-Lucy algorithm for deconvolution of confocal microscopy images.  


Although confocal microscopes have considerably smaller contribution of out-of-focus light than widefield microscopes, the confocal images can still be enhanced mathematically if the optical and data acquisition effects are accounted for. For that, several deconvolution algorithms have been proposed. As a practical solution, maximum-likelihood algorithms with regularization have been used. However, the choice of regularization parameters is often unknown although it has considerable effect on the result of deconvolution process. The aims of this work were: to find good estimates of deconvolution parameters; and to develop an open source software package that would allow testing different deconvolution algorithms and that would be easy to use in practice. Here, Richardson-Lucy algorithm has been implemented together with the total variation regularization in an open source software package IOCBio Microscope. The influence of total variation regularization on deconvolution process is determined by one parameter. We derived a formula to estimate this regularization parameter automatically from the images as the algorithm progresses. To assess the effectiveness of this algorithm, synthetic images were composed on the basis of confocal images of rat cardiomyocytes. From the analysis of deconvolved results, we have determined under which conditions our estimation of total variation regularization parameter gives good results. The estimated total variation regularization parameter can be monitored during deconvolution process and used as a stopping criterion. An inverse relation between the optimal regularization parameter and the peak signal-to-noise ratio of an image is shown. Finally, we demonstrate the use of the developed software by deconvolving images of rat cardiomyocytes with stained mitochondria and sarcolemma obtained by confocal and widefield microscopes. PMID:21323670

Laasmaa, M; Vendelin, M; Peterson, P



Virtual tool mark generation for efficient striation analysis.  


This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within 5-10. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James



Microscopy image segmentation tool: robust image data analysis.  


We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy. PMID:24689586

Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K



ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution  

E-print Network

ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution Jaroslav Pazdera pazdera the atomic deconvolution problem and we propose the estimator for an atom location and give its asymptotic in the ordinary deconvolution problem. ATOMIC DECONVOLUTION In the ordinary deconvolution problem one wants

Jureckova, Jana


Industrial Geospatial Analysis Tool for Energy Evaluation (IGATE-E)  

SciTech Connect

IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.

Alkadi, Nasr E [ORNL] [ORNL; Starke, Michael R [ORNL] [ORNL; Ma, Ookie [DOE EERE] [DOE EERE; Nimbalkar, Sachin U [ORNL] [ORNL; Cox, Daryl [ORNL] [ORNL



Efficient deconvolution of noisy periodic interference signals.  


The interference signal formed by combining two coherent light beams carries information on the path difference between the beams. When the path difference is a periodic function of time, as, for example, when one beam is reflected from a vibrating surface and the other from a fixed surface, the interference signal is periodic with the same period as the vibrating surface. Bessel functions provide an elegant and efficient means for deconvoluting such periodic interference signals, thus making it possible to obtain the displacement of the moving surface with nanometer resolution. Here we describe the mathematical basis for the signal deconvolution and employ this technique to obtain the amplitude of miniature capillary waves on water as a test case. PMID:16604773

Behroozi, Feredoon; Behroozi, Peter S



Exploiting polypharmacology for drug target deconvolution  

PubMed Central

Polypharmacology (action of drugs against multiple targets) represents a tempting avenue for new drug development; unfortunately, methods capable of exploiting the known polypharmacology of drugs for target deconvolution are lacking. Here, we present an ensemble approach using elastic net regularization combined with mRNA expression profiling and previously characterized data on a large set of kinase inhibitors to identify kinases that are important for epithelial and mesenchymal cell migration. By profiling a selected optimal set of 32 kinase inhibitors in a panel against six cell lines, we identified cell type-specific kinases that regulate cell migration. Our discovery of several informative kinases with a previously uncharacterized role in cell migration (such as Mst and Taok family of MAPK kinases in mesenchymal cells) may represent novel targets that warrant further investigation. Target deconvolution using our ensemble approach has the potential to aid in the rational design of more potent but less toxic drug combinations. PMID:24707051

Gujral, Taranjit Singh; Peshkin, Leonid; Kirschner, Marc W.



Applying AI tools to operational space environmental analysis  

NASA Technical Reports Server (NTRS)

The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.

Krajnak, Mike; Jesse, Lisa; Mucks, John



Inversion of marine magnetic anomalies by deconvolution  

E-print Network

large block size, it is demonstrated that the deconvolution technique can retrieve the equivalent source function when the anomaly due to each source block is clearly defined. The model profiles used were 115 samples in length, with a sample... magnetization structure. used to calculate the anomaly. 38 CHAPTER V RESOLUTION OF THE INVERSION TECHNIQUE The ability of the inversion procedure to recover short wavelength source blocks is limited by two factors; the sample interval of the magnetic...

Harry, Dennis Lee



Beta spectra deconvolution for liquid scintillation counting  

Microsoft Academic Search

This study presents the first results of a deconvolution method for Liquid Scintillation complex spectra. The method has been\\u000a developed by means of the software MATLAB and is based on the utilization of Fourier Transforms. Its main target is to obtain\\u000a a fast calculation procedure capable to unfold complex spectra without requiring any preliminary knowledge of the peak shapes\\u000a of

Romolo Remetti; Alessandro Sessa



An integrated thermal management analysis tool [for aircraft  

Microsoft Academic Search

A computational tool, developed to perform subsystem and system level integrated thermal management assessment and design calculations, is described in this paper. The Vehicle Integrated Thermal Management Analysis Code (VITMAC) simulates the coupled thermal-fluid response of airframe\\/engine active cooling circuits, airframe\\/engine structural components, and fuel tanks subject to aeroheating and internal\\/engine heat loads. VITMAC simulates both the steady-state and transient

F. Issacci; A. Telal Wassel; V. Van Griethuysen



Galileo: A Tool for Dynamic Fault Tree Analysis  

Microsoft Academic Search

Galileo is a prototype software tool for dependability analysis of fault tolerant computer-based systems. Reliability models\\u000a are specified using dynamic fault trees, which provide special constructs for modeling sequential failure modes in addition\\u000a to standard combinatorial fault tree gates. Independent modules are determined automatically, and separate modules are solved\\u000a combinatorially (using Binary Decision Diagrams) or using Markov Methods.

Joanne Bechta Dugan



Stranger: An Automata-Based String Analysis Tool for PHP  

Microsoft Academic Search

\\u000a \\u000a Stranger is an automata-based string analysis tool for finding and eliminating string-related security vulnerabilities in PHP applications.\\u000a Stranger uses symbolic forward and backward reachability analyses to compute the possible values that the string expressions can take\\u000a during program execution. Stranger can automatically (1) prove that an application is free from specified attacks or (2) generate vulnerability signatures that\\u000a characterize all

Fang Yu; Muath Alkhalaf; Tevfik Bultan



Future trends in data archiving and analysis tools  

NASA Astrophysics Data System (ADS)

Data archiving is a pluri-disciplinary activity and, as far as possible, generic architecture should be used. In this context the conceptual "Open Archival Information System" developed by CCSDS has recently become an ISO standard. For the particular discipline of plasma physics it remains an open question as to what should be archived, calibrated or uncalibrated data: both have their advantages and disadvantages. In any case, data centres should deliver calibrated data to the end user, and in a format which he can readily use. It is important to be able to find data easily, wherever it may be; standardisation of the data description would simplify interoperability, that is, the searching of multiple data centres via a single operation. Many data analysis tools now exist, and several offer a large selection of generic applications thanks to the use of an internal data format. The idea of a common data format for all these systems seems utopic. Thus there is an urgent need to define a common data format which can be used by data centres to deliver data, and by analysis systems to accept external data and to communicate data between each other. The archiving of analysis tools is problematic, but the best tools will undoubtedly continue to evolve as long as they have users.

Harvey, C.; Huc, C.


Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool  

NASA Technical Reports Server (NTRS)

The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall



Interactive Software Fault Analysis Tool for Operational Anomaly Resolution  

NASA Technical Reports Server (NTRS)

Resolving software operational anomalies frequently requires a significant amount of resources for software troubleshooting activities. The time required to identify a root cause of the anomaly in the software may lead to significant timeline impacts and in some cases, may extend to compromise of mission and safety objectives. An integrated tool that supports software fault analysis based on the observed operational effects of an anomaly could significantly reduce the time required to resolve operational anomalies; increase confidence for the proposed solution; identify software paths to be re-verified during regression testing; and, as a secondary product of the analysis, identify safety critical software paths.

Chen, Ken



ISAC: A tool for aeroservoelastic modeling and analysis  

NASA Technical Reports Server (NTRS)

The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

Adams, William M., Jr.; Hoadley, Sherwood Tiffany



Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.  

EPA Science Inventory

The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...


Tool for bonded optical element thermal stability analysis  

NASA Astrophysics Data System (ADS)

An analytical tool is presented which supports the opto-mechanical design of bonded optical elements. Given the mounting requirements from the optical engineer, the alignment stability and optical stresses in bonded optics can be optimized for the adhesive and housing material properties. While a perfectly athermalized mount is desirable, it is not realistic. The tool permits evaluation of element stability and stress over the expected thermal range at nominal, or worst case, achievable assembly and manufacturing tolerances. Selection of the most appropriate mount configuration and materials, which maintain the optical engineer's design, is then possible. The tool is based on a stress-strain analysis using Hooke's Law in the worst case plane through the optic centerline. The optimal bond line is determined for the selected adhesive, housing and given optic materials using the basic athermalization equation. Since a mounting solution is expected to be driven close to an athermalized design, the stress variations are considered linearly related to strain. A review of the equation set, the tool input and output capabilities and formats and an example will be discussed.

Klotz, Gregory L.



Integrated Modeling Tools for Thermal Analysis and Applications  

NASA Technical Reports Server (NTRS)

Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis



Networking Sensor Observations, Forecast Models & Data Analysis Tools  

NASA Astrophysics Data System (ADS)

This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and population statistics services to be registered with the GEOSS registry and made findable through the GEOSS Clearinghouse. The fusion of data sources and different web service interfaces illustrate the agility in using standard interfaces and help define the type of input and output interfaces needed to connect models and analysis tools within sensor webs.

Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.



Colossal Tooling Design: 3D Simulation for Ergonomic Analysis  

NASA Technical Reports Server (NTRS)

The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid



PERISCOPE: An Online-Based Distributed Performance Analysis Tool  

NASA Astrophysics Data System (ADS)

This paper presents PERISCOPE - an online distributed performance analysis tool that searches for a wide range of performance bottlenecks in parallel applications. It consists of a set of agents that capture and analyze application and hardware-related properties in an autonomous fashion. The paper focuses on the Periscope design, the different search methodologies, and the steps involved to do an online performance analysis. A new graphical user-friendly interface based on Eclipse is introduced. Through the use of this new easy-to-use graphical interface, remote execution, selection of the type of analysis, and the inspection of the found properties can be performed in an intuitive and easy way. In addition, a real-world application, namely, the GENE code, a grand challenge problem of plasma physics is analyzed using Periscope. The results are illustrated in terms of found properties and scalability issues.

Benedict, Shajulin; Petkov, Ventsislav; Gerndt, Michael



SciTech Connect

We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

Da Rio, Nicola [European Space Agency, Keplerlaan 1, 2200-AG Noordwijk (Netherlands); Robberto, Massimo, E-mail: [Space Telescope Science Institute, 3700 San Martin Dr., Baltimore, MD 21218 (United States)



Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution  

NASA Astrophysics Data System (ADS)

The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 5 grid positions over a 2 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution in UDECON is achieved by searching for the minimum ABIC while shifting the sensor response (to account for possible mispositioning of the sample on the tray) and a smoothness parameter in ranges defined by user. Comparison of deconvolution results using sensor response estimated from integrated point source measurements and other methods suggest that the integrated point source estimate yields better results (smaller ABIC). The noise characteristics of magnetometer measurements and the reliability of the UDECON algorithm were tested using repeated (a total of 400 times) natural remanence measurement of a u-channel sample before and after stepwise alternating field demagnetizations. Using a series of synthetic data constructed based on real paleomagnetic record, we demonstrate that optimized deconvolution using UDECON can greatly help revealing detailed paleomagnetic information such as excursions that may be smoothed out during pass-through measurement. Application of UDECON to the vast amount of existing and future pass-through paleomagnetic and rock magnetic measurements on sediments recovered especially through ocean drilling programs will contribute to our understanding of the geodynamo and paleo-environment by providing more detailed records of geomagnetic and environmental changes.

Xuan, C.; Oda, H.



Operations other than war: Requirements for analysis tools research report  

SciTech Connect

This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

Hartley, D.S. III



Least-squares (LS) deconvolution of a series of overlapping cortical auditory evoked potentials: a simulation and experimental study  

NASA Astrophysics Data System (ADS)

Objective. To evaluate the viability of disentangling a series of overlapping cortical auditory evoked potentials (CAEPs) elicited by different stimuli using least-squares (LS) deconvolution, and to assess the adaptation of CAEPs for different stimulus onset-asynchronies (SOAs). Approach. Optimal aperiodic stimulus sequences were designed by controlling the condition number of matrices associated with the LS deconvolution technique. First, theoretical considerations of LS deconvolution were assessed in simulations in which multiple artificial overlapping responses were recovered. Second, biological CAEPs were recorded in response to continuously repeated stimulus trains containing six different tone-bursts with frequencies 8, 4, 2, 1, 0.5, 0.25 kHz separated by SOAs jittered around 150 (120-185), 250 (220-285) and 650 (620-685) ms. The control condition had a fixed SOA of 1175 ms. In a second condition, using the same SOAs, trains of six stimuli were separated by a silence gap of 1600 ms. Twenty-four adults with normal hearing (<20 dB HL) were assessed. Main results. Results showed disentangling of a series of overlapping responses using LS deconvolution on simulated waveforms as well as on real EEG data. The use of rapid presentation and LS deconvolution did not however, allow the recovered CAEPs to have a higher signal-to-noise ratio than for slowly presented stimuli. The LS deconvolution technique enables the analysis of a series of overlapping responses in EEG. Significance. LS deconvolution is a useful technique for the study of adaptation mechanisms of CAEPs for closely spaced stimuli whose characteristics change from stimulus to stimulus. High-rate presentation is necessary to develop an understanding of how the auditory system encodes natural speech or other intrinsically high-rate stimuli.

Bardy, Fabrice; Van Dun, Bram; Dillon, Harvey; Cowan, Robert



Integrated network analysis and effective tools in plant systems biology  

PubMed Central

One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1) network visualization tools, (2) pathway analyses, (3) genome-scale metabolic reconstruction, and (4) the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms. PMID:25408696

Fukushima, Atsushi; Kanaya, Shigehiko; Nishida, Kozo



TRIAC: A code for track measurements using image analysis tools  

NASA Astrophysics Data System (ADS)

A computer program named TRIAC written in MATLAB has been developed for track recognition and track parameters measurements from images of the Solid State Nuclear Track Detectors CR39. The program using image analysis tools counts the number of tracks for dosimetry proposes and classifies the tracks according to their radii for the spectrometry of alpha-particles. Comparison of manual scanning counts with those output by the automatic system are presented for detectors exposed to a radon rich environment. The system was also tested to differentiate tracks recorded by alpha-particles of different energies.

Patiris, D. L.; Blekas, K.; Ioannides, K. G.



LEXER: A tool for lexical analysis of program input  

SciTech Connect

LEXER is a useful tool for lexical analysis. It is designed to give an application programmer the ability to write code that will quickly parse commands to an interactive program. It is also useful in parsing character data stored in a file. This is done by lexically analyzing the input character string and placing its components and related information into arrays stored in common blocks. The code is written in FORTRAN which conforms to the ANSI Standard FORTRAN 77 in all but a few carefully documented areas. 2 refs.

Kephart, E.M.; Selleck, C.B.



A Design and Performance Analysis Tool for Superconducting RF Systems  

NASA Astrophysics Data System (ADS)

Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall plug power efficiency. Typical examples are CEBAF at Jefferson Lab and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper we describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyse the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise. An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse stucture and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feedforward can be added to further supress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented.

Schilcher, Th.; Simrock, S. N.; Merminga, L.; Wang, D. X.



A fast approach to identification using deconvolution  

NASA Technical Reports Server (NTRS)

In this paper, we propose a fast approach to impulse response and noise-variance identification for a finite-order, linear, time-invariant, single-input/single-output system, whose input driving noise is white (stationary or nonstationary) and measurement noise is stationary, white and Gaussian. Our algorithm is an iterative block component method that includes two stages, deconvolution and prediction-error identification. Experiences with our method indicate that it works well and saves about an order of magnitude in computation. Analyses and examples are given in this paper to support this claim.

Chi, C.-Y.; Mendel, J. M.



Recovery of Dynamic PET Regions via Simultaneous Segmentation and Deconvolution  

E-print Network

Recovery of Dynamic PET Regions via Simultaneous Segmentation and Deconvolution Benjamin Smith1 and deconvolution of dynamic PET images. By incorporating the PSF of the imaging system into our segmentation model effect. We show improved segmentation results, and outperform two state-of-the-art dynamic PET

Möller, Torsten


Ultrasound medical image deconvolution using CLEAN L.-T. Chiraa  

E-print Network

Ultrasound medical image deconvolution using CLEAN algorithm L.-T. Chiraa , J.-M. Giraultb , T reconstruction problem of ultrasound medical images using blind deconvolution algorithm has been recognized parameters such as noise or diffusive effects in tissues which produce the speckle noise. We intend

Boyer, Edmond


On the Optimal Rates of Convergence for Nonparametric Deconvolution Problems  

Microsoft Academic Search

Deconvolution problems arise in a variety of situations in statistics. An interesting problem is to estimate the density $f$ of a random variable $X$ based on $n$ i.i.d. observations from $Y = X + \\\\varepsilon$, where $\\\\varepsilon$ is a measurement error with a known distribution. In this paper, the effect of errors in variables of nonparametric deconvolution is examined. Insights

Jianqing Fan



Mineral abundance determination: Quantitative deconvolution of thermal emission spectra  

Microsoft Academic Search

A linear retrieval (spectral deconvolution) algorithm is developed and applied to high-resolution laboratory infrared spectra of particulate mixtures and their end- members. The purpose is to place constraints on, and test the viability of, linear spectral deconvolution of high-resolution emission spectra. The effects of addition of noise, data reproducibility, particle size variation, an increasing number of minerals in the mixtures,

Michael S. Ramsey; Philip R. Christensen



Removing Grid Effect of 3-D Euler Deconvolution using Rotating Coordinate  

NASA Astrophysics Data System (ADS)

A new method for removing grid effect of Euler deconvolution is suggested. Gridding is inevitable process for calculating Euler deconvolution, but in the process of gridding and calculating derivative using FFT, errors are generated like Gibb's phenomenon, edge effect, and circular convolution problem. Solutions which are locations and depths of anomaly bodies are calculated at all window locations, therefore, undesirable, scattered, and too many solutions are achieved and looks like meaningless. In this paper, rotating coordinate technique is used for removing grid effect of Euler deconvolution. Once Euler deconvolution is calculated for locations and depths of anomaly bodies at all window positions, the coordinate is rotated until 90 degree by 15 degree, and re-calculated for locations and depths. The process for removing grid effect is the follows: (1) Rotate the gridded potential data from 0 to 90 degree (2)Calculate the conventional 3-D Euler deconvolution for each data (3)Re-rotate Euler solution(locations and depths) to original coordinate (4) Collocation each solution and remove the solution which is not shown in others. Total 7 solutions can be achieved from these procedures, and common locations and depths results which are shown in all 5 solutions are accepted. The effectiveness of rotating technique is evaluated by using rectangular prism model which has 1km thickness at 5km depth from ground. Random noise also added for verifying the rotating technique. In case of adding noise, the depth accuracy become lower, but location still shows good estimate. This method is also implemented to analysis the tectonic interpretation of the Eastern Asia including Korea, China, and Japan using GRACE and CHAMP satellite gravity and magnetic data

Hwang, J.; Yu, S.; Kim, C.; Min, K.; Kim, J.



Nonlinear deconvolution with deblending: a new analyzing technique for spectroscopy  

NASA Astrophysics Data System (ADS)

Context: Spectroscopy data in general often deals with an entanglement of spectral line properties, especially in the case of blended line profiles, independently of how high the quality of the data may be. In stellar spectroscopy and spectropolarimetry, where atomic transition parameters are usually known, the use of multi-line techniques to increase the signal-to-noise ratio of observations has become common practice. These methods extract an average line profile by means of either least squares deconvolution (LSD) or principle component analysis (PCA). However, only a few methods account for the blending of line profiles, and when they do, they assume that line profiles add linearly. Aims: We abandon the simplification of linear line-adding for Stokes I and present a novel approach that accounts for the nonlinearity in blended profiles, also illuminating the process of a reasonable deconvolution of a spectrum. Only the combination of those two enables us to treat spectral line variables independently, constituting our method of nonlinear deconvolution with deblending (NDD). The improved interpretation of a common line profile achieved compensates for the additional expense in calculation time, especially when it comes to the application to (Zeeman) doppler imaging (ZDI). Methods: By examining how absorption lines of different depths blend with each other and describing the effects of line-adding in a mathematically simple, yet physically meaningful way, we discover how it is possible to express a total line depth in terms of a (nonlinear) combination of contributing individual components. Thus, we disentangle blended line profiles and underlying parameters in a truthful manner and strongly increase the reliability of the common line patterns retrieved. Results: By comparing different versions of LSD with our NDD technique applied to simulated atomic and molecular intensity spectra, we are able to illustrate the improvements provided by our method to the interpretation of the recovered mean line profiles. As a consequence, it is possible for the first time to retrieve an intrinsic line pattern from a molecular band, offering the opportunity to fully include them in a NDD-based ZDI. However, we also show that strong line broadening deters the existence of a unique solution for heavily blended lines such as in molecular bandheads.

Sennhauser, C.; Berdyugina, S. V.; Fluri, D. M.



General Mission Analysis Tool (GMAT) Architectural Specification. Draft  

NASA Technical Reports Server (NTRS)

Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

Hughes, Steven P.; Conway, Darrel, J.



PyRAT (python radiography analysis tool): overview  

SciTech Connect

PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory



Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool  

SciTech Connect

Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

Unwin, Stephen D.; Seiple, Timothy E.




SciTech Connect

We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at, and the data used in the experiment are available at

Shamir, Lior, E-mail: [Department of Computer Science, Lawrence Technological University, 21000 West Ten Mile Road, Southfield, MI 48075 (United States)



The Climate Data Analysis Tools (CDAT): Scientific Discovery Made Easy  

NASA Astrophysics Data System (ADS)

In recent years, amount of data available to climate scientists has grown exponentially. Whether we're looking at the increasing number of organizations providing data, the finer resolutions of climate models, or the escalating number of experiments and realizations for those experiments, every aspect of climate research leads to an unprecedented growth of the volume of data to analyze. The recent success and visibility of the Intergovernmental Panel on Climate Change Annual Report 4 (IPCC AR4) is boosting the demand to unprecedented levels and keeping the numbers increasing. Meanwhile, technology available for scientists to analyze the data has remained largely unchanged since the early days. One tool, however, has proven itself flexible enough not only to follow the trend of escalating demand, but also to be ahead of the game: the Climate Data Analysis Tools (CDAT) from the Program for Climate Model Diagnosis and Comparison (PCMDI). While providing the cutting edge technology necessary to distribute the IPCC AR4 data via the Earth System Grid, PCMDI has continuously evolved CDAT to handle new grids and higher definitions, and provide new diagnostics. In the near future, in time for AR5, PCMDI will use CDAT for state-of-the-art remote data analysis in a grid computing environment.

Doutriaux, C. M.; Williams, D. N.; Drach, R. S.; McCoy, R. B.; Mlaker, V.



Psychometric analysis of the Perceived Value of Certification Tool.  


The 18-item Perceived Value of Certification Tool (PVCT) was developed to support a multiphased research initiative related to assessing certification value among perioperative nurses. This article addresses the reliability and validity evaluation of the PVCT in a study of three samples of perioperative nurses: certificants (n = 954), noncertificants (n = 675), and administrators (n = 694). Factor analysis identified two-factor solutions for noncertificants and certificants. A three-factor solution for administrators was not clear; a two-factor solution was more interpretable. Explained variance ranged between 56.1% and 60.8% for the two factors of intrinsic and extrinsic value of certification. Confirmatory factor analysis model fit statistics for the two-factor model showed an acceptable fit of the data to the model. Internal consistency reliability (coefficient alpha) for the total PVCT ranged between .93 and .95 for the three samples. The coefficient alpha's ranged between .94 and .92 for the intrinsic value and from .86 to .84 for the extrinsic value among samples. Responses to the PVCT were also shown to adequately and correctly classify 76.9% of certificants and 48.2% of noncertificants. Overall, the PVCT is a valid and reliable tool to measure perceived value of certification. PMID:16873048

Sechrist, Karen R; Berlin, Linda E



XQCAT: eXtra Quark Combined Analysis Tool  

E-print Network

XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

Barducci, D; Buchkremer, M; Marrouche, J; Moretti, S; Panizzi, L



Coupled lattice neural network for blind deconvolution  

NASA Astrophysics Data System (ADS)

In the research, we introduced an artificial neural network model named as coupled lattice neural network to reconstruct an original image from a degrade one in the blind deconvolution, where the original image and blurring function are not known. In the coupled lattice neural network, each neuron connects with its nearest neighbor neurons, the neighborhood corresponds to the weights of the neural network and is defined by a finite domain. Outputs of neurons shows the intensity distribution of an estimated original image. Weights of each neuron correspond to an estimated blur function and are the same for different neurons. The coupled lattice neural network includes two main operations, one is a nearest neighbor coupling or diffusion, the other is a local nonlinear reflection and learning. First a rule for a blur function growing is introduced. Then the coupled lattice neural network implements an estimated original image evolving based on an estimated blur function. Moreover we define a growing error criterion to control the evolution of the coupled lattice neural network. Whenever the error criterion is minimized, the coupled neural network gets stable, then outputs of the neural network correspond to the reconstructed original image, the weights are the blur function. In addition we demonstrate a method for the option of initial state variables of the coupled lattice neural network. The new approach to blind deconvolution can recover a digital binary image successful. Moreover the coupled lattice neural network can be used in the reconstruction of a gray-scale image.

Wang, Ning; Chen, Yen-Wei; Nakao, Zensho



An online database for plant image analysis software tools  

PubMed Central

Background Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. Results We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website,, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. Conclusions The database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work. PMID:24107223



Spectral Analysis As A Tool To Investigate Phosphorus Transport  

NASA Astrophysics Data System (ADS)

The transport of phosphorus (P), a major source of pollution in freshwater ecosystems, is directly linked to the flow paths and travel time of water through a catchment. The objective of this research is to understand P movement for improved management of drinking water resources. Here we used spectral analysis to show the differences in long-term P transport trends between an active agricultural watershed (160 ha) and an abandoned agricultural forested watershed (85 ha) in the Catskills Mountains, NY, specifically, in Delaware County. The watersheds were close to each other, 6.4 km apart, so that hydro-meteorological differences were small. The results suggest interesting shifts in P transport behavior when historically fertilized land is abandoned and allowed to revert to forest. Spectral analysis, a long-term frequency domain time series analysis method, has been successfully used to analyze long-term time series data, quantify travel time distributions, and measure the watershed scale retardation factor for reactive solutes (Kirchner et al. 2000, Nature 43:524-527; Kirchner et al., 2001J. Hydrol. 254:82-101). We found that spectral analysis is a useful tool for analyzing long-term time series of water and chemical fluxes record for understanding ecosystem responses to disturbance and provide a in depth view of long-term effects of changing agricultural and natural resource management practices.

Taylor, J. C.; Walter, M. T.; Bishop, P.; Steenhuis, T. S.



Design and Analysis Tool for External-Compression Supersonic Inlets  

NASA Technical Reports Server (NTRS)

A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

Slater, John W.



First Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNNL-SA-44274 Top Ten Needs for Intelligence Analysis Tool Development  

E-print Network

First Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNNL-SA-44274 Top Ten Needs for Intelligence Analysis Tool Development Richard V. Badalamente and Frank L. Greitzer Battelle to generate ideas about future enhancements to software systems de- signed to aid intelligence analysts


LogTool: A Flexible, Publicly Available Data Analysis Tool Providing Graphical Analysis, Extraction of Data Subsets and Daytyping for Multiple Data Formats  

E-print Network


Qualmann, R. L.; Goudge, P.; Baker, M.



CHRONOS's Paleontological-Stratigraphic Interval Construction and Analysis Tool (PSICAT)  

NASA Astrophysics Data System (ADS)

The Paleontological-Stratigraphic Interval Construction and Analysis Tool (PSICAT) is a Java-based graphical editing tool for creating and viewing stratigraphic column diagrams from drill cores and outcrops. It is customized to the task of working with stratigraphic columns and captures data digitally as you draw and edit the diagram. The data and diagrams are captured in open formats, and integration with the CHRONOS system ( will allow the user to easily upload their data and diagrams into CHRONOS. Because the data and diagrams are stored in CHRONOS, they will be accessible to anyone, anywhere, at any time. PSICAT is designed with a modular, plug-in-based architecture that will allow it to support a wide variety of functionality, tasks, and geoscientific communities. PSICAT is currently being developed for use by the ANDRILL project ( on their upcoming drilling expeditions in Antarctica, but a general community version will be also available. PSICAT will allow unprecedented communication between Antarctica-based scientists and shore-based scientists, potentially allowing shore-based scientists to interact in almost real time with on-ice operations and data collection.

Reed, J. A.; Cervato, C.; Fielding, C. R.; Fils, D.



VisIt: Interactive Parallel Visualization and Graphical Analysis Tool  

NASA Astrophysics Data System (ADS)

VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tool's features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)



3D Scanning Technology as a Standard Archaeological Tool for Pottery Analysis: Practice and Theory  

E-print Network

3D Scanning Technology as a Standard Archaeological Tool for Pottery Analysis: Practice and Theory this method as a practical and reliable tool in Archaeological research. Keywords: 3D pottery analysis as a practical tool to accompany and serve Archaeological projects, did not reach beyond its embryonic stage. One


A Simulation Tool for Analysis of Alternative Paradigms for the New Electricity Business  

E-print Network

A Simulation Tool for Analysis of Alternative Paradigms for the New Electricity Business Thomas J preliminary results on the development of a simulation tool to perform this analysis. Since power systems are ultimately governed by the laws of physics, the heart of the tool is a power system simulation engine, valid


Combinatorial tools for the analysis of transcriptional regulation  

SciTech Connect

In this paper, we discuss virtual experiments for the study of major regulatory processes such as translation, signalization or transcription pathways. An essential part of these processes is the formation of protein clusters held together by a small number of binding domains that can be shared by many different proteins. Analysis of these clusters is complicated by the vast number of different arrangements of proteins that can trigger a specific reaction. We propose combinatorial tools that can help predict the effects on the rate of transcription of either changes in transcriptional factors concentration, or due to the introduction of chimeras combining domains not usually present on a protein. 15 refs., 5 figs., 3 tabs.

Bergeron, A.; Gaul, E.; Bergeron, D. [Universite du Quebec a Montreal (Canada)



Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning  

PubMed Central

Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.



Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.  


Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C



Lagrangian analysis. Modern tool of the dynamics of solids  

NASA Astrophysics Data System (ADS)

Explosive metal-working, material synthesis under shock loading, terminal ballistics, and explosive rock-blasting, are some of the civil and military fields of activity that call for a wider knowledge about the behavior of materials subjected to strong dynamic pressures. It is in these fields that Lagrangian analysis methods, the subject of this work, prove to be a useful investigative tool for the physicist. Lagrangian analysis was developed around 1970 by Fowles and Williams. The idea is based on the integration of the conservation equations of mechanics using stress or particle velocity records obtained by means of transducers placed in the path of a stress wave. In this way, all the kinematical and mechanical quantities contained in the conservation equations are obtained. In the first chapter the authors introduce the mathematical tools used to analyze plane and spherical one-dimensional motions. For plane motion, they describe the mathematical analysis methods pertinent to the three regimes of wave propagation encountered : the non-attenuating unsteady wave, the simple wave, and the attenuating unsteady wave. In each of these regimes, cases are treated for which either stress or particle velocity records are initially available. The authors insist that one or the other groups of data (stress and particle velocity) are sufficient to integrate the conservation equations in the case of the plane motion when both groups of data are necessary in the case of the spherical motion. However, in spite of this additional difficulty, Lagrangian analysis of the spherical motion remains particularly interesting for the physicist because it allows access to the behavior of the material under deformation processes other than that imposed by plane one-dimensional motion. The methods expounded in the first chapter are based on Lagrangian measurement of particle velocity and stress in relation to time in a material compressed by a plane or spherical dilatational wave. The Lagrangian specificity of the required measurements is assured by the fact that a transducer enclosed within a solid material is necessarily linked in motion to the particles of the material which surround it. This Lagrangian instrumentation is described in the second chapter. The authors are concerned with the techniques considered today to be the most effective. These are, for stress : piezoresistive gauges (50 ? and low impedance) and piezoelectric techniques (PVF2 gauges, quartz transducers) ; and for particle velocity : electromagnetic gauges, VISAR and IDL Doppler laser interferometers. In each case both the physical principles as well as techniques of use are set out in detail. For the most part, the authors use their own experience to describe the calibration of these instrumentation systems and to compare their characteristics : measurement range, response time, accuracy, useful recording time, detection area... These characteristics should be taken into account by the physicist when he has to choose the instrumentation systems best adapted to the Lagrangian analysis he intends to apply to any given material. The discussion at the end of chapter 2 should guide his choice both for plane and spherical one-dimensional motions. The third chapter examines to what extent the accuracy of Lagrangian analysis is affected by the accuracies of the numerical analysis methods and experimental techniques. By means of a discussion of different cases of analysis, the authors want to make the reader aware of the different kinds of sources of errors that may be encountered. This work brings up to date the state of studies on Lagrangian analysis methods based on a wide review of bibliographical sources together with the contribution made to research in this field by the four authors themselves in the course of the last ten years. Le formage des mtaux par explosif, la consolidation dynamique des poudres, la balistique terminale, l'abattage des roches par explosif, sont autant d'applications, dans les domaines civil et militaire, qui exigent d'approfondir les connaissances q

Cagnoux, J.; Chartagnac, P.; Hereil, P.; Perez, M.; Seaman, L.


Expectation Maximization for Joint Deconvolution and Statistics Estimation  

NASA Astrophysics Data System (ADS)

Biomedical ultrasound image quality is limited due to the blurring of tissue reflectivity introduced by the transducer Point Spread Function (PSF). Deconvolution techniques can be used to obtain the pure tissue response, otherwise called reflectivity function. Typically deconvolution methods are developed in the only purpose of image visual quality improvement. In this work we present an Expectation Maximization (EM) framework for US images deconvolution in which local statistical description of the tissue reflectivity is restored as well, so that features extracted from the deconvolved frame can theoretically be used for classification purposes.

Alessandrini, M.; Palladini, A.; De Marchi, L.; Speciale, N.


Quantitative deconvolution of human thermal infrared emittance.  


The bioheat transfer models conventionally employed in etiology of human thermal infrared (TIR) emittance rely upon two assumptions; universal graybody emissivity and significant transmission of heat from subsurface tissue layers. In this work, a series of clinical and laboratory experiments were designed and carried out to conclusively evaluate the validity of the two assumptions. Results obtained from the objective analyses of TIR images of human facial and tibial regions demonstrated significant variations in spectral thermophysical properties at different anatomic locations on human body. The limited validity of the two assumptions signifies need for quantitative deconvolution of human TIR emittance in clinical, psychophysiological and critical applications. A novel approach to joint inversion of the bioheat transfer model is also introduced, levering the deterministic temperature-dependency of proton resonance frequency in low-lipid human soft tissue for characterizing the relationship between subsurface 3D tissue temperature profiles and corresponding TIR emittance. PMID:23086533

Arthur, D T J; Khan, M M



Deconvolution of mixed magnetism in multilayer graphene  

NASA Astrophysics Data System (ADS)

Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

Swain, Akshaya Kumar; Bahadur, Dhirendra



Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools  

PubMed Central

Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.



Y0: An innovative tool for spatial data analysis  

NASA Astrophysics Data System (ADS)

This paper describes an advanced analysis and visualization tool, called Y0 (pronounced ``Why not?!''), that has been developed to directly support the scientific process for earth and space science research. Y0 aids the scientific research process by enabling the user to formulate algorithms and models within an integrated environment, and then interactively explore the solution space with the aid of appropriate visualizations. Y0 has been designed to provide strong support for both quantitative analysis and rich visualization. The user's algorithm or model is defined in terms of algebraic formulas in cells on worksheets, in a similar fashion to spreadsheet programs. Y0 is specifically designed to provide the data types and rich function set necessary for effective analysis and manipulation of remote sensing data. This includes various types of arrays, geometric objects, and objects for representing geographic coordinate system mappings. Visualization of results is tailored to the needs of remote sensing, with straightforward methods of composing, comparing, and animating imagery and graphical information, with reference to geographical coordinate systems. Y0 is based on advanced object-oriented technology. It is implemented in C++ for use in Unix environments, with a user interface based on the X window system. Y0 has been delivered under contract to Unidata, a group which provides data and software support to atmospheric researches in universities affiliated with UCAR. This paper will explore the key concepts in Y0, describe its utility for remote sensing analysis and visualization, and will give a specific example of its application to the problem of measuring glacier flow rates from Landsat imagery.

Wilson, Jeremy C.



Generalized Analysis Tools for Multi-Spacecraft Missions  

NASA Astrophysics Data System (ADS)

Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI SR-001, 1998. [2] Chanteur, G.: Spatial Interpolation for Four Spacecraft: Theory, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 371-393, ISSI SR-001, 1998. [3] Chanteur, G.: Accuracy of field gradient estimations by Cluster: Explanation of its dependency upon elongation and planarity of the tetrahedron, pp. 265-268, ESA SP-449, 2000. [4] Vogt, J., Paschmann, G., and Chanteur, G.: Reciprocal Vectors, pp. 33-46, ISSI SR-008, 2008.

Chanteur, G. M.



Stiffness Analysis of Machine Tools Using Finite Element Method  

Microsoft Academic Search

Modern machining processes require machine tools to work accurately and dynamically. This leads to the necessity for a method which can analyze the stiffness of machine tools. In this paper, a single module method and a hybrid modeling method for analyzing the stiffness of machine tools are presented. Techniques include building suitable finite element models, determining equivalent loads, simulating the

Yu Lianqing Wang Liping; Wang Liping



Micropollutants in urban watersheds : substance flow analysis as management tool  

NASA Astrophysics Data System (ADS)

Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chvre, N.



Orienting the Neighborhood: A Subdivision Energy Analysis Tool  

SciTech Connect

In subdivisions, house orientations are largely determined by street layout. The resulting house orientations affect energy consumption (annual and on-peak) for heating and cooling, depending on window area distributions and shading from neighboring houses. House orientations also affect energy production (annual and on-peak) from solar thermal and photovoltaic systems, depending on available roof surfaces. Therefore, house orientations fundamentally influence both energy consumption and production, and an appropriate street layout is a prerequisite for taking full advantage of energy efficiency and renewable energy opportunities. The potential influence of street layout on solar performance is often acknowledged, but solar and energy issues must compete with many other criteria and constraints that influence subdivision street layout. When only general guidelines regarding energy are available, these factors may be ignored or have limited effect. Also, typical guidelines are often not site-specific and do not account for local parameters such as climate and the time value of energy. For energy to be given its due consideration in subdivision design, energy impacts need to be accurately quantified and displayed interactively to facilitate analysis of design alternatives. This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

Christensen, C.; Horowitz, S.



Advanced tools for astronomical time series and image analysis  

NASA Astrophysics Data System (ADS)

The algorithms described here, which I have developed for applications in X-ray and ?-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

Scargle, Jeffrey D.


jSIPRO - analysis tool for magnetic resonance spectroscopic imaging.  


Magnetic resonance spectroscopic imaging (MRSI) involves a huge number of spectra to be processed and analyzed. Several tools enabling MRSI data processing have been developed and widely used. However, the processing programs primarily focus on sophisticated spectra processing and offer limited support for the analysis of the calculated spectroscopic maps. In this paper the jSIPRO (java Spectroscopic Imaging PROcessing) program is presented, which is a java-based graphical interface enabling post-processing, viewing, analysis and result reporting of MRSI data. Interactive graphical processing as well as protocol controlled batch processing are available in jSIPRO. jSIPRO does not contain a built-in fitting program. Instead, it makes use of fitting programs from third parties and manages the data flows. Currently, automatic spectra processing using LCModel, TARQUIN and jMRUI programs are supported. Concentration and error values, fitted spectra, metabolite images and various parametric maps can be viewed for each calculated dataset. Metabolite images can be exported in the DICOM format either for archiving purposes or for the use in neurosurgery navigation systems. PMID:23870172

Jiru, Filip; Skoch, Antonin; Wagnerova, Dita; Dezortova, Monika; Hajek, Milan



Spectrophotometric Determination of the Dissociation Constant of an Acid-Base Indicator Using a Mathematical Deconvolution Technique  

ERIC Educational Resources Information Center

A laboratory experiment reinforces the concept of acid-base equilibria while introducing a common application of spectrophotometry and can easily be completed within a standard four-hour laboratory period. It provides students with an opportunity to use advanced data analysis techniques like data smoothing and spectral deconvolution to

Alter, Krystyn P.; Molloy, John L.; Niemeyer, Emily D.



Study of academic achievements using spatial analysis tools  

NASA Astrophysics Data System (ADS)

In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

Gonzlez, C.; Velilla, C.; Snchez-Girn, V.



Mission operations data analysis tools for Mars Observer guidance and control  

NASA Technical Reports Server (NTRS)

Mission operations for the Mars Observer (MO) Project at the Jet Propulsion Laboratory were supported by a variety of ground data processing software and analysis tools. Some of these tools were generic to multimission spacecraft mission operations, some were specific to the MO spacecraft, and others were custom tailored to the operation and control of the Attitude and Articulation Control Subsystem (AACS). The focus of this paper is on the data analysis tools for the AACS. Four different categories of analysis tools are presented; with details offered for specific tools. Valuable experience was gained from the use of these tools and through their development. These tools formed the backbone and enhanced the efficiency of the AACS Unit in the Mission Operations Spacecraft Team. These same tools, and extensions thereof, have been adopted by the Galileo mission operations, and are being designed into Cassini and other future spacecraft mission operations.

Kan, Edwin P.




E-print Network

Cichocki Lab. for Advanced Brain Signal Processing RIKEN Brain Science Institute Wako shi, Saitama, 351 deconvolution by Salam et al [11, 12], Zhang et al [13, 14], and Cichocki et al [15]-[16]. In the state space

Vialatte, François


Comparative study of some methods in blind deconvolution  

E-print Network

This study presents some techniques used in Blind Deconvolution with emphasis on applications to digital communications. The literature contains many algorithms developed and tested in different situations, but very limited research was conducted...

Mbarek, Kais



Cascade Neural Networks for Multichannel Blind Deconvolution Seungjin CHOI 1  

E-print Network

their unknown convolutive mixtures. The cascade neural network is proposed, where each module consistsCascade Neural Networks for Multichannel Blind Deconvolution Seungjin CHOI 1 and Andrzej CICHOCKI/anti-Hebbianlearning rule, multichannelblinddeconvolution/equalization, neural networks, unsupervised learning algorithms

Cichocki, Andrzej


Application of the LucyRichardson Deconvolution Procedure to High Resolution Photoemission Spectra  

SciTech Connect

Angle-resolved photoemission has developed into one of the leading probes of the electronic structure and associated dynamics of condensed matter systems. As with any experimental technique the ability to resolve features in the spectra is ultimately limited by the resolution of the instrumentation used in the measurement. Previously developed for sharpening astronomical images, the Lucy-Richardson deconvolution technique proves to be a useful tool for improving the photoemission spectra obtained in modern hemispherical electron spectrometers where the photoelectron spectrum is displayed as a 2D image in energy and momentum space.

Rameau, J.; Yang, H.-B.; Johnson, P.D.



Genome-tools: a flexible package for genome sequence analysis.  


Genome-tools is a Perl module, a set of programs, and a user interface that facilitates access to genome sequence information. The package is flexible, extensible, and designed to be accessible and useful to both nonprogrammers and programmers. Any relatively well-annotated genome available with standard GenBank genome files may be used with genome-tools. A simple Web-based front end permits searching any available genome with an intuitive interface. Flexible design choices also make it simple to handle revised versions of genome annotation files as they change. In addition, programmers can develop cross-genomic tools and analyses with minimal additional overhead by combining genome-tools modules with newly written modules. Genome-tools runs on any computer platform for which Perl is available, including Unix, Microsoft Windows, and Mac OS. By simplifying the access to large amounts of genomic data, genome-tools may be especially useful for molecular biologists looking at newly sequenced genomes, for which few informatics tools are available. The genome-tools Web interface is accessible at, and the source code is available at PMID:12503321

Lee, William; Chen, Swaine L



Developing a high-quality software tool for fault tree analysis  

Microsoft Academic Search

Sophisticated dependability analysis techniques are being developed in academia and research labs, but few have gained wide acceptance in industry. To be valuable, such techniques must be supported by usable, dependable software tools. We present our approach to addressing these issues in developing a dynamic fault tree analysis tool called Galileo. Galileo is designed to support efficient system-level analysis by

Joanne Bechta Dugan; Kevin J. Sullivan; David Coppit



Mineral abundance determination: Quantitative deconvolution of thermal emission spectra  

Microsoft Academic Search

A linear retrieval (spectral deconvolution) algorithm is developed and applied to high-resolution laboratory infrared spectra of particulate mixtures and their end-members. The purpose is to place constraints on, and test the viability of, linear spectral deconvolution of high-resolution emission spectra. The effects of addition of noise, data reproducibility, particle size variation, an increasing number of minerals in the mixtures, and

Michael S. Ramsey; Philip R. Christensen



Methods and Apparatus for Reducing Multipath Signal Error Using Deconvolution  

NASA Technical Reports Server (NTRS)

A deconvolution approach to adaptive signal processing has been applied to the elimination of signal multipath errors as embodied in one preferred embodiment in a global positioning system receiver. The method and receiver of the present invention estimates then compensates for multipath effects in a comprehensive manner. Application of deconvolution, along with other adaptive identification and estimation techniques, results in completely novel GPS (Global Positioning System) receiver architecture.

Kumar, Rajendra (Inventor); Lau, Kenneth H. (Inventor)



Analysis of the influence of tool dynamics in diamond turning  

SciTech Connect

This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.



CCAT: Combinatorial Code Analysis Tool for transcriptional regulation.  


Combinatorial interplay among transcription factors (TFs) is an important mechanism by which transcriptional regulatory specificity is achieved. However, despite the increasing number of TFs for which either binding specificities or genome-wide occupancy data are known, knowledge about cooperativity between TFs remains limited. To address this, we developed a computational framework for predicting genome-wide co-binding between TFs (CCAT, Combinatorial Code Analysis Tool), and applied it to Drosophila melanogaster to uncover cooperativity among TFs during embryo development. Using publicly available TF binding specificity data and DNaseI chromatin accessibility data, we first predicted genome-wide binding sites for 324 TFs across five stages of D. melanogaster embryo development. We then applied CCAT in each of these developmental stages, and identified from 19 to 58 pairs of TFs in each stage whose predicted binding sites are significantly co-localized. We found that nearby binding sites for pairs of TFs predicted to cooperate were enriched in regions bound in relevant ChIP experiments, and were more evolutionarily conserved than other pairs. Further, we found that TFs tend to be co-localized with other TFs in a dynamic manner across developmental stages. All generated data as well as source code for our front-to-end pipeline are available at PMID:24366875

Jiang, Peng; Singh, Mona



Thermal Management Tools for Propulsion System Trade Studies and Analysis  

NASA Technical Reports Server (NTRS)

Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

McCarthy, Kevin; Hodge, Ernie



CCAT: Combinatorial Code Analysis Tool for transcriptional regulation  

PubMed Central

Combinatorial interplay among transcription factors (TFs) is an important mechanism by which transcriptional regulatory specificity is achieved. However, despite the increasing number of TFs for which either binding specificities or genome-wide occupancy data are known, knowledge about cooperativity between TFs remains limited. To address this, we developed a computational framework for predicting genome-wide co-binding between TFs (CCAT, Combinatorial Code Analysis Tool), and applied it to Drosophila melanogaster to uncover cooperativity among TFs during embryo development. Using publicly available TF binding specificity data and DNaseI chromatin accessibility data, we first predicted genome-wide binding sites for 324 TFs across five stages of D. melanogaster embryo development. We then applied CCAT in each of these developmental stages, and identified from 19 to 58 pairs of TFs in each stage whose predicted binding sites are significantly co-localized. We found that nearby binding sites for pairs of TFs predicted to cooperate were enriched in regions bound in relevant ChIP experiments, and were more evolutionarily conserved than other pairs. Further, we found that TFs tend to be co-localized with other TFs in a dynamic manner across developmental stages. All generated data as well as source code for our front-to-end pipeline are available at PMID:24366875

Jiang, Peng; Singh, Mona



SEM analysis as a diagnostic tool for photovoltaic cell degradation  

NASA Astrophysics Data System (ADS)

The importance of scanning electron microscopy (SEM) analysis as a diagnostic tool for analyzing the degradation of a polycrystalline Photovoltaic cell has been studied. The main aim of this study is to characterize the surface morphology of hot spot regions (degraded) cells in photovoltaic solar cells. In recent years, production of hetero and multi-junction solar cells has experience tremendous growth as compared to conventional silicon (Si) solar cells. Thin film photovoltaic solar cells generally are more prone to exhibiting defects and associated degradation modes. To improve the lifetime of these cells and modules, it is imperative to fully understand the cause and effect of defects and degradation modes. The objective of this paper is to diagnose the observed degradation in polycrystalline silicon cells, using scanning electron microscopy (SEM). In this study poly-Si cells were characterize before and after reverse biasing, the reverse biasing was done to evaluate the cells' susceptibility to leakage currents and hotspots formation. After reverse biasing, some cells were found to exhibit hotspots as confirmed by infrared thermography. The surface morphology of these hotspots re

Osayemwenre, Gilbert; Meyer, E. L.



New Access and Analysis Tools for Voyager LECP Data  

NASA Astrophysics Data System (ADS)

The Low Energy Charged Particle (LECP) instruments on the Voyager 1 and 2 spacecraft have been returning unique scientific measurements since launching in 1977, most notably observations from the historic tour of the giant planets. As these spacecraft continue on their exit trajectories from the Solar system they have become an interstellar mission and have begun to probe the boundary between the heliosphere and the interstellar cloud and continue to make exciting discoveries. As the mission changed from one focused on discrete encounters to an open ended search for heliospheric boundaries and transitory disturbances, the positions and timing of which are not known, the data processing needs have changed. Open data policies and the push to draw data under the umbrella of emerging Virtual Observatories have added a data sharing component that was not a part of the original mission plans. We present our work in utilizing new, reusable software analysis tools to access legacy data in a way that leverages pre-existing data analysis techniques. We took an existing Applied Physics Laboratory application, Mission Independent Data Layer (MIDL) -- developed originally under a NASA Applied Information Research Program (AISRP) and subsequently used with data from Geotail, Cassini, IMP-8, ACE, Messenger, and New Horizons -- and applied it to Voyager data. We use the MIDL codebase to automatically generate standard data products such as daily summary plots and associated tabulated data that increase our ability to monitor the heliospheric environment on a regular basis. These data products will be publicly available and updated automatically and can be analyzed by the community using the ultra portable MIDL software launched from the data distribution website. The currently available LECP data will also be described with SPASE metadata and incorporated into the emerging Virtual Energetic Particle Observatory (VEPO).

Brown, L. E.; Hill, M. E.; Decker, R. B.; Cooper, J. F.; Krimigis, S. M.; Vandegriff, J. D.



A measuring tool for tree-rings analysis  

NASA Astrophysics Data System (ADS)

A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena



The analysis of crow population dynamics as a surveillance tool.  


West Nile virus (WNV) infection, a zoonotic disease for which birds act as a reservoir, first appeared in North America in August 1999. It was first reported in Quebec in 2002. The Quebec surveillance system for WNV has several components, including the surveillance of mortality in corvid populations, which includes the American crow (Corvus brachyrhynchos). The main objectives of this study are to better understand the population dynamics of this species in Quebec and to evaluate the impact of WNV on these dynamics. We obtained observation data for living crows in this province for the period of 1990-2005 and then conducted a spectral analysis of these data. To study changes in crow population dynamics, the analysis was carried out before and after the appearance of WNV and space was divided in two different areas (urban and non-urban). Our results show the importance of cycles with periods of less than 1 year in non-urban areas and cycles with periods of greater than 1 year in urban areas in the normal population dynamics of the species. We obtained expected fluctuations in bird densities using an algorithm derived from spectral decomposition. When we compared these predictions with data observed after 2002, we found marked perturbations in population dynamics beginning in 2003 and lasting up to 2005. In the discussion, we present various hypotheses based on the behaviour of the American crow to explain the normal population dynamics observed in this species and the effect of type of area (urban versus non-urban). We also discuss how the predictive algorithm could be used as a disease surveillance tool and as a measure of the impact of a disease on wild fauna. PMID:19811623

Ludwig, A; Bigras-Poulin, M; Michel, P



Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions  

E-print Network

Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions Dung V hack tools. However, beside U3 technology, attackers also have another more flexible alternative, portable application or application virtualization, which allows a wide range of hack tools to be compiled

Halgamuge, Malka N.


Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.  


Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

Villeneuve, Emma; Carfantan, Herv



Experimental analysis of change detection algorithms for multitooth machine tool fault detection  

NASA Astrophysics Data System (ADS)

This paper describes an industrial application of fault diagnosis method for a multitooth machine tool. Different statistical approaches have been used to detect and diagnose insert breakage in multitooth tools based on the analysis of electrical power consumption of the tool drives. Great effort has been made to obtain a robust method, able to avoid any needed re-calibration process, after, for example, a maintenance operation. From the point of view of maintenance costs, these multitooth tools are the most critical part of the machine tools used for mass production in the car industry. These tools integrate different kinds of machining operations and cutting conditions.

Reones, Anbal; de Miguel, Luis J.; Pern, Jos R.



Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools  

NASA Technical Reports Server (NTRS)

A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.



General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft  

NASA Technical Reports Server (NTRS)

The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

Dove, Edwin; Hughes, Steve



MTpy - Python Tools for Magnetotelluric Data Processing and Analysis  

NASA Astrophysics Data System (ADS)

We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes



AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool  

USGS Publications Warehouse

Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

Halford, Keith



Tools for protein posttranslational modifications analysis: FAK, a case study.  


Recent advances in mass spectrometry have resulted in an exponential increase in annotation of posttranslational modifications (PTMs). Just in the Swiss-Prot Knowledgebase, there are 89,931 of a total of 27 characterized PTM types reported experimentally. A single protein can be dynamically modified during its lifetime for regulation of its function. Considering a PTM can occur at different levels and the number of different PTMs described, the number of possibilities for a single protein is unthinkable. Narrowing the study to a single PTM can be rather unmerited considering that most proteins are heavily modified. Currently crosstalk between PTMs is plentifully reported in the literature. The example of amino acids serine and threonine on one hand and lysine on the other hand, as targets of different modifications, demand a more global analysis approach of a protein. Besides the direct competition for the same amino acid, a PTM can directly or indirectly influence other PTMs in the same protein molecule by for example steric hindrance due to close proximity between the modifications or creation of a binding site such as an SH2 binding domain for protein recruitment and further modifications. Given the complexity of PTMs a number of tools have been developed to archive, analyze, and visualize modifications. VISUALPROT is presented here to demonstrate the usefulness of visualizing all annotated protein features such as amino acid content, domains, amino acid modification sites and single amino acid polymorphisms in a single image. VISUALPROT application is demonstrated for the protein focal adhesion kinase (FAK) as an example. FAK is a highly phosphorylated cytoplasmatic tyrosine kinase comprising different domains and regions. FAK is crucial for integrating signals from integrins and receptor tyrosine kinases in processes such as cell survival, proliferation, and motility. PMID:23666734

Fonseca, Catarina; Voabil, Paula; Carvalho, Ana Sofia; Matthiesen, Rune



Affinity-based target deconvolution of safranal  

PubMed Central

Background and the purpose of the study Affinity-based target deconvolution is an emerging method for the identification of interactions between drugs/drug candidates and cellular proteins, and helps to predict potential activities and side effects of a given compound. In the present study, we hypothesized that a part of safranal pharmacological effects, one of the major constituent of Crocus sativus L., relies on its physical interaction with target proteins. Methods Affinity chromatography solid support was prepared by covalent attachment of safranal to agarose beads. After passing tissue lysate through the column, safranal-bound proteins were isolated and separated on SDS-PAGE or two-dimensional gel electrophoresis. Proteins were identified using MALDI-TOF/TOF mass spectrometry and Mascot software. Results and major conclusion Data showed that safranal physically binds to beta actin, cytochrome b-c1 complex sub-unit 1, trifunctional enzyme sub-unit beta and ATP synthase sub-unit alpha and beta. These interactions may explain part of safranals pharmacological effects. However, phenotypic and/or biological relevance of these interactions remains to be elucidated by future pharmacological studies. PMID:23514587



Adaptive deconvolution using a SAW storage correlator  

NASA Astrophysics Data System (ADS)

A new analog adaptive filter for deconvolving distorted signals is described. The filter uses a storage correlator which implements a clipped version of the least mean squared algorithm and uses a special iterative technique to achieve fast convergence. The new filter has a potential bandwidth of 100 MHz and would eventually handle pulsed signals of 10-microsecond width. For signals with time-bandwidth products of less than 100, the adaptation time is less than 1 ms, which allows operation in real time for most applications, including resolution of radar signals in a cluttered environment, removal of echoes from television signals, deconvolution of distorted signals in nondestructive evaluation, and also in telephony. The filter is particularly suited for radar and communications, as it processes signals directly in the VHF range. Two experiments related to ghost suppression of a pulse and to the field of NDE are described in this paper. The results are in good agreement with computer simulations and show a ghost suppression of 15 dB for the first example and a sidelobe suppression of 8 dB for a transducer signal. The adaptation time is less than 450 microseconds.

Bowers, J. E.; Kino, G. S.; Behar, D.; Olaisen, H.



Online Analysis of Wind and Solar Part I: Ramping Tool  

SciTech Connect

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa



Accepted in Tools for Automatic Program AnalysiS : TAPAS 2010, Springer (ENTCS) Replace this file with prentcsmacro.sty for your meeting,  

E-print Network

Accepted in Tools for Automatic Program AnalysiS : TAPAS 2010, � Springer (ENTCS) Replace this file Program AnalysiS : TAPAS 2010, � Springer (ENTCS) Feautrier and Gonnord practical tool for software

Gonnord, Laure


PerfCenter: A Methodology and Tool for Performance Analysis of Application Hosting  

E-print Network

PerfCenter: A Methodology and Tool for Performance Analysis of Application Hosting Centers Rukma P between servers. While tools and methodologies for such analysis have been proposed earlier, our approach model network links between LANs and model the contention at those links due to messages exchanged

Apte, Varsha



Microsoft Academic Search

As part of the programme a design and analysis tool for bonded composite repairs has been developed. The repair design tool runs on a normal PC under Microsoft Office Excel, which is easily accessible for most people. A wide variety of joint designs, including external patches and scarf repairs, can be specified via a s imple-to-use input interface. The analysis

R. J. C. Creemers


Novel tools for sequence and epitope analysis of glycosaminoglycans  

E-print Network

Our understanding of glycosaminoglycan (GAG) biology has been limited by a lack of sensitive and efficient analytical tools designed to deal with these complex molecules. GAGs are heterogeneous and often sulfated linear ...

Behr, Jonathan Robert



Ris-R-1359(EN) Fractography analysis of tool samples  

E-print Network

forging is a much-used process in the automotive indus- try for production of axles and valve houses strains, which lead to dimensional changes and sud- den fracture of the tool [2]. In order to minimize



E-print Network

-based products are limited because silicon is brittle. Products can be made from other engineering materials and need to be machined in microscale. This research deals with predicting microtool failure by studying spindle runout and tool deflection effects...

Chittipolu, Sujeev



Evaluation of a Surface Exploration Traverse Analysis and Navigation Tool  

E-print Network

SEXTANT is an extravehicular activity (EVA) mission planner tool developed in MATLAB, which computes the most efficient path between waypoints across a planetary surface. The traverse efficiency can be optimized around ...

Gilkey, Andrea L.


An Integrated Traverse Planner and Analysis Tool for Planetary Exploration  

E-print Network

Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. ...

Johnson, Aaron William


Tools for Scalable Parallel Program Analysis - Vampir VNG and DeWiz  

Microsoft Academic Search

Large scale high-performance computing systems pose a tough obstacle for todays program analysis tools. Their demands in computational performance and memory capacity for processing program analysis data exceed the capabilities of standard workstations and traditional analysis tools. The sophisticated approaches of Vampir NG (VNG) and the Debugging Wizard DeWiz\\u000a intend to provide novel ideas for scalable parallel program analysis. While

Holger Brunst; Dieter Kranzlmller; Wolfgang E. Nagel



A computational tool for ionosonde CADI's ionogram analysis  

NASA Astrophysics Data System (ADS)

The purpose of this work is to present a new computational tool for ionogram generated with a Canadian Advanced Digital Ionosonde (CADI). This new tool uses the fuzzy relation paradigm to identify the F trace and from this form extract the parameters foF2, h'F, and hpF2. The tool was very extensively tested with ionosondes that operate at low latitudes and near the equatorial region. The ionograms used in this work were recorded at So Jos dos Campos (23.2 S, 45.9 W; dip latitude 17.6 S) and Palmas (10.2 S, 48.2 W; dip latitude 5.5 S). These automatically extracted ionospheric parameters were compared with those obtained manually and a good agreement was found. The developed tool will greatly expedite and standardize ionogram processing. Therefore, this new tool will facilitate exchange of information among many groups that operate ionosondes of the CADI type, and will be very helpful for space weather purposes.

Pillat, Valdir Gil; Guimares, Lamartine Nogueira Frutuoso; Fagundes, Paulo Roberto; da Silva, Jos Demsio Simes



Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).  


This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings. PMID:18406925

Rath, Frank



Automation Tools for Finite Element Analysis of Adhesively Bonded Joints  

NASA Technical Reports Server (NTRS)

This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)



Automated Multivariate Optimization Tool for Energy Analysis: Preprint  

SciTech Connect

Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.



OutbreakTools: A new platform for disease outbreak analysis using the R software  

PubMed Central

The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Hhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil



Second generation sequencing allows for mtDNA mixture deconvolution and high resolution detection of heteroplasmy  

PubMed Central

Aim To use parallel array pyrosequencing to deconvolute mixtures of mitochondrial DNA (mtDNA) sequence and provide high resolution analysis of mtDNA heteroplasmy. Methods The hypervariable segment 1 (HV1) of the mtDNA control region was analyzed from 30 individuals using the 454 GS Junior instrument. Mock mixtures were used to evaluate the systems ability to deconvolute mixtures and to reliably detect heteroplasmy, including heteroplasmic differences between 5 family members of the same maternal lineage. Amplicon sequencing was performed on polymerase chain reaction (PCR) products generated with primers that included multiplex identifiers (MID) and adaptors for pyrosequencing. Data analysis was performed using NextGENe software. The analysis of an autosomal short tandem repeat (STR) locus (D18S51) and a Y-STR locus (DYS389 I/II) was performed simultaneously with a portion of HV1 to illustrate that multiplexing can encompass different markers of forensic interest. Results Mixtures, including heteroplasmic variants, can be detected routinely down to a component ratio of 1:250 (20 minor variant copies with a coverage rate of 5000 sequences) and can be readily detected down to 1:1000 (0.1%) with expanded coverage. Amplicon sequences from D18S51, DYS389 I/II, and the second half of HV1 were successfully partitioned and analyzed. Conclusions The ability to routinely deconvolute mtDNA mixtures down to a level of 1:250 allows for high resolution analysis of mtDNA heteroplasmy, and for differentiation of individuals from the same maternal lineage. The pyrosequencing approach results in poor resolution of homopolymeric sequences, and PCR/sequencing artifacts require a filtering mechanism similar to that for STR stutter and spectral bleed through. In addition, chimeric sequences from jumping PCR must be addressed to make the method operational. PMID:21674826

Holland, Mitchell M.; McQuillan, Megan R.; O'Hanlon, Katherine A.



Software tool for automated analysis of conceptual data model  

Microsoft Academic Search

This paper describes implementation of a software system for analyzing data model correctness. Proposed system is based on integration of an automated reasoning system with CASE tool output in aim to automate the process of data model evaluation. The system is based on transformation of XML form of a conceptual data model to predicate logic form and merging with data

Z. Kazi; B. Radulovic




EPA Science Inventory

Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...


Super-resolution thermal ghost imaging based on deconvolution  

NASA Astrophysics Data System (ADS)

The resolution of classical imaging is limited by the Rayleigh diffraction limit, whereas ghost imaging can overcome this diffraction limit and enhance the resolution. In this paper, we propose a super-resolution thermal ghost imaging scheme based on deconvolution, which can further improve the imaging resolution. Because the traditional thermal ghost imaging result is the convolution between the original image of the object and the correlation function, we can use deconvolution algorithm to decrease the effect of convolution and enhance the imaging resolution. Actually, the correlation function of ghost imaging system is just the point spread function (PSF) of classical imaging system. However, PSF is hard to obtain in classical imaging system, and it is easy to obtain in ghost imaging system. So deconvolution algorithm can be easily used in the ghost imaging system, and the imaging resolution can increases 2-3 times in practice.

Chen, Zhipeng; Shi, Jianhong; Li, Yuan; Li, Qing; Zeng, Guihua



Spectral semi-blind deconvolution with least trimmed squares regularization  

NASA Astrophysics Data System (ADS)

A spectral semi-blind deconvolution with least trimmed squares regularization (SBD-LTS) is proposed to improve spectral resolution. Firstly, the regularization term about the spectrum data is modeled as the form of least trimmed squares, which can help to preserve the peak details better. Then the regularization term about the PSF is modeled as L1-norm to enhance the stability of kernel estimation. The cost function of SBD-LTS is formulated and the numerical solution processes are deduced for deconvolving the spectra and estimating the PSF. The deconvolution results of simulated infrared spectra demonstrate that the proposed SBD-LTS can recover the spectrum effectively and estimate the PSF accurately, as well as has a merit on preserving the details, especially in the case of noise. The deconvolution result of experimental Raman spectrum indicates that SBD-LTS can resolve the spectrum and improve the resolution effectively.

Deng, Lizhen; Zhu, Hu



PERT: A Method for Expression Deconvolution of Human Blood Samples from Varied Microenvironmental and  

E-print Network

PERT: A Method for Expression Deconvolution of Human Blood Samples from Varied Microenvironmental deconvolution methods to predict cell frequencies within heterogeneous human blood samples that were collected composition of heterogeneous samples can be predicted using an expression deconvolution algorithm to decompose

Zandstra, Peter W.


DAnTE: a statistical tool for quantitative analysis of -omics data  

Microsoft Academic Search

Summary: DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges associated with quantitative bottom-up, shotgun proteomics data. This tool has also been dem- onstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting

Ashoka D. Polpitiya; Wei-jun Qian; Navdeep Jaitly; Vladislav A. Petyuk; Joshua N. Adkins; David G. Camp II; Gordon A. Anderson; Richard D. Smith



Application of the CO 2 -PENS risk analysis tool to the Rock Springs Uplift, Wyoming  

Microsoft Academic Search

We describe preliminary application of the CO2-PENS performance and risk analysis tool to a planned geologic CO2 sequestration demonstration project in the Rock Springs Uplift (RSU), located in south western Wyoming. We use data from the RSU to populate CO2-PENS, an evolving system-level modeling tool developed at Los Alamos National Laboratory. This tool has been designed to generate performance and

Philip H. Stauffer; Rajesh J. Pawar; Ronald C. Surdam; Zunsheng Jiao; Hailin Deng; Bruce C. Lettelier; Hari S. Viswanathan; Dean L. Sanzo; Gordon N. Keating



Receiver function deconvolution using transdimensional hierarchical Bayesian inference  

NASA Astrophysics Data System (ADS)

Teleseismic waves can convert from shear to compressional (Sp) or compressional to shear (Ps) across impedance contrasts in the subsurface. Deconvolving the parent waveforms (P for Ps or S for Sp) from the daughter waveforms (S for Ps or P for Sp) generates receiver functions which can be used to analyse velocity structure beneath the receiver. Though a variety of deconvolution techniques have been developed, they are all adversely affected by background and signal-generated noise. In order to take into account the unknown noise characteristics, we propose a method based on transdimensional hierarchical Bayesian inference in which both the noise magnitude and noise spectral character are parameters in calculating the likelihood probability distribution. We use a reversible-jump implementation of a Markov chain Monte Carlo algorithm to find an ensemble of receiver functions whose relative fits to the data have been calculated while simultaneously inferring the values of the noise parameters. Our noise parametrization is determined from pre-event noise so that it approximates observed noise characteristics. We test the algorithm on synthetic waveforms contaminated with noise generated from a covariance matrix obtained from observed noise. We show that the method retrieves easily interpretable receiver functions even in the presence of high noise levels. We also show that we can obtain useful estimates of noise amplitude and frequency content. Analysis of the ensemble solutions produced by our method can be used to quantify the uncertainties associated with individual receiver functions as well as with individual features within them, providing an objective way for deciding which features warrant geological interpretation. This method should make possible more robust inferences on subsurface structure using receiver function analysis, especially in areas of poor data coverage or under noisy station conditions.

Kolb, J. M.; Leki?, V.



Analysis tools for the calibration and commissioning of the AOF  

NASA Astrophysics Data System (ADS)

The Adaptive Optics Facility (AOF) is an AO-oriented upgrade envisaged to be implemented at the UT4 in Paranal in 2013-2014, and which could serve as a test case for the E-ELT. Counting on the largest Deformable Secondary Mirror ever built (1170 actuators) and on four off-axes Na laser launch telescopes, the AOF will operate in distinct modes (GLAO, LTAO, SCAO), in accordance to the instruments attached to the 2 telescope Nasmyth ports (GALACSI+MUSE, GRAAL+HAWK-I) and to the Cassegrain port (ERIS). Tools are under development to allow a fast testing of important parameters for these systems when at commissioning and for posterior assessment of telemetry data. These concern the determination of turbulence parameters and Cn2 profiling, measurement of Strehl and ensquared energies, misregistration calculation, bandwidth & overall performance, etc. Our tools are presented as Graphical User Interfaces developed in the Matlab environment, and will be able to grab through a dedicated server data saved in SPARTA standards. We present here the tools developed up to present date and discuss details of what can be obtained from the AOF, based on simulations.

Garcia-Rissmann, Aurea; Kolb, Johann; Le Louarn, Miska; Madec, Pierre-Yves; Muller, Nicolas



MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis  

SciTech Connect

MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.



Sub-diffraction limit differentiation of single fluorophores using Single Molecule Image Deconvolution (SMID)  

NASA Astrophysics Data System (ADS)

In order to better understand biological systems, researchers demand new techniques and improvements in single molecule differentiation. We present a unique approach utilizing an analysis of the standard deviation of the Gaussian point spread function of single immobile fluorescent molecules. This technique, Single Molecule Image Deconvolution (SMID), is applicable to standard TIRF instrumentation and standard fluorophores. We demonstrate the method by measuring the separation of two Cy3 molecules attached to the ends of short double-stranded DNA immobilized on a surface without photobleaching. Preliminary results and further applications will be presented.

Decenzo, Shawn H.; Desantis, Michael C.; Wang, Y. M.



Collaboratative Mapping and Analysis Tools for Biological Spatio-Temporal Databases  

Microsoft Academic Search

The purpose of this project is to deliver fast and collaborative image processing to the biology laboratory bench. This will be achieved in the context of spatial data mapping, reconstruction, analysis and database query by developing GRID-enabled tools based on the existing tools developed as part of the Mouse Atlas Program at the MRC Human Genetics Unit. The Mouse Atlas

Richard Baldock; Mehran Sharghi



Compiling Dynamic Fault Trees into Dynamic Bayesian Nets for Reliability Analysis: the RADYBAN tool  

Microsoft Academic Search

In this paper, we present Radyban (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze systems modeled by means of Dynamic Fault Trees (DFT), by relying on automatic conversion into Dynamic Bayesian Networks (DBN). The tools aims at providing a familiar interface to reliability engineers, by allowing them to model the system to be analyzed with

Luigi Portinale; Andrea Bobbio; Daniele Codetta-Raiteri; Stefania Montani



A new energy analysis tool for ground source heat pump systems  

Microsoft Academic Search

A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2h etc.) and the calculation time required is very short. The

A. Michopoulos; N. Kyriakis



BugMaps-Granger: A Tool for Causality Analysis between Source Code Metrics and Bugs  

E-print Network

visualizations for causal analysis of bugs. We also provide a case study in order to evaluate the tool. 1 to evaluate whether past changes to a given time series of source code metrics can be used to forecast changes in a time series of defects. Our tool extracts source code versions from version con- trol platforms

Paris-Sud XI, Université de


3D scanning technology as a standard archaeological tool for pottery analysis: practice and theory  

E-print Network

3D scanning technology as a standard archaeological tool for pottery analysis: practice and theory, its applica- tions as a practical tool to accompany and serve archaeological projects, did not reach Avshalom Karasik a,*, Uzy Smilansky b a The Institute of Archaeology, The Hebrew University, Mount Scopus

Smilansky, Uzy


Applying observations of work activity in designing prototype data analysis tools  

SciTech Connect

Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

Springmeyer, R.R.



Process-oriented evaluation of user interactions in integrated system analysis tools  

E-print Network

When computer-based tools are used for analysis of complex systems, the design of user interactions and interfaces becomes an essential part of development that determines the overall quality. The objective of this study ...

Lee, Chaiwoo


Using the WorldCat Collection Analysis Tool: Experiences From the University of Kansas Libraries  

E-print Network

In March of 2009, the University of Kansas (KU) Libraries began a year-long subscription to OCLCs WorldCat Collection Analysis (WCA) tool, which was recommend by the Associate Dean of Technical Services and the Assistant ...

Monroe-Gulick, Amalia; Currie, Lea



Matrix: A statistical method and software tool for linguistic analysis through corpus comparison  

Microsoft Academic Search

Abstract Matrix: A statistical method and software tool for linguistic analysis through corpus comparison A thesis submitted to Lancaster University for the degree of Ph D in Computer Science Paul Edward Rayson, B Sc September 2002

P. Rayson



An integrated traverse planner and analysis tool for future lunar surface exploration  

E-print Network

This thesis discusses the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT), a system designed to help maximize productivity, scientific return, and safety on future lunar and planetary explorations,. The ...

Johnson, Aaron William



Communication Subsystem Synthesis and Analysis Tool using Bus Architecture Generation and  

E-print Network

that various constraints, like bus length, topology complexity, potential for communication conflicts over time function models bus length, bus topology complexity, potential for communication conflicts over timeCommunication Subsystem Synthesis and Analysis Tool using Bus Architecture Generation

Doboli, Simona


Powerful tool for design analysis of linear control systems  

SciTech Connect

The methods for designing linear controls for electronic or mechanical systems have been understood and put to practice. What has not been readily available to engineers, however, is a practical, quick and inexpensive method for analyzing these linear control (feedback) systems once they have been designed into the electronic or mechanical hardware. Now, the PET, manufactured by Commodore Business Machines (CBM), operating with several peripherals via the IEEE 488 Bus, brings to the engineer for about $4000 a complete set of office tools for analyzing these system designs.

Maddux, Jr, A S



Treemaps as a Tool for Social Network Analysis* CASOS Technical Report  

E-print Network

Treemaps as a Tool for Social Network Analysis* CASOS Technical Report Terrill L. Frantz & Kathleen treemaps also useful in the social network analysis setting. A treemap represents hierarchical. #12;CMU SCS ISRI CASOS Report-ii- Keywords: social network analysis, large datasets, data


Tools of the Trade Region of interest analysis for fMRI  

E-print Network

Tools of the Trade Region of interest analysis for fMRI Russell A. Poldrack Department of California Los Angeles, Los Angeles, CA, USA A common approach to the analysis of fMRI data involves analysis between fMRI and behavioral data in our laboratory failed to uncover any activation in several

Poldrack, Russ


Automated image analysis as a tool to quantify the colour and composition of rainbow trout  

E-print Network

Automated image analysis as a tool to quantify the colour and composition of rainbow trout cutlets in rainbow trout. The proposed automated image analysis methods were tested on a total of 983 of trout cutlets. © 2006 Elsevier B.V. All rights reserved. Keywords: Image analysis; Rainbow trout; Cutlet

Manne, Fredrik


Socio-economic analysis: a tool for assessing the potential of nanotechnologies  

E-print Network

Socio-economic analysis: a tool for assessing the potential of nanotechnologies Jean-Marc Brignon the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly

Paris-Sud XI, Université de


ClariNet: a noise analysis tool for deep submicron design  

Microsoft Academic Search

Coupled noise analysis has become a critical issue for deep-submicron, high performance design. In this paper, we present, ClariNet, an industrial noise analysis tool, which was developed to efficiently analyze large, high performance processor designs. We present the overall approach and tool flow of ClariNet and discuss three critical large-processor design issues which have received limited discussion in the past.

Rafi Levy; David Blaauw; Gabi Braca; Aurobindo Dasgupta; Amir Grinshpon; Chanlee Oh; Boaz Orshav; Supamas Sirichotiyakul; Vladimir Zolotov



PROVAT - a versatile tool for Voronoi tessellation analysis of protein structures and complexes  

E-print Network

BioMed Central Page 1 of 1 (page number not for citation purposes) BMC Bioinformatics Open AccessPoster presentation PROVAT a versatile tool for Voronoi tessellation analysis of protein structures and complexes Swanand P Gore*, David F Burke... and Tom L Blundell Address: Department of Biochemistry, University of Cambridge, UK. Email: Swanand P Gore* - * Corresponding author Voronoi tessellation has proved to be a useful tool in pro- tein structure analysis. But a...



Image restoration for confocal microscopy: improving the limits of deconvolution, with application to the visualization of the mammalian hearing organ.  

PubMed Central

Deconvolution algorithms have proven very effective in conventional (wide-field) fluorescence microscopy. Their application to confocal microscopy is hampered, in biological experiments, by the presence of important levels of noise in the images and by the lack of a precise knowledge of the point spread function (PSF) of the system. We investigate the application of wavelet-based processing tools to deal with these problems, in particular wavelet denoising methods, which turn out to be very effective in application to three-dimensional confocal images. When used in combination with more classical deconvolution algorithms, these methods provide a robust and efficient restoration scheme allowing one to deal with difficult imaging conditions. To make our approach applicable in practical situations, we measured the PSF of a Biorad-MRC1024 confocal microscope under a large set of imaging conditions, including in situ acquisitions. As a specific biological application, we present several examples of restorations of three-dimensional confocal images acquired inside an intact preparation of the hearing organ. We also provide a quantitative assessment of the gain in quality achieved by wavelet-aided restorations over classical deconvolution schemes, based on a set of numerical experiments that we performed with test images. PMID:11325744

Boutet de Monvel, J; Le Calvez, S; Ulfendahl, M



Mesh-based spherical deconvolution: a flexible approach to reconstruction of non-negative fiber orientation distributions.  


Diffusion-weighted MRI has enabled the imaging of white matter architecture in vivo. Fiber orientations have classically been assumed to lie along the major eigenvector of the diffusion tensor, but this approach has well-characterized shortcomings in voxels containing multiple fiber populations. Recently proposed methods for recovery of fiber orientation via spherical deconvolution utilize a spherical harmonics framework and are susceptible to noise, yielding physically-invalid results even when additional measures are taken to minimize such artifacts. In this work, we reformulate the spherical deconvolution problem onto a discrete spherical mesh. We demonstrate how this formulation enables the estimation of fiber orientation distributions which strictly satisfy the physical constraints of realness, symmetry, and non-negativity. Moreover, we analyze the influence of the flexible regularization parameters included in our formulation for tuning the smoothness of the resultant fiber orientation distribution (FOD). We show that the method is robust and reliable by reconstructing known crossing fiber anatomy in multiple subjects. Finally, we provide a software tool for computing the FOD using our new formulation in hopes of simplifying and encouraging the adoption of spherical deconvolution techniques. PMID:20206705

Patel, Vishal; Shi, Yonggang; Thompson, Paul M; Toga, Arthur W



Analysis, Diagnosis, and Short-Range Forecast Tools  

NSDL National Science Digital Library

This lesson is divided into three sections. The first section discusses the importance of analysis and diagnosis in evaluating NWP in the forecast process. In section two, we discuss a methodology for dealing with discrepancies between both the official forecast and NWP compared to analysis and diagnosis. The third section shows a representative example of the methodology.




Abstract--The analysis of heart rate variability (HRV) signals is an important tool for studying the autonomic nervous  

E-print Network

Abstract-- The analysis of heart rate variability (HRV) signals is an important tool for studying this balance varies with time. This work presents a tool for time-frequency analysis of heart rate variability a tool for time-frequency analysis of heart rate variability (HRV), which was developed in Matlab 6

Carvalho, João Luiz


Energy life-cycle analysis modeling and decision support tool  

SciTech Connect

As one of DOE`s five multi-program national laboratories, Pacific Northwest Laboratory (PNL) develops and deploys technology for national missions in energy and the environment. The Energy Information Systems Group, within the Laboratory`s Computer Sciences Department, focuses on the development of the computational and data communications infrastructure and automated tools for the Transmission and Distribution energy sector and for advanced process engineering applications. The energy industry is being forced to operate in new ways and under new constraints. It is in a reactive mode, reacting to policies and politics, and to economics and environmental pressures. The transmission and distribution sectors are being forced to find new ways to maximize the use of their existing infrastructure, increase energy efficiency, and minimize environmental impacts, while continuing to meet the demands of an ever increasing population. The creation of a sustainable energy future will be a challenge for both the soft and hard sciences. It will require that we as creators of our future be bold in the way we think about our energy future and aggressive in its development. The development of tools to help bring about a sustainable future will not be simple either. The development of ELCAM, for example, represents a stretch for the computational sciences as well as for each of the domain sciences such as economics, which will have to be team members.

Hoza, M.; White, M.E.



High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.  


The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

Simonyan, Vahan; Mazumder, Raja




E-print Network

PROBABILISTIC BLIND DECONVOLUTION OF NON-STATIONARY SOURCES Rasmus Kongsgaard Olsson and Lars Kai mixture of colored noise signals with additive white noise. We derive a time-domain EM algorithm `Ka] is based on second order statistics and is attractive for its rela- tive simplicity and implementation, yet


Non-iterative wavelet-based deconvolution for sparse aperturesystem  

NASA Astrophysics Data System (ADS)

Optical sparse aperture imaging is a promising technology to obtain high resolution but with a significant reduction in size and weight by minimizing the total light collection area. However, with the decreasing of collection area, its OTF is also greatly attenuated, and thus the directly imaging quality of sparse aperture system is very poor. In this paper, we focus on the post-processing methods for sparse aperture systems, and propose a non-iterative wavelet-based deconvolution algorithm. The algorithm is performed by adaptively denoising the Fourier-based deconvolution results on the wavelet basis. We set up a Golay-3 sparse-aperture imaging system, where the imaging and deconvolution experiments of the natural scenes are performed. The experiments demonstrate that the proposed method has greatly improved the imaging quality of Golay-3 sparse-aperture system, and produce satisfactory visual quality. Furthermore, our experimental results also indicate that the sparse aperture system has the potential to reach higher resolution with the help of better post-processing deconvolution techniques.

Xu, Wenhai; Zhao, Ming; Li, Hongshu



DRAFT Deconvolution of Neutron Depth Profiling Spectra with Entropic Priors  

E-print Network

DRAFT Deconvolution of Neutron Depth Profiling Spectra with Entropic Priors Carlos C. March 18, 1992 Abstract An iterative algorithm to deconvolve neutron depth profiling spectra is derived of impurities in a sample. General guidelines for computational experiments, to be designed in close contact

Rodriguez, Carlos


Deconvolution of adaptive optics retinal images Julian C. Christou  

E-print Network

Deconvolution of adaptive optics retinal images Julian C. Christou Center for Adaptive Optics the contrast of the adaptive optics images. In this work we demonstrate that quantitative information is also by using adaptive optics1 (AO). The wave-front correction is not perfect, however. Although a diffraction


Spent Nuclear Fuel Characterization Through Neutron Flux Deconvolution  

Microsoft Academic Search

A method to determine the composition of spent fuel through spectral deconvolution of the neutron flux emitted from the fuel is proposed. Recently developed GaAs(¹°B) semiconductor detector arrays are used. The results of Monte Carlo simulations of the detector responses, illustrating the feasibility of the spectral unfolding technique for spent fuel characterization, are presented.

Michael R. Hartman; John C. Lee



Spent Nuclear Fuel Characterization Through Neutron Flux Deconvolution  

SciTech Connect

A method to determine the composition of spent fuel through spectral deconvolution of the neutron flux emitted from the fuel is proposed. Recently developed GaAs({sup 10}B) semiconductor detector arrays are used. The results of Monte Carlo simulations of the detector responses, illustrating the feasibility of the spectral unfolding technique for spent fuel characterization, are presented.

Hartman, Michael R.; Lee, John C.



Blind deconvolution using an improved L0 sparse representation  

NASA Astrophysics Data System (ADS)

In this paper, we present a method for single image blind deconvolution. Many common forms of blind deconvolution methods need to previously generate a salient image, while the paper presents a novel L0 sparse expression to directly solve the ill-positioned problem. It has no need to filter the blurred image as a restoration step and can use the gradient information as a fidelity term during optimization. The key to blind deconvolution problem is to estimate an accurate kernel. First, based on L2 sparse expression using gradient operator as a prior, the kernel can be estimated roughly and efficiently in the frequency domain. We adopt the multi-scale scheme which can estimate blur kernel from coarser level to finer level. After the estimation of this level's kernel, L0 sparse representation is employed as the fidelity term during restoration. After derivation, L0 norm can be approximately converted to a sum term and L1 norm term which can be addressed by the Split-Bregman method. By using the estimated blur kernel and the TV deconvolution model, the final restoration image is obtained. Experimental results show that the proposed method is fast and can accurately reconstruct the kernel, especially when the blur is motion blur, defocus blur or the superposition of the two. The restored image is of higher quality than that of some of the art algorithms.

Ye, Pengzhao; Feng, Huajun; Li, Qi; Xu, Zhihai; Chen, Yueting



A globally convergent approach for blind MIMO adaptive deconvolution  

Microsoft Academic Search

We discuss the blind deconvolution of multiple input\\/multiple output (MIMO) linear convolutional mixtures and propose a set of hierarchical criteria motivated by the maximum entropy principle. The proposed criteria are based on the constant-modulus (CM) criterion in order to guarantee that all minima achieve perfectly restoration of different sources. The approach is moreover robust to errors in channel order estimation.

Azzdine Touzni; Inbar Fijalkow; Michael G. Larimore; John R. Treichler



Blind deconvolution of ultrasonic traces accounting for pulse variance  

Microsoft Academic Search

The ability of pulse-echo measurements to resolve closely spaced reflectors is limited by the duration of the ultrasonic pulse. Resolution can be improved by deconvolution, but this often fails because frequency selective attenuation introduces unknown changes in the pulse shape. In this paper we propose a maximum a posteriori algorithm for simultaneous estimation of a time varying pulse and high-resolution

Kjetil F. Kaaresen; E. Bolviken



Detection of gravity field source boundaries using deconvolution method  

NASA Astrophysics Data System (ADS)

Complications arise in the interpretation of gravity fields because of interference from systematic degradations, such as boundary blurring and distortion. The major sources of these degradations are the various systematic errors that inevitably occur during gravity field data acquisition, discretization and geophysical forward modelling. To address this problem, we evaluate deconvolution method that aim to detect the clear horizontal boundaries of anomalous sources by the suppression of systematic errors. A convolution-based multilayer projection model, based on the classical 3-D gravity field forward model, is innovatively derived to model the systematic error degradation. Our deconvolution algorithm is specifically designed based on this multilayer projection model, in which three types of systematic error are defined. The degradations of the different systematic errors are considered in the deconvolution algorithm. As the primary source of degradation, the convolution-based systematic error is the main object of the multilayer projection model. Both the random systematic error and the projection systematic error are shown to form an integral part of the multilayer projection model, and the mixed norm regularization method and the primal-dual optimization method are therefore employed to control these errors and stabilize the deconvolution solution. We herein analyse the parameter identification and convergence of the proposed algorithms, and synthetic and field data sets are both used to illustrate their effectiveness. Additional synthetic examples are specifically designed to analyse the effects of the projection systematic error, which is caused by the uncertainty associated with the estimation of the impulse response function.

Zuo, Boxin; Hu, Xiangyun; Liang, Yin; Han, Qi



Breast image feature learning with adaptive deconvolutional networks  

NASA Astrophysics Data System (ADS)

Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpin, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.



Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)  

NASA Technical Reports Server (NTRS)

The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.



Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details  

PubMed Central

Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners. PMID:21904538

Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca



Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles  

PubMed Central

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578

Baer, DR; Gaspar, DJ; Nachimuthu, P; Techane, SD; Castner, DG



Introducing an Online Cooling Tower Performance Analysis Tool  

E-print Network

in detail to highlight important design considerations and issues. This will include how the Merkel Theory, psychometric properties, tower types, and historical weather data are incorporated into the analysis....

Muller, M.R.; Muller, M.B.; Rao, P.



Sequential Analysis: A Tool for Monitoring Program Delivery.  

ERIC Educational Resources Information Center

The sensitivity and simplicity of Wald's sequential analysis test in monitoring a preventive health care program are discussed. Data exemplifying the usefulness and expedience of employing sequential methods are presented. (Author/GK)

Howe, Holly L.; Hoff, Margaret B.




EPA Science Inventory

A third generation of environmental policymaking and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of these risks associated with the decision alternatives at hand will e...


OPE The Campus Safety and Security Data Analysis Cutting Tool  

NSDL National Science Digital Library

Provided by the Office of Postsecondary Education (OPE) of the US Department of Education, this searchable database allows users to browse records of reported criminal offenses at over 6000 colleges and universities. The database contains records for 1997-99 and may be browsed by region, state, city, type of institution, instructional program, and number of students. Users can also simply type in the name of a specific institution. Initial entries include basic contact information and links to statistics for criminal offenses, hate offenses, and arrests. Each entry page also links to the relevant page at the National Center for Education Statistics IPEDS COOL (College Opportunities On-Line) website (reviewed in the March 31, 2000 Scout Report), a tool for comparison shopping between different collges and universities.


National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion  

NASA Technical Reports Server (NTRS)

Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.

Follen, G.; Naiman, C.; Evans, A.



PRAAD: Preprocessing and Analysis Tool for Arabic Ancient Documents  

Microsoft Academic Search

This paper presents the new system PRAAD for preprocessing and analysis of Arabic historical documents. It is composed of two important parts: pre-processing and analysis of ancient documents. After digitization, the color or greyscale ancient documents images are distorted by the presence of strong background artefacts such as scan optical blur and noise, show-through and bleed-through effects and spots. In

Wafa Boussellaa; Abderrazak Zahour; Bruno Taconet; Adel Alimi; Abdellatif Benabdelhafid



Blind deconvolution of 3D transmitted light brightfield micrographs.  


The blind deconvolution algorithm for 3D transmitted light brightfield (TLB) microscopy, published previously [Holmes et al. Handbook of Biological Confocal Microscopy (1995)], is summarized with example images. The main emphasis of this paper is to discuss more thoroughly the importance and usefulness of this method and to provide more detailed evidence, some being quantitative, of its necessity. Samples of horseradish peroxidase (HRP)-stained pyramidal neurones were prepared and evaluated for the ability to see fine structures clearly, including the dendrites and spines. It is demonstrated that the appearance of fine spine structure, and means of identifying spine categories, is made possible by using blind deconvolution. A comparison of images of the same sample from reflected light confocal microscopy, which is the conventional light microscopic way of viewing the 3D structure of these HRP-stained samples, shows that the blind deconvolution method is far superior for clearly showing the structure with less distortion and better resolution of the spines. The main significance of this research is that it is now possible to obtain clear images of 3D structure by light microscopy of absorbing stains. This is important because the TLB microscope is probably the most widely used modality in the life-science laboratory, yet, until now, there has been no reliable means for it to provide visualization of 3D structure clearly. The main importance of the blind deconvolution approach is that it obviates the need to measure the point spread function of the optical system, so that it now becomes realistic to provide a 3D light microscopic deconvolution method that can be pervasively used by microscopists. PMID:11106952

Holmes, T J; O'Connor, N J



DEBRISK, a Tool for Re-Entry Risk Analysis  

NASA Astrophysics Data System (ADS)

An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the ablated object. The mass in the lumped mass equation is termed 'thermal mass' and consists of the part of the object that is exposed to the flow (so excluding the mass of the contained children). A fair amount of predefined materials is implemented, along with their thermal properties. In order to allow the users to modify the properties or to add new materials, user defined materials can be used. In that case the properties such as specific heat, emissivity and conductivity can either be entered as a constant or as being temperature dependent by entering a table. Materials can be derived from existing objects, which is useful in case only one or few of the material properties change. The code has been developed in the Java language, benefitting from the object oriented approach. Most methods that are used in DEBRISK to compute drag coefficients and heating rates are based on engineering methods developed in the 1950 to 1960's, which are used as well in similar tools (ORSAT, SESAME, ORSAT-J, ...). The paper presents a set of comparisons with literature cases of similar tools in order to verify the implementation of those methods in the developed software.

Omaly, P.; Spel, M.



Extension of a System Level Tool for Component Level Analysis  

NASA Technical Reports Server (NTRS)

This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)



Application of surface chemical analysis tools for characterization of nanoparticles  

Microsoft Academic Search

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is\\u000a described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy\\u000a (AES), X-ray photoelectron spectroscopy (XPS), time-of-flight secondary-ion mass spectrometry (TOF-SIMS), low-energy ion scattering\\u000a (LEIS), and scanning-probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic

Donald R. Baer; Daniel J. Gaspar; Ponnusamy Nachimuthu; Sirnegeda D. Techane; David G. Castner



Development of a User Interface for a Regression Analysis Software Tool  

NASA Technical Reports Server (NTRS)

An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

Ulbrich, Norbert Manfred; Volden, Thomas R.



Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology  

PubMed Central

The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of effector proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogens predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed ( or PMID:24109552

Gruning, Bjorn A.; Paszkiewicz, Konrad; Pritchard, Leighton



Tools for Comparative Analysis of Alternatives: Competing or Complementary Perspectives?  

Microsoft Academic Search

A third generation of environmental policy making and risk management will increasingly im pose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of all risks associated with the decision alternatives will aid decision- makers in prioritizing alternatives that effectively reduce both target and countervailing risks. Starting with the metaphor of the ripples caused by

Patrick Hofstetter; Jane C. Bare; James K. Hammitt; Patricia A. Murphy; Glenn E. Rice



A geospatial tool for wildfire threat analysis in central Texas  

NASA Astrophysics Data System (ADS)

Wildland fires in the United States are not always confined to wilderness areas. The growth of population centers and housing developments in wilderness areas has blurred the boundaries between rural and urban. This merger of human development and natural landscape is known in the wildland fire community as the wildland urban interface or WUI, and it is within this interface that many wildland fires increasingly occur. As wildland fire intrusions in the WUI increase so too does the need for tools to assess potential impact to valuable assets contained within the interface. This study presents a methodology that combines real-time weather data, a wildland fire behavior model, satellite remote sensing and geospatial data in a geographic information system to assess potential risk to human developments and natural resources within the Austin metropolitan area and surrounding ten counties of central Texas. The methodology uses readily available digital databases and satellite images within Texas, in combination with an industry standard fire behavior model to assist emergency and natural resource managers assess potential impacts from wildland fire. Results of the study will promote prevention of WUI fire disasters, facilitate watershed and habitat protection, and help direct efforts in post wildland fire mitigation and restoration.

Hunter, Bruce Allan


A software tool for the analysis of neuronal morphology data  

PubMed Central

Anatomy plays a fundamental role in supporting and shaping nervous system activity. The remarkable progress of computer processing power within the last two decades has enabled the generation of electronic databases of complete three-dimensional (3D) dendritic and axonal morphology for neuroanatomical studies. Several laboratories are freely posting their reconstructions online after result publication NeuroMorpho.Org (Nat Rev Neurosci7:318324, 2006). These neuroanatomical archives represent a crucial resource to explore the relationship between structure and function in the brain (Front Neurosci6:49, 2012). However, such 'Cartesian descriptions bear little intuitive information for neuroscientists. Here, we developed a simple prototype of a MATLAB-based software tool to quantitatively describe the 3D neuronal structures from public repositories. The program imports neuronal reconstructions and quantifies statistical distributions of basic morphological parameters such as branch length, tortuosity, branch's genealogy and bifurcation angles. Using these morphological distributions, our algorithm can generate a set of virtual neurons readily usable for network simulations. PMID:24529393



A software tool for the analysis of neuronal morphology data.  


Anatomy plays a fundamental role in supporting and shaping nervous system activity. The remarkable progress of computer processing power within the last two decades has enabled the generation of electronic databases of complete three-dimensional (3D) dendritic and axonal morphology for neuroanatomical studies. Several laboratories are freely posting their reconstructions online after result publication NeuroMorpho.Org (Nat Rev Neurosci7:318-324, 2006). These neuroanatomical archives represent a crucial resource to explore the relationship between structure and function in the brain (Front Neurosci6:49, 2012). However, such 'Cartesian' descriptions bear little intuitive information for neuroscientists. Here, we developed a simple prototype of a MATLAB-based software tool to quantitatively describe the 3D neuronal structures from public repositories. The program imports neuronal reconstructions and quantifies statistical distributions of basic morphological parameters such as branch length, tortuosity, branch's genealogy and bifurcation angles. Using these morphological distributions, our algorithm can generate a set of virtual neurons readily usable for network simulations. PMID:24529393

Ledderose, Julia; Sencin, Luis; Salgado, Humberto; Arias-Carrin, Oscar; Trevio, Mario



Deconvolution of Teleseismic P-Waves Using the SV Autocorrelation Method with Application to the P-Wave Structure Beneath the Hi-CLIMB Array in Tibet  

Microsoft Academic Search

The analysis of seismic receiver functions has become an effective approach for determining crust and upper mantle structure. In the traditional receiver function method, the vertical component is used to deconvolve the radial component, where the vertical component is assumed to be equivalent to the source wavelet. However, in the approach of Dasgupta and Nowack (2006), the deconvolution of three-component

S. Roy; R. L. Nowack



On-line concentration measurements in wastewater using nonlinear deconvolution and partial least squares of spectrophotometric data.  


A procedure for the on-line measurement of concentrations of toxins in wastewater using spectrophotometric data is proposed. The complete absorbance spectrum of a wastewater sample is used to predict the concentrations of all possible substances within. Two techniques are examined. In nonlinear spectral deconvolution, the spectrum is decomposed as a linear combination of base spectra and the coefficients of this deconvolution are used to nonlinearly estimate the concentrations. Under partial least squares analysis, the concentrations are directly estimated as a linear combination of the measured spectrum data. Both techniques show good results for estimating the kinetics of samples taken during the reaction phase in a laboratory anaerobic/aerobic SBR used for p-nitrophenol biodegradation. PMID:16722098

Vargas, A; Buitrn, G



The enhancement of fault detection and diagnosis in rolling element bearings using minimum entropy deconvolution combined with spectral kurtosis  

NASA Astrophysics Data System (ADS)

Spectral kurtosis (SK) represents a valuable tool for extracting transients buried in noise, which makes it very powerful for the diagnostics of rolling element bearings. However, a high value of SK requires that the individual transients are separated, which in turn means that if their repetition rate is high their damping must be sufficiently high that each dies away before the appearance of the next. This paper presents an algorithm for enhancing the surveillance capability of SK by using the minimum entropy deconvolution (MED) technique. The MED technique effectively deconvolves the effect of the transmission path and clarifies the impulses, even where they are not separated in the original signal. The paper illustrates these issues by analysing signals taken from a high-speed test rig, which contained a bearing with a spalled inner race. The results show that the use of the MED technique dramatically sharpens the pulses originating from the impacts of the balls with the spall and increases the kurtosis values to a level that reflects the severity of the fault. Moreover, when the algorithm was tested on signals taken from a gearbox for a bearing with a spalled outer race, it shows that each of the impulses originating from the impacts is made up of two parts (corresponding to entry into and exit from the spall). This agrees well with the literature but is often difficult to observe without the use of the MED technique. The use of the MED along with SK analysis also greatly improves the results of envelope analysis for making a complete diagnosis of the fault and trending its progression.

Sawalhi, N.; Randall, R. B.; Endo, H.



Leaf analysis as an exploratory tool in mineralogy  

NASA Astrophysics Data System (ADS)

PIXE analysis has been used for more than a decade at the University of Manitoba to determine trace-element concentrations in a wide variety of materials including minerals. Detailed analysis of the elemental composition of leaves is of interest because the macronutrient and micronutrient elements present in plant tissues are already well known from chemical studies. In the present work samples from species Betula populifolia and Picea glauca were irradiated at an incident proton energy of 40 MeV to determine possible additional trace-element concentrations due to migration from mineral deposits present underground. In addition to known nutrient elements, other elements such as Rb, Sr, Cd and Ba were readily detected. In some samples the presence of Pt was also identified.

Mirzai, A. A.; McKee, J. S. C.; Yeo, Y. H.; Gallop, D.; Medved, J.



ADVISOR: a systems analysis tool for advanced vehicle modeling  

Microsoft Academic Search

This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)the US Department of Energys (DOEs) ADVISOR written in the MATLAB\\/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance,

T. Markel; A. Brooker; T. Hendricks; V. Johnson; K. Kelly; B. Kramer; M OKeefe; S. Sprik; K. Wipke



Wind Turbine Modeling: Comparison of Advanced Tools for Transient Analysis  

Microsoft Academic Search

This paper presents a comparative study between two commercial programs considering transient analysis of custom power devices based on voltage source converters. The programs investigated were SABER and PSCAD\\/EMTDC. PSCAD\\/EMTDC and SABER models of a multi-megawatt wind turbine - Gamesa G80 - has been implemented and the obtained results are presented in this paper. Both models take into account the

J. A. Fuentes; A. Molina; F. Ruz; E. Gomez; F. Jimenez



Design and Analysis Tools for Concurrent Blackboard Systems  

NASA Technical Reports Server (NTRS)

A blackboard system consists of a set of knowledge sources, a blackboard data structure, and a control strategy used to activate the knowledge sources. The blackboard model of problem solving is best described by Dr. H. Penny Nii of the Stanford University AI Laboratory: "A Blackboard System can be viewed as a collection of intelligent agents who are gathered around a blackboard, looking at pieces of information written on it, thinking about the current state of the solution, and writing their conclusions on the blackboard as they generate them. " The blackboard is a centralized global data structure, often partitioned in a hierarchical manner, used to represent the problem domain. The blackboard is also used to allow inter-knowledge source communication and acts as a shared memory visible to all of the knowledge sources. A knowledge source is a highly specialized, highly independent process that takes inputs from the blackboard data structure, performs a computation, and places the results of the computation in the blackboard data structure. This design allows for an opportunistic control strategy. The opportunistic problem-solving technique allows a knowledge source to contribute towards the solution of the current problem without knowing which of the other knowledge sources will use the information. The use of opportunistic problem-solving allows the data transfers on the blackboard to determine which processes are active at a given time. Designing and developing blackboard systems is a difficult process. The designer is trying to balance several conflicting goals and achieve a high degree of concurrent knowledge source execution while maintaining both knowledge and semantic consistency on the blackboard. Blackboard systems have not attained their apparent potential because there are no established tools or methods to guide in their construction or analyze their performance.

McManus, John W.



A fully automated trabecular bone structural analysis tool based on T2* -weighted magnetic resonance imaging.  


One major source affecting the precision of bone structure analysis in quantitative magnetic resonance imaging (qMRI) is inter- and intraoperator variability, inherent in delineating and tracing regions of interest along longitudinal studies. In this paper an automated analysis tool, featuring bone marrow segmentation, region of interest generation, and characterization of cancellous bone of articular joints is presented. In evaluation studies conducted at the knee joint the novel analysis tool significantly decreased the standard error of measurement and improved the sensitivity in detecting minor structural changes. It further eliminated the need of time-consuming user interaction, and thereby increasing reproducibility. PMID:21862288

Kraiger, Markus; Martirosian, Petros; Opriessnig, Peter; Eibofner, Frank; Rempp, Hansjoerg; Hofer, Michael; Schick, Fritz; Stollberger, Rudolf



An integrated data analysis tool for improving measurements on the MST RFPa)  

NASA Astrophysics Data System (ADS)

Many plasma diagnostics contain complementary information. For example, the double-foil soft x-ray system (SXR) and the Thomson Scattering diagnostic (TS) on the Madison Symmetric Torus both measure electron temperature. The complementary information from these diagnostics can be combined using a systematic method based on integrated data analysis techniques, leading to more accurate and sensitive results. An integrated data analysis tool based on Bayesian probability theory was able to estimate electron temperatures that are consistent with both the SXR and TS diagnostics and more precise than either. A Markov Chain Monte Carlo analysis to increase the flexibility of the tool was implemented and benchmarked against a grid search method.

Reusch, L. M.; Galante, M. E.; Franz, P.; Johnson, J. R.; McGarry, M. B.; Stephens, H. D.; Den Hartog, D. J.



Using Microsoft PowerPoint as an Astronomical Image Analysis Tool  

Microsoft Academic Search

Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers

Bernhard Beck-Winchatz



International Workshop on Analysis Tools and Methodologies for Embedded and Real-time  

E-print Network

use different algorithms for generating random task sets, different application traces when simulating original contributions on methods and tools for real-time and embedded systems analysis, simulation Stafford and Robert Davis A Statistical Approach to Simulation Model Validation in 12 Timing Analysis

Lipari, Giuseppe


CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips  

PubMed Central

Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at PMID:24466021

Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh



Exploring NASA and ESA Atmospheric Data Using GIOVANNI, the Online Visualization and Analysis Tool  

NASA Technical Reports Server (NTRS)

Giovanni, the NASA Goddard online visualization and analysis tool ( allows users explore various atmospheric phenomena without learning remote sensing data formats and downloading voluminous data. Using NASA MODIS (Terra and Aqua) and ESA MERIS (ENVISAT) aerosol data as an example, we demonstrate Giovanni usage for online multi-sensor remote sensing data comparison and analysis.

Leptoukh, Gregory



PLFaultCAT: A Product-Line Software Fault Tree Analysis Tool  

Microsoft Academic Search

Industry currently employs a product line approach to software development and deployment as a means to enhance quality while reducing development cost and time. This effort has created a climate where safety-critical software product lines are being developed without the full range of accompanying safety analysis tools available to software engineers. Software Fault Tree Analysis (SFTA) is a technique that

Josh Dehlinger; Robyn R. Lutz



Fault Tree Analysis: A Research Tool for Educational Planning. Technical Report No. 1.  

ERIC Educational Resources Information Center

This ESEA Title III report describes fault tree analysis and assesses its applicability to education. Fault tree analysis is an operations research tool which is designed to increase the probability of success in any system by analyzing the most likely modes of failure that could occur. A graphic portrayal, which has the form of a tree, is

Alameda County School Dept., Hayward, CA. PACE Center.


Methodology, Metrics and Measures for Testing and Evaluation of Intelligence Analysis Tools  

E-print Network

PNWD-3550 Methodology, Metrics and Measures for Testing and Evaluation of Intelligence Analysis and Measures for Testing and Evaluation of Intelligence Analysis Tools 1. Introduction The intelligence, stakeholders and the research community have been seeking technology-based solutions to reduce the analyst


Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet  

NASA Technical Reports Server (NTRS)

The existing RocketWeb(TradeMark) Internet Analysis System (httr):// provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

Muss, J. A.; Johnson, C. W.; Gotchy, M. B.



University Economic Impact Analysis: Applying microeconomic tools and concepts  

NSDL National Science Digital Library

This service-learning impact analysis project had students look in detail at the current employment and purchasing practices and policies of the University of Vermont. Unlike traditional impact analyses that attempt to calculate the total impact of an institution on the local economy, this project attempted to identify where the University could change policies and practices to increase positive local impacts both from an efficiency and equity perspective. Students worked with a 14-person advisory committee from the University, local and state government and local non-profits.

Brooks, Nancy


Stakeholder analysis: a useful tool for biobank planning.  


Stakeholders are individuals, groups, or organizations that are affected by or can affect a particular action undertaken by others. Biobanks relate to a number of donors, researchers, research institutions, regulatory bodies, funders, and others. These stakeholders can potentially have a strong influence upon the organization and operation of a biobank. A sound strategy for stakeholder engagement is considered essential in project management and organization theory. In this article, we review relevant stakeholder theory and demonstrate how a stakeholder analysis was undertaken in the early stage of a planned research biobank at a public hospital in Norway. PMID:24835062

Bjugn, Roger; Casati, Bettina



New Geant4 based simulation tools for space radiation shielding and effects analysis  

NASA Astrophysics Data System (ADS)

We present here a set of tools for space applications based on the Geant4 simulation toolkit, developed for radiation shielding analysis as part of the European Space Agency (ESA) activities in the Geant4 collaboration. The Sector Shielding Analysis Tool (SSAT) and the Materials and Geometry Association (MGA) utility will first be described. An overview of the main features of the MUlti-LAyered Shielding SImulation Software tool (MULASSIS) will follow. The tool is specifically addressed to shielding optimization and effects analysis. A Java interface allows the use of MULASSIS by the space community over the World Wide Web, integrated in the widely used SPENVIS package. The analysis of the particle transport output provides automatically radiation fluence, ionising and NIEL dose and effects analysis. ESA is currently funding the porting of this tools to a lowcost parallel processor facility using the GRID technology under the ESA SpaceGRID initiative. Other Geant4 present and future projects will be presented related to the study of space environment effects on spacecrafts.

Santina, G.; Nieminen, P.; Evansa, H.; Daly, E.; Lei, F.; Truscott, P. R.; Dyer, C. S.; Quaghebeur, B.; Heynderickx, D.



SizeUp: A Tool for Interactive Comparative Collection Analysis for Very Large Species Collections  

E-print Network

SizeUp: A Tool for Interactive Comparative Collection Analysis for Very Large Species Collections Andrew Ozor Generated by Foxit PDF Creator Foxit Software For evaluation only. Wide Ranging Biological Data l Global... How do we compare and analyze large data sets, and visualize the result in a user friendly tool? Generated by Foxit PDF Creator Foxit Software For evaluation only. Multiple Problems l No formal definition for 'quality...

Ozor, Andrew



Surgem: Next Generation CAD Tools for Interactive Patient Specific Surgical Planning and Hemodynamic Analysis  

Microsoft Academic Search

The first version of an anatomy editing\\/surgical planning tool targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel shape editing concepts and human-shape interaction (HSI) technologies have been combined to facilitate interactive shape alterations and grid generation. At a surgery planning phase, these tools are applied to design and evaluate possible modifications of patient-specific anatomies of

Jarek Rossignac; Kerem Pekkan; Brian Whited; Kirk Kanter; Ajit Yoganathan; Wallace H. Coulter


Advancing lighting and daylighting simulation: The transition from analysis to design aid tools  

SciTech Connect

This paper explores three significant software development requirements for making the transition from stand-alone lighting simulation/analysis tools to simulation-based design aid tools. These requirements include specialized lighting simulation engines, facilitated methods for creating detailed simulatable building descriptions, an automated techniques for providing lighting design guidance. Initial computer implementations meant to address each of these requirements are discussed to further elaborate these requirements and to illustrate work-in-progress.

Hitchcock, R.J.



A systems analysis tool for construction and demolition wastes management.  


Managing construction and demolition (C&D) wastes has challenged many municipalities with diminishing waste disposal capacity. Facing such challenges, the Massachusetts Department of Environmental Protection proposed a policy restricting the landfill disposal of certain C&D waste materials, if unprocessed. This research is to study the potential economic impact of such restriction on construction contractors and C&D waste processors. A spreadsheet-based systems analysis model has been developed to assist the cost-benefit evaluation for various C&D waste management scenarios. The model, developed based on the mass balance principle, is designed to track a C&D waste stream through the various stages of a waste management system, i.e. generation, source separation, processing, recycling, and final disposal. This model, by incorporating the material flow data with the cost/revenue data associated with each management activity, can then provide an economic analysis for a proposed C&D waste management scenario. A case study illustrating the application of this model for Massachusetts is also presented. PMID:15567664

Wang, James Y; Touran, Ali; Christoforou, Christoforos; Fadlalla, Hatim



DAnTE: a statistical tool for quantitative analysis of omics data  

SciTech Connect

DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep; Petyuk, Vladislav A.; Adkins, Joshua N.; Camp, David G.; Anderson, Gordon A.; Smith, Richard D.



Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide  

NASA Technical Reports Server (NTRS)

The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

Csank, Jeffrey T.; Zinnecker, Alicia M.




SciTech Connect

The standard deconvolution analysis tool (SDAT) algorithms were developed and tested at the University of Texas at Austin. These algorithms utilize the standard spectrum technique for spectral analysis of {beta}-{gamma} coincidence spectra for nuclear explosion monitoring. Work has been conducted under this contract to implement these algorithms into a useable scientific software package with a graphical user interface. Improvements include the ability to read in PHD formatted data, gain matching, and data visualization. New auto-calibration algorithms were developed and implemented based on 137Cs spectra for assessment of the energy vs. channel calibrations. Details on the user tool and testing are included.

Biegalski, S.; Flory, Adam E.; Schrom, Brian T.; Ely, James H.; Haas, Derek A.; Bowyer, Ted W.; Hayes, James C.



Multilayered Analysis of Teacher-Student Interactions: Concepts and Perspectives Guiding Video Analysis with Tattoo, the Analytic Transcription Tool  

Microsoft Academic Search

This article describes the development of a video analysis software tool designed to make explicit and open the process of systematic analysis of video material on teachinglearning interactions. The need of an efficient and transparent way of transcribing and analysing video materials was brought forth in a sequence of studies of interaction in music education in Sweden, where spoken language,

Tore West



Automated analysis tools for reducing spacecraft telemetry data  

SciTech Connect

A practical description is presented of the methods used to reduce spacecraft telemetry data using a hierarchial toolkit of software programs developed for a UNIX environment. A project requiring the design, implementation and test flight of small, lightweight spacecraft was recently conducted. This spacecraft development required hundreds of tests and integrations of subsystems on several special purpose testbeds, with each test creating large amounts of telemetered data. This paper focuses on the automated analysis and reduction of data which is telemetered from one of the key subsystems, the Probe. A typical telemetry stream from a testbed run averaged 50 Megabytes of raw data, containing over 1600 system variables. The large telemetry file (raw data) sent from the Probe was decoded and decomposed into a large number of smaller Break Out Files (BOFs) containing variables with timestamps, and image files.

Voss, T.J.



Deconvolution of neural dynamics from fMRI data using a spatiotemporal hemodynamic response function.  


Functional magnetic resonance imaging (fMRI) is a powerful and broadly used means of non-invasively mapping human brain activity. However fMRI is an indirect measure that rests upon a mapping from neuronal activity to the blood oxygen level dependent (BOLD) signal via hemodynamic effects. The quality of estimated neuronal activity hinges on the validity of the hemodynamic model employed. Recent work has demonstrated that the hemodynamic response has non-separable spatiotemporal dynamics, a key property that is not implemented in existing fMRI analysis frameworks. Here both simulated and empirical data are used to demonstrate that using a physiologically based model of the spatiotemporal hemodynamic response function (stHRF) results in a quantitative improvement of the estimated neuronal response relative to unphysical space-time separable forms. To achieve this, an integrated spatial and temporal deconvolution is established using a recently developed stHRF. Simulated data allows the variation of key parameters such as noise and the spatial complexity of the neuronal drive, while knowing the neuronal input. The results demonstrate that the use of a spatiotemporally integrated HRF can avoid "ghost" neuronal responses that can otherwise be falsely inferred. Applying the spatiotemporal deconvolution to high resolution fMRI data allows the recovery of neuronal responses that are consistent with independent electrophysiological measures. PMID:24632091

Aquino, K M; Robinson, P A; Schira, M M; Breakspear, M



Isotope pattern deconvolution-tandem mass spectrometry for the determination and confirmation of diclofenac in wastewaters.  


Isotope dilution mass spectrometry (IDMS) based on isotope pattern deconvolution (IPD) has been applied here to MS/MS (QqQ) in order to carry out the quantification and confirmation of organic compounds in complex matrix water samples without the use of a methodological IDMS calibration graph. In this alternative approach, the isotope composition of the spiked sample is measured after fragmentation by SRM and deconvoluted into its constituting components (molar fractions of natural abundance and labeled compound) by multiple linear regression (IPD). The procedure has been evaluated for the determination of the pharmaceutical diclofenac in effluent and influent urban wastewaters and fortified surface waters by UHPLC (ESI) MS/MS using diclofenac-d(4) as labeled compound. Calculations were performed acquiring a part and the whole fragment cluster ion, achieving in all cases recoveries within 90-110% and coefficients of variation below 5% for all water samples tested. In addition, potential false negatives arising from the presence of diclofenac-d(2) impurities in the labeled compound were avoided when the proposed approach was used instead of the most usual IDMS calibration procedure. The number of SRM transitions measured was minimized to three to make possible the application of this alternative technique in routine multi-residue analysis. PMID:23410629

Castillo, Angel; Gracia-Lor, Emma; Roig-Navarro, Antoni Francesc; Vicente Sancho, Juan; Rodrguez-Gonzlez, Pablo; Garca Alonso, J Ignacio



Deconvolution from wave front sensing using the frozen flow hypothesis.  


Deconvolution from wave front sensing (DWFS) is an image-reconstruction technique for compensating the image degradation due to atmospheric turbulence. DWFS requires the simultaneous recording of high cadence short-exposure images and wave-front sensor (WFS) data. A deconvolution algorithm is then used to estimate both the target object and the wave front phases from the images, subject to constraints imposed by the WFS data and a model of the optical system. Here we show that by capturing the inherent temporal correlations present in the consecutive wave fronts, using the frozen flow hypothesis (FFH) during the modeling, high-quality object estimates may be recovered in much worse conditions than when the correlations are ignored. PMID:21369013

Jefferies, Stuart M; Hart, Michael



Deconvolution approaches applied to space-time radar data  

NASA Astrophysics Data System (ADS)

This paper considers the operation of multi-element radar arrays in the context of Airborne Early Warning applications. A 2D convolution model is proposed to represent the transformation of data determined by the existence of targets characterized by a given relative velocity and located at a certain angle, into a corresponding Azimuth-Doppler Spectrum. The feasibility of this interpretation is demonstrated by matching of the spectrum generated through convolution with the one resulting from software simulation of the same target conditions. Two methods of discrete 2D deconvolution are explored in an attempt to revert the process, obtaining an estimate of the target characteristics from simulated Azimuth-Doppler Spectra. The advantages and disadvantages of the methods are reported and the possibility of using deconvolution to preferentially retrieve target components over clutter interference is presented.

Barreto, Armando B.; Granadillo, Luis R.; Diner, E. E.



CCAT mount control using de-convolution for fast scans  

NASA Astrophysics Data System (ADS)

CCAT will be a 25-meter telescope for submillimeter wavelength astronomy located at an altitude of 5600 meters on Cerro Chajnantor in northern Chile. This paper presents an overview of the preliminary mount control design. A finite element model of the structure has been developed and is used to determine the dynamics relevant for mount control. Controller strategies are presented that are designed to meet challenging wind rejection and fast scan requirements. Conventional inner loops are used for encoder-based control. Offset requirements are satisfied using innovative command shaping with feedforward and a two-command path structure. The fast scan requirement is satisfied using a new approach based on a de-convolution filter. The de-convolution filter uses an estimate of the closed loop response obtained from test signals. Wind jitter requirements remain a challenge and additional sensors such as accelerometers and wind pressure sensors may be needed.

Thompson, Peter M.; Padin, Stephen



Funtools: Fits Users Need Tools for Quick, Quantitative Analysis  

NASA Technical Reports Server (NTRS)

The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

Mandel, Eric; Brederkamp, Joe (Technical Monitor)



Deconvolution of light-scattering patterns by observing intensity fluctuations.  


Results are presented on the statistical fluctuations occurring in a forward-light-scattering experiment to determine the particle size distribution. A sample of glass beads was measured using a Malvern 2600D instrument and analyzed with a proposed deconvolution procedure that incorporates the observed intensity fluctuations. This procedure yields a qualitative improvement of the solution, provides error intervals, and offers a better means for model discrimination. PMID:20717284

Boxman, A; Merkus, H G; Verheijen, P J; Scarlett, B



Determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution  

NASA Technical Reports Server (NTRS)

The final report for work on the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution is presented. Papers and theses prepared during the research report period are included. Among all the research results reported, note should be made of the specific investigation of the determination of design and operation parameters for upper atmospheric research instrumentation to yield optimum resolution with deconvolution. A methodology was developed to determine design and operation parameters for error minimization when deconvolution is included in data analysis. An error surface is plotted versus the signal-to-noise ratio (SNR) and all parameters of interest. Instrumental characteristics will determine a curve in this space. The SNR and parameter values which give the projection from the curve to the surface, corresponding to the smallest value for the error, are the optimum values. These values are constrained by the curve and so will not necessarily correspond to an absolute minimum in the error surface.

Ioup, George E.; Ioup, Juliette W.



The application of a time-domain deconvolution technique for identification of experimental acoustic-emission signals  

NASA Technical Reports Server (NTRS)

A method is presented for the signature analysis of pulses by reconstructing in the time domain the shape of the pulse prior to its passing through the measurement system. This deconvolution technique is first evaluated using an idealized system and analytical pulse models and is shown to provide improved results. An experimental situation is then treated; system-component models are developed for the digitizer, tape recorder, filter, transducer and mechanical structure. To accommodate both calibration results and manufacturer's data, and to provide stable mathematical models entails considerable effort: some 30 parameters must be identified to model this system - which is still a substantial approximation - albeit of very high order. Experimental pulses generated by a ball drop, spark discharge and a tearing crack are then deconvoluted 'back through' the system as modeled, using this technique. These results are compared and indicate (a) that consistent shapes may be expected from a given type of source and (b) that some sources can be identified with greater clarity using the deconvolution approach.

Houghton, J. R.; Townsend, M. A.; Packman, P. F.



Principal component analysis: A tool for processing hyperspectral infrared data  

NASA Astrophysics Data System (ADS)

During the last decades, new instruments have been designed and built to improve observations of atmospheric temperature, water vapor, and winds. In the area of infrared remote sensing, new technologies will enable the next generation of instruments, like the Geostationary Imaging Fourier Transform Spectrometer (GIFTS), to collect high spectral and spatial resolution data with very high data rates. If not properly compressed those data rates will exceed the capacity of the current operational downlink technology and will require expensive data systems to process the data on the ground. This dissertation focuses on establishing the compression and inversion procedures to reduce the volume of data with minimal information loss and with beneficial effects on the accuracy of the retrieved atmospheric variables. To take full advantage of the large number of channels available and the high correlation between them, Principal Component Analysis (PCA) has been chosen as the basis for the compression procedure. By separating the atmospheric signal from the random component of the instrument noise, a PCA-based Compression algorithm (PCC) leads to high values of the compression ratio with an overall improvement of the signal-to-noise ratio. The results obtained by applying PCA to both simulated and real data, show that it represents a key component in processing high spectral resolution data. PCA can be used to reduce the volume of data to be inverted for the retrieval of the atmospheric variables and to improve the signal to noise ratio. Both the data volume and the noise reduction have been demonstrated to be beneficial for the retrieval process, increasing the accuracy and the efficiency of the clear and cloudy sky inversion algorithms. Moreover, PCA has been proved useful in lowering the post-processing costs and improving the quality of the retrieved variables, the ultimate products of interest for the scientific community.

Antonelli, Paolo



Estimating backscatter spectra after deconvolution with Kalman smoothing  

NASA Astrophysics Data System (ADS)

In quantitative tissue characterization. Obtaining processed ultrasonic echoes with a direct relationship to local tissue response (backscatter spectrum) and that are free from systemic depth-dependent effects, such as diffraction, is essential. In general practice today, these unwanted distortions are eliminated by dividing short time power spectra. However, this method has its drawbacks; noise is not taken into account, and shorter time gates lead to an increasing bias within the relative spectra. To overcome these methodological issues, I propose a different approach as follows. Entire deconvolved A-scans are estimated by a Kalman smoothing deconvolution algorithm. These then serve as a basis for estimating the relative backscatter spectra. In addition, due to the principle of the deconvolution algorithm, it is possible to suppress additive noise to some degree. To examine the properties of the method proposed, this paper presents an analytical expression for the power spectrum of the deconvolved signals obtained by Kalman Smoothing. This result is then compared to the expectations of relative short time power spectra. Simulations demonstrate the behavior of the deconvolution method in a non-stationary environment.

Guenter, Armin I.



Adaptive wavelet-based deconvolution method for remote sensing imaging.  


Fourier-based deconvolution (FoD) techniques, such as modulation transfer function compensation, are commonly employed in remote sensing. However, the noise is strongly amplified by FoD and is colored, thus producing poor visual quality. We propose an adaptive wavelet-based deconvolution algorithm for remote sensing called wavelet denoise after Laplacian-regularized deconvolution (WDALRD) to overcome the colored noise and to preserve the textures of the restored image. This algorithm adaptively denoises the FoD result on a wavelet basis. The term "adaptive" means that the wavelet-based denoising procedure requires no parameter to be estimated or empirically set, and thus the inhomogeneous Laplacian prior and the Jeffreys hyperprior are proposed. Maximum a posteriori estimation based on such a prior and hyperprior leads us to an adaptive and efficient nonlinear thresholding estimator, and therefore WDALRD is computationally inexpensive and fast. Experimentally, textures and edges of the restored image are well preserved and sharp, while the homogeneous regions remain noise free, so WDALRD gives satisfactory visual quality. PMID:19696869

Zhang, Wei; Zhao, Ming; Wang, Zhile



Semantic integration of gene expression analysis tools and data sources using software connectors  

PubMed Central

Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380



An open-source deconvolution software package for 3-D quantitative fluorescence microscopy imaging  

PubMed Central

Summary Deconvolution techniques have been widely used for restoring the 3-D quantitative information of an unknown specimen observed using a wide-field fluorescence microscope. Deconv, an open-source deconvolution software package, was developed for 3-D quantitative fluorescence microscopy imaging and was released under the GNU Public License. Deconv provides numerical routines for simulation of a 3-D point spread function and deconvolution routines implemented three constrained iterative deconvolution algorithms: one based on a Poisson noise model and two others based on a Gaussian noise model. These algorithms are presented and evaluated using synthetic images and experimentally obtained microscope images, and the use of the library is explained. Deconv allows users to assess the utility of these deconvolution algorithms and to determine which are suited for a particular imaging application. The design of Deconv makes it easy for deconvolution capabilities to be incorporated into existing imaging applications. PMID:19941558




Maximum correlated Kurtosis deconvolution and application on gear tooth chip fault detection  

NASA Astrophysics Data System (ADS)

In this paper a new deconvolution method is presented for the detection of gear and bearing faults from vibration data. The proposed maximum correlated Kurtosis deconvolution method takes advantage of the periodic nature of the faults as well as the impulse-like vibration behaviour associated with most types of faults. The results are compared to the standard minimum entropy deconvolution method on both simulated and experimental data. The experimental data is from a gearbox with gear chip fault, and the results are compared between healthy and faulty vibrations. The results indicate that the proposed maximum correlated Kurtosis deconvolution method performs considerably better than the traditional minimum entropy deconvolution method, and often performs several times better at fault detection. In addition to this improved performance, deconvolution of separate fault periods is possible; allowing for concurrent fault detection. Finally, an online implementation is proposed and shown to perform well and be computationally achievable on a personal computer.

McDonald, Geoff L.; Zhao, Qing; Zuo, Ming J.



An automated technique for detailed ?-FTIR mapping of diamond and spectral deconvolution  

NASA Astrophysics Data System (ADS)

Since the original classification of diamonds based upon their absorption in the one-phonon region of the mid-infrared (IR) range was first introduced, a vast amount of research has been carried out in this field. The result today is that IR analysis has become the principle tool for classifying diamonds based upon the concentration and aggregation state of nitrogen, the most common impurity found within their crystal lattice. These studies have shown that diamonds can contain a large range of nitrogen, from nominally nitrogen free i.e. below detection limits (termed Type II) to nitrogen rich (termed Type I) with up to 5000 ppm. It has also been shown that the nitrogen concentration, aggregation and distribution in an individual stone can be either homogeneous or heterogeneous. Nitrogen has been shown to reside within diamond in three different aggregation states. The first is in the form of single substitutional nitrogen atoms, known as C centres. Diamonds that contain nitrogen only in this form are termed Type Ib. The second aggregation state is pairs of nitrogen atoms forming A centres (termed Type IaA diamonds) and the final state is four nitrogen atoms tetrahedrally arrange around a vacancy, forming a B centre (termed Type IaB). The sequence of aggregation has been shown to progress from C centres to A centres to B centres and is a function of time and temperature. As such it is a commonly used tool in the geological study of diamonds to gauge their mantle residence time / temperature history. The first step in the sequence is thought to occur relatively quickly in geological terms; the vast age of most diamonds therefore makes Type Ib samples rare in cratonic diamond deposits. The second step takes considerably more time, meaning that the A to B centre conversion may not always continue through to completion. So diamonds containing a mixture of both A and B centres are commonly termed Type IaAB. IR analysis of diamond also has the capability of identifying other commonly found defects and impurities. Whether these are intrinsic defects like platelets, extrinsic defects like hydrogen or boron atoms, or inclusions of minerals or fluids. Recent technological developments in the field of spectroscopy allow detailed ?-FTIR analysis to be performed rapidly in an automated fashion. The Nicolet iN10 microscope has an integrated design that maximises signal throughput and allows spectra to be collected with greater efficiency than is possible with conventional ?-FTIR spectrometer-microscope systems. Combining this with a computer controlled x-y stage allows for the automated measuring of several thousand spectra in only a few hours. This affords us the ability to record 2D IR maps of diamond plates with minimal effort, but has created the need for an automated technique to process the large quantities of IR spectra and obtain quantitative data from them. We will present new software routines that can process large batches of IR spectra, including baselining, conversion to absorption coefficient, and deconvolution to identify and quantify the various nitrogen components. Possible sources of error in each step of the process will be highlighted so that the data produced can be critically assessed. The end result will be the production of various false colour 2D maps that show the distribution of nitrogen concentrations and aggregation states, as well as other identifiable components.

Howell, Dan; Griffin, Bill; O'Neill, Craig; O'Reilly, Suzanne; Pearson, Norman; Handley, Heather



Video Analysis and Modeling Tool for Physics Education: A Workshop for Redesigning Pedagogy  

NSDL National Science Digital Library

This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICTenabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more selfdirected.

Wee, Loo K.; Leong, Lee T.



Video Analysis and Modeling Tool for Physics Education: A workshop for Redesigning Pedagogy  

E-print Network

This workshop aims to demonstrate how the Tracker Video Analysis and Modeling Tool engages, enables and empowers teachers to be learners so that we can be leaders in our teaching practice. Through this workshop, the kinematics of a falling ball and a projectile motion are explored using video analysis and in the later video modeling. We hope to lead and inspire other teachers by facilitating their experiences with this ICT-enabled video modeling pedagogy (Brown, 2008) and free tool for facilitating students-centered active learning, thus motivate students to be more self-directed.

Wee, Loo Kang



Ultrametric networks: a new tool for phylogenetic analysis  

PubMed Central

Background The large majority of optimization problems related to the inference of distance?based trees used in phylogenetic analysis and classification is known to be intractable. One noted exception is found within the realm of ultrametric distances. The introduction of ultrametric trees in phylogeny was inspired by a model of evolution driven by the postulate of a molecular clock, now dismissed, whereby phylogeny could be represented by a weighted tree in which the sum of the weights of the edges separating any given leaf from the root is the same for all leaves. Both, molecular clocks and rooted ultrametric trees, fell out of fashion as credible representations of evolutionary change. At the same time, ultrametric dendrograms have shown good potential for purposes of classification in so far as they have proven to provide good approximations for additive trees. Most of these approximations are still intractable, but the problem of finding the nearest ultrametric distance matrix to a given distance matrix with respect to the L? distance has been long known to be solvable in polynomial time, the solution being incarnated in any minimum spanning tree for the weighted graph subtending to the matrix. Results This paper expands this subdominant ultrametric perspective by studying ultrametric networks, consisting of the collection of all edges involved in some minimum spanning tree. It is shown that, for a graph with n vertices, the construction of such a network can be carried out by a simple algorithm in optimal time O(n2) which is faster by a factor of n than the direct adaptation of the classical O(n3) paradigm by Warshall for computing the transitive closure of a graph. This algorithm, called UltraNet, will be shown to be easily adapted to compute relaxed networks and to support the introduction of artificial points to reduce the maximum distance between vertices in a pair. Finally, a few experiments will be discussed to demonstrate the applicability of subdominant ultrametric networks. Availability PMID:23497437



New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.  


Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. PMID:25041833

Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K



Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity  

PubMed Central

This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges. PMID:24465054

Dinov, Ivo D.; Christou, Nicolas



The development of a visualization tool for displaying analysis and test results  

SciTech Connect

The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data.

Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S. [Sandia National Labs., Albuquerque, NM (United States). Transportation System Development Dept.; Knight, R.D. [Geo-Centers, Inc., Albuquerque, NM (United States); Wix, S.D. [GRAM, Inc., Albuquerque, NM (United States)



Reconstructing the Genomic Content of Microbiome Taxa through Shotgun Metagenomic Deconvolution  

PubMed Central

Metagenomics has transformed our understanding of the microbial world, allowing researchers to bypass the need to isolate and culture individual taxa and to directly characterize both the taxonomic and gene compositions of environmental samples. However, associating the genes found in a metagenomic sample with the specific taxa of origin remains a critical challenge. Existing binning methods, based on nucleotide composition or alignment to reference genomes allow only a coarse-grained classification and rely heavily on the availability of sequenced genomes from closely related taxa. Here, we introduce a novel computational framework, integrating variation in gene abundances across multiple samples with taxonomic abundance data to deconvolve metagenomic samples into taxa-specific gene profiles and to reconstruct the genomic content of community members. This assembly-free method is not bounded by various factors limiting previously described methods of metagenomic binning or metagenomic assembly and represents a fundamentally different approach to metagenomic-based genome reconstruction. An implementation of this framework is available at We first describe the mathematical foundations of our framework and discuss considerations for implementing its various components. We demonstrate the ability of this framework to accurately deconvolve a set of metagenomic samples and to recover the gene content of individual taxa using synthetic metagenomic samples. We specifically characterize determinants of prediction accuracy and examine the impact of annotation errors on the reconstructed genomes. We finally apply metagenomic deconvolution to samples from the Human Microbiome Project, successfully reconstructing genus-level genomic content of various microbial genera, based solely on variation in gene count. These reconstructed genera are shown to correctly capture genus-specific properties. With the accumulation of metagenomic data, this deconvolution framework provides an essential tool for characterizing microbial taxa never before seen, laying the foundation for addressing fundamental questions concerning the taxa comprising diverse microbial communities. PMID:24146609

Carr, Rogan; Shen-Orr, Shai S.; Borenstein, Elhanan



Spectral analysis of seismic noise induced by rivers: A new tool to monitor spatiotemporal changes in  

E-print Network

Spectral analysis of seismic noise induced by rivers: A new tool to monitor spatiotemporal changes and deeply incised channel of the Trisuli River, a major trans-Himalayan river. The early summer increase the Trisuli River. Seasonal increase in ambient noise coincides with the strong monsoon rainfall and a period

Demouchy, Sylvie



Microsoft Academic Search

A new modeling and simulation tool, the Virtual Test Bed (VTB), is introduced. The VTB is a software environment that has been developed for design, analysis, and virtual prototyping of large-scale multi-technical systems. It has the capability of integrating models, which have been created in a variety of languages such as SPICE, ACSL , and SABER , into one simulation

Levent U. Gkdere; Charles W. Brice; Roger A. Dougal


SPACE: a suite of tools for protein structure prediction and analysis based on complementarity and environment  

Microsoft Academic Search

We describe a suite of SPACE tools for analysis and prediction of structures of biomolecules and their complexes. LPC\\/CSU software provides a common definition of inter-atomic contacts and complement- arity of contacting surfaces to analyze protein struc- ture and complexes. In the current version of LPC\\/ CSU, analyses of water molecules and nucleic acids have been added, together with improved

Vladimir Sobolev; Eran Eyal; Sergey Gerzon; Vladimir Potapov; Mariana Babor; Jaime Prilusky; Marvin Edelman



Historical Analysis, a Valuable Tool in Community-Based Environmental Protection  

Microsoft Academic Search

A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of how current ecological conditions arose, provide information to identify past pollutant inputs, identify modification or loss of habitat, help identify changes

Carol E Pesch; Jonathan Garber




EPA Science Inventory

A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of...


A Collaborative Analysis Tool for Visualisation and Interaction with Spatial Data  

E-print Network

integrated into the system: visual assessment, interactive exploration, collaboration and information1 A Collaborative Analysis Tool for Visualisation and Interaction with Spatial Data Tina Abstract A collaborative virtual environment system is described that is designed to support location

Taylor, Hamish


Analysis of Java Client/Server and Web Programming Tools for Development of Educational Systems.  

ERIC Educational Resources Information Center

This paper provides an analysis of old and new programming tools for development of client/server programs, particularly World Wide Web-based programs. The focus is on development of educational systems that use interactive shared workspaces to provide portable and expandable solutions. The paper begins with a short description of relevant terms.

Muldner, Tomasz


The Synchrosqueezing algorithm: a robust analysis tool for signals with time-varying spectrum  

Microsoft Academic Search

We analyze the Synchrosqueezing transform, a consistent and invertible time-frequency analysis tool that can identify and extract oscillating components (of time-varying frequency and amplitude) from regularly sampled time series. We first describe a fast algorithm implementing the transform. Second, we show Synchrosqueezing is robust to bounded perturbations of the signal. This stability property extends the applicability of Synchrosqueezing to the

Eugene Brevdo; Neven S. Fuckar; Gaurav Thakur; Hau-Tieng Wu



Machine Tool Accuracy Analysis M.A.Sc. Candidate: Ricky Chan  

E-print Network

Machine Tool Accuracy Analysis M.A.Sc. Candidate: Ricky Chan Supervisor: Dr. Stephen Veldhuis Abstract: CNC machining is an essential part of almost all manufacturing industries. Machine accuracy surfaces are made for esthetics and aerodynamic purposes. Machining free formed surfaces require intricate

Bone, Gary


Evaluating Static Analysis Tools for Detecting Buffer Overflows in C Code Kendra June Kratkiewicz  

E-print Network

Evaluating Static Analysis Tools for Detecting Buffer Overflows in C Code Kendra June Kratkiewicz A Thesis in the Field of Information Technology for the Degree of Master of Liberal Arts in Extension Studies Harvard University March 2005 (corrected May 2005) This work was sponsored by the United States



E-print Network

BULLWHIP EFFECT AND SUPPLY CHAIN MODELLING AND ANALYSIS USING CPN TOOLS Dragana Makaji causes the so-called Bullwhip Effect, in which fluctuations in orders increase as they move up the chain. INTRODUCTION A Supply Chain (SC) includes all the participants and processes involved in the satisfaction

van der Aalst, Wil


Test data will be used to validate advanced turbine design and analysis tools.  

E-print Network

Test data will be used to validate advanced turbine design and analysis tools. NREL signed a Cooperative Research and Development Agreement with Alstom in 2010 to conduct certification testing certification testing in 2011. Tests to be conducted by NREL include a power quality test to finalize


Natural Language Processing (NLP) tools for the analysis of incident and accident reports  

E-print Network

Natural Language Processing (NLP) tools for the analysis of incident and accident reports project, we use NLP methods to facilitate experience feedback in the field of civil aviation safety. In this paper, we present how NLP methods based on the extraction of textual information from the Air France ASR

Paris-Sud XI, Université de


A very efficient tool for the structural analysis of hypersonic vehicles under high temperature aspects  

Microsoft Academic Search

This paper presents a Finite Element Tool, which has been adapted to the requirements of integrated thermal and mechanical analysis of structures. Thus, nonlinear and instationary heat-transfer as well as nonlinearities of the mechanical system are considered. The associated basic equations are given. To increase the efficiency of this finite element approach, adaptive grids and sophisticated linear and nonlinear solution

M. Haupt; H. Kossira; M. Kracht; J. Pleitner



Simulation Analysis of Dispatching Rules for Automated Material Handling Systems and Processing Tools in Semiconductor Fabs  

E-print Network

Tools in Semiconductor Fabs Julie Christopher, Michael E. Kuhl, Karl Hirschman Rochester Institute aspect of semiconductor manufacturing is the design and analysis of material handling and production to have a significant impact. INTRODUCTION A critical aspect of semiconductor manufacturing is the design

Kuhl, Michael E.


Web-Based Tools for Modelling and Analysis of Multivariate Data: California Ozone Pollution Activity  

ERIC Educational Resources Information Center

This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting

Dinov, Ivo D.; Christou, Nicolas



Security API analysis with the spi-calculus and the ProVerif tool  

E-print Network

Security API analysis with the spi-calculus and the ProVerif tool Technical Report copyright©2008 Notations 5 3 HSMs and APIs 5 3.1 Tamper evidence and Tamper resistance . . . . . . . . . . . . . . . . . . . . . 5 3.2 The role of HSMs and cryptoprocessors . . . . . . . . . . . . . . . . . . . . . 5 3.3 API

Bencsáth, Boldizsár


Iraqi Insurgents Use of YouTube as a Strategic Communication Tool: An Exploratory Content Analysis  

Microsoft Academic Search

This dissertation study is a baseline investigation into Iraqi insurgents use of YouTube as a strategic communication tool. The study utilized a content analysis of videos from October 28, 2008 to December 1, 2008 for the search term Iraqi resistance on YouTube that met stated criteria. Overall framing devices and themes found in the collection of videos were examined. While

Rheanna R. Rutledge



SCOR templateA simulation based dynamic supply chain analysis tool  

Microsoft Academic Search

The supply chain operations reference model model (SCOR) is developed and maintained by the Supply Chain Council (SCC). The SCOR model is a reference model that can be used to map, benchmark, and improve supply chain operations. SCOR template is a simulation based analysis tool, developed to capture the dynamics of supply chain operations. The first version of the SCOR

Fredrik Persson



Disaster SitRep -A Vertical Search Engine and Information Analysis Tool in Disaster Management Domain  

E-print Network

Disaster SitRep - A Vertical Search Engine and Information Analysis Tool in Disaster Management at the right time. Needs for heterogeneous information integration in disaster management domain: People have and multimedia data like images and videos. However, information management and processing in disaster management

Chen, Shu-Ching


A survey of tools for variant analysis of next-generation genome sequencing data.  


Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R; Zschocke, Johannes; Trajanoski, Zlatko



Using a Strategic Planning Tool as a Framework for Case Analysis  

ERIC Educational Resources Information Center

In this article, the authors describe how they use a strategic planning tool known as SWOT as a framework for case analysis, using it to analyze the strengths, weaknesses, opportunities, and threats of a public works project intended to enhance regional economic development in Tempe, Arizona. Students consider the project in light of a variety of

Lai, Christine A.; Rivera, Julio C., Jr.



Urban goods movement (UGM) analysis as a tool for urban planning  

E-print Network

Urban goods movement (UGM) analysis as a tool for urban planning Mathieu Gardrat, Jesus Gonzalez 07 Submission for track: F - Transport, Land Use and Sustainability Since several years, urban planning has become a major stake for the sustainable development of cities. Each decision taken for urban

Boyer, Edmond



E-print Network

FRACTAL ANALYSIS TOOLS FOR CHARACTERIZING THE COLORIMETRIC ORGANIZATION OF DIGITAL IMAGES Case.chapeau-blondeau} Keywords: Color image, Color histogram, Fractal, Self-similarity, Capacity dimension, Correlation dimension of algorithms which can characterize fractal organizations in the support and population of their three

Chapeau-Blondeau, François


DRIVE Analysis Tool Generates Custom Vehicle Drive Cycles Based on Real-World Data (Fact Sheet)  

SciTech Connect

This fact sheet from the National Renewable Energy Laboratory describes the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) analysis tool, which uses GPS and controller area network data to characterize vehicle operation and produce custom vehicle drive cycles, analyzing thousands of hours of data in a matter of minutes.

Not Available



Conversational Analysis as an Analytical Tool for Face-to-Face and Online Conversations  

ERIC Educational Resources Information Center

Some learning scientists are beginning to investigate social and cultural aspects of learning by examining the interactions between a learner and the environment as well as with other people in the learning environment. This article proposes conversational analysis (CA) as a tool to analyze interactions between learners and instructors in

Tan, Seng-Chee; Tan, Aik-Ling



Conversational analysis as an analytical tool for face?to?face and online conversations  

Microsoft Academic Search

Some learning scientists are beginning to investigate social and cultural aspects of learning by examining the interactions between a learner and the environment as well as with other people in the learning environment. This article proposes conversational analysis (CA) as a tool to analyze interactions between learners and instructors in face?to?face and online environments. It illustrates the potential of CA



UsingtheVirginiaCooperativeExtensionClimateAnalysis Web Tool to Develop a Corn Planting Strategy  

E-print Network

after which corn planting is likely to result in a successful stand. It is important to remember planting success or failure to develop a planting schedule. By using the Virginia Cooperative ExtensionUsingtheVirginiaCooperativeExtensionClimateAnalysis Web Tool to Develop a Corn Planting Strategy W

Liskiewicz, Maciej


MIRAGE: A Management Tool for the Analysis and Deployment of Network Security Policies  

Microsoft Academic Search

\\u000a We present the core functionality of MIRAGE, a management tool for the analysis and deployment of configuration policies over\\u000a network security components, such as firewalls, intrusion detection systems, and VPN routers. We review the two main functionalities\\u000a embedded in our current prototype: (1) a bottom-up analysis of already deployed network security configurations and (2) a\\u000a top-down refinement of global policies

Joaqun Garca-Alfaro; Frdric Cuppens; Nora Cuppens-Boulahia; Stere Preda



Tool Development for Analysis of WCDMA Radio Measurements and Investigation of EcNo and RSCP values before Drop Call  

Microsoft Academic Search

In this work, a tool was developed for the post processing analysis of the drive test measurement files. This tool collects all the necessary information from measurement files and organizes this information to a data base. The tool plots of the Received Signal Code Power (RSCP) measurements on the map along the route of the car, using a color code.



The ipc/HYDRA Tool Chain for the Analysis of PEPA Models Jeremy T. Bradley William J. Knottenbelt  

E-print Network

The ipc/HYDRA Tool Chain for the Analysis of PEPA Models Jeremy T. Bradley William J. Knottenbelt metrics which are used to specify ser- vice level agreements (SLAs) and benchmarks. HYDRA is a tool can pro- duce both steady-state and transient results, we present the ipc/HYDRA tool chain which can

Imperial College, London


Ribosomal Database Project: data and tools for high throughput rRNA analysis  

PubMed Central

Ribosomal Database Project (RDP; provides the research community with aligned and annotated rRNA gene sequence data, along with tools to allow researchers to analyze their own rRNA gene sequences in the RDP framework. RDP data and tools are utilized in fields as diverse as human health, microbial ecology, environmental microbiology, nucleic acid chemistry, taxonomy and phylogenetics. In addition to aligned and annotated collections of bacterial and archaeal small subunit rRNA genes, RDP now includes a collection of fungal large subunit rRNA genes. RDP tools, including Classifier and Aligner, have been updated to work with this new fungal collection. The use of high-throughput sequencing to characterize environmental microbial populations has exploded in the past several years, and as sequence technologies have improved, the sizes of environmental datasets have increased. With release 11, RDP is providing an expanded set of tools to facilitate analysis of high-throughput data, including both single-stranded and paired-end reads. In addition, most tools are now available as open source packages for download and local use by researchers with high-volume needs or who would like to develop custom analysis pipelines. PMID:24288368

Cole, James R.; Wang, Qiong; Fish, Jordan A.; Chai, Benli; McGarrell, Donna M.; Sun, Yanni; Brown, C. Titus; Porras-Alfaro, Andrea; Kuske, Cheryl R.; Tiedje, James M.



Mobility analysis tool based on the fundamental principle of conservation of energy.  

SciTech Connect

In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert



A Library of Cortical Morphology Analysis Tools to Study Development, Aging and Genetics of Cerebral Cortex  

PubMed Central

Sharing of analysis techniques and tools is among the main driving forces of modern neuroscience. We describe a library of tools developed to quantify global and regional differences in cortical anatomy in high resolution structural MR images. This library is distributed as a plug-in application for popular structural analysis software, BrainVisa (BV). It contains tools to measure global and regional gyrification, gray matter thickness and sulcal and gyral white matter spans. We provide a description of each tool and examples for several case studies to demonstrate their use. These examples show how the BV library was used to study cortical folding process during anten