These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

DECONV-TOOL: An IDL based deconvolution software package  

NASA Technical Reports Server (NTRS)

There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

Varosi, F.; Landsman, W. B.

1992-01-01

2

AIRYLN: an adhoc numerical tool for deconvolution of images from the LBT instrument LINCNIRVANA  

E-print Network

AIRY­LN: an ad­hoc numerical tool for deconvolution of images from the LBT instrument LINC­NIRVANA ABSTRACT LINC­NIRVANA (LN) is the German­Italian Fizeau beam combiner for the Large Binocular Telescope ­ Blind deconvolution ­ PSF extraction ­ Large Binocular Telescope ­ LINC­NIRVANA ­ Software ­ AIRY­LN 1

Bertero, Mario

3

Importance of FTIR Spectra Deconvolution for the Analysis of Amorphous Calcium Phosphates  

NASA Astrophysics Data System (ADS)

This work will consider Fourier transform infrared spectroscopy diffuse reflectance infrared reflection (FTIR-DRIFT) for collecting the spectra and deconvolution to identify changes in bonding as a means of more powerful detection. Spectra were recorded from amorphous calcium phosphate synthesized by wet precipitation, and from bone. FTIR-DRIFT was used to study the chemical environments of PO4, CO3 and amide. Deconvolution of spectra separated overlapping bands in the ?4PO4, ?2CO3, ?3CO3 and amide region allowing a more detailed analysis of changes at the atomic level. Amorphous calcium phosphate dried at 80 oC, despite showing an X-ray diffraction amorphous structure, displayed carbonate in positions resembling a carbonated hydroxyapatite. Additional peaks were designated as A1 type, A2 type or B type. Deconvolution allowed the separation of CO3 positions in bone from amide peaks. FTIR-DRIFT spectrometry in combination with deconvolution offers an advanced tool for qualitative and quantitative determination of CO3, PO4 and HPO4 and shows promise to measure the degree of order.

Brangule, Agnese; Agris Gross, Karlis

2015-03-01

4

Multispectral imaging analysis: spectral deconvolution and applications in biology  

NASA Astrophysics Data System (ADS)

Multispectral imaging has been in use for over half a century. Owing to advances in digital photographic technology, multispectral imaging is now used in settings ranging from clinical medicine to industrial quality control. Our efforts focus on the use of multispectral imaging coupled with spectral deconvolution for measurement of endogenous tissue fluorophores and for animal tissue analysis by multispectral fluorescence, absorbance, and reflectance data. Multispectral reflectance and fluorescence images may be useful in evaluation of pathology in histological samples. For example, current hematoxylin/eosin diagnosis limits spectral analysis to shades of red and blue/grey. It is possible to extract much more information using multispectral techniques. To collect this information, a series of filters or a device such as an acousto-optical tunable filter (AOTF) or liquid-crystal filter (LCF) can be used with a CCD camera, enabling collection of images at many more wavelengths than is possible with a simple filter wheel. In multispectral data processing the "unmixing" of reflectance or fluorescence data and analysis and the classification based upon these spectra is required for any classification. In addition to multispectral techniques, extraction of topological information may be possible by reflectance deconvolution or multiple-angle imaging, which could aid in accurate diagnosis of skin lesions or isolation of specific biological components in tissue. The goal of these studies is to develop spectral signatures that will provide us with specific and verifiable tissue structure/function information. In addition, relatively complex classification techniques must be developed so that the data are of use to the end user.

Leavesley, Silas; Ahmed, Wamiq; Bayraktar, Bulent; Rajwa, Bartek; Sturgis, Jennifer; Robinson, J. P.

2005-03-01

5

Monitoring a Building Using Deconvolution Interferometry. II: Ambient-Vibration Analysis  

E-print Network

Monitoring a Building Using Deconvolution Interferometry. II: Ambient- Vibration Analysis by Nori interferometry to ambient-vibration data, instead of using earthquake data, to monitor a building. The time continuity of ambient vibrations is useful for temporal monitoring. We show that, because multiple sources

Snieder, Roel

6

Bayesian Analysis (2006) 1, Number 2, pp. 189236 Deconvolution in High-Energy Astrophysics  

E-print Network

Bayesian Analysis (2006) 1, Number 2, pp. 189­236 Deconvolution in High-Energy Astrophysics. In recent years, there has been an avalanche of new data in observa- tional high-energy astrophysics inferential problems in high-energy astrophysics. We emphasize fully model-based statisti- cal inference; we

van Dyk, David

7

Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research  

NASA Astrophysics Data System (ADS)

During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer studies is critically discussed, where special emphasis is set on evaluating different data processing strategies on the example of enriched stable Sr isotopes.1 The analytical key parameters such as blank (Kr, Sr and Rb), variation of the natural Sr isotopic composition in the sample, mass bias, interferences (Rb) and total combined uncertainty are considered. A full metrological protocol for data processing using IPD is presented based on data gained during two transgenerational marking studies of fish, where the transfer of a Sr isotope double spike (84Sr and 86Sr) from female spawners of common carp (Cyprinus carpio L.) and brown trout (Salmo trutta f.f.)2 to the centre of the otoliths of their offspring was studied by (LA)-MC-ICP-MS. 1J. Irrgeher, A. Zitek, M. Cervicek and T. Prohaska, J. Anal. At. Spectrom., 2014, 29, 193-200. 2A. Zitek, J. Irrgeher, M. Kletzl, T. Weismann and T. Prohaska, Fish. Manage. Ecol., 2013, 20, 654-361.

Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

2014-05-01

8

Intrinsic fluorescence spectroscopy of glutamate dehydrogenase: Integrated behavior and deconvolution analysis  

NASA Astrophysics Data System (ADS)

In this paper, we present a deconvolution method aimed at spectrally resolving the broad fluorescence spectra of proteins, namely, of the enzyme bovine liver glutamate dehydrogenase (GDH). The analytical procedure is based on the deconvolution of the emission spectra into three distinct Gaussian fluorescing bands Gj. The relative changes of the Gj parameters are directly related to the conformational changes of the enzyme, and provide interesting information about the fluorescence dynamics of the individual emitting contributions. Our deconvolution method results in an excellent fitting of all the spectra obtained with GDH in a number of experimental conditions (various conformational states of the protein) and describes very well the dynamics of a variety of phenomena, such as the dependence of hexamers association on protein concentration, the dynamics of thermal denaturation, and the interaction process between the enzyme and external quenchers. The investigation was carried out by means of different optical experiments, i.e., native enzyme fluorescence, thermal-induced unfolding, and fluorescence quenching studies, utilizing both the analysis of the average behavior of the enzyme and the proposed deconvolution approach.

Pompa, P. P.; Cingolani, R.; Rinaldi, R.

2003-07-01

9

Quantitative scintigraphy with deconvolutional analysis for the dynamic measurement of hepatic function  

SciTech Connect

A mathematical technique known as deconvolutional analysis was used to provide a critical and previously missing element in the computations required to quantitate hepatic function scintigraphically. This computer-assisted technique allowed for the determination of the time required, in minutes, of a labeled bilirubin analog (/sup 99m/Tc-disofenin) to enter the liver via blood and exit via bile. This interval was referred to as the mean transit time (MTT). The critical process provided for by deconvolution is the mathematical simulation of a bolus injection of tracer directly into the afferent blood supply of the liver. The raw data required for this simulation are obtained from the intravenous injection of labeled disofenin, a member of the HIDA family of radiopharmaceuticals. In this study, we perform experiments which document that the simulation process itself is accurate. We then calculate the MTT under a variety of experimental conditions involving progressive hepatic ischemia/reperfusion injury and correlate these results with the results of simultaneously performed BSP determinations and hepatic histology. The experimental group with the most pronounced histologic findings (necrosis, vacuolization, disorganization of hepatic cords) also have the most prolonged MTT and BSP half-life. However, both quantitative imaging and BSP testing are able to identify milder degrees of hepatic ischemic injury not reflected in the histologic evaluation. Quantitative imaging with deconvolutional analysis is a technique easily adaptable to the standard nuclear medicine minicomputer. It provides rapid results and appears to be a sensitive monitor of hepatic functional disturbances resulting from ischemia and reperfusion.

Tagge, E.P.; Campbell, D.A. Jr.; Reichle, R.; Averill, D.R. Jr.; Merion, R.M.; Dafoe, D.C.; Turcotte, J.G.; Juni, J.E.

1987-06-01

10

A further analysis for the minimum-variance deconvolution filter performance  

NASA Technical Reports Server (NTRS)

Chi and Mendel (1984) analyzed the performance of minimum-variance deconvolution (MVD). In this correspondence, a further analysis of the performance of the MVD filter is presented. It is shown that the MVD filter performs like an inverse filter and a whitening filter as SNR goes to infinity, and like a matched filter as SNR goes to zero. The estimation error of the MVD filter is colored noise, but it becomes white when SNR goes to zero. This analysis also conects the error power-spectral density of the MVD filter with the spectrum of the causal-prediction error filter.

Chi, Chong-Yung

1987-01-01

11

FTIR Analysis of Alkali Activated Slag and Fly Ash Using Deconvolution Techniques  

NASA Astrophysics Data System (ADS)

The studies on aluminosilicate materials to replace traditional construction materials such as ordinary Portland cement (OPC) to reduce the effects caused has been an important research area for the past decades. Many properties like strength have already been studied and the primary focus is to learn about the reaction mechanism and the effect of the parameters on the formed products. The aim of this research was to explore the structural changes and reaction product analysis of geopolymers (Slag & Fly Ash) using Fourier transform infrared spectroscopy (FTIR) and deconvolution techniques. Spectroscopic techniques give valuable information at a molecular level but not all methods are economic and simple. To understand the mechanisms of alkali activated aluminosilicate materials, attenuated total reflectance (ATR) FTIR has been used where the effect of the parameters on the reaction products have been analyzed. To analyze complex systems like geopolymers using FTIR, deconvolution techniques help to obtain the properties of a particular peak attributed to a certain molecular vibration. Time and temperature dependent analysis were done on slag pastes to understand the polymerization of reactive silica in the system with time and temperature variance. For time dependent analysis slag has been activated with sodium and potassium silicates using two different `n'values and three different silica modulus [Ms- (SiO2 /M2 O)] values. The temperature dependent analysis was done by curing the samples at 60C and 80C. Similarly fly ash has been studied by activating with alkali hydroxides and alkali silicates. Under the same curing conditions the fly ash samples were evaluated to analyze the effects of added silicates for alkali activation. The peak shifts in the FTIR explains the changes in the structural nature of the matrix and can be identified using the deconvolution technique. A strong correlation is found between the concentrations of silicate monomer in the activating position of the main Si-O-T (where T is Al/Si) stretching band in the FTIR spectrum, which gives an indication of the relative changes in the Si/Al ratio. Also, the effect of the cation and silicate concentration in the activating solution has been discussed using the Fourier self deconvolution technique.

Madavarapu, Sateesh Babu

12

OEXP Analysis Tools Workshop  

NASA Technical Reports Server (NTRS)

This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

1988-01-01

13

Investigation of the CLEAN deconvolution method for use with Late Time Response analysis of multiple objects  

NASA Astrophysics Data System (ADS)

This paper investigates the application of the CLEAN non-linear deconvolution method to Late Time Response (LTR) analysis for detecting multiple objects in Concealed Threat Detection (CTD). When an Ultra-Wide Band (UWB) frequency radar signal is used to illuminate a conductive target, surface currents are induced upon the object which in turn give rise to LTR signals. These signals are re-radiated from the target and the results from a number of targets are presented. The experiment was performed using double ridged horn antenna in a pseudo-monostatic arrangement. A Vector network analyser (VNA) has been used to provide the UWB Frequency Modulated Continuous Wave (FMCW) radar signal. The distance between the transmitting antenna and the target objects has been kept at 1 metre for all the experiments performed and the power level at the VNA was set to 0dBm. The targets in the experimental setup are suspended in air in a laboratory environment. Matlab has been used in post processing to perform linear and non-linear deconvolution of the signal. The Wiener filter, Fast Fourier Transform (FFT) and Continuous Wavelet Transform (CWT) are used to process the return signals and extract the LTR features from the noise clutter. A Generalized Pencil-of-Function (GPOF) method was then used to extract the complex poles of the signal. Artificial Neural Networks (ANN) and Linear Discriminant Analysis (LDA) have been used to classify the data.

Hutchinson, Simon; Taylor, Christopher T.; Fernando, Michael; Andrews, David; Bowring, Nicholas

2014-10-01

14

Blind deconvolution via independent component analysis for thin-pavement thickness estimation using GPR  

Microsoft Academic Search

Blind deconvolution of sparse spikes is a well-known problem in the fields of seismic exploration and ultrasonic nondestructive testing. In measuring thin layer thickness of asphalt pavements using GPR, a similar problem arises; the sparse reflectivity series representing the layered structure of the pavement convolved with the radar wavelet results in masking closely spaced reflections. A successful deconvolution retrieves the

Khaled CHAHINE; Vincent BALTAZART; Xavier Drobert; Yide WANG

2009-01-01

15

MORESANE: MOdel REconstruction by Synthesis-ANalysis Estimators. A sparse deconvolution algorithm for radio interferometric imaging  

NASA Astrophysics Data System (ADS)

Context. Recent years have been seeing huge developments of radio telescopes and a tremendous increase in their capabilities (sensitivity, angular and spectral resolution, field of view, etc.). Such systems make designing more sophisticated techniques mandatory not only for transporting, storing, and processing this new generation of radio interferometric data, but also for restoring the astrophysical information contained in such data. Aims.In this paper we present a new radio deconvolution algorithm named MORESANEand its application to fully realistic simulated data of MeerKAT, one of the SKA precursors. This method has been designed for the difficult case of restoring diffuse astronomical sources that are faint in brightness, complex in morphology, and possibly buried in the dirty beam's side lobes of bright radio sources in the field. Methods.MORESANE is a greedy algorithm that combines complementary types of sparse recovery methods in order to reconstruct the most appropriate sky model from observed radio visibilities. A synthesis approach is used for reconstructing images, in which the synthesis atoms representing the unknown sources are learned using analysis priors. We applied this new deconvolution method to fully realistic simulations of the radio observations of a galaxy cluster and of an HII region in M 31. Results.We show that MORESANE is able to efficiently reconstruct images composed of a wide variety of sources (compact point-like objects, extended tailed radio galaxies, low-surface brightness emission) from radio interferometric data. Comparisons with the state of the art algorithms indicate that MORESANE provides competitive results in terms of both the total flux/surface brightness conservation and fidelity of the reconstructed model. MORESANE seems particularly well suited to recovering diffuse and extended sources, as well as bright and compact radio sources known to be hosted in galaxy clusters.

Dabbech, A.; Ferrari, C.; Mary, D.; Slezak, E.; Smirnov, O.; Kenyon, J. S.

2015-04-01

16

Extended Testability Analysis Tool  

NASA Technical Reports Server (NTRS)

The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

Melcher, Kevin; Maul, William A.; Fulton, Christopher

2012-01-01

17

Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis.  

PubMed

We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (F(p)), mean capillary transit time (T(c)), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate F(p) values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated F(p) values on T(c), PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the F(p) values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while T(c) did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent. PMID:17473352

Murase, Kenya; Miyazaki, Shohei

2007-05-21

18

Error analysis of tumor blood flow measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis  

NASA Astrophysics Data System (ADS)

We performed error analysis of tumor blood flow (TBF) measurement using dynamic contrast-enhanced data and model-independent deconvolution analysis, based on computer simulations. For analysis, we generated a time-dependent concentration of the contrast agent in the volume of interest (VOI) from the arterial input function (AIF) consisting of gamma-variate functions using an adiabatic approximation to the tissue homogeneity model under various plasma flow (Fp), mean capillary transit time (Tc), permeability-surface area product (PS) and signal-to-noise ratio (SNR) values. Deconvolution analyses based on truncated singular value decomposition with a fixed threshold value (TSVD-F), with an adaptive threshold value (TSVD-A) and with the threshold value determined by generalized cross validation (TSVD-G) were used to estimate Fp values from the simulated concentration-time curves in the VOI and AIF. First, we investigated the relationship between the optimal threshold value and SNR in TSVD-F, and then derived the equation describing the relationship between the threshold value and SNR for TSVD-A. Second, we investigated the dependences of the estimated Fp values on Tc, PS, the total duration for data acquisition and the shape of AIF. Although TSVD-F with a threshold value of 0.025, TSVD-A with the threshold value determined by the equation derived in this study and TSVD-G could estimate the Fp values in a similar manner, the standard deviation of the estimates was the smallest and largest for TSVD-A and TSVD-G, respectively. PS did not largely affect the estimates, while Tc did in all methods. Increasing the total duration significantly improved the variations in the estimates in all methods. TSVD-G was most sensitive to the shape of AIF, especially when the total duration was short. In conclusion, this study will be useful for understanding the reliability and limitation of model-independent deconvolution analysis when applied to TBF measurement using an extravascular contrast agent.

Murase, Kenya; Miyazaki, Shohei

2007-05-01

19

Analysis of a deconvolution-based information retrieval algorithm in X-ray grating-based phase-contrast imaging  

NASA Astrophysics Data System (ADS)

Grating-based X-ray phase-contrast imaging is a promising imaging modality to increase soft tissue contrast in comparison to conventional attenuation-based radiography. Complementary and otherwise inaccessible information is provided by the dark-field image, which shows the sub-pixel size granularity of the measured object. This could especially turn out to be useful in mammography, where tumourous tissue is connected with the presence of supertiny microcalcifications. In addition to the well-established image reconstruction process, an analysis method was introduced by Modregger, 1 which is based on deconvolution of the underlying scattering distribution within a single pixel revealing information about the sample. Subsequently, the different contrast modalities can be calculated with the scattering distribution. The method already proved to deliver additional information in the higher moments of the scattering distribution and possibly reaches better image quality with respect to an increased contrast-to-noise ratio. Several measurements were carried out using melamine foams as phantoms. We analysed the dependency of the deconvolution-based method with respect to the dark-field image on different parameters such as dose, number of iterations of the iterative deconvolution-algorithm and dark-field signal. A disagreement was found in the reconstructed dark-field values between the FFT method and the iterative method. Usage of the resulting characteristics might be helpful in future applications.

Horn, Florian; Bayer, Florian; Pelzer, Georg; Rieger, Jens; Ritter, Andr; Weber, Thomas; Zang, Andrea; Michel, Thilo; Anton, Gisela

2014-03-01

20

Numerical analysis of Leray-Tikhonov deconvolution models of fluid motion  

Microsoft Academic Search

This report develops and studies a new family of NSE-regularizations, Tikhonov Leray Reg- ularization with Time Relaxation Models. This new family of turbulence models is based on a modification (consistent with the large scales) of Tikhonov-Lavrentiev regularization. With this approach, we obtain an approximation of the unfiltered solution by one filtering step. We introduce the modified Tikhonov deconvolution operator and

Iuliana Stanculescu; Carolina C. Manica

2010-01-01

21

Characterization of attenuated proestrous luteinizing hormone surges in middle-aged rats by deconvolution analysis.  

PubMed

Reproductive aging in female rats is associated with attenuated preovulatory LH surges. In this study, detailed analyses of the episodic characteristics of the proestrous LH surge were conducted in young and middle-aged regularly cyclic rats. On proestrus, blood samples were withdrawn at 3-min intervals for 6 h and analyzed for LH concentrations by RIA in triplicate. Deconvolution analysis of immunoreactive LH concentrations revealed that there was no difference in the detectable LH secretory burst frequency between young and middle-aged rats. However, in middle-aged rats with an attenuated LH surge on proestrus, the mass of LH secreted per burst and the maximal rate of LH secretion per burst were only one fourth (p < 0.01) of those in young and middle-aged rats with normal LH surges. Furthermore, middle-aged rats with attenuated LH surges had a 4-fold decrease (p < 0.01) in the maximal rate of LH secretion per burst compared to young and middle-aged females with normal LH surges. The apparent half-life of endogenous LH was similar among the 3 groups. The attenuated LH surges of middle-aged rats were related specifically to a decrease in LH burst amplitude with no change in pulse frequency. The orderliness of moment-to-moment LH release as quantified by the regularity statistic, approximate entropy, was comparable in the 3 groups. Our findings of a markedly decreased amount of LH released per burst and preserved orderliness of the LH release process strongly suggest that a deficient GnRH drive and/or reduced responsivity to the GnRH signal, rather than altered timing of the signal, accounts for the age-related decline in reproductive function in female rats as presaged by an attenuated proestrous LH surge in middle age. PMID:9828195

Matt, D W; Gilson, M P; Sales, T E; Krieg, R J; Kerbeshian, M C; Veldhuis, J D; Evans, W S

1998-12-01

22

A System Analysis Tool  

SciTech Connect

In this paper we describe a tool for analyzing systems. The analysis is based on program slicing. It answers the following question for the software: if the value of a particular variable changes, what other variable values also change, and what is the path in between? program slicing was developed based on intra-procedure control and data flow. It has been expanded commercially to inter-procedure flow. However, we extend slicing to collections of programs and non-program entities, which we term multi-domain systems. The value of our tool is that an analyst can model the entirety of a system, not just the software, and we believe that this makes for a significant increase in power. We are building a prototype system.

CAMPBELL,PHILIP L.; ESPINOZA,JUAN

2000-06-01

23

Frequency Response Analysis Tool  

SciTech Connect

Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.

2014-12-31

24

Neutron multiplicity analysis tool  

SciTech Connect

I describe the capabilities of the EXCOM (EXcel based COincidence and Multiplicity) calculation tool which is used to analyze experimental data or simulated neutron multiplicity data. The input to the program is the count-rate data (including the multiplicity distribution) for a measurement, the isotopic composition of the sample and relevant dates. The program carries out deadtime correction and background subtraction and then performs a number of analyses. These are: passive calibration curve, known alpha and multiplicity analysis. The latter is done with both the point model and with the weighted point model. In the current application EXCOM carries out the rapid analysis of Monte Carlo calculated quantities and allows the user to determine the magnitude of sample perturbations that lead to systematic errors. Neutron multiplicity counting is an assay method used in the analysis of plutonium for safeguards applications. It is widely used in nuclear material accountancy by international (IAEA) and national inspectors. The method uses the measurement of the correlations in a pulse train to extract information on the spontaneous fission rate in the presence of neutrons from ({alpha},n) reactions and induced fission. The measurement is relatively simple to perform and gives results very quickly ({le} 1 hour). By contrast, destructive analysis techniques are extremely costly and time consuming (several days). By improving the achievable accuracy of neutron multiplicity counting, a nondestructive analysis technique, it could be possible to reduce the use of destructive analysis measurements required in safeguards applications. The accuracy of a neutron multiplicity measurement can be affected by a number of variables such as density, isotopic composition, chemical composition and moisture in the material. In order to determine the magnitude of these effects on the measured plutonium mass a calculational tool, EXCOM, has been produced using VBA within Excel. This program was developed to help speed the analysis of Monte Carlo neutron transport simulation (MCNP) data, and only requires the count-rate data to calculate the mass of material using INCC's analysis methods instead of the full neutron multiplicity distribution required to run analysis in INCC. This paper describes what is implemented within EXCOM, including the methods used, how the program corrects for deadtime, and how uncertainty is calculated. This paper also describes how to use EXCOM within Excel.

Stewart, Scott L [Los Alamos National Laboratory

2010-01-01

25

Geodetic Strain Analysis Tool  

NASA Technical Reports Server (NTRS)

A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.

Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan

2011-01-01

26

Extension of the performance of Laplace deconvolution in the analysis of fluorescence decay curves.  

PubMed Central

The original Laplace deconvolution of luminescence data, obtained with pulsed systems, is reviewed. The system of equations from which the luminescence parameters can be determined is generalized for the case that describes the relaxation by a sum of exponentials. Artifacts such as scatter and time-shift can be taken into account. A modification of the original method that eliminates the iterative procedure in the estimation of the cut-off correction is suggested. This modified Laplace method is no longer restricted to the cases where the cut-off error is rather small and the exciting flash has a low tail. The possibility of the combination of several discrete experiments in a single Laplace deconvolution, without introducing new parameters or normalization factors, is shown. The merits of this combination method are demonstrated on a time-resolved depolarization experiment. PMID:6626677

Ameloot, M; Hendrickx, H

1983-01-01

27

Draper Station Analysis Tool  

NASA Technical Reports Server (NTRS)

Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.

Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip

2011-01-01

28

A discriminant based charge deconvolution analysis pipeline for protein profiling of whole cell extracts using liquid chromatography-electrospray ionization-quadrupole time-of-flight mass spectrometry.  

PubMed

A discriminant based charge deconvolution analysis pipeline is proposed. The molecular weight determination (MoWeD) charge deconvolution method was applied directly to the discrimination rules obtained by the fuzzy rule-building expert system (FuRES) pattern classifier. This approach was demonstrated with synthetic electrospray ionization-mass spectra. Identification of the tentative protein biomarkers by bacterial cell extracts of Salmonella enterica serovar typhimurium strains A1 and A19 by liquid chromatography-electrospray ionization-mass spectrometry (LC-ESI-MS) was also demonstrated. The data analysis time was reduced by applying this approach. In addition, this method was less affected by noise and baseline drift. PMID:21530796

Lu, Weiying; Callahan, John H; Fry, Frederick S; Andrzejewski, Denis; Musser, Steven M; Harrington, Peter de B

2011-05-30

29

Independent component analysis (ICA) algorithms for improved spectral deconvolution of overlapped signals in 1H NMR analysis: application to foods and related products.  

PubMed

The major challenge facing NMR spectroscopic mixture analysis is the overlapping of signals and the arising impossibility to easily recover the structures for identification of the individual components and to integrate separated signals for quantification. In this paper, various independent component analysis (ICA) algorithms [mutual information least dependent component analysis (MILCA); stochastic non-negative ICA (SNICA); joint approximate diagonalization of eigenmatrices (JADE); and robust, accurate, direct ICA algorithm (RADICAL)] as well as deconvolution methods [simple-to-use-interactive self-modeling mixture analysis (SIMPLISMA) and multivariate curve resolution-alternating least squares (MCR-ALS)] are applied for simultaneous (1)H NMR spectroscopic determination of organic substances in complex mixtures. Among others, we studied constituents of the following matrices: honey, soft drinks, and liquids used in electronic cigarettes. Good quality spectral resolution of up to eight-component mixtures was achieved (correlation coefficients between resolved and experimental spectra were not less than 0.90). In general, the relative errors in the recovered concentrations were below 12%. SIMPLISMA and MILCA algorithms were found to be preferable for NMR spectra deconvolution and showed similar performance. The proposed method was used for analysis of authentic samples. The resolved ICA concentrations match well with the results of reference gas chromatography-mass spectrometry as well as the MCR-ALS algorithm used for comparison. ICA deconvolution considerably improves the application range of direct NMR spectroscopy for analysis of complex mixtures. PMID:24604756

Monakhova, Yulia B; Tsikin, Alexey M; Kuballa, Thomas; Lachenmeier, Dirk W; Mushtakova, Svetlana P

2014-05-01

30

Nondestructive 3D confocal laser imaging with deconvolution of seven whole stardust tracks with complementary XRF and quantitative analysis  

SciTech Connect

We present a nondestructive 3D system for analysis of whole Stardust tracks, using a combination of Laser Confocal Scanning Microscopy and synchrotron XRF. 3D deconvolution is used for optical corrections, and results of quantitative analyses of several tracks are presented. The Stardust mission to comet Wild 2 trapped many cometary and ISM particles in aerogel, leaving behind 'tracks' of melted silica aerogel on both sides of the collector. Collected particles and their tracks range in size from submicron to millimeter scale. Interstellar dust collected on the obverse of the aerogel collector is thought to have an average track length of {approx}15 {micro}m. It has been our goal to perform a total non-destructive 3D textural and XRF chemical analysis on both types of tracks. To that end, we use a combination of Laser Confocal Scanning Microscopy (LCSM) and X Ray Florescence (XRF) spectrometry. Utilized properly, the combination of 3D optical data and chemical data provides total nondestructive characterization of full tracks, prior to flattening or other destructive analysis methods. Our LCSM techniques allow imaging at 0.075 {micro}m/pixel, without the use of oil-based lenses. A full textural analysis on track No.82 is presented here as well as analysis of 6 additional tracks contained within 3 keystones (No.128, No.129 and No.140). We present a method of removing the axial distortion inherent in LCSM images, by means of a computational 3D Deconvolution algorithm, and present some preliminary experiments with computed point spread functions. The combination of 3D LCSM data and XRF data provides invaluable information, while preserving the integrity of the samples for further analysis. It is imperative that these samples, the first extraterrestrial solids returned since the Apollo era, be fully mapped nondestructively in 3D, to preserve the maximum amount of information prior to other, destructive analysis.

Greenberg, M.; Ebel, D.S. (AMNH)

2009-03-19

31

Hurricane Data Analysis Tool  

NASA Technical Reports Server (NTRS)

In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL: http://disc.gsfc.nasa.gov/HDAT), to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of interest and time, single imagery, overlay of two different products, animation,a time skip capability and different image size outputs. Users can save an animation as a file (animated gif) and import it in other presentation software, such as, Microsoft PowerPoint. Since the tool can directly access the real data, more features and functionality can be added in the future.

Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory

2011-01-01

32

Java Radar Analysis Tool  

NASA Technical Reports Server (NTRS)

Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

Zaczek, Mariusz P.

2005-01-01

33

FSSC Science Tools: Pulsar Analysis  

NASA Technical Reports Server (NTRS)

This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

Thompson, Dave

2010-01-01

34

Marginal Abatement Cost Analysis Tool  

EPA Science Inventory

The Non-CO2 Marginal Abatement Cost Analysis Tool is an extensive bottom-up engineering-economic spreadsheet model capturing the relevant cost and performance data on sectors emitting non-CO2 GHGs. The tool has 24 regions and 7 sectors and produces marginal abatement cost curves...

35

Convolution-deconvolution in DIGES  

SciTech Connect

Convolution and deconvolution operations is by all means a very important aspect of SSI analysis since it influences the input to the seismic analysis. This paper documents some of the convolution/deconvolution procedures which have been implemented into the DIGES code. The 1-D propagation of shear and dilatational waves in typical layered configurations involving a stack of layers overlying a rock is treated by DIGES in a similar fashion to that of available codes, e.g. CARES, SHAKE. For certain configurations, however, there is no need to perform such analyses since the corresponding solutions can be obtained in analytic form. Typical cases involve deposits which can be modeled by a uniform halfspace or simple layered halfspaces. For such cases DIGES uses closed-form solutions. These solutions are given for one as well as two dimensional deconvolution. The type of waves considered include P, SV and SH waves. The non-vertical incidence is given special attention since deconvolution can be defined differently depending on the problem of interest. For all wave cases considered, corresponding transfer functions are presented in closed-form. Transient solutions are obtained in the frequency domain. Finally, a variety of forms are considered for representing the free field motion both in terms of deterministic as well as probabilistic representations. These include (a) acceleration time histories, (b) response spectra (c) Fourier spectra and (d) cross-spectral densities.

Philippacopoulos, A.J.; Simos, N. [Brookhaven National Lab., Upton, NY (United States). Dept. of Advanced Technology

1995-05-01

36

Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis  

PubMed Central

The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M.

2015-01-01

37

Computational deconvolution of genome wide expression data from Parkinson's and Huntington's disease brain tissues using population-specific expression analysis.  

PubMed

The characterization of molecular changes in diseased tissues gives insight into pathophysiological mechanisms and is important for therapeutic development. Genome-wide gene expression analysis has proven valuable for identifying biological processes in neurodegenerative diseases using post mortem human brain tissue and numerous datasets are publically available. However, many studies utilize heterogeneous tissue samples consisting of multiple cell types, all of which contribute to global gene expression values, confounding biological interpretation of the data. In particular, changes in numbers of neuronal and glial cells occurring in neurodegeneration confound transcriptomic analyses, particularly in human brain tissues where sample availability and controls are limited. To identify cell specific gene expression changes in neurodegenerative disease, we have applied our recently published computational deconvolution method, population specific expression analysis (PSEA). PSEA estimates cell-type-specific expression values using reference expression measures, which in the case of brain tissue comprises mRNAs with cell-type-specific expression in neurons, astrocytes, oligodendrocytes and microglia. As an exercise in PSEA implementation and hypothesis development regarding neurodegenerative diseases, we applied PSEA to Parkinson's and Huntington's disease (PD, HD) datasets. Genes identified as differentially expressed in substantia nigra pars compacta neurons by PSEA were validated using external laser capture microdissection data. Network analysis and Annotation Clustering (DAVID) identified molecular processes implicated by differential gene expression in specific cell types. The results of these analyses provided new insights into the implementation of PSEA in brain tissues and additional refinement of molecular signatures in human HD and PD. PMID:25620908

Capurro, Alberto; Bodea, Liviu-Gabriel; Schaefer, Patrick; Luthi-Carter, Ruth; Perreau, Victoria M

2014-01-01

38

VCAT: Visual Crosswalk Analysis Tool  

SciTech Connect

VCAT is a knowledge modeling and analysis tool. It was synthesized from ideas in functional analysis, business process modeling, and complex network science. VCAT discovers synergies by analyzing natural language descriptions. Specifically, it creates visual analytic perspectives that capture intended organization structures, then overlays the serendipitous relationships that point to potential synergies within an organization or across multiple organizations.

Cleland, Timothy J. [Los Alamos National Laboratory; Forslund, David W. [Los Alamos National Laboratory; Cleland, Catherine A. [Los Alamos National Laboratory

2012-08-31

39

Constrained spherical deconvolution analysis of the limbic network in human, with emphasis on a direct cerebello-limbic pathway  

PubMed Central

The limbic system is part of an intricate network which is involved in several functions like memory and emotion. Traditionally the role of the cerebellum was considered mainly associated to motion control; however several evidences are raising about a role of the cerebellum in learning skills, emotions control, mnemonic and behavioral processes involving also connections with limbic system. In 15 normal subjects we studied limbic connections by probabilistic Constrained Spherical Deconvolution (CSD) tractography. The main result of our work was to prove for the first time in human brain the existence of a direct cerebello-limbic pathway which was previously hypothesized but never demonstrated. We also extended our analysis to the other limbic connections including cingulate fasciculus, inferior longitudinal fasciculus, uncinated fasciculus, anterior thalamic connections and fornix. Although these pathways have been already described in the tractographic literature we provided reconstruction, quantitative analysis and Fractional Anisotropy (FA) right-left symmetry comparison using probabilistic CSD tractography that is known to provide a potential improvement compared to previously used Diffusion Tensor Imaging (DTI) techniques. The demonstration of the existence of cerebello-limbic pathway could constitute an important step in the knowledge of the anatomic substrate of non-motor cerebellar functions. Finally the CSD statistical data about limbic connections in healthy subjects could be potentially useful in the diagnosis of pathological disorders damaging this system. PMID:25538606

Arrigo, Alessandro; Mormina, Enricomaria; Anastasi, Giuseppe Pio; Gaeta, Michele; Calamuneri, Alessandro; Quartarone, Angelo; De Salvo, Simona; Bruschetta, Daniele; Rizzo, Giuseppina; Trimarchi, Fabio; Milardi, Demetrio

2014-01-01

40

WEAT: Web Enabled Analysis Tool  

NSDL National Science Digital Library

Behavioral Risk Factor Surveillance SystemThe BRFSS, the worlds largest telephone survey, tracks health risks in the United States. Information from thesurvey is used to improve the health of the American people. This tool allows users to create crosstablulations and perform logistic analysis on these data.

Center for Disease Control

41

Evaluation of process tools in systems analysis  

Microsoft Academic Search

Process tools are used during Systems Analysis to describe the process logic of bubbles in Data Flow Diagrams. We conducted two experiments to determine the relative merits of three process tools: Structured English from textual tool category; Decision Tables from tabular tool category; and Nassi-Schneiderman Charts from graphical tool category. We measured three performance types: tool-based comprehension to find understandability

Narasimhaiah Gorla; Hao-Che Pu; Walter O Rom

1995-01-01

42

Comprehensive analysis of yeast metabolite GC x GC-TOFMS data: combining discovery-mode and deconvolution chemometric software.  

PubMed

The first extensive study of yeast metabolite GC x GC-TOFMS data from cells grown under fermenting, R, and respiring, DR, conditions is reported. In this study, recently developed chemometric software for use with three-dimensional instrumentation data was implemented, using a statistically-based Fisher ratio method. The Fisher ratio method is fully automated and will rapidly reduce the data to pinpoint two-dimensional chromatographic peaks differentiating sample types while utilizing all the mass channels. The effect of lowering the Fisher ratio threshold on peak identification was studied. At the lowest threshold (just above the noise level), 73 metabolite peaks were identified, nearly three-fold greater than the number of previously reported metabolite peaks identified (26). In addition to the 73 identified metabolites, 81 unknown metabolites were also located. A Parallel Factor Analysis graphical user interface (PARAFAC GUI) was applied to selected mass channels to obtain a concentration ratio, for each metabolite under the two growth conditions. Of the 73 known metabolites identified by the Fisher ratio method, 54 were statistically changing to the 95% confidence limit between the DR and R conditions according to the rigorous Student's t-test. PARAFAC determined the concentration ratio and provided a fully-deconvoluted (i.e. mathematically resolved) mass spectrum for each of the metabolites. The combination of the Fisher ratio method with the PARAFAC GUI provides high-throughput software for discovery-based metabolomics research, and is novel for GC x GC-TOFMS data due to the use of the entire data set in the analysis (640 MB x 70 runs, double precision floating point). PMID:17646875

Mohler, Rachel E; Dombek, Kenneth M; Hoggard, Jamin C; Pierce, Karisa M; Young, Elton T; Synovec, Robert E

2007-08-01

43

Failure environment analysis tool applications  

NASA Technical Reports Server (NTRS)

Understanding risks and avoiding failure are daily concerns for the women and men of NASA. Although NASA's mission propels us to push the limits of technology, and though the risks are considerable, the NASA community has instilled within, the determination to preserve the integrity of the systems upon which our mission and, our employees lives and well-being depend. One of the ways this is being done is by expanding and improving the tools used to perform risk assessment. The Failure Environment Analysis Tool (FEAT) was developed to help engineers and analysts more thoroughly and reliably conduct risk assessment and failure analysis. FEAT accomplishes this by providing answers to questions regarding what might have caused a particular failure; or, conversely, what effect the occurrence of a failure might have on an entire system. Additionally, FEAT can determine what common causes could have resulted in other combinations of failures. FEAT will even help determine the vulnerability of a system to failures, in light of reduced capability. FEAT also is useful in training personnel who must develop an understanding of particular systems. FEAT facilitates training on system behavior, by providing an automated environment in which to conduct 'what-if' evaluation. These types of analyses make FEAT a valuable tool for engineers and operations personnel in the design, analysis, and operation of NASA space systems.

Pack, Ginger L.; Wadsworth, David B.

1993-01-01

44

Dynamic Hurricane Data Analysis Tool  

NASA Technical Reports Server (NTRS)

A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

2009-01-01

45

Library Optimization in EDXRF Spectral Deconvolution for Multi-element Analysis of Ambient Aerosols  

EPA Science Inventory

In multi-element analysis of atmospheric aerosols, attempts are made to fit overlapping elemental spectral lines for many elements that may be undetectable in samples due to low concentrations. Fitting with many library reference spectra has the unwanted effect of raising the an...

46

Shot Planning and Analysis Tools  

SciTech Connect

Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z

2011-07-25

47

General Mission Analysis Tool (GMAT)  

NASA Technical Reports Server (NTRS)

The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development. The goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities, businesses, and other government organizations, and to share that technology in an open and unhindered way. GMAT is a free and open source software system licensed under the NASA Open Source Agreement: free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or further technology development.

Hughes, Steven P.

2007-01-01

48

Global spatial deconvolution of Lunar Prospector Th abundances  

Microsoft Academic Search

We have completed the first global spatial deconvolution analysis of planetary gamma-ray data for lunar Th abundances as measured by the Lunar Prospector Gamma-ray Spectrometer. We tested two different spatial deconvolution techniques Jansson's method and the Pixon method and determined that the Pixon method provides superior performance. The final deconvolved map results in a spatial resolution improvement of

D. J. Lawrence; R. C. Puetter; R. C. Elphic; W. C. Feldman; J. J. Hagerty; T. H. Prettyman; P. D. Spudis

2007-01-01

49

Flow Analysis Tool White Paper  

NASA Technical Reports Server (NTRS)

Faster networks are continually being built to accommodate larger data transfers. While it is intuitive to think that implementing faster networks will result in higher throughput rates, this is often not the case. There are many elements involved in data transfer, many of which are beyond the scope of the network itself. Although networks may get bigger and support faster technologies, the presence of other legacy components, such as older application software or kernel parameters, can often cause bottlenecks. Engineers must be able to identify when data flows are reaching a bottleneck that is not imposed by the network and then troubleshoot it using the tools available to them. The current best practice is to collect as much information as possible on the network traffic flows so that analysis is quick and easy. Unfortunately, no single method of collecting this information can sufficiently capture the whole endto- end picture. This becomes even more of a hurdle when large, multi-user systems are involved. In order to capture all the necessary information, multiple data sources are required. This paper presents a method for developing a flow analysis tool to effectively collect network flow data from multiple sources and provide that information to engineers in a clear, concise way for analysis. The purpose of this method is to collect enough information to quickly (and automatically) identify poorly performing flows along with the cause of the problem. The method involves the development of a set of database tables that can be populated with flow data from multiple sources, along with an easyto- use, web-based front-end interface to help network engineers access, organize, analyze, and manage all the information.

Boscia, Nichole K.

2012-01-01

50

Unsupervised Blind Deconvolution  

NASA Astrophysics Data System (ADS)

"Blind" deconvolution is rarely executed blindly. All available methods have parameters which the user fine-tunes until the most visually-appealing reconstruction is achieved. The "art" of deconvolution is to find constraints which allow for the best estimate of an object to be recovered, but in practice these parameterized constraints often reduce deconvolution to the struggle of trial and error. In the course of AFOSR-sponsored activities we are developing a general maximum a posteriori framework for the problem of imaging through atmospheric turbulence, with the emphasis on multi-frame blind deconvolution. Our aim is to develop deconvolution strategy which is reference-less, i.e. no calibration PSF is required, extendable to longer exposures, and applicable to imaging with adaptive optics. In the first part of the project the focus has been on developing a new theory of statistics of images taken through turbulence, both with-, and without adaptive optics. Images and their Fourier transforms have been described as random phasor sums, their fluctuations controlled by wavefront "cells" and moments of the phase. The models were validated using simulations and real data from the 3.5m telescope at the Starfire Optical Range in New Mexico. Another important ingredient of the new framework is the capability to estimate the average PSF automatically from the target observations. A general approach, applicable to any type of object, has been proposed. Here use is made of an object-cancelling transformation of the image sequence. This transformation yields information about the atmospheric PSF. Currently, the PSF estimation module and the theoretical constraints on PSF variability are being incorporated into multi-frame blind deconvolution. In preliminary simulation tests we obtained significantly sharper images with respect to the starting observations and PSF estimates which closely track the input kernels. Thanks to access to the SOR 3.5m telescope we are now testing our deconvolution approach on images of real, extended objects. Adaptive-optics-assisted I-band observations at SOR rarely exceed Strehl ratios of 15% and therefore, in many cases, deconvolution post adaptive optics would be necessary for object identification.

Baena-Galle, R.; Kann, L.; Mugnier, L.; Gudimetla, R.; Johnson, R.; Gladysz, S.

2013-09-01

51

Wavespace-Based Coherent Deconvolution  

NASA Technical Reports Server (NTRS)

Array deconvolution is commonly used in aeroacoustic analysis to remove the influence of a microphone array's point spread function from a conventional beamforming map. Unfortunately, the majority of deconvolution algorithms assume that the acoustic sources in a measurement are incoherent, which can be problematic for some aeroacoustic phenomena with coherent, spatially-distributed characteristics. While several algorithms have been proposed to handle coherent sources, some are computationally intractable for many problems while others require restrictive assumptions about the source field. Newer generalized inverse techniques hold promise, but are still under investigation for general use. An alternate coherent deconvolution method is proposed based on a wavespace transformation of the array data. Wavespace analysis offers advantages over curved-wave array processing, such as providing an explicit shift-invariance in the convolution of the array sampling function with the acoustic wave field. However, usage of the wavespace transformation assumes the acoustic wave field is accurately approximated as a superposition of plane wave fields, regardless of true wavefront curvature. The wavespace technique leverages Fourier transforms to quickly evaluate a shift-invariant convolution. The method is derived for and applied to ideal incoherent and coherent plane wave fields to demonstrate its ability to determine magnitude and relative phase of multiple coherent sources. Multi-scale processing is explored as a means of accelerating solution convergence. A case with a spherical wave front is evaluated. Finally, a trailing edge noise experiment case is considered. Results show the method successfully deconvolves incoherent, partially-coherent, and coherent plane wave fields to a degree necessary for quantitative evaluation. Curved wave front cases warrant further investigation. A potential extension to nearfield beamforming is proposed.

Bahr, Christopher J.; Cattafesta, Louis N., III

2012-01-01

52

Survey of visualization and analysis tools  

NASA Technical Reports Server (NTRS)

A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

Meyer, P. J.

1994-01-01

53

Explicit deconvolution of wellbore storage distorted well test data  

E-print Network

The analysis/interpretation of wellbore storage distorted pressure transient test data remains one of the most significant challenges in well test analysis. Deconvolution (i.e., the "conversion" of a variable-rate distorted pressure profile...

Bahabanian, Olivier

2007-04-25

54

Blind image deconvolution  

Microsoft Academic Search

The goal of image restoration is to reconstruct the original scene from a degraded observation. This recovery process is critical to many image processing applications. Although classical linear image restoration has been thoroughly studied, the more difficult problem of blind image restoration has numerous research possibilities. We introduce the problem of blind deconvolution for images, provide an overview of the

DEEPA KUNDUR; D. Hatzinakos

1996-01-01

55

Multi-Wiener SURE-LET deconvolution.  

PubMed

In this paper, we propose a novel deconvolution algorithm based on the minimization of a regularized Stein's unbiased risk estimate (SURE), which is a good estimate of the mean squared error. We linearly parametrize the deconvolution process by using multiple Wiener filters as elementary functions, followed by undecimated Haar-wavelet thresholding. Due to the quadratic nature of SURE and the linear parametrization, the deconvolution problem finally boils down to solving a linear system of equations, which is very fast and exact. The linear coefficients, i.e., the solution of the linear system of equations, constitute the best approximation of the optimal processing on the Wiener-Haar-threshold basis that we consider. In addition, the proposed multi-Wiener SURE-LET approach is applicable for both periodic and symmetric boundary conditions, and can thus be used in various practical scenarios. The very competitive (both in computation time and quality) results show that the proposed algorithm, which can be interpreted as a kind of nonlinear Wiener processing, can be used as a basic tool for building more sophisticated deconvolution algorithms. PMID:23335668

Xue, Feng; Luisier, Florian; Blu, Thierry

2013-05-01

56

ANALYSIS OF INTERFACE STRESSES IN CUTTING TOOL  

Microsoft Academic Search

In this paper the Filon's transformed equations have been applied successfully to evaluate the chip-tool interface stresses of a cutting tool from the photo-elastic data. This method is shown to be advantageous over the shear difference method, particularly for negative rake tools. The analysis was carried out to study the effect of rake angle and depth of cut on the

R. NAGARAJAN; J. S. RAO

1967-01-01

57

ADVANCED POWER SYSTEMS ANALYSIS TOOLS  

SciTech Connect

The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is dependent on the chemistry of the particle, it is possible to map chemically similar areas which can also be related to the viscosity of that compound at temperature. A second method was also developed to determine the elements associated with the organic matrix of the coals, which is currently determined by chemical fractionation. Mineral compositions and mineral densities can be determined for both included and excluded minerals, as well as the fraction of the ash that will be represented by that mineral on a frame-by-frame basis. The slag viscosity model was improved to provide improved predictions of slag viscosity and temperature of critical viscosity for representative Powder River Basin subbituminous and lignite coals.

Robert R. Jensen; Steven A. Benson; Jason D. Laumb

2001-08-31

58

NCI Interactive Budget Analysis Tool  

Cancer.gov

This tool provides users an interactive overview of the National Cancer Institute (NCI) budget and Fact Book data since Fiscal Year 1999. Additional historical NCI budget information can be obtained through the NCI Fact Book Collection.

59

Total variation blind deconvolution  

Microsoft Academic Search

We present a blind deconvolution algorithm based on the total variational (TV) minimization method proposed by Acar and Vogel (1994). The motivation for regularizing with the TV norm is that it is extremely effective for recovering edges of images as well as some blurring functions, e.g., motion blur and out-of-focus blur. An alternating minimization (AM) implicit iterative scheme is devised

Tony F. Chan; Chiu-Kwong Wong

1998-01-01

60

Windprofiler optimization using digital deconvolution procedures  

NASA Astrophysics Data System (ADS)

Digital improvements to data acquisition procedures used for windprofiler radars have the potential for improving the height coverage at optimum resolution, and permit improved height resolution. A few newer systems already use this capability. Real-time deconvolution procedures offer even further optimization, and this has not been effectively employed in recent years. In this paper we demonstrate the advantages of combining these features, with particular emphasis on the advantages of real-time deconvolution. Using several multi-core CPUs, we have been able to achieve speeds of up to 40 GHz from a standard commercial motherboard, allowing data to be digitized and processed without the need for any type of hardware except for a transmitter (and associated drivers), a receiver and a digitizer. No Digital Signal Processor chips are needed, allowing great flexibility with analysis algorithms. By using deconvolution procedures, we have then been able to not only optimize height resolution, but also have been able to make advances in dealing with spectral contaminants like ground echoes and other near-zero-Hz spectral contamination. Our results also demonstrate the ability to produce fine-resolution measurements, revealing small-scale structures within the backscattered echoes that were previously not possible to see. Resolutions of 30 m are possible for VHF radars. Furthermore, our deconvolution technique allows the removal of range-aliasing effects in real time, a major bonus in many instances. Results are shown using new radars in Canada and Costa Rica.

Hocking, W. K.; Hocking, A.; Hocking, D. G.; Garbanzo-Salas, M.

2014-10-01

61

IMPAIR: massively parallel deconvolution on the GPU  

NASA Astrophysics Data System (ADS)

The IMPAIR software is a high throughput image deconvolution tool for processing large out-of-core datasets of images, varying from large images with spatially varying PSFs to large numbers of images with spatially invariant PSFs. IMPAIR implements a parallel version of the tried and tested Richardson-Lucy deconvolution algorithm regularised via a custom wavelet thresholding library. It exploits the inherently parallel nature of the convolution operation to achieve quality results on consumer grade hardware: through the NVIDIA Tesla GPU implementation, the multi-core OpenMP implementation, and the cluster computing MPI implementation of the software. IMPAIR aims to address the problem of parallel processing in both top-down and bottom-up approaches: by managing the input data at the image level, and by managing the execution at the instruction level. These combined techniques will lead to a scalable solution with minimal resource consumption and maximal load balancing. IMPAIR is being developed as both a stand-alone tool for image processing, and as a library which can be embedded into non-parallel code to transparently provide parallel high throughput deconvolution.

Sherry, Michael; Shearer, Andy

2013-02-01

62

Target deconvolution techniques in modern phenotypic profiling  

PubMed Central

The past decade has seen rapid growth in the use of diverse compound libraries in classical phenotypic screens to identify modulators of a given process. The subsequent process of identifying the molecular targets of active hits, also called target deconvolution, is an essential step for understanding compound mechanism of action and for using the identified hits as tools for further dissection of a given biological process. Recent advances in omics technologies, coupled with in silico approaches and the reduced cost of whole genome sequencing, have greatly improved the workflow of target deconvolution and have contributed to a renaissance of modern phenotypic profiling. In this review, we will outline how both new and old techniques are being used in the difficult process of target identification and validation as well as discuss some of the ongoing challenges remaining for phenotypic screening. PMID:23337810

Lee, Jiyoun; Bogyo, Matthew

2013-01-01

63

Thermomechanical stress analysis of superplastic forming tools  

Microsoft Academic Search

A thermomechanical stress analysis of a superplastic forming (SPF) tool is performed by means of the finite element simulation of the whole forming process. The distributions of residual stress and distortion within the tool are investigated in order to evaluate the damage effects of thermomechanical loading. The effect of cyclic loading is related to the fact that residual stress and

C. Y. Gao; P. Lours; G. Bernhart

2005-01-01

64

Rapid gas chromatographic analysis of less abundant compounds in distilled spirits by direct injection with ethanol-water venting and mass spectrometric data deconvolution.  

PubMed

The principal trace secondary compounds common to fermentation-derived distilled spirits can be rapidly quantified by directly injecting 5muL of spirit without sample preparation to a narrow-bore 0.15mm internal diameter capillary column. The ethanol-water is removed in an initial solvent venting step using a programmed temperature vapourization injector, followed by splitless transfer of the target analytes to the column. The larger injection facilitates trace analysis and ethanol-water removal extends column lifetime. Problems of coelution between analytes or with sample matrix were surmounted by using mass spectral deconvolution software for quantification. All operations in the analysis from injection with solvent venting to data reduction are fully automated for unattended sequential sample analysis. The synergy of the various contributory steps combines to offer an effective novel solution for this analysis. Applications include quantification of low ppm amounts of acids and esters and sub-ppm profiling of trace compounds from both the raw material malt and the ageing in wood barrels. PMID:19959175

Macnamara, Kevin; Lee, Michelle; Robbat, Albert

2010-01-01

65

Statistical Tools for Forensic Analysis of Toolmarks  

SciTech Connect

Recovery and comparison of toolmarks, footprint impressions, and fractured surfaces connected to a crime scene are of great importance in forensic science. The purpose of this project is to provide statistical tools for the validation of the proposition that particular manufacturing processes produce marks on the work-product (or tool) that are substantially different from tool to tool. The approach to validation involves the collection of digital images of toolmarks produced by various tool manufacturing methods on produced work-products and the development of statistical methods for data reduction and analysis of the images. The developed statistical methods provide a means to objectively calculate a ''degree of association'' between matches of similarly produced toolmarks. The basis for statistical method development relies on ''discriminating criteria'' that examiners use to identify features and spatial relationships in their analysis of forensic samples. The developed data reduction algorithms utilize the same rules used by examiners for classification and association of toolmarks.

David Baldwin; Max Morris; Stan Bajic; Zhigang Zhou; James Kreiser

2004-04-22

66

Sequential deconvolution input reconstruction  

NASA Astrophysics Data System (ADS)

The reconstruction of inputs from measured outputs is examined. It is shown that the rank deficiency that arises in de-convolving non-collocated arrangements is associated with a kernel that is non-zero only over the part of the time axis where delay from wave propagation prevents uniqueness. Input deconvolution, therefore, follows in the same manner for collocated and non-collocated scenarios, collocation being the special case where the prediction lag can be zero. This paper illustrates that deconvolution carried out on a sliding window is a conditionally stable process and the condition for stability is derived. Examination of the Cramer-Rao Lower Bound of the inputs in frequency shows that the inference model should be formulated such that the spectra of the inputs to be reconstructed, and of the realized measurement noise, are within the model bandwidth. An expression for the error in the reconstructed input as a function of the noise sequence is developed and is used to control the regularization, when regularization is needed. The paper brings attention to the fact that finite dimensional models cannot display true dead time and that failure to recognize this matter has led to algorithms that, in general, propose to violate the physical constraints.

Bernal, Dionisio; Ussia, Alessia

2015-01-01

67

Liquid chromatography with diode array detection combined with spectral deconvolution for the analysis of some diterpene esters in Arabica coffee brew.  

PubMed

In this manuscript, the separation of kahweol and cafestol esters from Arabica coffee brews was investigated using liquid chromatography with a diode array detector. When detected in conjunction, cafestol, and kahweol esters were eluted together, but, after optimization, the kahweol esters could be selectively detected by setting the wavelength at 290 nm to allow their quantification. Such an approach was not possible for the cafestol esters, and spectral deconvolution was used to obtain deconvoluted chromatograms. In each of those chromatograms, the four esters were baseline separated allowing for the quantification of the eight targeted compounds. Because kahweol esters could be quantified either using the chromatogram obtained by setting the wavelength at 290 nm or using the deconvoluted chromatogram, those compounds were used to compare the analytical performances. Slightly better limits of detection were obtained using the deconvoluted chromatogram. Identical concentrations were found in a real sample with both approaches. The peak areas in the deconvoluted chromatograms were repeatable (intraday repeatability of 0.8%, interday repeatability of 1.0%). This work demonstrates the accuracy of spectral deconvolution when using liquid chromatography to mathematically separate coeluting compounds using the full spectra recorded by a diode array detector. PMID:25521818

Erny, Guillaume L; Moeenfard, Marzieh; Alves, Arminda

2015-02-01

68

Stochastic Simulation Tool for Aerospace Structural Analysis  

NASA Technical Reports Server (NTRS)

Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

Knight, Norman F.; Moore, David F.

2006-01-01

69

The Galileo Fault Tree Analysis Tool  

Microsoft Academic Search

We present Galileo, a dynamic fault tree modeling and analysis tool that combines the innovative DIF- Tree analysis methodology with a rich user interface built using package-oriented programming. DIFTree integrates binary decision diagram and Markov meth- ods under the common notation of dynamic fault trees, allowing the user to exploit the benefits of both tech- niques while avoiding the need

Kevin J. Sullivan; Joanne Bechta Dugan; David Coppit

1999-01-01

70

A network evaluation and analysis tool  

SciTech Connect

The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.

1993-01-01

71

A network evaluation and analysis tool  

SciTech Connect

The rapid emergence of large hetemgeneous networks, distributed systems, and massively parallel computers has resulted in economies of scale, enhanced productivity, efficient communication, resource sharing, and increased reliability, which are computationally beneficial. In addition to these benefits, networking presents technical challenges and problems with respect to maintaining and ensuring the security, design, compatibility, integrity, functionality, and management of these systems. In this paper we describe a computer security tool, Network Evaluation and Analysis Tool (NEAT), that we have developed to address these concerns.

Stoltz, L.A.; Whiteson, R.; Fasel, P.K.; Temple, R.; Dreicer, J.S.

1993-05-01

72

Built Environment Energy Analysis Tool Overview (Presentation)  

SciTech Connect

This presentation provides an overview of the Built Environment Energy Analysis Tool, which is designed to assess impacts of future land use/built environment patterns on transportation-related energy use and greenhouse gas (GHG) emissions. The tool can be used to evaluate a range of population distribution and urban design scenarios for 2030 and 2050. This tool was produced as part of the Transportation Energy Futures (TEF) project, a Department of Energy-sponsored multi-agency project initiated to pinpoint underexplored strategies for abating GHGs and reducing petroleum dependence related to transportation.

Porter, C.

2013-04-01

73

Accuracy of deconvolution analysis based on singular value decomposition for quantification of cerebral blood flow using dynamic susceptibility contrast-enhanced magnetic resonance imaging.  

PubMed

Deconvolution analysis (DA) based on singular value decomposition (SVD) has been widely accepted for quantification of cerebral blood flow (CBF) using dynamic susceptibility contrast-enhanced magnetic resonance imaging (DSC-MRI). When using this method, the elements in the diagonal matrix obtained by SVD are set to zero when they are smaller than the threshold value given beforehand. In the present study, we investigated the effect of the threshold value on the accuracy of the CBF values obtained by this method using computer simulations. We also investigated the threshold value giving the CBF closest to the assumed value (optimal threshold value) under various conditions. The CBF values obtained by this method largely depended on the threshold value. Both the mean and the standard deviation of the estimated CBF values decreased with increasing threshold value. The optimal threshold value decreased with increasing signal-to-noise ratio and CBF, and increased with increasing cerebral blood volume. Although delay and dispersion in the arterial input function also affected the relationship between the estimated CBF and threshold values, the optimal threshold value tended to be nearly constant. In conclusion, our results suggest that the threshold value should be carefully considered when quantifying CBF in terms of absolute values using DSC-MRI for DA based on SVD. We believe that this study will be helpful in selecting the threshold value in SVD. PMID:11768497

Murase, K; Shinohara, M; Yamazaki, Y

2001-12-01

74

Performance Analysis of GYRO: A Tool Evaluation  

SciTech Connect

The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

2005-06-26

75

Photogrammetry Tool for Forensic Analysis  

NASA Technical Reports Server (NTRS)

A system allows crime scene and accident scene investigators the ability to acquire visual scene data using cameras for processing at a later time. This system uses a COTS digital camera, a photogrammetry calibration cube, and 3D photogrammetry processing software. In a previous instrument developed by NASA, the laser scaling device made use of parallel laser beams to provide a photogrammetry solution in 2D. This device and associated software work well under certain conditions. In order to make use of a full 3D photogrammetry system, a different approach was needed. When using multiple cubes, whose locations relative to each other are unknown, a procedure that would merge the data from each cube would be as follows: 1. One marks a reference point on cube 1, then marks points on cube 2 as unknowns. This locates cube 2 in cube 1 s coordinate system. 2. One marks reference points on cube 2, then marks points on cube 1 as unknowns. This locates cube 1 in cube 2 s coordinate system. 3. This procedure is continued for all combinations of cubes. 4. The coordinate of all of the found coordinate systems is then merged into a single global coordinate system. In order to achieve maximum accuracy, measurements are done in one of two ways, depending on scale: when measuring the size of objects, the coordinate system corresponding to the nearest cube is used, or when measuring the location of objects relative to a global coordinate system, a merged coordinate system is used. Presently, traffic accident analysis is time-consuming and not very accurate. Using cubes with differential GPS would give absolute positions of cubes in the accident area, so that individual cubes would provide local photogrammetry calibration to objects near a cube.

Lane, John

2012-01-01

76

Design and Analysis Tools for Supersonic Inlets  

NASA Technical Reports Server (NTRS)

Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

Slater, John W.; Folk, Thomas C.

2009-01-01

77

UMMPerfusion: an open source software tool towards quantitative MRI perfusion analysis in clinical routine.  

PubMed

To develop a generic Open Source MRI perfusion analysis tool for quantitative parameter mapping to be used in a clinical workflow and methods for quality management of perfusion data. We implemented a classic, pixel-by-pixel deconvolution approach to quantify T1-weighted contrast-enhanced dynamic MR imaging (DCE-MRI) perfusion data as an OsiriX plug-in. It features parallel computing capabilities and an automated reporting scheme for quality management. Furthermore, by our implementation design, it could be easily extendable to other perfusion algorithms. Obtained results are saved as DICOM objects and directly added to the patient study. The plug-in was evaluated on ten MR perfusion data sets of the prostate and a calibration data set by comparing obtained parametric maps (plasma flow, volume of distribution, and mean transit time) to a widely used reference implementation in IDL. For all data, parametric maps could be calculated and the plug-in worked correctly and stable. On average, a deviation of 0.032??0.02 ml/100 ml/min for the plasma flow, 0.004??0.0007 ml/100 ml for the volume of distribution, and 0.037??0.03 s for the mean transit time between our implementation and a reference implementation was observed. By using computer hardware with eight CPU cores, calculation time could be reduced by a factor of 2.5. We developed successfully an Open Source OsiriX plug-in for T1-DCE-MRI perfusion analysis in a routine quality managed clinical environment. Using model-free deconvolution, it allows for perfusion analysis in various clinical applications. By our plug-in, information about measured physiological processes can be obtained and transferred into clinical practice. PMID:22832894

Zllner, Frank G; Weisser, Gerald; Reich, Marcel; Kaiser, Sven; Schoenberg, Stefan O; Sourbron, Steven P; Schad, Lothar R

2013-04-01

78

Mars Reconnaissance Orbiter Uplink Analysis Tool  

NASA Technical Reports Server (NTRS)

This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

2008-01-01

79

Integrated Drill Core Data Analysis Tools  

NASA Astrophysics Data System (ADS)

Data management in scientific drilling programs such IODP, ICDP, and ANDRILL is applied to capture drilling and science data during an expedition and for long-term data storage and dissemination. Currently data management tools are linked directly with capture and visualization applications to allow for both, a two-way flow of data between the database and the applications, and an integrated data environment. The new system has meanwhile been tested by recent IODP and ICDP projects. The components comprise the Expedition Drilling Information System (ExpeditionDIS) used for data acquisition, PSICAT, the Paleontological Stratigraphic Interval Construction and Analysis Tool, for graphical editing and viewing of core description diagrams, and Corelyzer as part of CoreWall for scalable, extensible visualization, developed to enhance the study of geological cores. This interoperable configuration of tools provides an excellent all-in-one toolbox for core analysis.

Conze, Ronald; Reed, Josh; Chen, Yu-Chung; Krysiak, Frank

2010-05-01

80

The CO5BOLD analysis tool.  

NASA Astrophysics Data System (ADS)

The interactive IDL-based CO5BOLD Analysis Tool (CAT) was developed to facilitate an easy and quick analysis of numerical simulation data produced with the 2D/3D radiation magnetohydrodynamics code CO5BOLD. The basic mode of operation is the display and analysis of cross-sections through a model either as 2D slices or 1D graphs. A wide range of physical quantities can be selected. Further features include the export of models into VAPOR format or the output of images and animations. A short overview including scientific analysis examples is given.

Wedemeyer, S.

81

Accelerator physics analysis with interactive tools  

SciTech Connect

Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties.

Holt, J.A.; Michelotti, L.

1993-05-01

82

Errata: Response Analysis and Error Diagnosis Tools.  

ERIC Educational Resources Information Center

This guide to ERRATA, a set of HyperCard-based tools for response analysis and error diagnosis in language testing, is intended as a user manual and general reference and designed to be used with the software (not included here). It has three parts. The first is a brief survey of computational techniques available for dealing with student test

Hart, Robert S.

83

lmbench: Portable Tools for Performance Analysis  

Microsoft Academic Search

lmbench is a micro-benchmark suite designed to focus attention on the basic building blocks of man y common system applications, such as databases, simu- lations, software development, and networking. In almost all cases, the indi vidual tests are the result of analysis and isolation of a customer' sa ctual perfor- mance problem. These tools can be, and currently are, used

Larry W. Mcvoy; Carl Staelin

1996-01-01

84

Integrated multidisciplinary analysis tool IMAT users' guide  

NASA Technical Reports Server (NTRS)

The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.

Meissner, Frances T. (editor)

1988-01-01

85

Paramedir: A Tool for Programmable Performance Analysis  

NASA Technical Reports Server (NTRS)

Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

2004-01-01

86

The Galileo Fault Tree Analysis Tool  

E-print Network

We present Galileo, a dynamic fault tree modeling and analysis tool that combines the innovative DIFTree analysis methodology with a rich user interface built using package-oriented programming. DIFTree transparently integrates binary decision diagram and Markov methods under the common notation of dynamic fault trees, allowing the user to exploit the benefits of both techniques while avoiding the need to learn additional notations and methodologies. Package-oriented programming (POP) is a novel software architectural style, in which large-scale software packages are used as components, exploiting their rich functionality and familiarity to users. Galileo can be obtained for free under license for evaluation, and can be downloaded from the World-Wide Web. Keywords: Galileo, dynamic fault tree, DIFtree, software tools, component-based, package-oriented Approximate word count: 3000 This material has been cleared for publication by the University of Virginia. 2 The Galileo Fault Tree...

Joanne Bechta Dugan; Kevin J. Sullivan; David Coppit; Kevin J. Sullivan; Joanne Bechta Dugan

1999-01-01

87

Challenges Facing Design and Analysis Tools  

NASA Technical Reports Server (NTRS)

The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.

Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)

2001-01-01

88

Industrial Geospatial Analysis Tool for Energy Evaluation  

E-print Network

statistical analysis of multiple databases to estimate manufacturing plants energy consumption for over 300,000 manufacturers across the U.S. and provides geospatial interlinking to Google Earth using MATLAB based mapping tools. We used the ?bottom up... for the Statistical Model (O) DO21 Filter (3) Criteria: R27.5% Computational Module Geospatial Module (Google Earth) Single Data Point out per Iteration DO31 Geospatial Coordinates by ZIP Code. ELI per Zip Code (MWH/$) EIP...

Alkadi, N.; Starke, M.; Ma, O.; Nimbalkar, S.; Cox, D.; Dowling, K.; Johnson, B.; Khan, S.

2013-01-01

89

Tectonic Analysis of Esh El-Mallaha Area, Gulf of Suez Using Euler Deconvolution for Aeromagnetic Data  

NASA Astrophysics Data System (ADS)

Esh El-Mallaha area is located on the western coast of the Gulf of Suez which is considered the main source of hydrocarbon resources in Egypt. The main exploration problem of the Gulf of Suez (and areas around) is the existence of the Pre-Miocene salt that masks the seismic energy and as a result, seismic method is not usually able to provide information about the subsurface structure. A solution may be existed using potential field methods such as magnetic which is highly sensitive to basement and not affected by salt. Herein, aeromagnetic data over Esh El-Mallaha area have been interpreted to provide a new look on the subsurface structure and tectonics of the area. This interpretation includes the application of Euler method which has been considered as a sufficient tool in magnetic interpretation. Comparing the results of Euler method with the available geologic data (wells, geologic maps), Euler method facilitates in identification of new faults as well as mapping of known faults from geologic information. Generally, the area is characterized by two basins structure trending in the NW-SE (parallel to the Gulf of Suez) direction. These two basins are separated by a high topographic feature (Esh El-Mallaha range) and bounded by faults of most probably normal type.

Aboud, E.; Ushijima, K.

2004-05-01

90

Tools for Next Generation Sequencing Data Analysis  

PubMed Central

As NGS technology continues to improve, the amount of data generated per run grows exponentially. Unfortunately, the primary bottleneck in NGS studies is still bioinformatics analysis. Not all researchers have access to a bioinformatics core or dedicated bioinformatician. Additionally, much of the software for NGS analyses is written to run in a Unix / Linux environment. Researchers unfamiliar with the Unix command line may be unable to use these tools, or face a steep learning curve in trying to do so. Commercial packages exist, such as the CLC Genomics Workbench, DNANexus, and GenomeQuest. However, these commercial packages often incorporate proprietary algorithms to perform data analysis and may be costly. Galaxy provides a solution to this problem by incorporating popular open-source and community linux command line tools into an easy to use web-based environment. After sequence data has been uploaded and mapped, there are a variety of workflows for NGS analyses that use open-source tools. This includes peak-calling analyses for ChIP-Seq (MACS, GeneTrack indexer, Peak predictor), RNA-Seq (Tophat, Cufflinks), and finding small insertions, deletions, and SNPs using SAMtools. Any researcher can apply a workflow to his NGS data and retrieve results, without having to interact with a command line. Additionally, since Galaxy is cloud-based, expensive computing hardware for performing analyses is not needed. In this presentation we will provide an overview of two popular open source RNA-Seq analysis tools, Tophat and Cufflinks, and demonstrate how they can be used in Galaxy.

Bodi, K.

2011-01-01

91

ATOM: a system for building customized program analysis tools  

Microsoft Academic Search

ATOM (Analysis Tools with OM) is a single framework for building a wide range of customized program analysis tools. It provides the common infrastructure present in all code-instrumenting tools; this is the difficult and time-consuming part. The user simply defines the tool-specific details in instrumentation and analysis routines. Building a basic block counting tool like Pixie with ATOM requires only

Amitabh Srivastava; Alan Eustace

1994-01-01

92

9. Analysis a. Analysis tools for dam removal  

E-print Network

9. Analysis a. Analysis tools for dam removal v. Hydrodynamic, sediment transport and physical are frequently the main concerns associated with a dam removal due to the possible effects on infrastructure reservoir sediment when removing a dam are river erosion, mechanical removal, and stabilization (ASCE 1997

Tullos, Desiree

93

Modeling of pharmacokinetic systems using stochastic deconvolution.  

PubMed

In environments where complete mechanistic knowledge of the system dynamics is not available, a synergy of first-principle concepts, stochastic methods and statistical approaches can provide an efficient, accurate, and insightful strategy for model development. In this work, a system of ordinary differential equations describing system pharmacokinetics (PK) was coupled to a Wiener process for tracking the absorption rate coefficient, and was embedded in a nonlinear mixed effects population PK formalism. The procedure is referred to as "stochastic deconvolution" and it is proposed as a diagnostic tool to inform on a mapping function between the fraction of the drug absorbed and the fraction of the drug dissolved when applying one-stage methods to in vitro-in vivo correlation modeling. The goal of this work was to show that stochastic deconvolution can infer an a priori specified absorption profile given dense observational (simulated) data. The results demonstrate that the mathematical model is able to accurately reproduce the simulated data in scenarios where solution strategies for linear, time-invariant systems would assuredly fail. To this end, PK systems that are representative of Michaelis-Menten kinetics and enterohepatic circulation were investigated. Furthermore, the solution times are manageable using a modest computer hardware platform. PMID:24174399

Kakhi, Maziar; Chittenden, Jason

2013-12-01

94

DISCRETE FUNCTIONAL ANALYSIS TOOLS FOR DISCONTINUOUS GALERKIN METHODS WITH APPLICATION  

E-print Network

DISCRETE FUNCTIONAL ANALYSIS TOOLS FOR DISCONTINUOUS GALERKIN METHODS WITH APPLICATION functional analysis tools are used to prove the conver- gence of Discontinuous Galerkin approximations and a conservative one based on a nonstandard modification of the pressure. 1. Introduction Discontinuous Galerkin

Paris-Sud XI, Universit de

95

Blind Deconvolution via Sequential Imputations  

Microsoft Academic Search

The sequential imputation procedure is applied to adaptively and sequentially reconstruct discrete input signals that are blurred by an unknown linear moving average channel and contaminated by additive Gaussian noises, a problem known as blind deconvolution in digital communication. A rejuvenation procedure for improving the efficiency of sequential imputation is introduced and theoretically justified. The proposed method does not require

Jun S. Liu; Rong Chen

1995-01-01

96

Deconvolution using the complex cepstrum  

SciTech Connect

The theory, description, and implementation of a generalized linear filtering system for the nonlinear filtering of convolved signals are presented. A detailed look at the problems and requirements associated with the deconvolution of signal components is undertaken. Related properties are also developed. A synthetic example is shown and is followed by an application using real seismic data. 29 figures.

Riley, H B

1980-12-01

97

Data Analysis with Graphical Models: Software Tools  

NASA Technical Reports Server (NTRS)

Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

Buntine, Wray L.

1994-01-01

98

Enhancement of Local Climate Analysis Tool  

NASA Astrophysics Data System (ADS)

The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

2012-12-01

99

GIS-based hydrogeochemical analysis tools (QUIMET)  

NASA Astrophysics Data System (ADS)

A software platform (QUIMET) was developed to improve the sorting, analysis, calculations, visualizations, and interpretations of hydrogeochemical data in a GIS environment. QUIMET is composed of a geospatial database plus a set of tools specially designed for graphical and statistical analysis of hydrogeochemical data. The geospatial database has been designed to include organic and inorganic chemical records, as well as relevant physical parameters (temperature, Eh, electrical conductivity). The instruments for analysis cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data. They include, among others, chemical time-series analysis, ionic balance calculations, correlation of chemical parameters, and calculation of various common hydrogeochemical diagrams (Salinity, Scheller-Berkaloff, Piper, and Stiff). The GIS platform allows the generation of maps of the spatial distribution of parameters and diagrams. Moreover, it allows performing a complete statistical analysis of the data including descriptive statistic univariate and bivariate analysis, the latter including generation of correlation matrices and graphics. Finally, QUIMET offers interoperability with other external platforms. The platform is illustrated with a geochemical data set from the city of Badalona, located on the Mediterranean coast in NE Spain.

Velasco, V.; Tubau, I.; Vzquez-Su, E.; Gogu, R.; Gaitanaru, D.; Alcaraz, M.; Serrano-Juan, A.; Fernndez-Garcia, D.; Garrido, T.; Fraile, J.; Sanchez-Vila, X.

2014-09-01

100

A speciation methodology to study the contributions of humic-like and fulvic-like acids to the mobilization of metals from compost using size exclusion chromatography-ultraviolet absorption-inductively coupled plasma mass spectrometry and deconvolution analysis.  

PubMed

High performance size-exclusion chromatography (HP-SEC) with UV absorption for organic matter detection and inductively coupled plasma mass spectrometry (ICP-MS) for elemental detection have been used to study the mobilization of metals from compost as a function of pH and the molecular mass of their complexes with dissolved organic matter (DOM). Due to its heterogeneous nature, organic matter mobilized from compost shows a continuous distribution of molecular masses in the range studied (up to 80kDa). In order to differentiate between the contribution of humic and fulvic acids (FA) to the organic matter mobilized in the pH range 5-10, their UV absorption chromatographic profiles have been deconvoluted with respect to the adjusted gaussian profiles of the humic and fulvic acids isolated from compost. Results show a preponderant contribution of fulvic acids at low pH values and an increasing percentage of humic acids (HA) mobilized at basic pH (up to 49% of total DOM at pH 10). A similar deconvolution procedure has been applied to the ICP-MS chromatograms of selected metals (Co, Cu, Pb and Bi). In general, both fulvic and humic acids contribute to the mobilization of divalent transition metals, such as copper or cobalt, whereas bismuth or lead are preferably associated to humic acids. Non-humic substances (HS) also contribute to the mobilization of cations, especially at acidic pHs. These conclusions have been extended to different elements based on deconvolution analysis results at pH 7. PMID:18068764

Laborda, Francisco; Bolea, Eduardo; Grriz, Mara P; Martn-Ruiz, Mara P; Ruiz-Beguera, Sergio; Castillo, Juan R

2008-01-01

101

SEAT: A strategic engagement analysis tool  

SciTech Connect

The Strategic Engagement Analysis Tool (SEAT) is a prototype of an expert system knowledge-based discrete event simulation. SEAT realistically represents the interrelationships between the eight major subsystems in the strategic targeting and assault domain. Some of the subsystems employ run-time cognitive decision making and reasoning capabilities to represent human tactical and operational strategy decisions. SEAT's goal is to allow analysts to conduct sensitivity analysis and to determine cause-effect relationships. An intelligent interface mechanism is provided to aid the analyst in scenario creation. The interface was designed to provide on-line documentation, support for model input, logic control, and data validation prior to simulation execution. 4 refs., 3 figs.

Dreicer, J.; Michelsen, C.; Morgeson, D.

1988-01-01

102

Deconvolution procedure of the UV-vis spectra. A powerful tool for the estimation of the binding of a model drug to specific solubilisation loci of bio-compatible aqueous surfactant-forming micelle.  

PubMed

UV-vis-spectra evolution of Nile Red loaded into Tween 20 micelles with pH and [Tween 20] have been analysed in a non-conventional manner by exploiting the deconvolution method. The number of buried sub-bands has been found to depend on both pH and bio-surfactant concentration, whose positions have been associated to Nile Red confined in aqueous solution and in the three micellar solubilisation sites. For the first time, by using an extended classical two-pseudo-phases-model, the robust treatment of the spectrophotometric data allows the estimation of Nile Red binding constant to the available loci. Hosting capability towards Nile Red is exalted by the pH enhancement. Comparison between binding constant values classically evaluated and those estimated by the deconvolution protocol unveiled that overall binding values perfectly match with the mean values of the local binding sites. This result suggests that deconvolution procedure provides more precise and reliable values, which are more representative of drug confinement. PMID:25703359

Calabrese, Ilaria; Merli, Marcello; Turco Liveri, Maria Liria

2015-05-01

103

Automated Steel Cleanliness Analysis Tool (ASCAT)  

SciTech Connect

The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

2005-12-30

104

DEVELOPMENT OF AN ERGONOMICS SCREENING TOOL FOR MULTITASK JOB ANALYSIS  

Microsoft Academic Search

Currently, most ergonomic analysis tools are intended for evaluation of mono- task jobs only. Also, many ergonomic analysis tools require a significant investment of time, money, and\\/or training to assess jobs. The field of ergonomics is in need of a tool that can quickly analyze jobs and categorize them based on assessment of riskjobs not needing further review (probably \\

Sharon Davis

105

Defining Digital Forensic Examination and Analysis Tools Using Abstraction Layers  

Microsoft Academic Search

This paper uses the theory of abstraction layers to describe the purpose and goals of digital forensic analysis tools. Using abstraction layers, we identify where tools can introduce errors and provide requirements that the tools must follow. Categories of forensic analysis types are also defined based on the abstraction layers. Abstraction layers are not a new concept, but their usage

Brian Carrier

2002-01-01

106

Method and tool for network vulnerability analysis  

DOEpatents

A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)

2006-03-14

107

Multi-Mission Power Analysis Tool  

NASA Technical Reports Server (NTRS)

Multi-Mission Power Analysis Tool (MMPAT) Version 2 simulates spacecraft power generation, use, and storage in order to support spacecraft design, mission planning, and spacecraft operations. It can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. A user-friendly GUI (graphical user interface) makes it easy to use. Multiple deployments allow use on the desktop, in batch mode, or as a callable library. It includes detailed models of solar arrays, radioisotope thermoelectric generators, nickel-hydrogen and lithium-ion batteries, and various load types. There is built-in flexibility through user-designed state models and table-driven parameters.

Broderick, Daniel

2011-01-01

108

Simplified building energy analysis tool for architects  

NASA Astrophysics Data System (ADS)

Energy Modeler is an energy software program designed to study the relative change of energy uses (heating, cooling, and lighting loads) in different architectural design schemes. This research focuses on developing a tool to improve energy efficiency of the built environment. The research studied the impact of different architectural design response for two distinct global climates: temperate and tropical climatic zones. This energy-based interfacing program is intended to help architects, engineers, educators, students, building designers, major consumers of architectural services, and other professionals whose work interfaces with that of architects, perceive, quickly visualize, and compare energy performance and savings of different design schemes. The buildings in which we live or work have a great impact on our natural environment. Energy savings and consumption reductions in our buildings probably are the best indications of solutions to help environmental sustainability; by reducing the depletion of the world's fossil fuel (oil, natural gas, coal etc.). Architects when they set about designing an environmentally responsive building for an owner or the public, often lack the energy-based information and design tools to tell them whether the building loads and energy consumption are very responsive to the modifications that they made. Buildings are dynamic in nature and changeable over time, with many design variables involved. Architects really need energy-based rules or tools to assist them in the design process. Energy efficient design for sustainable solutions requires attention throughout the design process and is very related to architectural solutions. Early involvement is the only guaranteed way of properly considering fundamental building design issues related to building site, form and exposure. The research presents the methodology and process, which leads to the discussion of the research findings. The innovative work is to make these tools applicable to the earliest stage of design, where more informed analysis of possible alternatives could yield the most benefit and the greatest cost savings both economic and environmental. This is where computer modeling and simulation can really lead to better and energy efficient buildings. Both apply to internal environment and human comfort, and environmental impact from surroundings.

Chaisuparasmikul, Pongsak

109

Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)  

SciTech Connect

NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.

Melaina, M.; Penev, M.

2012-09-01

110

PyRAT - python radiography analysis tool (u)  

SciTech Connect

PyRAT is a radiography analysis tool used to reconstruction images of unknown 1-0 objects. The tool is written in Python and developed for use on LINUX and Windows platforms. The tool is capable of performing nonlinear inversions of the images with minimal manual interaction in the optimization process. The tool utilizes the NOMAD mixed variable optimization tool to perform the optimization.

Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory; Armstrong, Jerawan C [Los Alamos National Laboratory

2011-01-14

111

Built Environment Analysis Tool: April 2013  

SciTech Connect

This documentation describes the tool development. It was created to evaluate the effects of built environment scenarios on transportation energy and greenhouse gas (GHG) emissions. This documentation also provides guidance on how to apply the tool.

Porter, C.

2013-05-01

112

ISHM Decision Analysis Tool: Operations Concept  

NASA Technical Reports Server (NTRS)

The state-of-the-practice Shuttle caution and warning system warns the crew of conditions that may create a hazard to orbiter operations and/or crew. Depending on the severity of the alarm, the crew is alerted with a combination of sirens, tones, annunciator lights, or fault messages. The combination of anomalies (and hence alarms) indicates the problem. Even with much training, determining what problem a particular combination represents is not trivial. In many situations, an automated diagnosis system can help the crew more easily determine an underlying root cause. Due to limitations of diagnosis systems,however, it is not always possible to explain a set of alarms with a single root cause. Rather, the system generates a set of hypotheses that the crew can select from. The ISHM Decision Analysis Tool (IDAT) assists with this task. It presents the crew relevant information that could help them resolve the ambiguity of multiple root causes and determine a method for mitigating the problem. IDAT follows graphical user interface design guidelines and incorporates a decision analysis system. I describe both of these aspects.

2006-01-01

113

Numerical Uncertainty Quantification for Radiation Analysis Tools  

NASA Technical Reports Server (NTRS)

Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

2007-01-01

114

Solar Array Verification Analysis Tool (SAVANT) Developed  

NASA Technical Reports Server (NTRS)

Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

1999-01-01

115

The Analysis of Stone Tool Procurement, Production, and Maintenance  

E-print Network

with the needs of tool users. It also has become apparent to researchers that interpretations of lithic analysis of volumes on lithic analysis has multiplied rapidly over the past decade or so (Andrefsky 2001a, 2005, 2008aThe Analysis of Stone Tool Procurement, Production, and Maintenance William Andrefsky Jr. Published

Kohler, Tim A.

116

Tools for Decision Analysis: Analysis of Risky Decisions  

NSDL National Science Digital Library

This site offers a decision making procedure for solving complex problems step by step. It presents the decision-analysis process for both public and private decision-making, using different decision criteria, different types of information, and information of varying quality. It describes the elements in the analysis of decision alternatives and choices, as well as the goals and objectives that guide decision-making. The key issues related to a decision-maker's preferences regarding alternatives, criteria for choice, and choice modes, together with the risk assessment tools are also presented.

117

Scalable analysis tools for sensitivity analysis and UQ (3160) results.  

SciTech Connect

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

Karelitz, David B.; Ice, Lisa G.; Thompson, David C.; Bennett, Janine C.; Fabian, Nathan; Scott, W. Alan; Moreland, Kenneth D.

2009-09-01

118

Emerging requirements for multi-modal annotation and analysis tools  

Microsoft Academic Search

We review existing capabilities of multi-modal annotation and analysis tools by presenting a survey of seven representative tools, and providing a sample annotation using one system. We discuss emerging requirements including handling electronic ink, eye-gaze tracking, and other time-based considerations. We briefly review aspects of empirically evaluating tool effectiveness and suggest that multimodal interfaces in future analytical tools may be

Tony Bigbee; Dan Loehr; Lisa Harper

2001-01-01

119

Interactive Graphics Tools for Analysis of MOLA and Other Data  

NASA Technical Reports Server (NTRS)

We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

Frey, H.; Roark, J.; Sakimoto, S.

2000-01-01

120

Integrated Turbopump Thermo-Mechanical Design and Analysis Tools  

NASA Astrophysics Data System (ADS)

This viewgraph presentation provides information on the thermo-mechanical design and analysis tools used to control the steady and transient thermo-mechanical effects which drive life, reliability, and cost. The thermo-mechanical analysis tools provide upfront design capability by effectively leveraging existing component design tools to analyze and control: fits, clearance, preload; cooling requirements; stress levels, LCF (low cycle fatigue) limits, and HCF (high cycle fatigue) margin.

Platt, Mike

2002-07-01

121

Electrical Analysis Tool Suite for Inductrial Energy Audits  

E-print Network

Electrical Analysis Tool Suite for Industrial Energy Audits Texas A&M University, Industrial Assessment Center Franco Morelli May 2014 ESL-IE-14-05-39 Proceedings of the Thrity-Sixth Industrial Energy Technology Conference New Orleans, LA. May... 20-23, 2014 AGENDA Current Available Tools Analysis Tools Demand Visualization Weather Disaggregation Demand Aberrations Photovoltaic Optimization Demand Scheduling Automatic Report Generation Demonstration ESL-IE-14-05-39 Proceedings of the Thrity...

Morelli, F.

2014-01-01

122

ProMAT: protein microarray analysis tool  

SciTech Connect

Summary: ProMAT is a software tool for statistically analyzing data from ELISA microarray experiments. The software estimates standard curves, sample protein concentrations and their uncertainties for multiple assays. ProMAT generates a set of comprehensive figures for assessing results and diagnosing process quality. The tool is available for Windows or Mac, and is distributed as open-source Java and R code. Availability: ProMAT is available at http://www.pnl.gov/statistics/ProMAT. ProMAT requires Java version 1.5.0 and R version 1.9.1 (or more recent versions) which are distributed with the tool.

White, Amanda M.; Daly, Don S.; Varnum, Susan M.; Anderson, Kevin K.; Bollinger, Nikki; Zangar, Richard C.

2006-04-04

123

A Multidimensional Analysis Tool for Visualizing Online Interactions  

ERIC Educational Resources Information Center

This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their

Kim, Minjeong; Lee, Eunchul

2012-01-01

124

General Mission Analysis Tool (GMAT) User's Guide (Draft)  

NASA Technical Reports Server (NTRS)

4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

Hughes, Steven P.

2007-01-01

125

Tools for Knowledge Analysis, Synthesis, and Sharing  

NASA Astrophysics Data System (ADS)

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own literacy by helping them to interact with the learning context. These tools include peer-group skills as well as strategies to analyze text and to indicate comprehension by way of text summaries and concept maps. Even with these tools, more appears to be needed. Disparate backgrounds and languages interfere with the comprehension and the sharing of knowledge. To meet this need, two new tools are proposed. The first tool fractures language ontologically, giving all learners who use it a language to talk about what has, and what has not, been uttered in text or talk about the world. The second fractures language epistemologically, giving those involved in working with text or on the world around them a way to talk about what they have done and what remains to be done. Together, these tools operate as a two- tiered knowledge representation of knowledge. This representation promotes both an individual meta-cognitive and a social meta-cognitive approach to what is known and to what is not known, both ontologically and epistemologically. Two hypotheses guide the presentation: If the tools are taught during early childhood, children will be prepared to master science and technology content. If the tools are used by both students and those who design and deliver instruction, the learning of such content will be accelerated.

Medland, Michael B.

2007-04-01

126

An Integrated Tool for System Analysis of Sample Return Vehicles  

NASA Technical Reports Server (NTRS)

The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

2012-01-01

127

Bayesian approach based blind image deconvolution with fuzzy median filter  

NASA Astrophysics Data System (ADS)

The inverse problem associated with reconstruction of Poisson blurred images has attracted attention in recent years. In this paper, we propose an alternative unified approach to blind image deconvolution problem using fuzzy median filter as Gibbs prior to model the nature of inter pixel interaction for better edge preserving reconstruction. The performance of the algorithm at various SNR levels has been studied quantitatively using PSNR, RMSE and universal quality index (UQI). Comparative analysis with existing methods has also been carried out.

Mohan, S. Chandra; Rajan, K.; Srinivasan, R.

2011-10-01

128

Deconvolution of immittance data: Some old and new methods  

Microsoft Academic Search

The background and history of various deconvolution approaches are briefly summarized; different methods are compared; and available computational resources are described. These underutilized data analysis methods are valuable in both electrochemistry and immittance spectroscopy areas, and freely available computer programs are cited that provide an automatic test of the appropriateness of KronigKramers transforms, a powerful non-linear-least-squares inversion method, and a

J. Ross Macdonald; Enis Tuncer

2007-01-01

129

FDTD simulation tools for UWB antenna analysis.  

SciTech Connect

This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

Brocato, Robert Wesley

2005-02-01

130

FDTD simulation tools for UWB antenna analysis.  

SciTech Connect

This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

Brocato, Robert Wesley

2004-12-01

131

Tools for Knowledge Analysis, Synthesis, and Sharing  

ERIC Educational Resources Information Center

Change and complexity are creating a need for increasing levels of literacy in science and technology. Presently, we are beginning to provide students with clear contexts in which to learn, including clearly written text, visual displays and maps, and more effective instruction. We are also beginning to give students tools that promote their own

Medland, Michael B.

2007-01-01

132

CCMR: Wound Dressing Tool and Wound Analysis  

NSDL National Science Digital Library

The goal of our project is to develop a Wound Dressing Tool (WDT) that in addition to extracting overabundant chemicals like the VAC system does, can also allow for variable rates of mass transfer as well as a way for clinicians to monitor the fluid chemical composition of the wound bed during the healing and treatment processes.

Men, Shannon

2005-08-17

133

Homomorphic deconvolution of marine magnetic anomalies  

E-print Network

HOMOMORPIIIC DECONVOLUT ION OF MAR INE MAGNETIC ANOMAL I ES A Thesis by LEO DAVID JONES Submitted to the Graduate Colleoe of Texas Ahhi Unis ars i ty in partial fulfillment of the requirement for tne degree MASTEP, OF SCIENCE December 1975... Major Subject: Geophysics HOMOMORPHIC DECONVOLUT ION OF MARINE MAGNETIC ACNOMALIES A Thesis by LEO DAVID JONES Approved es to sty1e and content by: r"hi ~f C itt: Ch h~7 December 1976 ABSTRACT Homomorpi;ic Deconvolution of Marine Magnetic...

Jones, Leo David

1976-01-01

134

Fuzzy logic components for iterative deconvolution systems  

NASA Astrophysics Data System (ADS)

Deconvolution systems rely heavily on expert knowledge and would benefit from approaches that capture this expert knowledge. Fuzzy logic is an approach that is used to capture expert knowledge rules and produce outputs that range in degree. This paper describes a fuzzy-deconvolution-system that integrates traditional Richardson-Lucy deconvolution with fuzzy components. The system is intended for restoration of 3D widefield images taken under conditions of refractive index mismatch. The system uses a fuzzy rule set for calculating sample refractive index, a fuzzy median filter for inter-iteration noise reduction, and a fuzzy rule set for stopping criteria.

Northan, Brian M.

2013-02-01

135

FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)  

NASA Technical Reports Server (NTRS)

The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A

Pack, G.

1994-01-01

136

Deconvolution of dynamic mechanical networks  

PubMed Central

Time-resolved single-molecule biophysical experiments yield data that contain a wealth of dynamic information, in addition to the equilibrium distributions derived from histograms of the time series. In typical force spectroscopic setups the molecule is connected via linkers to a readout device, forming a mechanically coupled dynamic network. Deconvolution of equilibrium distributions, filtering out the influence of the linkers, is a straightforward and common practice. We have developed an analogous dynamic deconvolution theory for the more challenging task of extracting kinetic properties of individual components in networks of arbitrary complexity and topology. Our method determines the intrinsic linear response functions of a given object in the network, describing the power spectrum of conformational fluctuations. The practicality of our approach is demonstrated for the particular case of a protein linked via DNA handles to two optically trapped beads at constant stretching force, which we mimic through Brownian dynamics simulations. Each well in the protein free energy landscape (corresponding to folded, unfolded, or possibly intermediate states) will have its own characteristic equilibrium fluctuations. The associated linear response function is rich in physical content, because it depends both on the shape of the well and its diffusivitya measure of the internal friction arising from such processes as the transient breaking and reformation of bonds in the protein structure. Starting from the autocorrelation functions of the equilibrium bead fluctuations measured in this force clamp setup, we show how an experimentalist can accurately extract the state-dependent protein diffusivity using a straightforward two-step procedure. PMID:21118989

Hinczewski, Michael; von Hansen, Yann; Netz, Roland R.

2010-01-01

137

Aristotle: A System for Development of Program Analysis Based Tools  

E-print Network

Aristotle provides program analysis information, and supports the development of software engineering tools. Aristotle's front end consists of parsers that gather control flow, local dataflow and symbol table information for procedural language programs. We implemented a parser for C by incorporating analysis routines into the GNU C parser; a C ++ parser is being implemented using similar techniques. Aristotle tools use the data provided by the parsers to perform a variety of tasks, such as dataflow and control dependence analysis, dataflow testing, graph construction and graph viewing. Most of Aristotle's components function on single procedures and entire programs. Parsers and tools use database handler routines to store information in, and retrieve it from, a central database. A user interface provides interactive menu-driven access to tools, and users can view results textually or graphically. Many tools can also be invoked directly from applications programs, which facilitates ...

Mary Jean Harrold; Loren Larsen; John Lloyd; David Nedved; Melanie Page; Gregg Rothermel; Manvinder Singh; Michael Smith

1995-01-01

138

Vibroseis deconvolution: A comparison of pre and post correlation vibroseis deconvolution data in real noisy data  

NASA Astrophysics Data System (ADS)

Vibroseis is a source used commonly for inland seismic exploration. This non-destructive source is often used in urban areas with strong environmental noise. The main goal of seismic data processing is to increase the signal/noise ratio where a determinant step is deconvolution. Vibroseis seismic data do not meet the basic minimum-phase assumption for the application of spiking and predictive deconvolution, therefore various techniques, such as phase shift, are applied to the data, to be able to successfully perform deconvolution of vibroseis data. This work analyzes the application of deconvolution techniques before and after cross-correlation on a real data set acquired for high resolution prospection of deep aquifers. In particular, we compare pre-correlation spiking and predictive deconvolution with Wiener filtering and with post-correlation time variant spectral whitening deconvolution. The main result is that at small offsets, post cross-correlation spectral whitening deconvolution and pre-correlation spiking deconvolution yield comparable results, while for large offsets the best result is obtained by applying a pre-cross-correlation predictive deconvolution.

Baradello, Luca; Accaino, Flavio

2013-05-01

139

Development of wavelet analysis tools for turbulence  

NASA Technical Reports Server (NTRS)

Presented here is the general framework and the initial results of a joint effort to derive novel research tools and easy to use software to analyze and model turbulence and transition. Given here is a brief review of the issues, a summary of some basic properties of wavelets, and preliminary results. Technical aspects of the implementation, the physical conclusions reached at this time, and current developments are discussed.

Bertelrud, A.; Erlebacher, G.; Dussouillez, PH.; Liandrat, M. P.; Liandrat, J.; Bailly, F. Moret; Tchamitchian, PH.

1992-01-01

140

Spatial Analysis and Modeling Tool (SAMT): 1. Structure and possibilities  

Microsoft Academic Search

This paper presents an overview of the integrated Spatial Analysis and Modeling Tool (SAMT). SAMT has been designed to integrate models from dieren t sciences such as economics and ecology. This is necessary for the evaluation of sustain abil- ity of landscapes. Traditional geographic information systems and modeling systems have specic limitations. To overcome such limitations the new tool has

Ralf Wieland; Marion Voss; Xenia Holtmann; Wilfried Mirschel; Igbekele A. Ajibefun

2006-01-01

141

Understanding and evaluating blind deconvolution algorithms  

E-print Network

Blind deconvolution is the recovery of a sharp version of a blurred image when the blur kernel is unknown. Recent algorithms have afforded dramatic progress, yet many aspects of the problem remain challenging and hard to ...

Durand, Fredo

142

Understanding and evaluating blind deconvolution algorithms  

E-print Network

Blind deconvolution is the recovery of a sharp version of a blurred image when the blur kernel is unknown. Recent algorithms have afforded dramatic progress, yet many aspects of the problem remain challenging and hard to ...

Freeman, William

2009-03-31

143

HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL  

EPA Science Inventory

An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

144

Introducing an Online Cooling Tower Performance Analysis Tool  

E-print Network

Introducing an Online Cooling Tower Performance Analysis Tool Michael B. Muller Mechanical Engineer Rutgers University Piscataway, NJ Michael R. Muller Professor of Mechanical Engineering Rutgers University Piscataway, NJ Prakash Rao, Ph...

Muller, M.R.; Muller, M.B.; Rao, P.

2012-01-01

145

Three-Dimensional Imaging by Deconvolution Microscopy  

Microsoft Academic Search

Deconvolution is a computational method used to reduce out-of-focus fluorescence in three-dimensional (3D) microscope images. It can be applied in principle to any type of microscope image but has most often been used to improve images from conventional fluorescence microscopes. Compared to other forms of 3D light microscopy, like confocal microscopy, the advantage of deconvolution microscopy is that it can

James G. McNally; Tatiana Karpova; John Cooper; Jos Angel Conchello

1999-01-01

146

Inversion of marine magnetic anomalies by deconvolution  

E-print Network

INVERSION OF MARINE MAGNETIC ANOMALIES BY DECONVOLUTION A Thesis by DENNIS LEE HARRY Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE December 1983 Maj... or Subject: Geophysics INVERSION OF MARINE MAGNETIC ANOMALIES BY DECONVOLUTION A Thesis by DENNIS LEE HARRY Approved as to style and content by: Phi p ~yrnowxtz (Co-Chairman of Committee) Richard L. Garison (Co-Chairman of Committee) Davis A...

Harry, Dennis Lee

1983-01-01

147

Validating and Verifying a New Thermal-Hydraulic Analysis Tool  

Microsoft Academic Search

The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3D{sup C}\\/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created

Richard R. Schultz; Walter L. Weaver; Abderrafi M. Ougouag; William A. Wieselquist

2002-01-01

148

BRFSS: Prevalence Data and Data Analysis Tools  

NSDL National Science Digital Library

RFSS is the nation's premier system of health-related telephone surveys that collect state data about U.S. residents regarding their health-related risk behaviors, chronic health conditions, and use of preventive services. BRFSS collects data in all 50 states as well as the District of Columbia and three U.S. territories. BRFSS completes more than 400,000 adult interviews each year, making it the largest continuously conducted health survey system in the world. These tools allow the user to perform various analyses and display the data in different means.

Center for Disease Control

149

Deconvolution of multi-peak ICESat/GLAS waveforms  

NASA Astrophysics Data System (ADS)

Although primarily designed for cryosphere studies, data from ICESat/GLAS currently provide the only source of global vegetation height mapping. The objective of this research is to examine the methodological techniques and accuracy of lidar waveform analysis for 3d vertical structure using ICESat/GLAS. This research will investigate the ranging techniques and methods (deconvolution and decomposition) for discriminating various features or reflecting surfaces within each returned waveform. The returned waveform energy detected by the digitizer is a function of the scattering elements within the energy path and the impulse response of the system. By knowing the impulse response of the system, this signal can be removed from the return waveform to improve separability between targets along the laser path. Heights derived from the deconvolution methodology will be assessed against heights derived from Gaussian decomposition of the returned waveform. Here, the assumption that the return waveform is a modeled composite of Gaussian distributions from multiple scatters falling along the laser path. This is currently the technique (up to 6 Gaussians) that is implemented on ICESat/GLAS processing. The White Sands Missile Range (WSMR) Space Harbor area in New Mexico is used as a precision calibration and validation site for ICESat, with experiments operated and maintained by the University of Texas at Austin Center for Space Research (UTCSR). The returned waveforms from an array of corner-cube reflectors placed on poles of known heights located at the WSMR will be used to evaluate the deconvolution and decomposition results.

Neuenschwander, A.; Urban, T. J.; Webb, C. E.; Schutz, B.

2007-12-01

150

JAVA based LCD Reconstruction and Analysis Tools  

SciTech Connect

We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities.

Bower, G.

2004-10-11

151

A new tool for contamination analysis  

SciTech Connect

The Contamination Analysis Unit (CAU) is a sensing system that facilitates a new approach to industrial cleaning. Through use of portable mass spectrometry and various desorption techniques, the CAU provides in-process, near-real-time measurement of surface cleanliness levels. It can be of help in significantly reducing hazardous waste generation and toxic air emissions from manufacturing operations.

Meltzer, M.; Gregg, H.

1996-06-01

152

Pin: building customized program analysis tools with dynamic instrumentation  

Microsoft Academic Search

Robust and powerful software instrumentation tools are essential for program analysis tasks such as profiling, performance evaluation, and bug detection. To meet this need, we have developed a new instrumentation system called to instrument executables while they are running. For efficiency, Pin uses several techniques, including inlining, register re-allocation, liveness analysis, and instruction scheduling to optimize instrumentation. This fully automated

Chi-Keung Luk; Robert S. Cohn; Robert Muth; Harish Patil; Artur Klauser; P. Geoffrey Lowney; Steven Wallace; Vijay Janapa Reddi; Kim M. Hazelwood

2005-01-01

153

MAK, a computational tool kit for automated MITE analysis  

Microsoft Academic Search

Miniature inverted repeat transposable elements (MITEs) are ubiquitous and numerous in higher eukaryotic genomes. Analysis of MITE families is laborious and time consuming, especially when multiple MITE families are involved in the study. Based on the structural characteristics of MITEs and genetic principles for transposable elements (TEs), we have developed a computational tool kit named MITE analysis kit (MAK) to

Guojun Yang; Timothy C. Hall

2003-01-01

154

GEOGRAPHIC ANALYSIS TOOL FOR HEALTH AND ENVIRONMENTAL RESEARCH (GATHER)  

EPA Science Inventory

GATHER, Geographic Analysis Tool for Health and Environmental Research, is an online spatial data access system that provides members of the public health community and general public access to spatial data that is pertinent to the analysis and exploration of public health issues...

155

Automated Scalability Analysis Tools for Message Passing Parallel Programs  

NASA Technical Reports Server (NTRS)

In order to develop scalable parallel applications, a number of programming decisions have to be made during the development of the program. Performance tools that help in making these decisions are few, if existent. Traditionally, performance tools have focused on exposing performance bottlenecks of small-scale executions of the program. However, it is common knowledge that programs that perform exceptionally well on small processor configurations, more often than not, perform poorly when executed on larger processor configurations. Hence, new tools that predict the execution characteristics of scaled-up programs are an essential part of an application developers toolkit. In this paper we discuss important issues that need to be considered in order to build useful scalability analysis tools for parallel programs. We introduce a simple tool that automatically extracts scalability characteristics of a class of deterministic parallel programs. We show with the help of a number of results on the Intel iPSC/860, that predictions are within reasonable bounds.

Sarukkai, Sekhar R.; Mehra, Pankaj; Tucker, Deanne (Technical Monitor)

1994-01-01

156

Genomics Assisted Ancestry Deconvolution in Grape  

PubMed Central

The genus Vitis (the grapevine) is a group of highly diverse, diploid woody perennial vines consisting of approximately 60 species from across the northern hemisphere. It is the worlds most valuable horticultural crop with ~8 million hectares planted, most of which is processed into wine. To gain insights into the use of wild Vitis species during the past century of interspecific grape breeding and to provide a foundation for marker-assisted breeding programmes, we present a principal components analysis (PCA) based ancestry estimation method to calculate admixture proportions of hybrid grapes in the United States Department of Agriculture grape germplasm collection using genome-wide polymorphism data. We find that grape breeders have backcrossed to both the domesticated V. vinifera and wild Vitis species and that reasonably accurate genome-wide ancestry estimation can be performed on interspecific Vitis hybrids using a panel of fewer than 50 ancestry informative markers (AIMs). We compare measures of ancestry informativeness used in selecting SNP panels for two-way admixture estimation, and verify the accuracy of our method on simulated populations of admixed offspring. Our method of ancestry deconvolution provides a first step towards selection at the seed or seedling stage for desirable admixture profiles, which will facilitate marker-assisted breeding that aims to introgress traits from wild Vitis species while retaining the desirable characteristics of elite V. vinifera cultivars. PMID:24244717

Sawler, Jason; Reisch, Bruce; Aradhya, Mallikarjuna K.; Prins, Bernard; Zhong, Gan-Yuan; Schwaninger, Heidi; Simon, Charles; Buckler, Edward; Myles, Sean

2013-01-01

157

Database tools for enhanced analysis of TMX-U data  

SciTech Connect

A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers.

Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

1986-03-06

158

Diamond-turning tool setting by interferogram analysis  

SciTech Connect

A method was developed to establish a numerically controlled tool path with respect to the work spindle centerline. Particularly adapted to the diamond turning of optics, this method is based upon interferogram analysis and is applicable to the establishment of the work spindle centerline relative to the tool path for any center-turned optic having a well-defined vertex radius of curvature. The application reported is for an f/2 concave spherical mirror.

Rasnick, W.H.; Yoder, R.C.

1980-10-22

159

Aristotle: a system for development of program analysis based tools  

Microsoft Academic Search

Aristotle provides program analysis information,and supports the development of softwareengineering tools. Aristotle's front end consists ofparsers that gather control flow, local dataflow andsymbol table information for procedural languageprograms. We implemented a parser for C by incorporatinganalysis routines into the GNU C parser; aC++parser is being implemented using similar techniques. Aristotle tools use the data provided bythe parsers to perform a

Mary Jean Harrold; Loren Larsen; John Lloyd; David Nedved; Melanie Page; Gregg Rothermel; Manvinder Singh; Michael Smith

1995-01-01

160

Development of a climate data analysis tool (CDAT)  

SciTech Connect

The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

Marlais, S.M.

1997-09-01

161

RCytoscape: tools for exploratory network analysis  

PubMed Central

Background Biomolecular pathways and networks are dynamic and complex, and the perturbations to them which cause disease are often multiple, heterogeneous and contingent. Pathway and network visualizations, rendered on a computer or published on paper, however, tend to be static, lacking in detail, and ill-equipped to explore the variety and quantities of data available today, and the complex causes we seek to understand. Results RCytoscape integrates R (an open-ended programming environment rich in statistical power and data-handling facilities) and Cytoscape (powerful network visualization and analysis software). RCytoscape extends Cytoscape's functionality beyond what is possible with the Cytoscape graphical user interface. To illustrate the power of RCytoscape, a portion of the Glioblastoma multiforme (GBM) data set from the Cancer Genome Atlas (TCGA) is examined. Network visualization reveals previously unreported patterns in the data suggesting heterogeneous signaling mechanisms active in GBM Proneural tumors, with possible clinical relevance. Conclusions Progress in bioinformatics and computational biology depends upon exploratory and confirmatory data analysis, upon inference, and upon modeling. These activities will eventually permit the prediction and control of complex biological systems. Network visualizations -- molecular maps -- created from an open-ended programming environment rich in statistical power and data-handling facilities, such as RCytoscape, will play an essential role in this progression. PMID:23837656

2013-01-01

162

Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions  

NASA Technical Reports Server (NTRS)

Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

2002-01-01

163

Mutation surveyor: an in silico tool for sequencing analysis.  

PubMed

DNA sequencing is widely used for DNA diagnostics and functional studies of genes of interest. With significantly increased sequencing outputs, manual reading of sequence results can impede an efficient and accurate analysis. Mutation Surveyor is a useful in silico tool developed by SoftGenetics that assists the detection of sequence variations within Sanger sequencing traces. This tool can process up to 400 lanes of data at a time with high accuracy and sensitivity. It can effectively detect SNPs and indels in their homozygous or heterozygous states as well as mosaicism. In this chapter, we describe the general application of Mutation Surveyor for DNA sequencing analysis and its unique features. PMID:21780000

Dong, Chongmei; Yu, Bing

2011-01-01

164

A SIMILARITY THEORY OF APPROXIMATE DECONVOLUTION MODELS OF TURBULENCE  

Microsoft Academic Search

We apply the phenomenology of homogeneous, isotropic turbulence to the family of approximate deconvolution models proposed by Stolz and Adams. In particular, we establish that the models themselves have an energy cascade with two asymptotically dierent inertial ranges. Delineation of these gives insight into the resolution requirements of using approximate deconvolution models. The approximate deconvolution model's energy balance contains both

MONIKA NEDAz

165

Computational Tools for the Secondary Analysis of Metabolomics Experiments  

PubMed Central

Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

2013-01-01

166

Software Construction and Analysis Tools for Future Space Missions  

NASA Technical Reports Server (NTRS)

NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

2002-01-01

167

Interoperability of the analysis tools within the IMPEx project  

NASA Astrophysics Data System (ADS)

The growing amount of data in planetary sciences requires adequate tools for visualisation enabling in depth analysis. Within the FP7 IMPEx infrastructure data will originate from heterogeneous sources : large observational databases (CDAWeb, AMDA at CDPP, ...), simulation databases for hybrid and MHD codes (FMI, LATMOS), planetary magnetic field models database and online services (SINP). Together with the common "time series" visualisation functionality for both in-situ and modeled data (provided by AMDA and CLWeb tools), IMPEx will also provide immersion capabilities into the complex 3D data originating from models (provided by 3DView). The functionalities of these tools will be described. The emphasis will be put on how these tools 1/ can share information (for instance Time Tables or user composed parameters) and 2/ be operated synchronously via dynamic connections based on Virtual Observatory standards.

Gnot, Vincent; Khodachenko, Maxim; Kallio, Esa; Al-Ubaidi, Tarek; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Renard, Benjamin; Bourel, Natacha; Modolo, Ronan; Hess, Sbastien; Andr, Nicolas; Penou, Emmanuel; Topf, Florian; Alexeev, Igor; Belenkaya, Elena; Kalegaev, Vladimir; Schmidt, Walter

2013-04-01

168

Millennial scale system impulse response of polar climates - deconvolution results between ? 18O records from Greenland and Antarctica  

NASA Astrophysics Data System (ADS)

Deconvolution has long been used in science to recover real input given a system's impulse response and output. In this study, we applied spectral division deconvolution to select, polar, ? 18O time series to investigate the possible relationship between the climates of the Polar Regions, i.e. the equivalent to a climate system's ';impulse response.' While the records may be the result of nonlinear processes, deconvolution remains an appropriate tool because the two polar climates are synchronized, forming a Hilbert transform pair. In order to compare records, the age models of three Greenland and four Antarctica records have been matched via a Monte Carlo method using the methane-matched pair GRIP and BYRD as a basis for the calculations. For all twelve polar pairs, various deconvolution schemes (Wiener, Damped Least Squares, Tikhonov, Kalman filter) give consistent, quasi-periodic, impulse responses of the system. Multitaper analysis reveals strong, millennia scale, quasi-periodic oscillations in these system responses with a range of 2,500 to 1,000 years. These are not symmetric, as the transfer function from north to south differs from that of south to north. However, the difference is systematic and occurs in the predominant period of the deconvolved signals. Specifically, the north to south transfer function is generally of longer period than the south to north transfer function. High amplitude power peaks at 5.0ky to 1.7ky characterize the former, while the latter contains peaks at mostly short periods, with a range of 2.5ky to 1.0ky. Consistent with many observations, the deconvolved, quasi-periodic, transfer functions share the predominant periodicities found in the data, some of which are likely related to solar forcing (2.5-1.0ky), while some are probably indicative of the internal oscillations of the climate system (1.6-1.4ky). The approximately 1.5 ky transfer function may represent the internal periodicity of the system, perhaps even related to the periodicity of the thermo-haline circulation (THC). Simplified models of the polar climate fluctuations are shown to support these findings.

Reischmann, E.; Yang, X.; Rial, J. A.

2013-12-01

169

Single frame blind image deconvolution by non-negative sparse matrix factorization  

NASA Astrophysics Data System (ADS)

Novel approach to single frame multichannel blind image deconvolution has been formulated recently as non-negative matrix factorization problem with sparseness constraints imposed on the unknown mixing vector that accounts for the case of non-sparse source image. Unlike most of the blind image deconvolution algorithms, the novel approach assumed no a priori knowledge about the blurring kernel and original image. Our contributions in this paper are: (i) we have formulated generalized non-negative matrix factorization approach to blind image deconvolution with sparseness constraints imposed on either unknown mixing vector or unknown source image; (ii) the criteria are established to distinguish whether unknown source image was sparse or not as well as to estimate appropriate sparseness constraint from degraded image itself, thus making the proposed approach completely unsupervised; (iii) an extensive experimental performance evaluation of the non-negative matrix factorization algorithm is presented on the images degraded by the blur caused by the photon sieve, out-of-focus blur with sparse and non-sparse images and blur caused by atmospheric turbulence. The algorithm is compared with the state-of-the-art single frame blind image deconvolution algorithms such as blind Richardson-Lucy algorithm and single frame multichannel independent component analysis based algorithm and non-blind image restoration algorithms such as multiplicative algebraic restoration technique and Van-Cittert algorithms. It has been experimentally demonstrated that proposed algorithm outperforms mentioned non-blind and blind image deconvolution methods.

Kopriva, Ivica; Garrood, Dennis J.; Borjanovi?, Vesna

2006-10-01

170

Mass++: A Visualization and Analysis Tool for Mass Spectrometry.  

PubMed

We have developed Mass++, a plug-in style visualization and analysis tool for mass spectrometry. Its plug-in style enables users to customize it and to develop original functions. Mass++ has several kinds of plug-ins, including rich viewers and analysis methods for proteomics and metabolomics. Plug-ins for supporting vendors' raw data are currently available; hence, Mass++ can read several data formats. Mass++ is both a desktop tool and a software development platform. Original functions can be developed without editing the Mass++ source code. Here, we present this tool's capability to rapidly analyze MS data and develop functions by providing examples of label-free quantitation and implementing plug-ins or scripts. Mass++ is freely available at http://www.first-ms3d.jp/english/ . PMID:24965016

Tanaka, Satoshi; Fujita, Yuichiro; Parry, Howell E; Yoshizawa, Akiyasu C; Morimoto, Kentaro; Murase, Masaki; Yamada, Yoshihiro; Yao, Jingwen; Utsunomiya, Shin-Ichi; Kajihara, Shigeki; Fukuda, Mitsuru; Ikawa, Masayuki; Tabata, Tsuyoshi; Takahashi, Kentaro; Aoshima, Ken; Nihei, Yoshito; Nishioka, Takaaki; Oda, Yoshiya; Tanaka, Koichi

2014-07-01

171

A simulation tool for the analysis of high speed flows  

Microsoft Academic Search

The availability of an efficient, low-cost numerical simulator is essential in the design of fluid dynamic systems, both to achieve a deep understanding of the flow field and to allow a quick and economical optimization of the technical characteristics of industrial devices. With the aim of developing an adequate simulation tool for the analysis of the fluid dynamical field in

G Bella; M Burroni; M. M Cerimele; F Pistella

1999-01-01

172

Effect of Static Analysis Tools on Software Security: Preliminary Investigation  

E-print Network

]: Software/Program Verification; K.6.5 [Management of Computing and Information Systems]: Security security defects are introduced at the source code level [10], coding errors are a critical problem complement to testing to discover defects in source code. This paper is concerned with static analysis tools

Black, Paul E.

173

Orienting the Neighborhood: A Subdivision Energy Analysis Tool; Preprint  

SciTech Connect

This paper describes a new computerized Subdivision Energy Analysis Tool being developed to allow users to interactively design subdivision street layouts while receiving feedback about energy impacts based on user-specified building design variants and availability of roof surfaces for photovoltaic and solar water heating systems.

Christensen, C.; Horowitz, S.

2008-07-01

174

International comparative analysis of building regulations: an analytical tool  

Microsoft Academic Search

Purpose The purpose of this paper is to introduce a tool for the international comparative analysis of regulatory regimes in the field of building regulation. Design\\/methodology\\/approach On the basis of a heuristic model drawn from regulatory literature, a typology of building regulatory regimes is introduced. Each type is illustrated with a number of real-life examples from North America,

Jeroen van der Heijden

2009-01-01

175

Recurrence time statistics: Versatile tools for genomic DNA sequence analysis  

E-print Network

Recurrence time statistics: Versatile tools for genomic DNA sequence analysis Yinhe Cao1, Wen from DNA sequences. One of the more important structures in a DNA se- quence is repeat-related. Often they have to be masked before protein coding regions along a DNA sequence are to be identified or redundant

Gao, Jianbo

176

A disassembly user requirements analysis method, tools and validated examples  

Microsoft Academic Search

The objective of this research was to create an object component oriented requirements analysis software tool that enables enterprises to analyze and design all critical aspects of their demanufacturing processes over the web and\\/or their company intranets. Our solution provides an analytical method for evaluating various disassembly line manager and operator user requirements, based on appropriate engineering solutions, including the

Paul G. Ranky; Reggie J. Caudill; Ketan Limaye; Najeeb Alli; Satishkumar ChamyVelumani; Apoorva Bhatia; Manasi Lonkar

2002-01-01

177

The Adversarial Route Analysis Tool: A Web Application  

SciTech Connect

The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

Casson, William H. Jr. [Los Alamos National Laboratory

2012-08-02

178

An Online Image Analysis Tool for Science Education  

ERIC Educational Resources Information Center

This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and

Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

2008-01-01

179

DARE-COTS. A domain analysis support tool  

Microsoft Academic Search

DARE-COTS (Domain Analysis Research Environment for Commercial Off-The-Shelf software) is a CASE tool that supports domain analysis-the activity of identifying and documenting the commonalities and variabilities in related software systems. DARE-COTS supports the capture of domain information from experts, documents and code in a domain. Captured domain information is stored in a domain book that typically contains a generic architecture

William Frakes; Ruben Prieto-Diaz; Christopher Fox

1997-01-01

180

Cirrus: A Protocol Analysis Tool Cirrus: an automated protocol analysis tool  

E-print Network

, Pittsburgh, PA 15213 U.S.A. Morgan-Kaufman. 1987. pp. 205-217. Abstract nical Note erta- Chi- {the ltes. gs of ,.49- "enth Cauf- ;erta- ,.ican !w of 'chni- eley, 'rural 147- f,23, Cirrus is a tool for protocol

VanLehn, Kurt

181

The effects of error magnitude and bandwidth selection for deconvolution with unknown error distribution  

PubMed Central

The error distribution is generally unknown in deconvolution problems with real applications. A separate independent experiment is thus often conducted to collect the additional noise data in those studies. In this paper, we study the nonparametric deconvolution estimation from a contaminated sample coupled with an additional noise sample. A ridge-based kernel deconvolution estimator is proposed and its asymptotic properties are investigated depending on the error magnitude. We then present a data-driven bandwidth selection algorithm with combining the bootstrap method and the idea of simulation extrapolation. The finite sample performance of the proposed methods and the effects of error magnitude are evaluated through simulation studies. A real data analysis for a gene Illumina BeadArray study is performed to illustrate the use of the proposed methods. PMID:22754269

Wang, Xiao-Feng; Ye, Deping

2012-01-01

182

A Survey of Tool Use and Analysis of its Implications for the Design of Robotic Tool Lloyd Williams  

E-print Network

Williams The use of tools has long been pointed to as an indicator of intelligent behavior. One situations is often indicative of intelligence guiding changes in behavior. The use of tools multiplies by a specific analysis of technologies that could be employed to enable robots to use tools. Ethological

Young, R. Michael

183

Blind deconvolution through digital signal processing  

Microsoft Academic Search

This paper addresses the problem of deconvolving two signals when both are unknown. The authors call this problem blind deconvolution. The discussion develops two related solutions which can be applied through digital signal processing in certain practical cases. The case of reverberated and resonated sound forms the center of the development. The specific problem of restoring old acoustic recordings provides

T. M. Cannon; R. B. Ingebretsen

1975-01-01

184

Efficient Marginal Likelihood Optimization in Blind Deconvolution  

E-print Network

In blind deconvolution one aims to estimate from an input blurred image y a sharp image x and an unknown blur kernel k. Recent research shows that a key to success is to consider the overall shape of the posterior distribution ...

Levin, Anat

2011-04-04

185

Super-exponential methods for blind deconvolution  

Microsoft Academic Search

A class of iterative methods for solving the blind deconvolution problem, i.e. for recovering the input of an unknown possibly nonminimum-phase linear system by observation of its output, is presented. These methods are universal do not require prior knowledge of the input distribution, are computationally efficient and statistically stable, and converge to the desired solution regardless of initialization at a

Ofir Shalvi; Ehud Weinstein

1993-01-01

186

Iterative blind deconvolution method and its applications  

Microsoft Academic Search

A simple iterative technique has been developed for blind deconvolution of two convolved functions. The method is described, and a number of results obtained from a computational implementation are presented. Some further possible applications are indicated. The convolution c(x) of two functions, f(x) and g(x), can be expressed mathematically by the integral equa- tion

G. R. Ayers; J. C. Dainty

1988-01-01

187

K-41 optimised approximate deconvolution models  

Microsoft Academic Search

If the Navier-Stokes equations are averaged with a local, spacial convo- lution type lter, = g , the resulting system is not closed due to the ltered nonlinear term uu. An approximate deconvolution operator D is a bounded linear operator which satises

William Layton; Iuliana Stanculescu

2007-01-01

188

Chebychev optimized approximate deconvolution models of turbulence  

Microsoft Academic Search

If the Navier-Stokes equations are averaged with a local, spacial convolution type filter, = g , the resulting system is not closed due to the filtered nonlinear term uu. An approximate deconvolution operator D is a bounded linear operator which satisfies u = D(u) + O( ),

William J. Layton; Iuliana Stanculescu

2009-01-01

189

Deconvolution based on the curvelet transform  

Microsoft Academic Search

This paper describes a new deconvolution algorithm, based on both the wavelet transform and the curvelet transform. It extends previous results which were obtained for the denois- ing problem. Using these two different transformations in the same algorithm allows us to optimally detect in the same time isotropic features, well represented by the wavelet trans- form, and edges better represented

Jean-luc Starck; Mai K. Nguyen; Fionn Murtagh

2003-01-01

190

Tools  

NSDL National Science Digital Library

The goal of the lesson is not for students to learn what the simple machines are, even though this is an underlying theme. Students will approach the lesson in a much more open-minded fashion. They will discuss tools and how they function. This will naturally lead to acknowledgment of how tools make our lives easier. By categorizing everyday items, students will come to understand the natural functions of tools. This base of knowledge will lead into exercises and discussions about how complex machines are a conglomerate of simpler tools and motions, as well as how tools have changed and become more sophisticated throughout history. At the end of the lesson to reemphasize the importance of tools in human society, students will write a paper in which they imagine a world without a particular tool.

Science Netlinks

2005-06-13

191

Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)  

NASA Technical Reports Server (NTRS)

The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

2005-01-01

192

Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study  

NASA Technical Reports Server (NTRS)

An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

Flores, Melissa; Malin, Jane T.

2013-01-01

193

A dataflow analysis tool for parallel processing of algorithms  

NASA Technical Reports Server (NTRS)

A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

Jones, Robert L., III

1993-01-01

194

Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs  

NASA Technical Reports Server (NTRS)

The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

Min, James B.

2005-01-01

195

Tool Support for Parametric Analysis of Large Software Simulation Systems  

NASA Technical Reports Server (NTRS)

The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

2008-01-01

196

Rosetta CONSERT operations and data analysis preparation: simulation software tools.  

NASA Astrophysics Data System (ADS)

The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.

Rogez, Yves; Hrique, Alain; Cardiet, Mal; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek

2014-05-01

197

Phenostat: visualization and statistical tool for analysis of phenotyping data.  

PubMed

The effective extraction of information from multidimensional data sets derived from phenotyping experiments is a growing challenge in biology. Data visualization tools are important resources that can aid in exploratory data analysis of complex data sets. Phenotyping experiments of model organisms produce data sets in which a large number of phenotypic measures are collected for each individual in a group. A critical initial step in the analysis of such multidimensional data sets is the exploratory analysis of data distribution and correlation. To facilitate the rapid visualization and exploratory analysis of multidimensional complex trait data, we have developed a user-friendly, web-based software tool called Phenostat. Phenostat is composed of a dynamic graphical environment that allows the user to inspect the distribution of multiple variables in a data set simultaneously. Individuals can be selected by directly clicking on the graphs and thus displaying their identity, highlighting corresponding values in all graphs, allowing their inclusion or exclusion from the analysis. Statistical analysis is provided by R package functions. Phenostat is particularly suited for rapid distribution and correlation analysis of subsets of data. An analysis of behavioral and physiologic data stemming from a large mouse phenotyping experiment using Phenostat reveals previously unsuspected correlations. Phenostat is freely available to academic institutions and nonprofit organizations and can be used from our website at: (http://www.bioinfo.embl.it/phenostat/). PMID:17674099

Reuveni, Eli; Carola, Valeria; Banchaabouchi, Mumna Al; Rosenthal, Nadia; Hancock, John M; Gross, Cornelius

2007-09-01

198

Microscopy image segmentation tool: Robust image data analysis  

SciTech Connect

We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)] [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

2014-03-15

199

Field Quality Analysis as a Tool to Monitor Magnet Production  

SciTech Connect

Field harmonics offer a powerful tool to examine the mechanical structure of accelerator magnets. A large deviation from the nominal values suggests a mechanical defect. Magnets with such defects are likely to have a poor quench performance. Similarly, a trend suggests a wear in tooling or a gradual change in the magnet assem-bly or in the size of a component. This paper presents the use of the field quality as a tool to monitor the magnet production of the Relativistic Heavy Ion Collider (RHIC). Several examples are briefly described. Field quality analysis can also rule out a suspected geometric error if it can not be supported by the symmetry and the magnitude of the measured harmonics.

Gupta, R.; Anerella, M.; Cozzolino, J.; Fisher, D.; Ghosh, A.; Jain, A.; Sampson, W.; Schmalzle, J.; Thompson, P.; Wanderer, P.; Willen, E.

1997-10-18

200

Virtual tool mark generation for efficient striation analysis.  

PubMed

This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguished known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within 5-10. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners. PMID:24502818

Ekstrand, Laura; Zhang, Song; Grieve, Taylor; Chumbley, L Scott; Kreiser, M James

2014-07-01

201

Systematic Omics Analysis Review (SOAR) Tool to Support Risk Assessment  

PubMed Central

Environmental health risk assessors are challenged to understand and incorporate new data streams as the field of toxicology continues to adopt new molecular and systems biology technologies. Systematic screening reviews can help risk assessors and assessment teams determine which studies to consider for inclusion in a human health assessment. A tool for systematic reviews should be standardized and transparent in order to consistently determine which studies meet minimum quality criteria prior to performing in-depth analyses of the data. The Systematic Omics Analysis Review (SOAR) tool is focused on assisting risk assessment support teams in performing systematic reviews of transcriptomic studies. SOAR is a spreadsheet tool of 35 objective questions developed by domain experts, focused on transcriptomic microarray studies, and including four main topics: test system, test substance, experimental design, and microarray data. The tool will be used as a guide to identify studies that meet basic published quality criteria, such as those defined by the Minimum Information About a Microarray Experiment standard and the Toxicological Data Reliability Assessment Tool. Seven scientists were recruited to test the tool by using it to independently rate 15 published manuscripts that study chemical exposures with microarrays. Using their feedback, questions were weighted based on importance of the information and a suitability cutoff was set for each of the four topic sections. The final validation resulted in 100% agreement between the users on four separate manuscripts, showing that the SOAR tool may be used to facilitate the standardized and transparent screening of microarray literature for environmental human health risk assessment. PMID:25531884

McConnell, Emma R.; Bell, Shannon M.; Cote, Ila; Wang, Rong-Lin; Perkins, Edward J.; Garcia-Reyero, Natlia; Gong, Ping; Burgoon, Lyle D.

2014-01-01

202

Comparative analysis of statistical tools to identify recruitmenteenvironment relationships and forecast  

E-print Network

Comparative analysis of statistical tools to identify recruitmenteenvironment relationships-W., and Macklin, S. A. 2005. Comparative analysis of statistical tools to identify recruitmenteenvironment beyond the capabilities of traditional statistical analysis paradigms. This study examines the utility

203

AstroStat - A VO Tool for Statistical Analysis  

E-print Network

AstroStat is an easy-to-use tool for performing statistical analysis on data. It has been designed to be compatible with Virtual Observatory (VO) standards thus enabling it to become an integral part of the currently available collection of VO tools. A user can load data in a variety of formats into AstroStat and perform various statistical tests using a menu driven interface. Behind the scenes, all analysis is done using the public domain statistical software - R and the output returned is presented in a neatly formatted form to the user. The analyses performable include exploratory tests, visualizations, distribution fitting, correlation & causation, hypothesis testing, multivariate analysis and clustering. The tool is available in two versions with identical interface and features - as a web service that can be run using any standard browser and as an offline application. AstroStat will provide an easy-to-use interface which can allow for both fetching data and performing power statistical analysis on ...

Kembhavi, Ajit K; Kale, Tejas; Jagade, Santosh; Vibhute, Ajay; Garg, Prerak; Vaghmare, Kaustubh; Navelkar, Sharmad; Agrawal, Tushar; Nandrekar, Deoyani; Shaikh, Mohasin

2015-01-01

204

Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)  

NASA Astrophysics Data System (ADS)

Forecasting technology capabilities requires a tool and a process for capturing state-of-the-art technology metrics and estimates for future metrics. A decision support tool, known as the Advanced Technology Lifecycle Analysis System (ATLAS), contains a Technology Tool Box (TTB) database designed to accomplish this goal. Sections of this database correspond to a Work Breakdown Structure (WBS) developed by NASA's Exploration Systems Research and Technology (ESRT) Program. These sections cover the waterfront of technologies required for human and robotic space exploration. Records in each section include technology performance, operations, and programmatic metrics. Timeframes in the database provide metric values for the state of the art (Timeframe 0) and forecasts for timeframes that correspond to spiral development milestones in NASA's Exploration Systems Mission Directorate (ESMD) development strategy. Collecting and vetting data for the TTB will involve technologists from across the agency, the aerospace industry and academia. Technologists will have opportunities to submit technology metrics and forecasts to the TTB development team. Semi-annual forums will facilitate discussions about the basis of forecast estimates. As the tool and process mature, the TTB will serve as a powerful communication and decision support tool for the ESRT program.

Doyle, Monica M.; O'Neil, Daniel A.; Christensen, Carissa B.

2005-02-01

205

SATRAT: Staphylococcus aureus transcript regulatory network analysis tool  

PubMed Central

Staphylococcus aureus is a commensal organism that primarily colonizes the nose of healthy individuals. S. aureus causes a spectrum of infections that range from skin and soft-tissue infections to fatal invasive diseases. S. aureus uses a large number of virulence factors that are regulated in a coordinated fashion. The complex regulatory mechanisms have been investigated in numerous high-throughput experiments. Access to this data is critical to studying this pathogen. Previously, we developed a compilation of microarray experimental data to enable researchers to search, browse, compare, and contrast transcript profiles. We have substantially updated this database and have built a novel exploratory toolSATRATthe S. aureus transcript regulatory network analysis tool, based on the updated database. This tool is capable of performing deep searches using a query and generating an interactive regulatory network based on associations among the regulators of any query gene. We believe this integrated regulatory network analysis tool would help researchers explore the missing links and identify novel pathways that regulate virulence in S. aureus. Also, the data model and the network generation code used to build this resource is open sourced, enabling researchers to build similar resources for other bacterial systems. PMID:25653902

Nagarajan, Vijayaraj; Elasri, Mohamed O.

2015-01-01

206

A conceptual design tool for RBCC engine performance analysis  

SciTech Connect

Future reusable launch vehicles will depend on new propulsion technologies to lower system operational costs while maintaining adequate performance. Recently, a number of vehicle systems utilizing rocket-based combined-cycle (RBCC) propulsion have been proposed as possible low-cost space launch solutions. Vehicles using RBCC propulsion have the potential to combine the best aspects of airbreathing propulsion (high average Isp) with the best aspects of rocket propulsion (high propellant bulk density and engine T/W). Proper conceptual assessment of each proposed vehicle will require computer-based tools that allow for quick and cheap, yet sufficiently accurate disciplinary analyses. At Georgia Tech, a spreadsheet-based tool has been developed that uses quasi-1D flow analysis with component efficiencies to parametrically model RBCC engine performance in ejector, fan-ramjet, ramjet and pure rocket modes. The technique is similar to an earlier RBCC modeling technique developed by the Marquardt Corporation in the mid-1960{close_quote}s. For a given sea-level static thrust requirement, the current tool generates engine weight and size data, as well as Isp and thrust data vs. altitude and Mach number. The latter is output in tabular form for use in a trajectory optimization program. This paper reviews the current state of the RBCC analysis tool and the effort to upgrade it from a Microsoft Excel spreadsheet to a design-oriented UNIX program in C suitable for integration into a multidisciplinary design optimization (MDO) framework. {copyright} {ital 1997 American Institute of Physics.}

Olds, J.R.; Saks, G. [Aerospace Systems Design Laboratory, School of Aerospace Engineering Georgia Institute of Technology Atlanta, Georgia30332-0150 (United States)

1997-01-01

207

Application of regularized RichardsonLucy algorithm for deconvolution of confocal microscopy images  

PubMed Central

Although confocal microscopes have considerably smaller contribution of out-of-focus light than widefield microscopes, the confocal images can still be enhanced mathematically if the optical and data acquisition effects are accounted for. For that, several deconvolution algorithms have been proposed. As a practical solution, maximum-likelihood algorithms with regularization have been used. However, the choice of regularization parameters is often unknown although it has considerable effect on the result of deconvolution process. The aims of this work were: to find good estimates of deconvolution parameters; and to develop an open source software package that would allow testing different deconvolution algorithms and that would be easy to use in practice. Here, RichardsonLucy algorithm has been implemented together with the total variation regularization in an open source software package IOCBio Microscope. The influence of total variation regularization on deconvolution process is determined by one parameter. We derived a formula to estimate this regularization parameter automatically from the images as the algorithm progresses. To assess the effectiveness of this algorithm, synthetic images were composed on the basis of confocal images of rat cardiomyocytes. From the analysis of deconvolved results, we have determined under which conditions our estimation of total variation regularization parameter gives good results. The estimated total variation regularization parameter can be monitored during deconvolution process and used as a stopping criterion. An inverse relation between the optimal regularization parameter and the peak signal-to-noise ratio of an image is shown. Finally, we demonstrate the use of the developed software by deconvolving images of rat cardiomyocytes with stained mitochondria and sarcolemma obtained by confocal and widefield microscopes. PMID:21323670

Laasmaa, M; Vendelin, M; Peterson, P

2011-01-01

208

Validating and Verifying a New Thermal-Hydraulic Analysis Tool  

SciTech Connect

The Idaho National Engineering and Environmental Laboratory (INEEL) has developed a new analysis tool by coupling the Fluent computational fluid dynamics (CFD) code to the RELAP5-3D{sup C}/ATHENA advanced thermal-hydraulic analysis code. This tool enables researchers to perform detailed, three-dimensional analyses using Fluent's CFD capability while the boundary conditions required by the Fluent calculation are provided by the balance-of-system model created using RELAP5-3D{sup C}/ATHENA. Both steady-state and transient calculations can be performed, using many working fluids and point to three-dimensional neutronics. A general description of the techniques used to couple the codes is given. The validation and verification (V and V) matrix is outlined. V and V is presently ongoing. (authors)

Schultz, Richard R.; Weaver, Walter L.; Ougouag, Abderrafi M. [INEEL - Idaho National Engineering and Environmental Laboratory, Idaho Falls, ID 83415 (United States); Wieselquist, William A. [North Carolina State University, 700 Hillsborough St, Raleigh, NC 27606 (United States)

2002-07-01

209

Analysis Tools for Next-Generation Hadron Spectroscopy Experiments  

E-print Network

The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

M. Battaglieri; B. J. Briscoe; A. Celentano; S. -U. Chung; A. D'Angelo; R. De Vita; M. Dring; J. Dudek; S. Eidelman; S. Fegan; J. Ferretti; G. Fox; G. Galata; H. Garcia-Tecocoatzi; D. I. Glazier; B. Grube; C. Hanhart; M. Hoferichter; S. M. Hughes; D. G. Ireland; B. Ketzer; F. J. Klein; B. Kubis; B. Liu; P. Masjuan; V. Mathieu; B. McKinnon; R. Mitchell; F. Nerling; S. Paul; J. R. Pelaez; J. Rademacker; A. Rizzo; C. Salgado; E. Santopinto; A. V. Sarantsev; T. Sato; T. Schlter; M. L. L. da Silva; I. Stankovic; I. Strakovsky; A. Szczepaniak; A. Vassallo; N. K. Walford; D. P. Watts; L. Zana

2014-12-19

210

SMART (Shop floor Modeling, Analysis and Reporting Tool Project  

NASA Technical Reports Server (NTRS)

This document presents summarizes the design and prototype of the Shop floor Modeling, Analysis, and Reporting Tool (S.M.A.R.T.) A detailed description of it is found on the full documentation given to the NASA liaison. This documentation is also found on the A.R.I.S.E. Center web site, under a projected directory. Only authorized users can gain access to this site.

Centeno, Martha A.; Garcia, Maretys L.; Mendoza, Alicia C.; Molina, Louis A.; Correa, Daisy; Wint, Steve; Doice, Gregorie; Reyes, M. Florencia

1999-01-01

211

Phenostat: visualization and statistical tool for analysis of phenotyping data  

Microsoft Academic Search

The effective extraction of information from multidimensional data sets derived from phenotyping experiments is a growing\\u000a challenge in biology. Data visualization tools are important resources that can aid in exploratory data analysis of complex\\u000a data sets. Phenotyping experiments of model organisms produce data sets in which a large number of phenotypic measures are\\u000a collected for each individual in a group.

Eli Reuveni; Valeria Carola; Mumna Al Banchaabouchi; Nadia Rosenthal; John M. Hancock; Cornelius Gross

2007-01-01

212

Scenario Analysis in an Automated Tool for Requirements Engineering  

Microsoft Academic Search

\\u000a ?This paper presents an automated tool for scenario-driven requirements engineering where scenario analysis plays the central\\u000a role. It is shown that a scenario can be described by three views of data flow, entity relationship and state transition models\\u000a by slight extensions of classic data flow, entity relationship and state transition diagrams. The notions of consistency and\\u000a completeness of a

Hong Zhu; Lingzi Jin

2000-01-01

213

Stranger: An Automata-Based String Analysis Tool for PHP  

Microsoft Academic Search

\\u000a \\u000a Stranger is an automata-based string analysis tool for finding and eliminating string-related security vulnerabilities in PHP applications.\\u000a Stranger uses symbolic forward and backward reachability analyses to compute the possible values that the string expressions can take\\u000a during program execution. Stranger can automatically (1) prove that an application is free from specified attacks or (2) generate vulnerability signatures that\\u000a characterize all

Fang Yu; Muath Alkhalaf; Tevfik Bultan

2010-01-01

214

Sound methods and effective tools for engineering modeling and analysis  

Microsoft Academic Search

Modeling and analysis is indispensable in engineering. To be safe and effective, a modeling method requires a language with a validated semantics; feature-rich, easy-to-use, dependable tools; and low engineering costs. Today we lack adequate means to develop such methods. We present a partial solution combining two techniques: formal methods for language design, and package-oriented programming for function and usability at

David Coppit; Kevin J. Sullivan

2003-01-01

215

On the next generation of reliability analysis tools  

NASA Technical Reports Server (NTRS)

The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

Babcock, Philip S., IV; Leong, Frank; Gai, Eli

1987-01-01

216

Aerospace Power Systems Design and Analysis (APSDA) Tool  

NASA Technical Reports Server (NTRS)

The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

Truong, Long V.

1998-01-01

217

Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool  

NASA Technical Reports Server (NTRS)

The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify a variety of plant phenomena and improve monitoring capabilities.

McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

2008-01-01

218

Interactive Software Fault Analysis Tool for Operational Anomaly Resolution  

NASA Technical Reports Server (NTRS)

Resolving software operational anomalies frequently requires a significant amount of resources for software troubleshooting activities. The time required to identify a root cause of the anomaly in the software may lead to significant timeline impacts and in some cases, may extend to compromise of mission and safety objectives. An integrated tool that supports software fault analysis based on the observed operational effects of an anomaly could significantly reduce the time required to resolve operational anomalies; increase confidence for the proposed solution; identify software paths to be re-verified during regression testing; and, as a secondary product of the analysis, identify safety critical software paths.

Chen, Ken

2002-01-01

219

ISAC: A tool for aeroservoelastic modeling and analysis  

NASA Technical Reports Server (NTRS)

The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

Adams, William M., Jr.; Hoadley, Sherwood Tiffany

1993-01-01

220

ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution  

E-print Network

ROBUST 2008 Poster Section 2008 c JCMF Detecting atoms in deconvolution Jaroslav Pazdera pazdera the atomic deconvolution problem and we propose the estimator for an atom location and give its asymptotic in the ordinary deconvolution problem. ATOMIC DECONVOLUTION In the ordinary deconvolution problem one wants

Jureckova, Jana

221

Design and Application of the Exploration Maintainability Analysis Tool  

NASA Technical Reports Server (NTRS)

Conducting human exploration missions beyond Low Earth Orbit (LEO) will present unique challenges in the areas of supportability and maintainability. The durations of proposed missions can be relatively long and re-supply of logistics, including maintenance and repair items, will be limited or non-existent. In addition, mass and volume constraints in the transportation system will limit the total amount of logistics that can be flown along with the crew. These constraints will require that new strategies be developed with regards to how spacecraft systems are designed and maintained. NASA is currently developing Design Reference Missions (DRMs) as an initial step in defining future human missions. These DRMs establish destinations and concepts of operation for future missions, and begin to define technology and capability requirements. Because of the unique supportability challenges, historical supportability data and models are not directly applicable for establishing requirements for beyond LEO missions. However, supportability requirements could have a major impact on the development of the DRMs. The mass, volume, and crew resources required to support the mission could all be first order drivers in the design of missions, elements, and operations. Therefore, there is a need for enhanced analysis capabilities to more accurately establish mass, volume, and time requirements for supporting beyond LEO missions. Additionally, as new technologies and operations are proposed to reduce these requirements, it is necessary to have accurate tools to evaluate the efficacy of those approaches. In order to improve the analysis of supportability requirements for beyond LEO missions, the Space Missions Analysis Branch at the NASA Langley Research Center is developing the Exploration Maintainability Analysis Tool (EMAT). This tool is a probabilistic simulator that evaluates the need for repair and maintenance activities during space missions and the logistics and crew requirements to support those activities. Using a Monte Carlo approach, the tool simulates potential failures in defined systems, based on established component reliabilities, and then evaluates the capability of the crew to repair those failures given a defined store of spares and maintenance items. Statistical analysis of Monte Carlo runs provides probabilistic estimates of overall mission safety and reliability. This paper will describe the operation of the EMAT, including historical data sources used to populate the model, simulation processes, and outputs. Analysis results are provided for a candidate exploration system, including baseline estimates of required sparing mass and volume. Sensitivity analysis regarding the effectiveness of proposed strategies to reduce mass and volume requirements and improve mission reliability is included in these results.

Stromgren, Chel; Terry, Michelle; Crillo, William; Goodliff, Kandyce; Maxwell, Andrew

2012-01-01

222

A tool framework for static and dynamic analysis of object-oriented software with templates  

Microsoft Academic Search

The developers of high-performance scientific applications often work in complex computing environments that place heavy demands on program analysis tools. The developers need tools that interoperate, are portable across machine architectures, and provide source-level feedback. In this paper, we describe a tool framework, the Program Database Toolkit (PDT), that supports the development of program analysis tools meeting these requirements. PDT

Kathleen A. Lindlan; Janice E. Cuny; Allen D. Malony; Sameer Shende; Forschungszentrum Juelich; Reid Rivenburgh; Craig Edward Rasmussen; Bernd Mohr

2000-01-01

223

Architecture of Approximate Deconvolution Models of Turbulence  

Microsoft Academic Search

This report presents the mathematical foundation of approximate deconvolution LES models together with the model phenomenology\\u000a downstream of the theory. This mathematical foundation now begins to be complete for the incompressible NavierStokes equations.\\u000a It is built upon averaging, deconvolving and addressing closure so as to obtain the physically correct energy and helicity\\u000a balances in the LES model. We show how

A. Labovschii; W. Layton; C. Manica; M. Neda; L. Rebholz; I. Stanculescu; C. Trenchea

224

Blind Poissonian images deconvolution with framelet regularization.  

PubMed

We propose a maximum a posteriori blind Poissonian images deconvolution approach with framelet regularization for the image and total variation (TV) regularization for the point spread function. Compared with the TV based methods, our algorithm not only suppresses noise effectively but also recovers edges and detailed information. Moreover, the split Bregman method is exploited to solve the resulting minimization problem. Comparative results on both simulated and real images are reported. PMID:23455078

Fang, Houzhang; Yan, Luxin; Liu, Hai; Chang, Yi

2013-02-15

225

Deconvolution of diode-laser spectra  

NASA Technical Reports Server (NTRS)

A new technique has been developed for deconvolving diode-laser spectra. This technique treats Doppler broadening, collisional broadening, and instrumental effects simultaneously. This technique is superior to previous deconvolution methods in the recovery of line-strength and transition-frequency information. A section of the ethane spectrum near 12 microns is used as an example. This new approach applies to any spectroscopy in which the instrumental resolution is narrower than actual linewidths.

Halsey, G. W.; Jennings, D. E.; Blass, W. E.

1985-01-01

226

Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)  

NASA Technical Reports Server (NTRS)

An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

2007-01-01

227

A new bioinformatics analysis tools framework at EMBLEBI  

PubMed Central

The EMBL-EBI provides access to various mainstream sequence analysis applications. These include sequence similarity search services such as BLAST, FASTA, InterProScan and multiple sequence alignment tools such as ClustalW, T-Coffee and MUSCLE. Through the sequence similarity search services, the users can search mainstream sequence databases such as EMBL-Bank and UniProt, and more than 2000 completed genomes and proteomes. We present here a new framework aimed at both novice as well as expert users that exposes novel methods of obtaining annotations and visualizing sequence analysis results through one uniform and consistent interface. These services are available over the web and via Web Services interfaces for users who require systematic access or want to interface with customized pipe-lines and workflows using common programming languages. The framework features novel result visualizations and integration of domain and functional predictions for protein database searches. It is available at http://www.ebi.ac.uk/Tools/sss for sequence similarity searches and at http://www.ebi.ac.uk/Tools/msa for multiple sequence alignments. PMID:20439314

Goujon, Mickael; McWilliam, Hamish; Li, Weizhong; Valentin, Franck; Squizzato, Silvano; Paern, Juri; Lopez, Rodrigo

2010-01-01

228

Networking Sensor Observations, Forecast Models & Data Analysis Tools  

NASA Astrophysics Data System (ADS)

This presentation explores the interaction between sensor webs and forecast models and data analysis processes within service oriented architectures (SOA). Earth observation data from surface monitors and satellite sensors and output from earth science models are increasingly available through open interfaces that adhere to web standards, such as the OGC Web Coverage Service (WCS), OGC Sensor Observation Service (SOS), OGC Web Processing Service (WPS), SOAP-Web Services Description Language (WSDL), or RESTful web services. We examine the implementation of these standards from the perspective of forecast models and analysis tools. Interoperable interfaces for model inputs, outputs, and settings are defined with the purpose of connecting them with data access services in service oriented frameworks. We review current best practices in modular modeling, such as OpenMI and ESMF/Mapl, and examine the applicability of those practices to service oriented sensor webs. In particular, we apply sensor-model-analysis interfaces within the context of wildfire smoke analysis and forecasting scenario used in the recent GEOSS Architecture Implementation Pilot. Fire locations derived from satellites and surface observations and reconciled through a US Forest Service SOAP web service are used to initialize a CALPUFF smoke forecast model. The results of the smoke forecast model are served through an OGC WCS interface that is accessed from an analysis tool that extract areas of high particulate matter concentrations and a data comparison tool that compares the forecasted smoke with Unattended Aerial System (UAS) collected imagery and satellite-derived aerosol indices. An OGC WPS that calculates population statistics based on polygon areas is used with the extract area of high particulate matter to derive information on the population expected to be impacted by smoke from the wildfires. We described the process for enabling the fire location, smoke forecast, smoke observation, and population statistics services to be registered with the GEOSS registry and made findable through the GEOSS Clearinghouse. The fusion of data sources and different web service interfaces illustrate the agility in using standard interfaces and help define the type of input and output interfaces needed to connect models and analysis tools within sensor webs.

Falke, S. R.; Roberts, G.; Sullivan, D.; Dibner, P. C.; Husar, R. B.

2009-12-01

229

Integrated Modeling Tools for Thermal Analysis and Applications  

NASA Technical Reports Server (NTRS)

Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

1999-01-01

230

Colossal Tooling Design: 3D Simulation for Ergonomic Analysis  

NASA Technical Reports Server (NTRS)

The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

2003-01-01

231

Coastal Online Analysis and Synthesis Tool 2.0 (COAST)  

NASA Technical Reports Server (NTRS)

The Coastal Online Assessment and Synthesis Tool (COAST) 3D geobrowser has been developed to integrate disparate coastal datasets from NASA and other sources into a desktop tool that provides new data visualization and analysis capabilities for coastal researchers, managers, and residents. It is built upon the widely used NASA-developed open source World Wind geobrowser from NASA Ames (Patrick Hogan et al.) .Net and C# version is used for development. It is leveraged off of World Wind community shared code samples and COAST 2.0 enhancement direction is based on Coastal science community feedback and needs assessment (GOMA). The main objective is to empower the user to bring more user-meaningful data into multi-layered, multi-temporal spatial context.

Brown, Richard B.; Navard, Andrew R.; Nguyen, Beth T.

2009-01-01

232

Seismic analysis and learning tools for high school students  

NASA Astrophysics Data System (ADS)

Easy access and processing of geophysical data is still a challenge in modern geosciences. Many existing tools require long learning time for a user to master them effectively, and moreover their PC interface needs to be updated to modern GUI (Graphical User Interface) standards. Thus most of the existing tools are not applicable for educational purposes. New high level programming languages like Java other opportunities to develop user friendly tools for geophysical data access and processing. We present our software tool for statistical analysis of earthquake catalogs named "Seismotectonics". It is developed as a part of SEIS/SCHOOL/NORWAY project and used in the framework of electronic learning and a practical tool for high school students. Having user-friendly interface it can be easily operated by the students while it provides most important features available by professional software. Based on Java technology it runs on most computers and operating systems. The application includes geological map of Norway (Mosar, 2002) and Fennoscandian Earthquake catalog covering the period 1375-2001. The software provides features for histogram and fault plane solution plotting, space-time filtering for a catalog or its subset, besides that extra features which are useful for educational purposes like earthquake animation are available. We present two studies based on this package: i) catalogues search for explosions and frosting events which have remarkable time distribution patterns, and ii)isolate large historical earthquake which magnitudes may be positive biased. In the latter case, the Luroy earthquakes of 31 Aug. 1819, claimed as the largest one in NW Europe is found to be not largest by high school students.

Boulaenko, M. E.; Husebye, E. S.

2003-04-01

233

A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra  

NASA Astrophysics Data System (ADS)

A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

2013-06-01

234

POPBAM: Tools for Evolutionary Analysis of Short Read Sequence Alignments  

PubMed Central

Background While many bioinformatics tools currently exist for assembling and discovering variants from next-generation sequence data, there are very few tools available for performing evolutionary analyses from these data. Evolutionary and population genomics studies hold great promise for providing valuable insights into natural selection, the effect of mutations on phenotypes, and the origin of species. Thus, there is a need for an extensible and flexible computational tool that can function into a growing number of evolutionary bioinformatics pipelines. Results This paper describes the POPBAM software, which is a comprehensive set of computational tools for evolutionary analysis of whole-genome alignments consisting of multiple individuals, from multiple populations or species. POPBAM works directly from BAM-formatted assembly files, calls variant sites, and calculates a variety of commonly used evolutionary sequence statistics. POPBAM is designed primarily to perform analyses in sliding windows across chromosomes or scaffolds. POPBAM accurately measures nucleotide diversity, population divergence, linkage disequilibrium, and the frequency spectrum of mutations from two or more populations. POPBAM can also produce phylogenetic trees of all samples in a BAM file. Finally, I demonstrate that the implementation of POPBAM is both fast and memory-efficient, and also can feasibly scale to the analysis of large BAM files with many individuals and populations. Software: The POPBAM program is written in C/C++ and is available from http://dgarriga.github.io/POPBAM. The program has few dependencies and can be built on a variety of Linux platforms. The program is open-source and users are encouraged to participate in the development of this resource. PMID:24027417

Garrigan, Daniel

2013-01-01

235

Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.  

ERIC Educational Resources Information Center

This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow

Carlson, David H.

1986-01-01

236

Development of unified plotting tools for GA transport analysis  

NASA Astrophysics Data System (ADS)

A collection of python classes for the TGYRO suite of codes (NEO, GYRO, TGYRO, TGLF) has been developed that provide both the expert user with conceptually simple access to all code output data, and the casual end user with simple command-line control of plotting. The user base for these transport analysis codes continues to grow, raising the urgency of modernizing and unifying the plotting tools used for post-simulation analysis. Simultaneously, there is a push toward larger-scale fusion modeling underscoring the need for a revised, modernized approach to data management and analysis. The TGYRO suite is currently in use at all major fusion laboratories worldwide, and allows the user to make steady-state profile predictions for existing devices and future reactors, and simultaneously to carry out fundamental research on plasma transport (both collisional and turbulent).

Buuck, M.; Candy, J.

2011-11-01

237

TA-DA: A TOOL FOR ASTROPHYSICAL DATA ANALYSIS  

SciTech Connect

We present the Tool for Astrophysical Data Analysis (TA-DA), a new software aimed to greatly simplify and improve the analysis of stellar photometric data in comparison with theoretical models, and allow the derivation of stellar parameters from multi-band photometry. Its flexibility allows one to address a number of such problems: from the interpolation of stellar models, or sets of stellar physical parameters in general, to the computation of synthetic photometry in arbitrary filters or units; from the analysis of observed color-magnitude diagrams to a Bayesian derivation of stellar parameters (and extinction) based on multi-band data. TA-DA is available as a pre-compiled Interactive Data Language widget-based application; its graphical user interface makes it considerably user-friendly. In this paper, we describe the software and its functionalities.

Da Rio, Nicola [European Space Agency, Keplerlaan 1, 2200-AG Noordwijk (Netherlands); Robberto, Massimo, E-mail: ndario@rssd.esa.int [Space Telescope Science Institute, 3700 San Martin Dr., Baltimore, MD 21218 (United States)

2012-12-01

238

Array-conditioned deconvolution of multiple component teleseismic recordings  

E-print Network

We investigate the applicability of an array-conditioned deconvolution technique, developed for analyzing borehole seismic exploration data, to teleseismic receiver functions and data preprocessing steps for scattered ...

Chen, C. -W.

2010-01-01

239

The Precision Formation Flying Integrated Analysis Tool (PFFIAT)  

NASA Technical Reports Server (NTRS)

Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.

Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor

2004-01-01

240

CFD Methods and Tools for Multi-Element Airfoil Analysis  

NASA Technical Reports Server (NTRS)

This lecture will discuss the computational tools currently available for high-lift multi-element airfoil analysis. It will present an overview of a number of different numerical approaches, their current capabilities, short-comings, and computational costs. The lecture will be limited to viscous methods, including inviscid/boundary layer coupling methods, and incompressible and compressible Reynolds-averaged Navier-Stokes methods. Both structured and unstructured grid generation approaches will be presented. Two different structured grid procedures are outlined, one which uses multi-block patched grids, the other uses overset chimera grids. Turbulence and transition modeling will be discussed.

Rogers, Stuart E.; George, Michael W. (Technical Monitor)

1995-01-01

241

Gamma test analysis tools for non-linear time series Samuel Edward Kemp  

E-print Network

Gamma test analysis tools for non-linear time series by Samuel Edward Kemp Department of Computing surface temperature. Gamma test analysis tools for non-linear time series Samuel Kemp #12. This thesis is dedicated to them. Samuel Edward Kemp August 2006 Gamma test analysis tools for non-linear time

Jones, Antonia J.

242

Image analysis tools and emerging algorithms for expression proteomics  

PubMed Central

Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput shotgun proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a users and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614

English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.

2012-01-01

243

SAVANT: Solar Array Verification and Analysis Tool Demonstrated  

NASA Technical Reports Server (NTRS)

The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

Chock, Ricaurte

2000-01-01

244

VisIt: Interactive Parallel Visualization and Graphical Analysis Tool  

NASA Astrophysics Data System (ADS)

VisIt is a free interactive parallel visualization and graphical analysis tool for viewing scientific data on Unix and PC platforms. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images for presentations. VisIt contains a rich set of visualization features so that you can view your data in a variety of ways. It can be used to visualize scalar and vector fields defined on two- and three-dimensional (2D and 3D) structured and unstructured meshes. VisIt was designed to handle very large data set sizes in the terascale range and yet can also handle small data sets in the kilobyte range. See the table below for more details about the tools features. VisIt was developed by the Department of Energy (DOE) Advanced Simulation and Computing Initiative (ASCI) to visualize and analyze the results of terascale simulations. It was developed as a framework for adding custom capabilities and rapidly deploying new visualization technologies. Although the primary driving force behind the development of VisIt was for visualizing terascale data, it is also well suited for visualizing data from typical simulations on desktop systems.

Department Of Energy (DOE) Advanced Simulation; Computing Initiative (ASCI)

2011-03-01

245

Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation  

SciTech Connect

The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

Neubauer, J.

2014-12-01

246

BLIND DECONVOLUTION AND DEBLURRING IN IMAGE ANALYSIS  

E-print Network

grant, a grant from National Security Anegency of USA, and a grant from National Science Foundation of USA. #12;1. INTRODUCTION Photographic images, whether recorded by digital or analogue means, have-spread function would usually be one that has relatively sharp boundaries, such as those in a photographic test

Qiu, Peihua

247

BLIND DECONVOLUTION AND DEBLURRING IN IMAGE ANALYSIS  

E-print Network

in part by an ARC grant, a grant from National Security Anegency of USA, and a grant from National Science Foundation of USA. #12; 1. INTRODUCTION Photographic images, whether recorded by digital or analogue means in a photographic test pattern. We introduce a di#erence­based approximation, D say, to the deriva­ tive

Qiu, Peihua

248

Tools for integrated sequence-structure analysis with UCSF Chimera  

PubMed Central

Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a) provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b) facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit); (c) can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d) interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is available for Microsoft Windows, Apple Mac OS X, Linux, and other platforms from . PMID:16836757

Meng, Elaine C; Pettersen, Eric F; Couch, Gregory S; Huang, Conrad C; Ferrin, Thomas E

2006-01-01

249

Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool  

SciTech Connect

Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

Unwin, Stephen D.; Seiple, Timothy E.

2009-05-28

250

Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution  

NASA Astrophysics Data System (ADS)

The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 5 grid positions over a 2 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution in UDECON is achieved by searching for the minimum ABIC while shifting the sensor response (to account for possible mispositioning of the sample on the tray) and a smoothness parameter in ranges defined by user. Comparison of deconvolution results using sensor response estimated from integrated point source measurements and other methods suggest that the integrated point source estimate yields better results (smaller ABIC). The noise characteristics of magnetometer measurements and the reliability of the UDECON algorithm were tested using repeated (a total of 400 times) natural remanence measurement of a u-channel sample before and after stepwise alternating field demagnetizations. Using a series of synthetic data constructed based on real paleomagnetic record, we demonstrate that optimized deconvolution using UDECON can greatly help revealing detailed paleomagnetic information such as excursions that may be smoothed out during pass-through measurement. Application of UDECON to the vast amount of existing and future pass-through paleomagnetic and rock magnetic measurements on sediments recovered especially through ocean drilling programs will contribute to our understanding of the geodynamo and paleo-environment by providing more detailed records of geomagnetic and environmental changes.

Xuan, C.; Oda, H.

2013-12-01

251

General Mission Analysis Tool (GMAT) Architectural Specification. Draft  

NASA Technical Reports Server (NTRS)

Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

Hughes, Steven P.; Conway, Darrel, J.

2007-01-01

252

Mechanical System Analysis/Design Tool (MSAT) Quick Guide  

NASA Technical Reports Server (NTRS)

MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

Lee, HauHua; Kolb, Mark; Madelone, Jack

1998-01-01

253

Orbit Analysis Tools Software user's manual, version 1  

NASA Astrophysics Data System (ADS)

In the course of our work in mission planning and analysis we have developed a set of computer programs that address many of the questions commonly asked by designers when planning a new satellite system and by managers wishing to assess the performance of an existing system. The Orbit Analysis Tools Software (OATS) is an organization of this collection of computer programs unified by a single graphical user interface. The graphical and tabular output from OATS may be printed directly from the program or cut and pasted via the clipboard into any other Macintosh application program. The FaceIt1 utility is used to establish the interface between the FORTRAN code and the Macintosh Toolbox.

Hope, Alan S.; Middour, Jay

1993-04-01

254

Message Correlation Analysis Tool for NOvA  

NASA Astrophysics Data System (ADS)

A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

Lu, Qiming; Biery, Kurt A.; Kowalkowski, James B.

2012-12-01

255

Net energy analysis: Powerful tool for selecting electric power options  

NASA Astrophysics Data System (ADS)

A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

Baron, S.

256

PyRAT (python radiography analysis tool): overview  

SciTech Connect

PyRAT was developed as a quantitative tool for robustly characterizing objects from radiographs to solve problems such as the hybrid nonlinear inverse problem. The optimization software library that was used is the nonsmooth optimization by MADS algorithm (NOMAD). Some of PyRAT's features are: (1) hybrid nonlinear inverse problem with calculated x-ray spectrum and detector response; (2) optimization based inversion approach with goal of identifying unknown object configurations - MVO problem; (3) using functionalities of Python libraries for radiographic image processing and analysis; (4) using the Tikhonov regularization method of linear inverse problem to recover partial information of object configurations; (5) using a priori knowledge of problem solutions to define feasible region and discrete neighbor for the MVO problem - initial data analysis + material library {yields} a priori knowledge; and (6) using the NOMAD (C++ version) software in the object.

Armstrong, Jerawan C [Los Alamos National Laboratory; Temple, Brian A [Los Alamos National Laboratory; Buescher, Kevin L [Los Alamos National Laboratory

2011-01-14

257

The Lagrangian analysis tool LAGRANTO - version 2.0  

NASA Astrophysics Data System (ADS)

Lagrangian trajectories are widely used in the atmospheric sciences, for instance to identify flow structures in extratropical cyclones (e.g., warm conveyor belts) and long-range transport pathways of moisture and trace substances. Here a new version of the Lagrangian analysis tool LAGRANTO (Wernli and Davies, 1997) is introduced, which offers considerably enhanced functionalities: (i) trajectory starting positions can be described easily based on different geometrical and/or meteorological conditions; e.g., equidistantly spaced within a prescribed region and on a stack of pressure (or isentropic) levels; (ii) a versatile selection of trajectories is offered based on single or combined criteria; these criteria are passed to LAGRANTO with a simple command language (e.g., "GT:PV:2" readily translates into a selection of all trajectories with potential vorticity (PV) greater than 2 PVU); and (iii) full versions are available for global ECMWF and regional COSMO data; core functionality is also provided for the regional WRF and UM models, and for the global 20th Century Reanalysis data set. The intuitive application of LAGRANTO is first presented for the identification of a warm conveyor belt in the North Atlantic. A further case study then shows how LAGRANTO is used to quasi-operationally diagnose stratosphere-troposphere exchange events over Europe. Whereas these example rely on the ECMWF version, the COSMO version and input fields with 7 km horizontal resolution are needed to adequately resolve the rather complex flow structure associated with orographic blocking due to the Alps. Finally, an example of backward trajectories presents the tool's application in source-receptor analysis studies. The new distribution of LAGRANTO is publicly available and includes simple tools, e.g., to visualize and merge trajectories. Furthermore, a detailed user guide exists, which describes all LAGRANTO capabilities.

Sprenger, M.; Wernli, H.

2015-02-01

258

CRITICA: coding region identification tool invoking comparative analysis  

NASA Technical Reports Server (NTRS)

Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).

Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)

1999-01-01

259

XQCAT: eXtra Quark Combined Analysis Tool  

E-print Network

XQCAT (eXtra Quark Combined Analysis Tool) is a tool aimed to determine exclusion Confidence Levels (eCLs) for scenarios of new physics characterised by the presence of one or multiple heavy extra quarks (XQ) which interact through Yukawa couplings with any of the Standard Model (SM) quarks. The code uses a database of efficiencies for pre-simulated processes of Quantum Chromo-Dynamics (QCD) pair production and on-shell decays of extra quarks. In the version 1.0 of XQCAT the efficiencies have been computed for a set of seven publicly available search results by the CMS experiment, and the package is subject to future updates to include further searches by both ATLAS and CMS collaborations. The input for the code is a text file in which masses, branching ratios (BRs) and dominant chirality of the couplings of the new quarks are provided. The output of the code is the eCL of the test point for each implemented experimental analysis considered individually and, when possible, in statistical combination.

D. Barducci; A. Belyaev; M. Buchkremer; J. Marrouche; S. Moretti; L. Panizzi

2014-09-10

260

GANALYZER: A TOOL FOR AUTOMATIC GALAXY IMAGE ANALYSIS  

SciTech Connect

We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze {approx}10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

Shamir, Lior, E-mail: lshamir@mtu.edu [Department of Computer Science, Lawrence Technological University, 21000 West Ten Mile Road, Southfield, MI 48075 (United States)

2011-08-01

261

Least-squares (LS) deconvolution of a series of overlapping cortical auditory evoked potentials: a simulation and experimental study  

NASA Astrophysics Data System (ADS)

Objective. To evaluate the viability of disentangling a series of overlapping cortical auditory evoked potentials (CAEPs) elicited by different stimuli using least-squares (LS) deconvolution, and to assess the adaptation of CAEPs for different stimulus onset-asynchronies (SOAs). Approach. Optimal aperiodic stimulus sequences were designed by controlling the condition number of matrices associated with the LS deconvolution technique. First, theoretical considerations of LS deconvolution were assessed in simulations in which multiple artificial overlapping responses were recovered. Second, biological CAEPs were recorded in response to continuously repeated stimulus trains containing six different tone-bursts with frequencies 8, 4, 2, 1, 0.5, 0.25 kHz separated by SOAs jittered around 150 (120-185), 250 (220-285) and 650 (620-685) ms. The control condition had a fixed SOA of 1175 ms. In a second condition, using the same SOAs, trains of six stimuli were separated by a silence gap of 1600 ms. Twenty-four adults with normal hearing (<20 dB HL) were assessed. Main results. Results showed disentangling of a series of overlapping responses using LS deconvolution on simulated waveforms as well as on real EEG data. The use of rapid presentation and LS deconvolution did not however, allow the recovered CAEPs to have a higher signal-to-noise ratio than for slowly presented stimuli. The LS deconvolution technique enables the analysis of a series of overlapping responses in EEG. Significance. LS deconvolution is a useful technique for the study of adaptation mechanisms of CAEPs for closely spaced stimuli whose characteristics change from stimulus to stimulus. High-rate presentation is necessary to develop an understanding of how the auditory system encodes natural speech or other intrinsically high-rate stimuli.

Bardy, Fabrice; Van Dun, Bram; Dillon, Harvey; Cowan, Robert

2014-08-01

262

Damage analysis of extrusion tools made from the austenitic hot work tool steel Bhler W750  

Microsoft Academic Search

During hot extrusion of copper alloys, extrusion tools have to withstand cyclic thermal and mechanical loads. To enhance the service life of the tools, materials with high temperature strength are designed as well as an optimised process control is performed. To characterise the tool damage evolution during service and to improve process guiding, modelling and simulation are appropriate means. The

Friedrich Krumphals; Thomas Wlanis; Rainer Sievert; Volker Wieser; Christof Sommitsch

2011-01-01

263

Strategies for the deconvolution of hypertelescope images  

NASA Astrophysics Data System (ADS)

Aims: We study the possibility of deconvolving hypertelescope images and propose a procedure that can be used provided that the densification factor is small enough to make the process reversible. Methods: We present the simulation of hypertelescope images for an array of cophased densified apertures. We distinguish between two types of aperture densification, one called FAD (full aperture densification) corresponding to Labeyrie's original technique, and the other FSD (full spectrum densification) corresponding to a densification factor twice as low. Images are compared to the Fizeau mode. A single image of the observed object is obtained in the hypertelescope modes, while in the Fizeau mode the response produces an ensemble of replicas of the object. Simulations are performed for noiseless images and in a photodetection regime. Assuming first that the point spread function (PSF) does not change much over the object extent, we use two classical techniques to deconvolve the images, namely the Richardson-Lucy and image space reconstruction algorithms. Results: Both algorithms fail to achieve satisfying results. We interpret this as meaning that it is inappropriate to deconvolve a relation that is not a convolution, even if the variation in the PSF is very small across the object extent. We propose instead the application of a redilution to the densified image prior to its deconvolution, i.e. to recover an image similar to the Fizeau observation. This inverse operation is possible only when the rate of densification is no more than in the FSD case. This being done, the deconvolution algorithms become efficient. The deconvolution brings together the replicas into a single high-quality image of the object. This is heuristically explained as an inpainting of the Fourier plane. This procedure makes it possible to obtain improved images while retaining the benefits of hypertelescopes for image acquisition consisting of detectors with a small number of pixels.

Aime, C.; Lantri, H.; Diet, M.; Carlotti, A.

2012-07-01

264

An online database for plant image analysis software tools  

PubMed Central

Background Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. Results We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. Conclusions The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work. PMID:24107223

2013-01-01

265

Validating Whole Slide Digital Morphometric Analysis as a Microscopy Tool.  

PubMed

Whole slide imaging (WSI) can be used to quantify multiple responses within tissue sections during histological analysis. Feature Analysis on Consecutive Tissue Sections (FACTS) allows the investigator to perform digital morphometric analysis (DMA) within specified regions of interest (ROI) across multiple serial sections at faster rates when compared with manual morphometry methods. Using FACTS in conjunction with WSI is a powerful analysis tool, which allows DMA to target specific ROI across multiple tissue sections stained for different biomarkers. DMA may serve as an appropriate alternative to classic, manual, histologic morphometric measures, which have historically relied on the selection of high-powered fields of views and manual scoring (e.g., a gold standard). In the current study, existing preserved samples were used to determine if DMA would provide similar results to manual counting methods. Rodent hearts (n=14, left ventricles) were stained with Masson's trichrome, and reacted for cluster of differentiation 68 (CD-68). This study found no statistical significant difference between a classic, manual method and the use of digital algorithms to perform the similar counts (p=0.38). DMA offers researchers the ability to accurately evaluate morphological characteristics in a reproducible fashion without investigator bias and with higher throughput. PMID:25399639

Diller, Robert B; Kellar, Robert S

2014-11-17

266

Design and Analysis Tool for External-Compression Supersonic Inlets  

NASA Technical Reports Server (NTRS)

A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.

Slater, John W.

2012-01-01

267

Blind iterative deconvolution of binary star images  

E-print Network

The technique of Blind Iterative De-convolution (BID) was used to remove the atmospherically induced point spread function (PSF) from short exposure images of two binary stars, HR 5138 and HR 5747 obtained at the cassegrain focus of the 2.34 meter Vainu Bappu Telescope(VBT), situated at Vainu Bappu Observatory (VBO), Kavalur. The position angles and separations of the binary components were seen to be consistent with results of the auto-correlation technique, while the Fourier phases of the reconstructed images were consistent with published observations of the binary orbits.

S. K. Saha; P. Venkatakrishnan

1999-10-21

268

First Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNNL-SA-44274 Top Ten Needs for Intelligence Analysis Tool Development  

E-print Network

First Annual Conference on Intelligence Analysis Methods and Tools, May 2005 PNNL-SA-44274 Top Ten Needs for Intelligence Analysis Tool Development Richard V. Badalamente and Frank L. Greitzer Battelle to generate ideas about future enhancements to software systems de- signed to aid intelligence analysts

269

Recovery of Dynamic PET Regions via Simultaneous Segmentation and Deconvolution  

E-print Network

Recovery of Dynamic PET Regions via Simultaneous Segmentation and Deconvolution Benjamin Smith1 and deconvolution of dynamic PET images. By incorporating the PSF of the imaging system into our segmentation model effect. We show improved segmentation results, and outperform two state-of-the-art dynamic PET

Möller, Torsten

270

Simultaneous Total Variation Image Inpainting and Blind Deconvolution  

Microsoft Academic Search

Abstract We propose a total variation based model for simultaneous image inpainting and blind deconvolution. We demonstrate that the tasks are inherently coupled together and that solving them individually will lead to poor results. The main advantages of our model are that (i) boundary conditions for deconvolution required near the interface between observed and occluded regions are naturally generated through

Tony F. Chan; Andy M. Yip; Frederick E. Park

2004-01-01

271

Removing boundary artifacts for real-time iterated shrinkage deconvolution.  

PubMed

We propose a solution to the problem of boundary artifacts appearing in several recently published fast deblurring algorithms based on iterated shrinkage thresholding in a sparse domain and Fourier domain deconvolution. Our approach adapts an idea proposed by Reeves for deconvolution by the Wiener filter. The time of computation less than doubles. PMID:22106148

Sorel, Michal

2012-04-01

272

On the Optimal Rates of Convergence for Nonparametric Deconvolution Problems  

Microsoft Academic Search

Deconvolution problems arise in a variety of situations in statistics. An interesting problem is to estimate the density $f$ of a random variable $X$ based on $n$ i.i.d. observations from $Y = X + \\\\varepsilon$, where $\\\\varepsilon$ is a measurement error with a known distribution. In this paper, the effect of errors in variables of nonparametric deconvolution is examined. Insights

Jianqing Fan

1991-01-01

273

IPMP 2013--a comprehensive data analysis tool for predictive microbiology.  

PubMed

Predictive microbiology is an area of applied research in food science that uses mathematical models to predict the changes in the population of pathogenic or spoilage microorganisms in foods exposed to complex environmental changes during processing, transportation, distribution, and storage. It finds applications in shelf-life prediction and risk assessments of foods. The objective of this research was to describe the performance of a new user-friendly comprehensive data analysis tool, the Integrated Pathogen Modeling Model (IPMP 2013), recently developed by the USDA Agricultural Research Service. This tool allows users, without detailed programming knowledge, to analyze experimental kinetic data and fit the data to known mathematical models commonly used in predictive microbiology. Data curves previously published in literature were used to test the models in IPMP 2013. The accuracies of the data analysis and models derived from IPMP 2013 were compared in parallel to commercial or open-source statistical packages, such as SAS or R. Several models were analyzed and compared, including a three-parameter logistic model for growth curves without lag phases, reduced Huang and Baranyi models for growth curves without stationary phases, growth models for complete growth curves (Huang, Baranyi, and re-parameterized Gompertz models), survival models (linear, re-parameterized Gompertz, and Weibull models), and secondary models (Ratkowsky square-root, Huang square-root, Cardinal, and Arrhenius-type models). The comparative analysis suggests that the results from IPMP 2013 were equivalent to those obtained from SAS or R. This work suggested that the IPMP 2013 could be used as a free alternative to SAS, R, or other more sophisticated statistical packages for model development in predictive microbiology. PMID:24334095

Huang, Lihan

2014-02-01

274

Assessing Extremes Climatology Using NWS Local Climate Analysis Tool  

NASA Astrophysics Data System (ADS)

The Local Climate Analysis Tool (LCAT) is evolving out of a need to support and enhance the National Oceanic and Atmospheric Administration (NOAA) National Weather Service (NWS) field offices ability to access, manipulate, and interpret local climate data and characterize climate variability and change impacts. LCAT will enable NWS Regional Headquarters, Weather Forecast Offices, Weather Service Offices, and River Forecast Centers the ability to conduct regional and local climate studies using station and reanalysis gridded data and various statistical techniques for climate analysis. The analysis results will be used for climate services to guide local decision makers in weather and climate sensitive actions and to deliver information to the general public. Field offices need standardized, scientifically sound methodology for local climate analysis (such as trend, composites, and principal statistical and time-series analysis) that is comprehensive, accessible, and efficient, with the potential to expand with growing NOAA Climate Services needs. The methodology for climate analyses is practiced by the NWS Climate Prediction Center (CPC), NOAA National Climatic Data Center, and NOAA Earth System Research Laboratory, as well as NWS field office staff. LCAT will extend this practice at the local level, allowing it to become both widespread and standardized, and thus improve NWS climate services capabilities. LCAT focus is on the local scale (as opposed to national and global scales of CPC products). The LCAT will: -Improve professional competency of local office staff and expertise in providing local information to their users. LCAT will improve quality of local climate services -Ensure adequate local input to CPC products that depend on local information, such as the U.S. Drought Monitor. LCAT will allow improvement of CPC climate products -Allow testing of local climate variables beyond temperature averages and precipitation totals such as climatology of tornadoes, flash floods, storminess, extreme weather events, etc. LCAT will expand the suite of NWS climate products. The LCAT development utilizes NWS Operations and Services Improvement Process (OSIP) to document the field and user requirements, develop solutions, and prioritize resources. OSIP is a five work-stage process separated by four gate reviews. LCAT is currently at work-stage three: Research Demonstration and Solution Analysis. Gate 1 and 2 reviews identified LCAT as a high strategic priority project with a very high operational need. The Integrated Working Team, consisting of NWS field representatives, assists in tool function design and identification of LCAT operational deployment support.

Timofeyeva, M. M.; Hollingshead, A.; Hilderbrand, D.; Mayes, B.; Hartley, T.; Kempf McGavock, N. M.; Lau, E.; Olenic, E. A.; Motta, B.; Bunge, R.; Brown, L. E.; Fritsch, F.

2010-12-01

275

The Analysis of Stone Tool Procurement, Production, and Maintenance  

Microsoft Academic Search

Researchers who analyze stone tools and their production debris have made significant progress in understanding the relationship\\u000a between stone tools and human organizational strategies. Stone tools are understood to be morphologically dynamic throughout\\u000a their use-lives; the ever-changing morphology of stone tools is intimately associated with the needs of tool users. It also\\u000a has become apparent to researchers that interpretations of

William Andrefsky Jr

2009-01-01

276

A Freeware Java Tool for Spatial Point Analysis of Neuronal Barry G. Condron  

E-print Network

NEWS ITEM A Freeware Java Tool for Spatial Point Analysis of Neuronal Structures Barry G. Condron, a freeware tool, called PAJ, has been developed. This Java-based tool takes 3D Cartesian coordinates as input in Java that is based on previously described statistical analysis (Diggle 2003). In PAJ, data is copied

Condron, Barry

277

Multiple Lyapunov functions and other analysis tools for switched and hybrid systems  

Microsoft Academic Search

We introduce some analysis tools for switched and hybrid systems. We first present work on stability analysis. We introduce multiple Lyapunov functions as a tool for analyzing Lyapunov stability and use iterated function systems theory as a tool for Lagrange stability. We also discuss the case where the switched systems are indexed by an arbitrary compact set. Finally, we extend

Michael S. Branicky

1998-01-01

278

Verification and Validation of the General Mission Analysis Tool (GMAT)  

NASA Technical Reports Server (NTRS)

This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

2014-01-01

279

Multi-Mission Power Analysis Tool (MMPAT) Version 3  

NASA Technical Reports Server (NTRS)

The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

Wood, Eric G.; Chang, George W.; Chen, Fannie C.

2012-01-01

280

In silico tools for the analysis of antibiotic biosynthetic pathways.  

PubMed

Natural products of bacteria and fungi are the most important source for antimicrobial drug leads. For decades, such compounds were exclusively found by chemical/bioactivity-guided screening approaches. The rapid progress in sequencing technologies only recently allowed the development of novel screening methods based on the genome sequences of potential producing organisms. The basic principle of such genome mining approaches is to identify genes, which are involved in the biosynthesis of such molecules, and to predict the products of the identified pathways. Thus, bioinformatics methods and tools are crucial for genome mining. In this review, a comprehensive overview is given on programs and databases for the identification and analysis of antibiotic biosynthesis gene clusters in genomic data. PMID:24631213

Weber, Tilmann

2014-05-01

281

Color infrared (CIR) photography: A tool for environmental analysis  

NASA Technical Reports Server (NTRS)

Research carried out under NASA auspices suggests that in the future remote sensors may play an important role in monitoring our environment. One medium, color infrared photography, appears to have immediate utility. Its capability to identify, measure the acreage of, and monitor the health of agricultural and woodland resources has been demonstrated, as has its capability to identify the sources and extent of certain types of water pollution. CIR is also beginning to demonstrate considerable potential as a tool for urban analysis. The great value of CIR is that it can provide these data quickly and inexpensively, and for that reason will be preferred to more complex multispectral systems by budget-conscious administrators.

Lindgren, D. T.

1971-01-01

282

Input Range Testing for the General Mission Analysis Tool (GMAT)  

NASA Technical Reports Server (NTRS)

This document contains a test plan for testing input values to the General Mission Analysis Tool (GMAT). The plan includes four primary types of information, which rigorously define all tests that should be performed to validate that GMAT will accept allowable inputs and deny disallowed inputs. The first is a complete list of all allowed object fields in GMAT. The second type of information, is test input to be attempted for each field. The third type of information is allowable input values for all objects fields in GMAT. The final piece of information is how GMAT should respond to both valid and invalid information. It is VERY important to note that the tests below must be performed for both the Graphical User Interface and the script!! The examples are illustrated using a scripting perspective, because it is simpler to write up. However, the test must be performed for both interfaces to GMAT.

Hughes, Steven P.

2007-01-01

283

Combinatorial tools for the analysis of transcriptional regulation  

SciTech Connect

In this paper, we discuss virtual experiments for the study of major regulatory processes such as translation, signalization or transcription pathways. An essential part of these processes is the formation of protein clusters held together by a small number of binding domains that can be shared by many different proteins. Analysis of these clusters is complicated by the vast number of different arrangements of proteins that can trigger a specific reaction. We propose combinatorial tools that can help predict the effects on the rate of transcription of either changes in transcriptional factors concentration, or due to the introduction of chimeras combining domains not usually present on a protein. 15 refs., 5 figs., 3 tabs.

Bergeron, A.; Gaul, E.; Bergeron, D. [Universite du Quebec a Montreal (Canada)

1996-12-31

284

Validation of tool mark analysis of cut costal cartilage.  

PubMed

This study was designed to establish the potential error rate associated with the generally accepted method of tool mark analysis of cut marks in costal cartilage. Three knives with different blade types were used to make experimental cut marks in costal cartilage of pigs. Each cut surface was cast, and each cast was examined by three analysts working independently. The presence of striations, regularity of striations, and presence of a primary and secondary striation pattern were recorded for each cast. The distance between each striation was measured. The results showed that striations were not consistently impressed on the cut surface by the blade's cutting edge. Also, blade type classification by the presence or absence of striations led to a 65% misclassification rate. Use of the classification tree and cross-validation methods and inclusion of the mean interstriation distance decreased the error rate to c. 50%. PMID:22081951

Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

2012-03-01

285

Decision Analysis Tool to Compare Energy Pathways for Transportation  

SciTech Connect

With the goals of reducing greenhouse gas emissions, oil imports, and energy costs, a wide variety of automotive technologies are proposed to replace the traditional gasoline-powered internal combustion engine (g-ICE). Biomass is seen as an important domestic energy feedstock, and there are multiple pathways in which it can be linked to the transport sector. Contenders include the use of cellulosic ethanol from biomass to replace gasoline or the use of a biomass-fueled combined cycle electrical power generation facility in conjunction plug-in hybrid electric vehicles (PHEVs). This paper reviews a project that is developing a scenario decision analysis tool to assist policy makers, program managers, and others to obtain a better understanding of these uncertain possibilities and how they may interact over time.

Bloyd, Cary N.

2010-06-30

286

The use of current risk analysis tools evaluated towards preventing external domino accidents  

Microsoft Academic Search

Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that there are many appropriate techniques for any circumstance and the choice has become more

G. L. L. Reniers; W. Dullaert; B. J. M. Ale

2005-01-01

287

System-of-Systems Technology-Portfolio-Analysis Tool  

NASA Technical Reports Server (NTRS)

Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.

O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne

2012-01-01

288

Perfusion quantification using Gaussian process deconvolution.  

PubMed

The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated as a constraint in the method. The GPD method, which automatically estimates the noise level in each voxel, has the advantage that model parameters are optimized automatically. The GPD is compared to singular value decomposition (SVD) using a common threshold for the singular values, and to SVD using a threshold optimized according to the noise level in each voxel. The comparison is carried out using artificial data as well as data from healthy volunteers. It is shown that GPD is comparable to SVD with a variable optimized threshold when determining the maximum of the IRF, which is directly related to the perfusion. GPD provides a better estimate of the entire IRF. As the signal-to-noise ratio (SNR) increases or the time resolution of the measurements increases, GPD is shown to be superior to SVD. This is also found for large distribution volumes. PMID:12210944

Andersen, I K; Szymkowiak, A; Rasmussen, C E; Hanson, L G; Marstrand, J R; Larsson, H B W; Hansen, L K

2002-08-01

289

Abstract Interfaces for Data Analysis - Component Architecture for Data Analysis Tools  

SciTech Connect

The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualization), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organization, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimizing re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. This paper gives an overview of the architecture and design of the various components for data analysis as discussed in AIDA.

Barrand, Guy

2002-08-20

290

TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0  

NASA Technical Reports Server (NTRS)

The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K MS-DOS format diskette. TARGET was developed in 1991.

Ortiz, C. J.

1994-01-01

291

The Watershed Deposition Tool: A Tool for Incorporating Atmospheric Deposition in Watershed Analysis  

EPA Science Inventory

The tool for providing the linkage between air and water quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. The Watershed Deposition Tool (WDT) takes gridded output of at...

292

Limited-memory scaled gradient projection methods for real-time image deconvolution in microscopy  

NASA Astrophysics Data System (ADS)

Gradient projection methods have given rise to effective tools for image deconvolution in several relevant areas, such as microscopy, medical imaging and astronomy. Due to the large scale of the optimization problems arising in nowadays imaging applications and to the growing request of real-time reconstructions, an interesting challenge to be faced consists in designing new acceleration techniques for the gradient schemes, able to preserve their simplicity and low computational cost of each iteration. In this work we propose an acceleration strategy for a state-of-the-art scaled gradient projection method for image deconvolution in microscopy. The acceleration idea is derived by adapting a step-length selection rule, recently introduced for limited-memory steepest descent methods in unconstrained optimization, to the special constrained optimization framework arising in image reconstruction. We describe how important issues related to the generalization of the step-length rule to the imaging optimization problem have been faced and we evaluate the improvements due to the acceleration strategy by numerical experiments on large-scale image deconvolution problems.

Porta, F.; Zanella, R.; Zanghirati, G.; Zanni, L.

2015-04-01

293

Comparative Usability Study of Two Space Logistics Analysis Tools  

E-print Network

Future space exploration missions and campaigns will require sophisticated tools to help plan and analyze logistics. To encourage their use, space logistics tools must be usable: a design concept encompassing terms such ...

Lee, Chairwoo

294

Quantifying traces of tool use: a novel morphometric analysis of damage patterns on percussive tools.  

PubMed

Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

Caruana, Matthew V; Carvalho, Susana; Braun, David R; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W K

2014-01-01

295

Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools  

PubMed Central

Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

2014-01-01

296

Generalized Analysis Tools for Multi-Spacecraft Missions  

NASA Astrophysics Data System (ADS)

Analysis tools for multi-spacecraft missions like CLUSTER or MMS have been designed since the end of the 90's to estimate gradients of fields or to characterize discontinuities crossed by a cluster of spacecraft. Different approaches have been presented and discussed in the book "Analysis Methods for Multi-Spacecraft Data" published as Scientific Report 001 of the International Space Science Institute in Bern, Switzerland (G. Paschmann and P. Daly Eds., 1998). On one hand the approach using methods of least squares has the advantage to apply to any number of spacecraft [1] but is not convenient to perform analytical computation especially when considering the error analysis. On the other hand the barycentric approach is powerful as it provides simple analytical formulas involving the reciprocal vectors of the tetrahedron [2] but appears limited to clusters of four spacecraft. Moreover the barycentric approach allows to derive theoretical formulas for errors affecting the estimators built from the reciprocal vectors [2,3,4]. Following a first generalization of reciprocal vectors proposed by Vogt et al [4] and despite the present lack of projects with more than four spacecraft we present generalized reciprocal vectors for a cluster made of any number of spacecraft : each spacecraft is given a positive or nul weight. The non-coplanarity of at least four spacecraft with strictly positive weights is a necessary and sufficient condition for this analysis to be enabled. Weights given to spacecraft allow to minimize the influence of some spacecraft if its location or the quality of its data are not appropriate, or simply to extract subsets of spacecraft from the cluster. Estimators presented in [2] are generalized within this new frame except for the error analysis which is still under investigation. References [1] Harvey, C. C.: Spatial Gradients and the Volumetric Tensor, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 307-322, ISSI SR-001, 1998. [2] Chanteur, G.: Spatial Interpolation for Four Spacecraft: Theory, in: Analysis Methods for Multi-Spacecraft Data, G. Paschmann and P. Daly (eds.), pp. 371-393, ISSI SR-001, 1998. [3] Chanteur, G.: Accuracy of field gradient estimations by Cluster: Explanation of its dependency upon elongation and planarity of the tetrahedron, pp. 265-268, ESA SP-449, 2000. [4] Vogt, J., Paschmann, G., and Chanteur, G.: Reciprocal Vectors, pp. 33-46, ISSI SR-008, 2008.

Chanteur, G. M.

2011-12-01

297

Machine tool condition monitoring using workpiece surface texture analysis  

Microsoft Academic Search

. Tool wear affects the surface roughness dramatically. There is a very close correspondence between the geometrical features\\u000a imposed on the tool by wear and micro-fracture and the geometry imparted by the tool on to the workpiece surface. Since a\\u000a machined surface is the negative replica of the shape of the cutting tool, and reflects the volumetric changes in cutting-edge

Ashraf A. Kassim; M. A. Mannan; Ma Jing

2000-01-01

298

Enhancing the Curation of Botanical Data Using Text Analysis Tools  

E-print Network

. An evaluation study provides a comparison of this tool with the traditional methods currently used. Keywords has been created. The tool and interface are both applicable to many uses, but in order to evaluate the likely users of such a tool in evaluating the interface, therefore producing valid usability results

Edinburgh, University of

299

NCC: A Multidisciplinary Design/Analysis Tool for Combustion Systems  

NASA Technical Reports Server (NTRS)

A multi-disciplinary design/analysis tool for combustion systems is critical for optimizing the low-emission, high-performance combustor design process. Based on discussions between NASA Lewis Research Center and the jet engine companies, an industry-government team was formed in early 1995 to develop the National Combustion Code (NCC), which is an integrated system of computer codes for the design and analysis of combustion systems. NCC has advanced features that address the need to meet designer's requirements such as "assured accuracy", "fast turnaround", and "acceptable cost". The NCC development team is comprised of Allison Engine Company (Allison), CFD Research Corporation (CFDRC), GE Aircraft Engines (GEAE), NASA Lewis Research Center (LeRC), and Pratt & Whitney (P&W). This development team operates under the guidance of the NCC steering committee. The "unstructured mesh" capability and "parallel computing" are fundamental features of NCC from its inception. The NCC system is composed of a set of "elements" which includes grid generator, main flow solver, turbulence module, turbulence and chemistry interaction module, chemistry module, spray module, radiation heat transfer module, data visualization module, and a post-processor for evaluating engine performance parameters. Each element may have contributions from several team members. Such a multi-source multi-element system needs to be integrated in a way that facilitates inter-module data communication, flexibility in module selection, and ease of integration.

Liu, Nan-Suey; Quealy, Angela

1999-01-01

300

Tool for Sizing Analysis of the Advanced Life Support System  

NASA Technical Reports Server (NTRS)

Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

2005-01-01

301

New tools for the analysis and design of building envelopes  

SciTech Connect

We describe the integrated development of PowerDOE, a new version of the DOE-2 building energy analysis program, and the Building Design Advisor (BDA), a multimedia-based design tool that assists building designers with the concurrent consideration of multiple design solutions with respect to multiple design criteria. PowerDOE has a windows-based Graphical User Interface (GUI) that makes it easier to use than DOE-2, while retaining DOE-2`s calculation power and accuracy. BDA, with a similar GUI, is designed to link to multiple analytical models and databases. In its first release it is linked to PowerDOE and a Daylighting Analysis Module, as well as to a Case Studies Database and a Schematic Graphic Editor. These allow building designers to set performance goals and address key building envelope parameters from the initial, schematic phases of building design to the detailed specification of building components and systems required by PowerDOE. The consideration of the thermal performance of building envelopes through PowerDOE and BDA is integrated with non-thermal envelope performance aspects, such as daylighting, as well as with the performance of non-envelope building components and systems, such as electric lighting and HVAC. Future versions of BDA will support links to CAD and electronic product catalogs, as well as provide context-dependent design advice to improve performance.

Papamichael, K.; Winkelmann, F.C.; Buhl, W.F.; Chauvet, H. [and others

1994-08-01

302

Spectral Analysis Tool 6.2 for Windows  

NASA Technical Reports Server (NTRS)

Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

2006-01-01

303

Study of academic achievements using spatial analysis tools  

NASA Astrophysics Data System (ADS)

In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or failure for the new coming students in the following academic years. Keywords: Academic achievement, spatial analyst, GIS, Bologna.

Gonzlez, C.; Velilla, C.; Snchez-Girn, V.

2012-04-01

304

Towards robust deconvolution of low-dose perfusion CT: sparse perfusion deconvolution using online dictionary learning.  

PubMed

Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C

2013-05-01

305

Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning  

PubMed Central

Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422

Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.

2014-01-01

306

An Exploratory Analysis Tool for a Long-Term Video from a Stationary Camera  

E-print Network

An Exploratory Analysis Tool for a Long-Term Video from a Stationary Camera Ryoji Nogami, Buntarou analysis of a long-term video from a stationary camera. The tool consists of three key methods: spatial in the exploratory analysis of a long-term video taken with a stationary camera. This exploratory approach enables

Tanaka, Jiro

307

User Guide for the Advance Planning Risk Analysis Tool for Transportation Projects  

E-print Network

0-5478-P1 User Guide for the Advance Planning Risk Analysis Tool for Transportation Projects Carlos for the Advance Planning Risk Analysis (APRA) computer tool (program), which was developed to help participants .....................................................................................................8 3. How to Do a New Analysis

Texas at Austin, University of

308

RISK MANAGEMENT AND RISK ANALYSIS-BASED DECISION TOOLS FOR ATTACKS ON  

E-print Network

RISK MANAGEMENT AND RISK ANALYSIS- BASED DECISION TOOLS FOR ATTACKS ON ELECTRIC POWER Simonoff, J for Risk and Economic Analysis of Terrorism Events University of Southern California Los Angeles.usc.edu/create Report #04-004DRAFT #12;Risk Management and Risk Analysis-Based Decision Tools for Attacks on Electric

Wang, Hai

309

Deconvolution of infrared spectra beyond the Doppler limit  

NASA Astrophysics Data System (ADS)

It is shown that the deconvolution method of Van Cittert can be used reliably to enhance the effective spectral resolution by a factor of about 3 with data that exhibit a high SNR and in which base line variations have been eliminated. Deconvolution of a Doppler-limited spectrum of C6H6 measured on a difference-frequency laser system yielded linewidths of about 1.2 x 10 to the -3 per cm (compared with the Doppler width of 3.6 x 10 to the -3 per cm at 203 K). Extensive reliability tests of the deconvolution technique have been performed.

Pilva, J.; Pine, A. S.; Willson, P. D.

1980-06-01

310

Deconvolution of infrared spectra beyond the Doppler limit.  

PubMed

It is shown that the deconvolution method of Van Cittert can be used reliably to enhance the effective spectral resolution by a factor of ~3 with data that exhibit a high SNR (~10(3)) and in which base line variations have been eliminated. Deconvolution of a Doppler-limited spectrum of C(6)H(6) measured on a difference-frequency laser system yielded linewidths of ~1.2 x 10(-3) cm(-1) (compared with the Doppler width of 3.6 x 10(-3) cm(-1) at 203 K). Extensive reliability tests of the deconvolution technique have been performed. PMID:20221131

Pliva, J; Pine, A S; Willson, P D

1980-06-01

311

A Virtual Environment Task-Analysis Tool for the Creation of Virtual Art Exhibits  

Microsoft Academic Search

This paper describes the creation of a hypothetical virtual art exhibit using a virtual environment task analysis tool. The Virtual Environment Task Analysis Tool (VETAT-ART) is a paper-and-pencil tool developed to provide structure and guidance to the needs-analysis process that is essential to the development of lifelike virtual exhibits. To illustrate its potential usefulness, VETAT-ART is applied to the design

Anne Parent

1999-01-01

312

Detailed descriptions of new proof-of-concept Bluetooth security analysis tools and new security attacks  

Microsoft Academic Search

This report describes the details of two new proof-of-concept Bluetooth security analysis tools and two new attacks against Bluetooth security. On-Line PIN Cracking script is a security analysis tool for on-line Bluetooth device PIN cracking. Brute-Force BD ADDR Scanning script is a security analysis tool for brute-force discovery of the addresses of Bluetooth devices that want to be private. Scripts

Keijo M. J. Haataja

2005-01-01

313

Quantitative deconvolution of human thermal infrared emittance.  

PubMed

The bioheat transfer models conventionally employed in etiology of human thermal infrared (TIR) emittance rely upon two assumptions; universal graybody emissivity and significant transmission of heat from subsurface tissue layers. In this work, a series of clinical and laboratory experiments were designed and carried out to conclusively evaluate the validity of the two assumptions. Results obtained from the objective analyses of TIR images of human facial and tibial regions demonstrated significant variations in spectral thermophysical properties at different anatomic locations on human body. The limited validity of the two assumptions signifies need for quantitative deconvolution of human TIR emittance in clinical, psychophysiological and critical applications. A novel approach to joint inversion of the bioheat transfer model is also introduced, levering the deterministic temperature-dependency of proton resonance frequency in low-lipid human soft tissue for characterizing the relationship between subsurface 3D tissue temperature profiles and corresponding TIR emittance. PMID:23086533

Arthur, D T J; Khan, M M

2013-01-01

314

Deconvolution of mixed magnetism in multilayer graphene  

SciTech Connect

Magnetic properties of graphite modified at the edges by KCl and exfoliated graphite in the form of twisted multilayered graphene (<4 layers) are analyzed to understand the evolution of magnetic behavior in the absence of any magnetic impurities. The mixed magnetism in multilayer graphene is deconvoluted using Low field-high field hysteresis loops at different temperatures. In addition to temperature and the applied magnetic field, the density of edge state spins and the interaction between them decides the nature of the magnetic state. By virtue of magnetometry and electron spin resonance studies, we demonstrate that ferromagnetism is intrinsic and is due to the interactions among various paramagnetic centers. The strength of these magnetic correlations can be controlled by modifying the structure.

Swain, Akshaya Kumar [IITB-Monash Research Academy, Department of Metallurgical Engineering and Materials Science, IIT Bombay, Mumbai 400076 (India); Bahadur, Dhirendra, E-mail: dhirenb@iitb.ac.in [Department of Metallurgical Engineering and Materials Science, IIT Bombay, Mumbai 400076 (India)

2014-06-16

315

Ejecta distribution patterns at Meteor Crater, Arizona: On the applicability of lithologic end-member deconvolution for  

E-print Network

of the analysis that can be performed on thermal infrared data currently being returned from Earth orbit and techniques; 6225 Planetology: Solar System Objects: Mars; KEYWORDS: Meteor Crater, thermal infrared, remote-member deconvolution for spaceborne thermal infrared data of Earth and Mars Michael S. Ramsey Department of Geology

Ramsey, Michael

316

Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification  

NASA Astrophysics Data System (ADS)

Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

Lopes, R.; Vicente, D.; Silva, N.

2009-05-01

317

Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs  

SciTech Connect

In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.

Howard, M. [NSTec; Luttman, A. [NSTec; Fowler, M. [NSTec

2014-11-01

318

Analysis of the influence of tool dynamics in diamond turning  

SciTech Connect

This report describes the progress in defining the role of machine and interface dynamics on the surface finish in diamond turning. It contains a review of literature from conventional and diamond machining processes relating tool dynamics, material interactions and tool wear to surface finish. Data from experimental measurements of tool/work piece interface dynamics are presented as well as machine dynamics for the DTM at the Center.

Fawcett, S.C.; Luttrell, D.E.; Keltie, R.F.

1988-12-01

319

Thermal Management Tools for Propulsion System Trade Studies and Analysis  

NASA Technical Reports Server (NTRS)

Energy-related subsystems in modern aircraft are more tightly coupled with less design margin. These subsystems include thermal management subsystems, vehicle electric power generation and distribution, aircraft engines, and flight control. Tighter coupling, lower design margins, and higher system complexity all make preliminary trade studies difficult. A suite of thermal management analysis tools has been developed to facilitate trade studies during preliminary design of air-vehicle propulsion systems. Simulink blocksets (from MathWorks) for developing quasi-steady-state and transient system models of aircraft thermal management systems and related energy systems have been developed. These blocksets extend the Simulink modeling environment in the thermal sciences and aircraft systems disciplines. The blocksets include blocks for modeling aircraft system heat loads, heat exchangers, pumps, reservoirs, fuel tanks, and other components at varying levels of model fidelity. The blocksets have been applied in a first-principles, physics-based modeling and simulation architecture for rapid prototyping of aircraft thermal management and related systems. They have been applied in representative modern aircraft thermal management system studies. The modeling and simulation architecture has also been used to conduct trade studies in a vehicle level model that incorporates coupling effects among the aircraft mission, engine cycle, fuel, and multi-phase heat-transfer materials.

McCarthy, Kevin; Hodge, Ernie

2011-01-01

320

ADVISOR: a systems analysis tool for advanced vehicle modeling  

NASA Astrophysics Data System (ADS)

This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

321

The Effectiveness of Automated Static Analysis Tools for Fault Detection and Refactoring Prediction  

E-print Network

The Effectiveness of Automated Static Analysis Tools for Fault Detection and Refactoring Prediction of faults that caused failures, and refac- toring modifications. The results show that fewer than 3% of the detected faults correspond to the coding concerns re- ported by the ASA tools. ASA tools were more

Bieman, James M.

322

Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions  

E-print Network

Threat Analysis of Portable Hack Tools from USB Storage Devices and Protection Solutions Dung V hack tools. However, beside U3 technology, attackers also have another more flexible alternative, portable application or application virtualization, which allows a wide range of hack tools to be compiled

Halgamuge, Malka N.

323

Computer-aided circuit analysis tools for RFIC simulation: algorithms, features, and limitations  

Microsoft Academic Search

The design of the radio frequency (RF) section in a communication integrated circuit (IC) is a challenging problem. Although several computer-aided analysis tools are available for RFIC design, they are not effectively used, because there is a lack of understanding about their features and limitations. These tools provide fast simulation of RFICs. However, no single tool delivers a complete solution

Kartikeya Mayaram; David C. Lee; Shahriar Moinian; David A. Rich; Jaijeet Roychowdhury

2000-01-01

324

New Access and Analysis Tools for Voyager LECP Data  

NASA Astrophysics Data System (ADS)

The Low Energy Charged Particle (LECP) instruments on the Voyager 1 and 2 spacecraft have been returning unique scientific measurements since launching in 1977, most notably observations from the historic tour of the giant planets. As these spacecraft continue on their exit trajectories from the Solar system they have become an interstellar mission and have begun to probe the boundary between the heliosphere and the interstellar cloud and continue to make exciting discoveries. As the mission changed from one focused on discrete encounters to an open ended search for heliospheric boundaries and transitory disturbances, the positions and timing of which are not known, the data processing needs have changed. Open data policies and the push to draw data under the umbrella of emerging Virtual Observatories have added a data sharing component that was not a part of the original mission plans. We present our work in utilizing new, reusable software analysis tools to access legacy data in a way that leverages pre-existing data analysis techniques. We took an existing Applied Physics Laboratory application, Mission Independent Data Layer (MIDL) -- developed originally under a NASA Applied Information Research Program (AISRP) and subsequently used with data from Geotail, Cassini, IMP-8, ACE, Messenger, and New Horizons -- and applied it to Voyager data. We use the MIDL codebase to automatically generate standard data products such as daily summary plots and associated tabulated data that increase our ability to monitor the heliospheric environment on a regular basis. These data products will be publicly available and updated automatically and can be analyzed by the community using the ultra portable MIDL software launched from the data distribution website. The currently available LECP data will also be described with SPASE metadata and incorporated into the emerging Virtual Energetic Particle Observatory (VEPO).

Brown, L. E.; Hill, M. E.; Decker, R. B.; Cooper, J. F.; Krimigis, S. M.; Vandegriff, J. D.

2008-12-01

325

The analysis of crow population dynamics as a surveillance tool.  

PubMed

West Nile virus (WNV) infection, a zoonotic disease for which birds act as a reservoir, first appeared in North America in August 1999. It was first reported in Quebec in 2002. The Quebec surveillance system for WNV has several components, including the surveillance of mortality in corvid populations, which includes the American crow (Corvus brachyrhynchos). The main objectives of this study are to better understand the population dynamics of this species in Quebec and to evaluate the impact of WNV on these dynamics. We obtained observation data for living crows in this province for the period of 1990-2005 and then conducted a spectral analysis of these data. To study changes in crow population dynamics, the analysis was carried out before and after the appearance of WNV and space was divided in two different areas (urban and non-urban). Our results show the importance of cycles with periods of less than 1 year in non-urban areas and cycles with periods of greater than 1 year in urban areas in the normal population dynamics of the species. We obtained expected fluctuations in bird densities using an algorithm derived from spectral decomposition. When we compared these predictions with data observed after 2002, we found marked perturbations in population dynamics beginning in 2003 and lasting up to 2005. In the discussion, we present various hypotheses based on the behaviour of the American crow to explain the normal population dynamics observed in this species and the effect of type of area (urban versus non-urban). We also discuss how the predictive algorithm could be used as a disease surveillance tool and as a measure of the impact of a disease on wild fauna. PMID:19811623

Ludwig, A; Bigras-Poulin, M; Michel, P

2009-12-01

326

A measuring tool for tree-rings analysis  

NASA Astrophysics Data System (ADS)

A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

2013-04-01

327

Experimental analysis of change detection algorithms for multitooth machine tool fault detection  

NASA Astrophysics Data System (ADS)

This paper describes an industrial application of fault diagnosis method for a multitooth machine tool. Different statistical approaches have been used to detect and diagnose insert breakage in multitooth tools based on the analysis of electrical power consumption of the tool drives. Great effort has been made to obtain a robust method, able to avoid any needed re-calibration process, after, for example, a maintenance operation. From the point of view of maintenance costs, these multitooth tools are the most critical part of the machine tools used for mass production in the car industry. These tools integrate different kinds of machining operations and cutting conditions.

Reones, Anbal; de Miguel, Luis J.; Pern, Jos R.

2009-10-01

328

Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools  

NASA Technical Reports Server (NTRS)

A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

2006-01-01

329

Social dataset analysis and mapping tools for Risk Perception: resilience, people preparation and communication tools  

NASA Astrophysics Data System (ADS)

Perception has been identified as resource and part of the resilience of a community to disasters. Risk perception, if present, may determine the potential damage a household or community experience. Different levels of risk perception and preparedness can influence directly people's susceptibility and the way they might react in case of an emergency caused by natural hazards. In spite of the profuse literature about risk perception, works to spatially portray this feature are really scarce. The spatial relationship to danger or hazard is being recognised as an important factor of the risk equation; it can be used as a powerful tool either for better knowledge or for operational reasons (e.g. management of preventive information). Risk perception and people's awareness when displayed in a spatial format can be useful for several actors in the risk management arena. Local authorities and civil protection can better address educational activities to increase the preparation of particularly vulnerable groups of clusters of households within a community. It can also be useful for the emergency personal in order to optimally direct the actions in case of an emergency. In the framework of the Marie Curie Research Project, a Community Based Early Warning System (CBEWS) it's been developed in the Mountain Community Valtellina of Tirano, northern Italy. This community has been continuously exposed to different mass movements and floods, in particular, a large event in 1987 which affected a large portion of the valley and left 58 dead. The actual emergency plan for the study area is composed by a real time, highly detailed, decision support system. This emergency plan contains detailed instructions for the rapid deployment of civil protection and other emergency personal in case of emergency, for risk scenarios previously defined. Especially in case of a large event, where timely reaction is crucial for reducing casualties, it is important for those in charge of emergency management, to know in advance the different levels of risk perception and preparedness existing among several sectors of the population. Knowing where the most vulnerable population is located may optimize the use of resources, better direct the initial efforts and organize the evacuation and attention procedures. As part of the CBEWS, a comprehensive survey was applied in the study area to measure, among others features, the levels of risk perception, preparation and information received about natural hazards. After a statistical and direct analysis on a complete social dataset recorded, a spatial information distribution is actually in progress. Based on boundaries features (municipalities and sub-districts) of Italian Institute of Statistics (ISTAT), a local scale background has been granted (a private address level is not accessible for privacy rules so the local districts-ID inside municipality has been the detail level performed) and a spatial location of the surveyed population has been completed. The geometric component has been defined and actually it is possible to create a local distribution of social parameters derived from perception questionnaries results. A lot of raw information and social-statistical analysis offer different mirror and "visual concept" of risk perception. For this reason a concrete complete GeoDB is under working for the complete organization of the dataset. By a technical point of view the environment for data sharing is based on a complete open source web-service environment, to offer manually-made and user-friendly interface to this kind of information. Final aim is to offer different switches of dataset, using the same scale prototype and data hierarchical structure, to provide and compare social location of risk perception in the most detailed level.

Peters-Guarin, Graciela; Garcia, Carolina; Frigerio, Simone

2010-05-01

330

Efficient implementation of spatially-varying 3D ultrasound deconvolution  

E-print Network

1 Efficient implementation of spatially-varying 3D ultrasound deconvolution Henry Gomersall, David Hodgson, Richard Prager, Nick Kingsbury, Graham Treece, Andrew Gee Abstract--There are sometimes occasions

Kingsbury, Nick

331

Tool Chatter Monitoring in Turning Operations Using Wavelet Analysis of Ultrasound Waves  

Microsoft Academic Search

This paper presents a new method for tool chatter monitoring using the wavelet analysis of ultrasound waves. Ultrasound waves\\u000a are pulsed through the cutting tool towards the nose and are reflected back off the cutting edge. Fluctuating states of contact\\u000a and non-contact between the tool insert and the workpiece, which are generated as a result of tool chatter, affect the

J. H. Lange; N. H. Abu-Zahra

2002-01-01

332

Deconvolution of infrared spectra beyond the Doppler limit  

Microsoft Academic Search

It is shown that the deconvolution method of Van Cittert can be used reliably to enhance the effective spectral resolution by a factor of about 3 with data that exhibit a high SNR and in which base line variations have been eliminated. Deconvolution of a Doppler-limited spectrum of C6H6 measured on a difference-frequency laser system yielded linewidths of about 1.2

J. Pilva; A. S. Pine; P. D. Willson

1980-01-01

333

FAST GEM WAVELET-BASED IMAGE DECONVOLUTION ALGORITHM  

Microsoft Academic Search

The paper proposes a new wavelet-based Bayesian approach to image deconvolution, under the space-invariant blur and ad- ditive white Gaussian noise assumptions. Image deconvolution exploits the well known sparsity of the wavelet coefficients, de- scribed by heavy-tailed priors. The present approach admits any prior given by a linear (finite of infinite) combination of Gaussian densities. To compute the maximum a

M. B. Dias; Torre Norte

2003-01-01

334

MTpy - Python Tools for Magnetotelluric Data Processing and Analysis  

NASA Astrophysics Data System (ADS)

We present the Python package MTpy, which provides functions for the processing, analysis, and handling of magnetotelluric (MT) data sets. MT is a relatively immature and not widely applied geophysical method in comparison to other geophysical techniques such as seismology. As a result, the data processing within the academic MT community is not thoroughly standardised and is often based on a loose collection of software, adapted to the respective local specifications. We have developed MTpy to overcome problems that arise from missing standards, and to provide a simplification of the general handling of MT data. MTpy is written in Python, and the open-source code is freely available from a GitHub repository. The setup follows the modular approach of successful geoscience software packages such as GMT or Obspy. It contains sub-packages and modules for the various tasks within the standard work-flow of MT data processing and interpretation. In order to allow the inclusion of already existing and well established software, MTpy does not only provide pure Python classes and functions, but also wrapping command-line scripts to run standalone tools, e.g. modelling and inversion codes. Our aim is to provide a flexible framework, which is open for future dynamic extensions. MTpy has the potential to promote the standardisation of processing procedures and at same time be a versatile supplement for existing algorithms. Here, we introduce the concept and structure of MTpy, and we illustrate the workflow of MT data processing, interpretation, and visualisation utilising MTpy on example data sets collected over different regions of Australia and the USA.

Krieger, Lars; Peacock, Jared; Thiel, Stephan; Inverarity, Kent; Kirkby, Alison; Robertson, Kate; Soeffky, Paul; Didana, Yohannes

2014-05-01

335

AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool  

USGS Publications Warehouse

Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

Halford, Keith

2009-01-01

336

AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool  

SciTech Connect

Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

Keith J. Halford

2009-10-01

337

Online Analysis of Wind and Solar Part I: Ramping Tool  

SciTech Connect

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. This tool predicts and displays additional capacity and ramping requirements caused by uncertainties in forecasts of loads and renewable generation. The tool is currently operational in the CAISO operations center. This is one of two final reports on the project.

Etingov, Pavel V.; Ma, Jian; Makarov, Yuri V.; Subbarao, Krishnappa

2012-01-31

338

Online Analysis of Wind and Solar Part II: Transmission Tool  

SciTech Connect

To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the tool has been developed and implemented in software.

Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian; Subbarao, Krishnappa

2012-01-31

339

Security EvaBio: An Analysis Tool for the Security Evaluation of Biometric Authentication Systems  

E-print Network

and vulnerabilities of biometric systems is given in Section 4. The security analysis tool is then presentedSecurity EvaBio: An Analysis Tool for the Security Evaluation of Biometric Authentication Systems Biometric systems present several drawbacks that may significantly decrease their utility. Nowadays, several

Paris-Sud XI, Universit de

340

DMT-TAFM: a data mining tool for technical analysis of futures market  

Microsoft Academic Search

Technical analysis of financial markets describes many patterns of market behavior. For practical use, all these descriptions need to be adjusted for each particular trading session. In this paper, we develop a data mining tool for technical analysis of the futures markets (DMT-TAFM), which dynamically generates rules based on the notion of the price pattern similarity. The tool consists of

Vladimir Stepanov; Archana Sathaye

2002-01-01

341

SEE: a tool for the visualization and analysis of rodent exploratory behavior  

E-print Network

for the exploration of exploratory behavior. The raw data for SEE are a time series of the animal `s coordinatesSEE: a tool for the visualization and analysis of rodent exploratory behavior Dan Drai, Ilan Golani behavior creates a need for a visualization and analysis tool that will highlight regularities and help

Golani, Ilan

342

Abstract Title: Image Informatics Tools for the Analysis of Retinal Images  

E-print Network

Abstract Title: Image Informatics Tools for the Analysis of Retinal Images Presentation Start for Bioimage Informatics, C Dept. of Electrical and Computer Engineering, 1 University of California Santa/image analysis: non-clinical Purpose: To develop software tools that can be used for the enhancement

California at Santa Barbara, University of

343

DEVELOPMENT OF AN ANALYSIS TOOL FOR THE DESIGN OF BONDED COMPOSITE REPAIRS  

Microsoft Academic Search

As part of the programme a design and analysis tool for bonded composite repairs has been developed. The repair design tool runs on a normal PC under Microsoft Office Excel, which is easily accessible for most people. A wide variety of joint designs, including external patches and scarf repairs, can be specified via a s imple-to-use input interface. The analysis

R. J. C. Creemers

344

Applied Climate-Change Analysis: The Climate Wizard Tool  

PubMed Central

Background Although the message of global climate change is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org) that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. Methodology/Principal Findings To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies) in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 19512002 occurred in northern hemisphere countries (especially during JanuaryApril), but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50N during February-March to 10N during August-September. Precipitation decreases occurred most commonly in countries between 020N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs) for 20702099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. Conclusions/Significance The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally-specific analyses of climate change. Moreover, Climate Wizard is not a static product, but rather a data analysis framework designed to be used for climate change impact and adaption planning, which can be expanded to include other information, such as downscaled future projections of hydrology, soil moisture, wildfire, vegetation, marine conditions, disease, and agricultural productivity. PMID:20016827

Girvetz, Evan H.; Zganjar, Chris; Raber, George T.; Maurer, Edwin P.; Kareiva, Peter; Lawler, Joshua J.

2009-01-01

345

Nonlinear deconvolution of hyperspectral data with MCMC for studying the kinematics of galaxies.  

PubMed

Hyperspectral imaging has been an area of active research in image processing and analysis for more than 10 years, mainly for remote sensing applications. Astronomical ground-based hyperspectral imagers offer new challenges to the community, which differ from the previous ones in the nature of the observed objects, but also in the quality of the data, with a low signal-to-noise ratio and a low resolution, due to the atmospheric turbulence. In this paper, we focus on a deconvolution problem specific to hyperspectral astronomical data, to improve the study of the kinematics of galaxies. The aim is to estimate the flux, the relative velocity, and the velocity dispersion, integrated along the line-of-sight, for each spatial pixel of an observed galaxy. Thanks to the Doppler effect, this is equivalent to estimate the amplitude, center, and width of spectral emission lines, in a small spectral range, for every spatial pixel of the hyperspectral data. We consider a parametric model for the spectral lines and propose to compute the posterior mean estimators, in a Bayesian framework, using Monte Carlo Markov chain algorithms. Various estimation schemes are proposed for this nonlinear deconvolution problem, taking advantage of the linearity of the model with respect to the flux parameters. We differentiate between methods taking into account the spatial blurring of the data (deconvolution) or not (estimation). The performances of the methods are compared with classical ones, on two simulated data sets. It is shown that the proposed deconvolution method significantly improves the resolution of the estimated kinematic parameters. PMID:25073172

Villeneuve, Emma; Carfantan, Herv

2014-10-01

346

An Integrated Traverse Planner and Analysis Tool for Planetary Exploration  

E-print Network

Future planetary explorations will require surface traverses of unprecedented frequency, length, and duration. As a result, there is need for exploration support tools to maximize productivity, scientific return, and safety. ...

Johnson, Aaron William

347

Evaluation of a Surface Exploration Traverse Analysis and Navigation Tool  

E-print Network

SEXTANT is an extravehicular activity (EVA) mission planner tool developed in MATLAB, which computes the most efficient path between waypoints across a planetary surface. The traverse efficiency can be optimized around ...

Gilkey, Andrea L.

348

Novel tools for sequence and epitope analysis of glycosaminoglycans  

E-print Network

Our understanding of glycosaminoglycan (GAG) biology has been limited by a lack of sensitive and efficient analytical tools designed to deal with these complex molecules. GAGs are heterogeneous and often sulfated linear ...

Behr, Jonathan Robert

2007-01-01

349

GENOME RESOURCES AND COMPARATIVE ANALYSIS TOOLS FOR CARDIOVASCULAR RESEARCH  

Technology Transfer Automated Retrieval System (TEKTRAN)

Disorders of the cardiovascular (CV) system are often caused by the interaction of genetic and environmental factors that jointly contribute to individual susceptibility. Genomic data and bioinformatics tools generated from genome projects, coupled with functional verification, offer novel approache...

350

Development of Harmonic Excitation technique for Machine Tool stability analysis  

Microsoft Academic Search

The project described in this thesis was to establish the instrumentation and technique for analysing stability of machine-tools against chatter by harmonic excitation. To test out the technique, two sets of experiments were performed on centre lathes:\\u000a1) comparison of cutting stability with tour different types of boring bars, and\\u000a2) comparison of cutting stability of a tool oriented in

Michael King-Chun Lau

1973-01-01

351

pathFinder: A Static Network Analysis Tool for Pharmacological Analysis of Signal Transduction Pathways  

NSDL National Science Digital Library

The study of signal transduction is becoming a de facto part of the analysis of gene expression and protein profiling techniques. Many online tools are used to cluster genes in various ways or to assign gene products to signal transduction pathways. Among these, pathFinder is a unique tool that can find signal transduction pathways between first, second, or nth messengers and their targets within the cell. pathFinder can identify qualitatively all possible signal transduction pathways connecting any starting component and target within a database of two-component pathways (directional dyads). One or more intermediate pathway components can be excluded to simulate the use of pharmacological inhibitors or genetic deletion (knockout). Missing elements in a pathway connecting the activator or initiator and target can also be inferred from a null pathway result. The value of this static network analysis tool is illustrated by the predication from pathFinder analysis of a novel cyclic AMPdependent, protein kinase Aindependent signaling pathway in neuroendocrine cells, which has been experimentally confirmed.

Babru B. Samal (NIH; National Institute of Mental Health--Intramural Research Programs (NIMH-IRP) Bioinformatics Core REV)

2008-08-05

352

pathFinder: A Static Network Analysis Tool for Pharmacological Analysis of Signal Transduction Pathways  

PubMed Central

The study of signal transduction is becoming a de facto part of the analysis of gene expression and protein profiling techniques. Many online tools are used to cluster genes in various ways or to assign gene products to signal transduction pathways. Among these, pathFinder is a unique tool that can find signal transduction pathways between first, second, or nth messengers and their targets within the cell. pathFinder can identify qualitatively all possible signal transduction pathways connecting any starting component and target within a database of two-component pathways (directional dyads). One or more intermediate pathway components can be excluded to simulate the use of pharmacological inhibitors or genetic deletion (knockout). Missing elements in a pathway connecting the activator or initiator and target can also be inferred from a null pathway result. The value of this static network analysis tool is illustrated by the predication from pathFinder analysis of a novel cyclic AMPdependent, protein kinase Aindependent signaling pathway in neuroendocrine cells, which has been experimentally confirmed. PMID:18682604

Samal, Babru B.; Eiden, Lee E.

2009-01-01

353

Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches  

NASA Astrophysics Data System (ADS)

We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

Cazzani, Antonio; Malag, Marcello; Turco, Emilio

2014-12-01

354

Tools-4-Metatool (T4M): online suite of web-tools to process stoichiometric network analysis data from Metatool.  

PubMed

Tools-4-Metatool (T4M) is a suite of web-tools, implemented in PERL, which analyses, parses, and manipulates files related to Metatool. Its main goal is to assist the work with Metatool. T4M has two major sets of tools: Analysis and Compare. Analysis visualizes the results of Metatool (convex basis, elementary flux modes, and enzyme subsets) and facilitates the study of metabolic networks. It is composed of five tools: MDigraph, MetaMatrix, CBGraph, EMGraph, and SortEM. Compare was developed to compare different Metatool results from different networks. This set consists of: Compara and ComparaSub which compare network subsets providing outputs in different formats and ComparaEM that seeks for identical elementary modes in two metabolic networks. The suite T4M also includes one script that generates Metatool input: CBasis2Metatool, based on a Metatool output file that is filtered by a list of convex basis' metabolites. Finally, the utility CheckMIn checks the consistency of the Metatool input file. T4M is available at http://solea.quim.ucm.es/t4m. PMID:21554926

Xavier, Daniela; Vzquez, Sara; Higuera, Clara; Morn, Federico; Montero, Francisco

2011-08-01

355

Irises: A practical tool for image-based analysis of cellular DNA content.  

PubMed

The DNA content of nuclei is a valuable measure of cell cycle status. Irises is a software tool to facilitate systematic in situ determination of DNA content for cell cycle analysis at single-nucleus resolution within complex tissues. We demonstrate the utility of the tool with analysis of DNA content in germline nuclei of C. elegans. Compared with results obtained by manual analysis, we find the tool greatly facilitates analysis by improving speed at least 5-fold while maintaining accuracy. The source code and instruction manual (including installation for both Mac and PC) are provided. PMID:25254149

Vogel, Julia L Moore; Michaelson, David; Santella, Anthony; Hubbard, E Jane Albert; Bao, Zhirong

2014-01-01

356

Teaching Advanced Data Analysis Tools to High School Astronomy Students  

NASA Astrophysics Data System (ADS)

A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed instruction tailored to their experience level along with proper support and mentoring.This project was funded by a grant from the National Science Foundation, Grant # PHY1157078.

Black, David V.; Herring, Julie; Hintz, Eric G.

2015-01-01

357

Affinity-based target deconvolution of safranal  

PubMed Central

Background and the purpose of the study Affinity-based target deconvolution is an emerging method for the identification of interactions between drugs/drug candidates and cellular proteins, and helps to predict potential activities and side effects of a given compound. In the present study, we hypothesized that a part of safranal pharmacological effects, one of the major constituent of Crocus sativus L., relies on its physical interaction with target proteins. Methods Affinity chromatography solid support was prepared by covalent attachment of safranal to agarose beads. After passing tissue lysate through the column, safranal-bound proteins were isolated and separated on SDS-PAGE or two-dimensional gel electrophoresis. Proteins were identified using MALDI-TOF/TOF mass spectrometry and Mascot software. Results and major conclusion Data showed that safranal physically binds to beta actin, cytochrome b-c1 complex sub-unit 1, trifunctional enzyme sub-unit beta and ATP synthase sub-unit alpha and beta. These interactions may explain part of safranals pharmacological effects. However, phenotypic and/or biological relevance of these interactions remains to be elucidated by future pharmacological studies. PMID:23514587

2013-01-01

358

OutbreakTools: a new platform for disease outbreak analysis using the R software.  

PubMed

The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

Jombart, Thibaut; Aanensen, David M; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Hhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

2014-06-01

359

OutbreakTools: A new platform for disease outbreak analysis using the R software  

PubMed Central

The investigation of infectious disease outbreaks relies on the analysis of increasingly complex and diverse data, which offer new prospects for gaining insights into disease transmission processes and informing public health policies. However, the potential of such data can only be harnessed using a number of different, complementary approaches and tools, and a unified platform for the analysis of disease outbreaks is still lacking. In this paper, we present the new R package OutbreakTools, which aims to provide a basis for outbreak data management and analysis in R. OutbreakTools is developed by a community of epidemiologists, statisticians, modellers and bioinformaticians, and implements classes and methods for storing, handling and visualizing outbreak data. It includes real and simulated outbreak datasets. Together with a number of tools for infectious disease epidemiology recently made available in R, OutbreakTools contributes to the emergence of a new, free and open-source platform for the analysis of disease outbreaks. PMID:24928667

Jombart, Thibaut; Aanensen, David M.; Baguelin, Marc; Birrell, Paul; Cauchemez, Simon; Camacho, Anton; Colijn, Caroline; Collins, Caitlin; Cori, Anne; Didelot, Xavier; Fraser, Christophe; Frost, Simon; Hens, Niel; Hugues, Joseph; Hhle, Michael; Opatowski, Lulla; Rambaut, Andrew; Ratmann, Oliver; Soubeyrand, Samuel; Suchard, Marc A.; Wallinga, Jacco; Ypma, Rolf; Ferguson, Neil

2014-01-01

360

Automation Tools for Finite Element Analysis of Adhesively Bonded Joints  

NASA Technical Reports Server (NTRS)

This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

2002-01-01

361

Research on dynamic tools integration model for sea battlefield environmental analysis  

Microsoft Academic Search

A dynamic analysis tools integration model based on geo-workflow is presented, through the analysis of the sea battlefield environmental analysis process. Geo-workflow is a new technology to manage spatial processes, but its methodology, especially scheduling of data flow, is till weak. In the paper, the traditional workflow model is extended and a modeling method of environmental analysis data flow is

Zhe Gan; Xiaoan Tang; Boning Ma; Huan Li; Hong Chen

2008-01-01

362

Deconvolution of dynamic dual photon microscopy images of cerebral microvasculature to assess the hemodynamic status of the brain  

NASA Astrophysics Data System (ADS)

Assessing the hemodynamic status of the brain and its variations in response to stimulations is required to understand the local cerebral circulatory mechanisms. Dynamic contrast enhanced imaging of cerebral microvasculature provides information that can be used in understanding physiology of cerebral diseases. Bolus tracking is used to extract characteristic parameters that quantify local cerebral blood flow. However, post-processing of the data is needed to segment the field of view (FOV) and to perform deconvolution to remove the effects of input bolus profile and the path it travels to reach the imaging window. Finding the arterial input function (AIF) and dealing with the ill-posedness of deconvolution system make this process are the main challenges. We propose using ICA to segment the FOV and to extract a local AIF as well as the venous output function that is required for deconvolution. This also helps to stabilize the system as ICA suppresses noise efficiently. Tikhoniv regularization (with L-curve analysis to find the best regularization parameter) is used to make the system stable. In-vivo dynamic 2PLSM images of a rat brain in two conditions (when the animal is at rest and when it is stimulated) are used in this study. The experimental along with the simulation studies provided promising results that demonstrate the feasibility and importance of performing deconvolution.

Mehrabian, Hatef; Lindvere, Liis; Stefanovic, Bojana; Martel, Anne L.

2011-03-01

363

Mapping and spatiotemporal analysis tool for hydrological data: Spellmap  

Technology Transfer Automated Retrieval System (TEKTRAN)

Lack of data management and analyses tools is one of the major limitations to effectively evaluate and use large datasets of high-resolution atmospheric, surface, and subsurface observations. High spatial and temporal resolution datasets better represent the spatiotemporal variability of hydrologica...

364

Usability tool for analysis of web designs using mouse tracks  

Microsoft Academic Search

This paper presents MouseTrack as a web logging system that tracks mouse movements on websites. The system includes a visualization tool that displays the mouse cursor path followed by website visitors. It helps web site administrators run usability tests and analyze the collected data. Practitioners can track any existing webpage by simply entering its URL. This paper includes a design

Ernesto Arroyo; Ted Selker; Willy Wei

2006-01-01

365

Ris-R-1359(EN) Fractography analysis of tool samples  

E-print Network

three steels have the same nominal composition of alloying elements. The failure in both types-speed tool steels 6 Heat treatment 7 Effect of alloying elements 8 Powder metallurgy 10 Third generation P through finite element modelling (FEM) (See for example [3]), material selection, heat treat- ment

366

A new perspective and analysis for regenerative machine tool chatter  

Microsoft Academic Search

The paper contains a practical perspective on regenerative machine tool chatter. Chatter is a well known phenomenon, occurrence of which is undesired in manufacturing. Aggressive machining conditions, in the sense of removing more metal rapidly, usually cause chatter. In most cases, these conditions can be determined a priori to the operation. A chatter stability study and its reasoning based on

Nejat Olgac; Martin Hosek

1998-01-01

367

A requirements analysis for videogame design support tools  

Microsoft Academic Search

Designing videogames involves weaving together systems of rules, called game mechanics, which support and str ucture com- pelling player experiences. Thus a significant port ion of game design involves reasoning about the effects of diff erent potential game mechanics on player experience. Unlike some design fields, such as architecture and mechanical design, that ha ve CAD tools to support designers

Mark J. Nelson; Michael Mateas

2009-01-01

368

Regional energy planning through SWOT analysis and strategic planning tools  

Microsoft Academic Search

Strategic planning processes, which are commonly used as a tool for region development and territorial structuring, can be harnessed by politicians and public administrations, at the local level, to redesign the regional energy system and encourage renewable energy development and environmental preservation. In this sense, the province of Jan, a southern Spanish region whose economy is mainly based on olive

J. Terrados; G. Almonacid; L. Hontoria

2007-01-01

369

Protected marine reserves as fisheries management tools: a bioeconomic analysis  

Microsoft Academic Search

This paper develops a dynamic computational bioeconomic model with the objective of assessing protected marine reserves as fisheries management tools. Data on the North East Atlantic cod stock are used to determine the bioeconomically optimal size of a marine reserve for the Barents Sea cod fishery, as a function of the net transfer rate between the protected and unprotected areas

Ussif Rashid Sumaila

1998-01-01

370

PDAs as Lifelong Learning Tools: An Activity Theory Based Analysis  

ERIC Educational Resources Information Center

This paper describes the use of an activity theory (AT) framework to analyze the ways that distance part time learners and mobile workers adapted and appropriated mobile devices for their activities and in turn how their use of these new tools changed the ways that they carried out their learning or their work. It is argued that there are two key

Waycott, Jenny; Jones, Ann; Scanlon, Eileen

2005-01-01

371

INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION  

EPA Science Inventory

Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

372

Tools for Education Policy Analysis [with CD-ROM].  

ERIC Educational Resources Information Center

This manual contains a set of tools to assist policymakers in analyzing and revamping educational policy. Its main focus is on some economic and financial aspects of education and selected features in the arrangements for service delivery. Originally offered as a series of training workshops for World Bank staff to work with clients in the

Mingat, Alain; Tan, Jee-Peng

373

FUTURE POWER GRID INITIATIVE Market Design Analysis Tool  

E-print Network

Infrastructure Operations Center (EIOC), the Pacific Northwest National Laboratory's (PNNL) national electric The Future Power Grid Initiative (FPGI) will deliver next-generation concepts and tools for grid operation and planning and ensure a more secure, efficient and reliable future grid. Building on the Electricity

374

An advanced image analysis tool for the quantification and characterization of breast cancer in microscopy images.  

PubMed

The paper presents an advanced image analysis tool for the accurate and fast characterization and quantification of cancer and apoptotic cells in microscopy images. The proposed tool utilizes adaptive thresholding and a Support Vector Machines classifier. The segmentation results are enhanced through a Majority Voting and a Watershed technique, while an object labeling algorithm has been developed for the fast and accurate validation of the recognized cells. Expert pathologists evaluated the tool and the reported results are satisfying and reproducible. PMID:25681102

Goudas, Theodosios; Maglogiannis, Ilias

2015-03-01

375

Applications of custom developed object based analysis tool: Precipitation in Pacific, Tropical cyclones precipitation, Hail areas  

NASA Astrophysics Data System (ADS)

In the last few years an object-based analysis software tool was developed at University of Ljubljana in collaboration with National Center for Atmospheric Research (NCAR). The tool was originally based on ideas of the Method for Object-Based Diagnostic Evaluation (MODE) developed by NCAR but has since evolved and changed considerably and is now available as a separate free software package. The software is called the Forward in Time object analysis tool (FiT tool). The software was used to analyze numerous datasets - mainly focusing on precipitation. Climatology of satellite and model precipitation in the low-and-mid latitude Pacific Ocean was performed by identifying and tracking of individual perception systems and estimating their lifespan, movement and size. A global climatology of tropical cyclone precipitation was performed using satellite data and tracking and analysis of areas with hail in Slovenia was performed using radar data. The tool will be presented along with some results of applications.

Skok, Gregor; Rakovec, Joe; Strajnar, Benedikt; Bacmeister, Julio; Tribbia, Joe

2014-05-01

376

CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS  

EPA Science Inventory

Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

377

Forensic Analysis of Windows Hosts Using UNIX-based Tools  

SciTech Connect

Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linux operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.

Cory Altheide

2004-07-19

378

MultiAlign: a multiple LC-MS analysis tool for targeted omics analysis  

SciTech Connect

MultiAlign is a free software tool that aligns multiple liquid chromatography-mass spectrometry datasets to one another by clustering mass and LC elution features across datasets. Applicable to both label-free proteomics and metabolomics comparative analyses, the software can be operated in several modes. Clustered features can be matched to a reference database to identify analytes, used to generate abundance profiles, linked to tandem mass spectra based on parent precursor masses, and culled for targeted liquid chromatography-tandem mass spectrometric analysis. MultiAlign is also capable of tandem mass spectral clustering to describe proteome structure and find similarity in subsequent sample runs.

Lamarche, Brian L.; Crowell, Kevin L.; Jaitly, Navdeep; Petyuk, Vladislav A.; Shah, Anuj R.; Polpitiya, Ashoka D.; Sandoval, John D.; Kiebel, Gary R.; Monroe, Matthew E.; Callister, Stephen J.; Metz, Thomas O.; Anderson, Gordon A.; Smith, Richard D.

2013-02-12

379

The Mission Planning Lab: A Visualization and Analysis Tool  

NASA Technical Reports Server (NTRS)

Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

Daugherty, Sarah C.; Cervantes, Benjamin W.

2009-01-01

380

Glow Discharge as a Tool for Surface and Interface Analysis  

Microsoft Academic Search

This review article focuses on the analytical capabilities of glow discharge optical emission spectrometry (GD?OES) and mass spectrometry (GD?MS) to perform compositional depth profiling (GD?CDP). The properties of the Grimm?type glow discharge as well as basic processes of sputtering are described and their influence on the GD as a surface and interface analytical tool are discussed. A series of examples

Thomas Nelis; Jozsef Pallosi

2006-01-01

381

An educational tool for load flow analysis with electromechanical interface  

SciTech Connect

The paper describes a new combination of hardware and software being developed at Georgia Tech as a learning tool for power system and a number of adjustable parameters (potentiometers). The manual parameter adjustment is acquired by a personal computer, which runs the software and complements the parameters from a data base. A variety of simple tests and small projects may be derived from such hardware/software configuration. Some of them are described in the text.

McCall, S.; Begovic, M.; Meliopoulos, A.P.S. [Georgia Inst. of Tech., Atlanta, GA (United States). School of Electrical and Computer Engineering

1994-12-31

382

Stress Analysis in Machining with the Use of Sapphire Tools  

Microsoft Academic Search

Stress birefringence in sapphire tools has been used to determine the stress boundary conditions in machining. Steel and brass specimens were machined orthogonally at speeds of up to 75 m min-1 at a maximum feed rate of 0.381 mm per revolution to study the effect of speed and feed rate on stress distributions. The shear-difference method was used to calculate

A. Bagchi; P. K. Wright

1987-01-01

383

Analysis of Mutation Testing Tools Johnathan Snyder, Department of Computer Science, University of Alabama  

E-print Network

Analysis of Mutation Testing Tools Johnathan Snyder, Department of Computer Science, University@crimson.ua.edu Conclusion In my research, I studied three mutation testing tools for Java: MuJava, Jumble, and PIT. All of them use byte level mutation which speeds up the time it takes to generate the mutants and run

Gray, Jeffrey G.

384

Data stream management system: Tools for live stream handling & their application on trivial network analysis problems  

Microsoft Academic Search

This paper presents handling and analysis of network packets using data stream management system tool TelegraphCQ. The number of tools for analyzing data traffic in the Internet is continuously increasing, because there is an increasing need in many different application domains. The high volume data that flows within a network requires one to rethink the fundamental architecture of a DBMS

Nadeem Akhtar; Mohammed A Qadeer; Faraz Khan; Faridul Haque

2008-01-01

385

LC-TOOL-2003-015 1 Java Physics Generator and Analysis Modules  

E-print Network

LC-TOOL-2003-015 1 Java Physics Generator and Analysis Modules Michael T. Ronan LBNL, Berkeley, CA) physics event generators are used in de#12;ning a common generator interface package. Portable libraries provide high-level OO study tools. Complete physics generation, parallel detector simulations

386

A method of approximate tool wear analysis in cold roll forming  

Microsoft Academic Search

A method of approximate tool wear evaluation is proposed for cold roll forming (CRF). The method uses simple assumptions for approximate tool wear analysis and allows estimation of roll profile change caused by wear . Boundary conditions are obtained from the model based on a new relaxation method. The strip material obeys the rigid-perfect plastic model. A roll-strip sliding velocity

Alexander S. Galakhar; Paul A. Meehan; William J. T. Daniel; Shi Chao Ding

387

VMCAnalytic: Developing a Collaborative Video Analysis Tool for Education Faculty and Practicing Educators  

Microsoft Academic Search

This paper describes the genesis, design and prototype development of the VMCAnalytic, a repository-based video annotation and analysis tool for education. The VMCAnalytic is a flexible, extensible analytic tool that is unique in its integration into an open source repository architecture to transform a resource discovery environment into an interactive collaborative where practicing teachers and faculty researchers can analyze and

Grace Agnew; Chad M. Mills; Carolyn A. Maher

2010-01-01

388

Filtering capabilities and convergence of the Van-Cittert deconvolution technique  

Microsoft Academic Search

The authors demonstrate the application of the Van-Cittert technique in iterative frequency-domain deconvolution. The technique is shown to have built-in filtering capabilities which can be used successfully to produce optimum deconvolution estimates. The Bennia-Riad optimization criterion for iterative deconvolution is jointly used with the Van-Cittert technique to optimize the number of iterations required to achieve acceptable (optimum) deconvolution results. An

Abdelhak Bennia; Sedki M. Riad

1992-01-01

389

Accuracy of peak deconvolution algorithms within chromatographic integrators  

SciTech Connect

The soundness of present-day algorithms to deconvolve overlapping skewed peaks was investigated. From simulated studies based on the exponentially modified Gaussian model (EMG), chromatographic peak area inaccuracies for unresolved peaks are presented for the two deconvolution methods, the tangent skim and the perpendicular drop method. These inherent inaccuracies, in many cases exceeding 50%, are much greater than those calculated from ideal Gaussian profiles. Multiple linear regression (MLR) was used to build models that predict the relative error for either peak deconvolution method. MLR also provided a means for determining influential independent variables, defining the required chromatographic relationships needed for prediction. Once forecasted errors for both methods are calculated, selection of either peak deconvolution method can be made by minimum errors. These selection boundaries are contrasted to method selection criteria of present data systems algorithms.

Papas, A.N. (Food Drug Administration, Winchester, MA (USA)); Tougas, T.P. (Univ. of Lowell, MA (USA))

1990-02-01

390

Deconvolution of spectral line profiles: solution of the inversion problem  

NASA Astrophysics Data System (ADS)

We present a method for the deconvolution of spectral line profiles - the subtraction of the apparatus function. This inversion problem requires solution of the Fredholm integral equation of the first kind. For this purpose we suggest the use of B-splines. The efficiency of the approach is demonstrated using test examples as well as the deconvolution of the Hicons/Journals/Common/beta" ALT="beta" ALIGN="MIDDLE"/> spectral line profile, measured by means of the Fabry-Perot interferometer. The Hicons/Journals/Common/beta" ALT="beta" ALIGN="MIDDLE"/> line was emitted by a low temperature plasma in RF discharge burning in water vapour at reduced pressure. This deconvolution method is compared with the standard least squares method, where the initial profile is described by the Voigt function.

Brablec, A.; Trunec, D.; Stastn, F.

1999-08-01

391

Spectral semi-blind deconvolution with least trimmed squares regularization  

NASA Astrophysics Data System (ADS)

A spectral semi-blind deconvolution with least trimmed squares regularization (SBD-LTS) is proposed to improve spectral resolution. Firstly, the regularization term about the spectrum data is modeled as the form of least trimmed squares, which can help to preserve the peak details better. Then the regularization term about the PSF is modeled as L1-norm to enhance the stability of kernel estimation. The cost function of SBD-LTS is formulated and the numerical solution processes are deduced for deconvolving the spectra and estimating the PSF. The deconvolution results of simulated infrared spectra demonstrate that the proposed SBD-LTS can recover the spectrum effectively and estimate the PSF accurately, as well as has a merit on preserving the details, especially in the case of noise. The deconvolution result of experimental Raman spectrum indicates that SBD-LTS can resolve the spectrum and improve the resolution effectively.

Deng, Lizhen; Zhu, Hu

2014-11-01

392

The deconvolution of differential scanning calorimetry unfolding transitions.  

PubMed

This paper is a review of a process for deconvolution of unfolding thermal transitions measured by differential scanning calorimetry. The mathematical background is presented along with illustrations of how the unfolding data is processed to resolve the number of sequential transitions needed to describe an unfolding mechanism and to determine thermodynamic properties of the intermediate states. Examples of data obtained for a simple two-state unfolding of a G-quadruplex DNA structure derived from the basic human telomere sequence, (TTAGGG)4TT are used to present some of the basic issues in treating the DSC data. A more complex unfolding mechanism is also presented that requires deconvolution of a multistate transition, the unfolding of a related human telomere structure, (TTAGGG)12 TT. The intent of the discussion is to show the steps in deconvolution, and to present the data at each step to help clarify how the information is derived from the various mathematical manipulations. PMID:25498005

Spink, Charles H

2015-04-01

393

A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.  

PubMed

Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. PMID:24522836

Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jrgen

2014-07-01

394

Demonstration of the Rapid Assessment Tool: Analysis of Water Supply Conditions in the Harlingen Irrigation District  

E-print Network

RAT (Rapid Assessment Tool), currently under development, is a combination of surveys, data collection, mapping and limited direct measurement designed to provide a quick and cost-effective analysis of the conditions of the water distribution...

Leigh, E.; Fipps, G.

395

An integrated traverse planner and analysis tool for future lunar surface exploration  

E-print Network

This thesis discusses the Surface Exploration Traverse Analysis and Navigation Tool (SEXTANT), a system designed to help maximize productivity, scientific return, and safety on future lunar and planetary explorations,. The ...

Johnson, Aaron William

2010-01-01

396

Matrix: A statistical method and software tool for linguistic analysis through corpus comparison  

Microsoft Academic Search

Abstract Matrix: A statistical method and software tool for linguistic analysis through corpus comparison A thesis submitted to Lancaster University for the degree of Ph D in Computer Science Paul Edward Rayson, B Sc September 2002

P. Rayson

2003-01-01

397

Disaster SitRep -A Vertical Search Engine and Information Analysis Tool in Disaster Management Domain  

E-print Network

Disaster SitRep - A Vertical Search Engine and Information Analysis Tool in Disaster Management information delivering platform, the process of collecting, integrating, and analyzing disaster related information from diverse channels becomes more difficult and challenging. Further, information from multiple

Chen, Shu-Ching

398

Image analysis as a tool to facilitate selective breeding of quality traits in rainbow trout  

Microsoft Academic Search

A quantitative genetic analysis was performed to assess the suitability of automated image analysis of cutlets as a selection tool to genetically improve flesh composition and colour in large rainbow trout. Fish were reared on two diets with different lipid and protein content to assess the robustness of the image analysis method across different nutritional environments, and the strength of

A. Kause; L. H. Stien; K. Rungruangsak-Torrissen; O. Ritola; K. Ruohonen; A. Kiessling

2008-01-01

399

METABONOMICS AS A CLINICAL TOOL OF ANALYSIS, LCMS APPROACH  

Microsoft Academic Search

Metabolic differences between test and control groups (i.e., metabonomics) are routinely accomplished by using multivariate analysis for data obtained commonly from NMR, GC-MS and LC-MS. Multivariate analysis (e.g., principal component analysis PCA) is commonly used to extract potential metabolites responsible for clinical observations. Metabonomics applied to the clinical field is challenging because the physiological variabilities like gender, age, raceetc might

Muhammed Alzweiri; David Watson; John Parkinson

2012-01-01

400

Development of a task analysis tool to facilitate user interface design  

NASA Technical Reports Server (NTRS)

A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

Scholtz, Jean C.

1992-01-01

401

Powerful tool for design analysis of linear control systems  

SciTech Connect

The methods for designing linear controls for electronic or mechanical systems have been understood and put to practice. What has not been readily available to engineers, however, is a practical, quick and inexpensive method for analyzing these linear control (feedback) systems once they have been designed into the electronic or mechanical hardware. Now, the PET, manufactured by Commodore Business Machines (CBM), operating with several peripherals via the IEEE 488 Bus, brings to the engineer for about $4000 a complete set of office tools for analyzing these system designs.

Maddux, Jr, A S

1982-05-10

402

Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools  

NASA Technical Reports Server (NTRS)

A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

Orr, Stanley A.; Narducci, Robert P.

2009-01-01

403

High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis  

PubMed Central

The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

Simonyan, Vahan; Mazumder, Raja

2014-01-01

404

Analysis, Diagnosis, and Short-Range Forecast Tools  

NSDL National Science Digital Library

This lesson is divided into three sections. The first section discusses the importance of analysis and diagnosis in evaluating NWP in the forecast process. In section two, we discuss a methodology for dealing with discrepancies between both the official forecast and NWP compared to analysis and diagnosis. The third section shows a representative example of the methodology.

COMET

2010-09-15

405

Javelin Diagrams: A Graphical Tool for Probabilistic Sensitivity Analysis  

E-print Network

of a probabilistic sensitivity analysis. Here we introduce an analog of tornado diagrams for probabilistic sensitivity analysis, which we call javelin diagrams. Javelin diagrams are graphical augmentations of tornado). For instance, Clemen (1996, Ch.5) illustrates how tornado diagrams (e.g., Howard 1988) can be used to determine

Hazen, Gordon

406

Core Curriculum Analysis: A Tool for Educational Design  

ERIC Educational Resources Information Center

This paper examines the outcome of a dimensional core curriculum analysis. The analysis process was an integral part of an educational development project, which aimed to compact and clarify the curricula of the degree programmes. The task was also in line with the harmonising of the degree structures as part of the Bologna process within higher

Levander, Lena M.; Mikkola, Minna

2009-01-01

407

Pathway-based analysis tools for complex diseases: a review.  

PubMed

Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases. PMID:25462153

Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

2014-10-01

408

Receiver function deconvolution using transdimensional hierarchical Bayesian inference  

NASA Astrophysics Data System (ADS)

Teleseismic waves can convert from shear to compressional (Sp) or compressional to shear (Ps) across impedance contrasts in the subsurface. Deconvolving the parent waveforms (P for Ps or S for Sp) from the daughter waveforms (S for Ps or P for Sp) generates receiver functions which can be used to analyse velocity structure beneath the receiver. Though a variety of deconvolution techniques have been developed, they are all adversely affected by background and signal-generated noise. In order to take into account the unknown noise characteristics, we propose a method based on transdimensional hierarchical Bayesian inference in which both the noise magnitude and noise spectral character are parameters in calculating the likelihood probability distribution. We use a reversible-jump implementation of a Markov chain Monte Carlo algorithm to find an ensemble of receiver functions whose relative fits to the data have been calculated while simultaneously inferring the values of the noise parameters. Our noise parametrization is determined from pre-event noise so that it approximates observed noise characteristics. We test the algorithm on synthetic waveforms contaminated with noise generated from a covariance matrix obtained from observed noise. We show that the method retrieves easily interpretable receiver functions even in the presence of high noise levels. We also show that we can obtain useful estimates of noise amplitude and frequency content. Analysis of the ensemble solutions produced by our method can be used to quantify the uncertainties associated with individual receiver functions as well as with individual features within them, providing an objective way for deciding which features warrant geological interpretation. This method should make possible more robust inferences on subsurface structure using receiver function analysis, especially in areas of poor data coverage or under noisy station conditions.

Kolb, J. M.; Leki?, V.

2014-06-01

409

Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)  

NASA Technical Reports Server (NTRS)

The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

2005-01-01

410

Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop  

SciTech Connect

Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as pipelines of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, we built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.

Miller, Barton

2014-06-30

411

Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan  

SciTech Connect

Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This paper describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.

Janjusic, Tommy [ORNL; Kartsaklis, Christos [ORNL; Wang, Dali [ORNL

2013-01-01

412

Visualization tools for vorticity transport analysis in incompressible flow.  

PubMed

Vortices are undesirable in many applications while indispensable in others. It is therefore of common interest to understand their mechanisms of creation. This paper aims at analyzing the transport of vorticity inside incompressible flow. The analysis is based on the vorticity equation and is performed along pathlines which are typically started in upstream direction from vortex regions. Different methods for the quantitative and explorative analysis of vorticity transport are presented and applied to CFD simulations of water turbines. Simulation quality is accounted for by including the errors of meshing and convergence into analysis and visualization. The obtained results are discussed and interpretations with respect to engineering questions are given. PMID:17080821

Sadlo, Filip; Peikert, Ronald; Sick, Mirjam

2006-01-01

413

Application of Surface Chemical Analysis Tools for Characterization of Nanoparticles  

PubMed Central

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy (AES); X-ray photoelectron spectroscopy (XPS); time of flight secondary ion mass spectrometry (TOF-SIMS); low energy ion scattering (LEIS); and scanning probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic force microscopy (AFM), are briefly summarized. Examples describing the characterization of engineered nanoparticles are provided. Specific analysis considerations and issues associated with using surface analysis methods for the characterization of nanoparticles are discussed and summarized, along with the impact that shape instability, environmentally induced changes, deliberate and accidental coating, etc., have on nanoparticle properties. PMID:20052578

Baer, DR; Gaspar, DJ; Nachimuthu, P; Techane, SD; Castner, DG

2010-01-01

414

TOOLS FOR COMPARATIVE ANALYSIS OF ALTERNATIVES: COMPETING OR COMPLEMENTARY PERSPECTIVES?  

EPA Science Inventory

A third generation of environmental policymaking and risk management will increasingly impose environmental measures, which may give rise to analyzing countervailing risks. Therefore, a comprehensive analysis of these risks associated with the decision alternatives at hand will e...

415

PRAAD: Preprocessing and Analysis Tool for Arabic Ancient Documents  

Microsoft Academic Search

This paper presents the new system PRAAD for preprocessing and analysis of Arabic historical documents. It is composed of two important parts: pre-processing and analysis of ancient documents. After digitization, the color or greyscale ancient documents images are distorted by the presence of strong background artefacts such as scan optical blur and noise, show-through and bleed-through effects and spots. In

Wafa Boussellaa; Abderrazak Zahour; Bruno Taconet; Adel Alimi; Abdellatif Benabdelhafid

2007-01-01

416

CANDLE: A Tool for Efficient Analysis of CAN Control Systems  

Microsoft Academic Search

Construction and analysis of formal system models is in- creasingly accepted as a valuable technique in the rigorous development of real-time control systems. The effectiveness of modelling with timed automata and analysis via model-checking has been demonstrated often in practice. An obstacle to greater industrial uptake of this approach is the low-level character of the language of timed automata which

David Kendall

417

OPE The Campus Safety and Security Data Analysis Cutting Tool  

NSDL National Science Digital Library

Provided by the Office of Postsecondary Education (OPE) of the US Department of Education, this searchable database allows users to browse records of reported criminal offenses at over 6000 colleges and universities. The database contains records for 1997-99 and may be browsed by region, state, city, type of institution, instructional program, and number of students. Users can also simply type in the name of a specific institution. Initial entries include basic contact information and links to statistics for criminal offenses, hate offenses, and arrests. Each entry page also links to the relevant page at the National Center for Education Statistics IPEDS COOL (College Opportunities On-Line) website (reviewed in the March 31, 2000 Scout Report), a tool for comparison shopping between different collges and universities.

418

Image restoration for confocal microscopy: improving the limits of deconvolution, with application to the visualization of the mammalian hearing organ.  

PubMed Central

Deconvolution algorithms have proven very effective in conventional (wide-field) fluorescence microscopy. Their application to confocal microscopy is hampered, in biological experiments, by the presence of important levels of noise in the images and by the lack of a precise knowledge of the point spread function (PSF) of the system. We investigate the application of wavelet-based processing tools to deal with these problems, in particular wavelet denoising methods, which turn out to be very effective in application to three-dimensional confocal images. When used in combination with more classical deconvolution algorithms, these methods provide a robust and efficient restoration scheme allowing one to deal with difficult imaging conditions. To make our approach applicable in practical situations, we measured the PSF of a Biorad-MRC1024 confocal microscope under a large set of imaging conditions, including in situ acquisitions. As a specific biological application, we present several examples of restorations of three-dimensional confocal images acquired inside an intact preparation of the hearing organ. We also provide a quantitative assessment of the gain in quality achieved by wavelet-aided restorations over classical deconvolution schemes, based on a set of numerical experiments that we performed with test images. PMID:11325744

Boutet de Monvel, J; Le Calvez, S; Ulfendahl, M

2001-01-01

419

GCAFITA new tool for glow curve analysis in thermoluminescence nanodosimetry  

NASA Astrophysics Data System (ADS)

Glow curve analysis is widely used for dosimetric studies and applications. Therefore, a new computer program, GCAFIT, for deconvoluting first-order kinetics thermoluminescence (TL) glow curves and evaluating the activation energy for each glow peak in the glow curve has been developed using the MATLAB technical computing language. A non-linear function describing a single glow peak is fitted to experimental points using the Levenberg-Marquardt least-square method. The developed GCAFIT software was used to analyze the glow curves of TLD-100, TLD-600, and TLD-700 nanodosimeters. The activation energy E obtained by the developed GCAFIT software was compared with that obtained by the peak shape methods of Grossweiner, Lushchik, and Halperin-Braner. The frequency factor S for each glow peak was also calculated. The standard deviations are discussed in each case and compared with those of other investigators. The results show that GCAFIT is capable of accurately analyzing first-order TL glow curves. Unlike other software programs, the developed GCAFIT software does not require activation energy as an input datum; in contrast, activation energy for each glow peak is given in the output data. The resolution of the experimental glow curve influences the results obtained by the GCAFIT software; as the resolution increases, the results obtained by the GCAFIT software become more accurate. The values of activation energy obtained by the developed GCAFIT software a in good agreement with those obtained by the peak shape methods. The agreement with the Halperin-Braner and Lushchik methods is better than with that of Grossweiner. High E and S values for peak 5 were observed; we believe that these values are not real because peak 5 may in fact consist of two or three unresolved peaks. We therefore treated E and S for peak 5 as an effective activation energy, Eeff, and an effective frequency factor, Seff. The temperature value for peak 5 was also treated as an effective quantity, Tm eff.

Abd El-Hafez, A. I.; Yasin, M. N.; Sadek, A. M.

2011-05-01

420

Dynamite analysis by Raman spectroscopy as a unique analytical tool.  

PubMed

Apart from powerful explosives, dynamites are complex samples with an intricate analysis. These mixtures of compounds of diverse chemical nature present a challenge to the analyst, and as a result, several analytical techniques need to be applied currently for their analysis. Taking into account that presently there are almost no methods for dynamite analysis in the literature, it is crucial to develop analytical methods that could be applied for the analysis of these samples. This study introduces the use of Raman spectroscopy to analyze dynamites. Two different dynamites made up of ethylene glycol dinitrate and ammonium nitrate, among other minor components, were analyzed by Raman spectroscopy. First, confocal Raman spectroscopy allowed the identification of different components easily distinguished by eye (ammonium nitrate, ethylene glycol dinitrate, and sawdust). Then, Raman mapping was used to show the distribution of the main components throughout the dynamite mass. Finally, several minor components were identified after flocculation (nitrocellulose) or precipitation (sawdust, CaCO3, and flour). The results obtained demonstrate the huge potential of this technique for the analysis of such a complex and tricky sample. PMID:23360417

Lpez-Lpez, Mara; Ferrando, Jose Luis; Garca-Ruiz, Carmen

2013-03-01

421

Development of a User Interface for a Regression Analysis Software Tool  

NASA Technical Reports Server (NTRS)

An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

Ulbrich, Norbert Manfred; Volden, Thomas R.

2010-01-01

422

DEBRISK, a Tool for Re-Entry Risk Analysis  

NASA Astrophysics Data System (ADS)

An act of French parliament, adopted in 2008, imposes satellite constructors to evaluate the end-of-life operations in order to assure the risk mitigation of their satellites. One important element in this evaluation is the estimation of the mass and impact energy of the satellite debris after atmospheric re-entry. For this purpose, CNES has developed the tool DEBRISK which allows the operator to simulate the re-entry phase and to study the demise altitudes or impact energy of the individual fragments of the original satellite. DEBRISK is based on the so called object based approach. Using this approach, a breakup altitude is assumed where the satellite disintegrates due to the pressure loads. This altitude is typically around 78 km. After breakup, the satellite structure is modelled by a parent-child approach, where each child has its birth criterion. In the simplest approach the child is born after demise of the parent object. This could be the case of an object A containing an object B which is in the interior of object A and thus not exposed to the atmosphere. Each object is defined by: - its shape, attitude and dimensions, - the material along with their physical properties - the state and velocity vectors. The shape, attitude and dimensions define the aerodynamic drag of the object which is input to the 3DOF trajectory modelling. The aerodynamic mass used in the equation of motion is defined as the sum of the object's own mass and the mass of the object's offspring. A new born object inherits the state vector of the parent object. The shape, attitude and dimensions also define the heating rates experienced by the object. The heating rate is integrated in time up to the point where the melting temperature is reached. The mass of melted material is computed from the excess heat and the material properties. After each step the amount of ablated material is determined using the lumped mass approach and is peeled off from the object, updating mass and shape of the ablated object. The mass in the lumped mass equation is termed 'thermal mass' and consists of the part of the object that is exposed to the flow (so excluding the mass of the contained children). A fair amount of predefined materials is implemented, along with their thermal properties. In order to allow the users to modify the properties or to add new materials, user defined materials can be used. In that case the properties such as specific heat, emissivity and conductivity can either be entered as a constant or as being temperature dependent by entering a table. Materials can be derived from existing objects, which is useful in case only one or few of the material properties change. The code has been developed in the Java language, benefitting from the object oriented approach. Most methods that are used in DEBRISK to compute drag coefficients and heating rates are based on engineering methods developed in the 1950 to 1960's, which are used as well in similar tools (ORSAT, SESAME, ORSAT-J, ...). The paper presents a set of comparisons with literature cases of similar tools in order to verify the implementation of those methods in the developed software.

Omaly, P.; Spel, M.

2012-01-01

423

Extension of a System Level Tool for Component Level Analysis  

NASA Technical Reports Server (NTRS)

This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

Majumdar, Alok; Schallhorn, Paul

2002-01-01

424

Extension of a System Level Tool for Component Level Analysis  

NASA Technical Reports Server (NTRS)

This paper presents an extension of a numerical algorithm for network flow analysis code to perform multi-dimensional flow calculation. The one dimensional momentum equation in network flow analysis code has been extended to include momentum transport due to shear stress and transverse component of velocity. Both laminar and turbulent flows are considered. Turbulence is represented by Prandtl's mixing length hypothesis. Three classical examples (Poiseuille flow, Couette flow, and shear driven flow in a rectangular cavity) are presented as benchmark for the verification of the numerical scheme.

Majumdar, Alok; Schallhorn, Paul; McConnaughey, Paul K. (Technical Monitor)

2001-01-01

425

Application of surface chemical analysis tools for characterization of nanoparticles  

Microsoft Academic Search

The important role that surface chemical analysis methods can and should play in the characterization of nanoparticles is\\u000a described. The types of information that can be obtained from analysis of nanoparticles using Auger electron spectroscopy\\u000a (AES), X-ray photoelectron spectroscopy (XPS), time-of-flight secondary-ion mass spectrometry (TOF-SIMS), low-energy ion scattering\\u000a (LEIS), and scanning-probe microscopy (SPM), including scanning tunneling microscopy (STM) and atomic

Donald R. Baer; Daniel J. Gaspar; Ponnusamy Nachimuthu; Sirnegeda D. Techane; David G. Castner

2010-01-01

426

GIPSY 3D: Analysis, visualization and VO-Tools  

NASA Astrophysics Data System (ADS)

The scientific goals of the AMIGA project are based on the analysis of a significant amount of spectroscopic 3D data. In order to perform this work we present an initiative to develop a new VO compliant package, including present core applications and tasks offered by the Groningen Image Processing System (GIPSY), and new ones based on use cases elaborated in collaboration with advanced users. One of the main goals is to provide local interoperability between GIPSY (visualization and data analysis) and other VO software. The connectivity with the Virtual Observatory environment will provide general access to 3D data VO archives and services, maximizing the potential for scientific discovery.

Ruz, J. E.; Santander-Vela, J. D.; Espigares, V.; Verdes-Montenegro, L.; van der Hulst, J. M.

2009-07-01

427

The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis  

NASA Technical Reports Server (NTRS)

A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

Burks, Jason Eric; Sperow, Ken

2015-01-01

428

Deconvolution of astronomical images using SOR with adaptive relaxation.  

PubMed

We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity. PMID:21747506

Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

2011-07-01

429

Simultaneous Total Variation Image Inpainting and Blind Deconvolution  

E-print Network

Simultaneous Total Variation Image Inpainting and Blind Deconvolution Tony F. Chan Andy M. Yip opposed to inpainting blurry images). As a result, ringing effects due to imposing improper boundary conditions and errors due to imperfection of inpainting blurry images are reduced. More- over, our model can

Chan, Tony F.

430

Deconvolution of adaptive optics retinal images Julian C. Christou  

E-print Network

Deconvolution of adaptive optics retinal images Julian C. Christou Center for Adaptive Optics the contrast of the adaptive optics images. In this work we demonstrate that quantitative information is also by using adaptive optics1 (AO). The wave-front correction is not perfect, however. Although a diffraction

431

Detection of gravity field source boundaries using deconvolution method  

NASA Astrophysics Data System (ADS)

Complications arise in the interpretation of gravity fields because of interference from systematic degradations, such as boundary blurring and distortion. The major sources of these degradations are the various systematic errors that inevitably occur during gravity field data acquisition, discretization and geophysical forward modelling. To address this problem, we evaluate deconvolution method that aim to detect the clear horizontal boundaries of anomalous sources by the suppression of systematic errors. A convolution-based multilayer projection model, based on the classical 3-D gravity field forward model, is innovatively derived to model the systematic error degradation. Our deconvolution algorithm is specifically designed based on this multilayer projection model, in which three types of systematic error are defined. The degradations of the different systematic errors are considered in the deconvolution algorithm. As the primary source of degradation, the convolution-based systematic error is the main object of the multilayer projection model. Both the random systematic error and the projection systematic error are shown to form an integral part of the multilayer projection model, and the mixed norm regularization method and the primal-dual optimization method are therefore employed to control these errors and stabilize the deconvolution solution. We herein analyse the parameter identification and convergence of the proposed algorithms, and synthetic and field data sets are both used to illustrate their effectiveness. Additional synthetic examples are specifically designed to analyse the effects of the projection systematic error, which is caused by the uncertainty associated with the estimation of the impulse response function.

Zuo, Boxin; Hu, Xiangyun; Liang, Yin; Han, Qi

2014-12-01

432

Image Deconvolution of the Radio Ring PKS 1830-211  

Microsoft Academic Search

New high-quality Keck and ESO images of PKS 1830-211 are presented. By applying a powerful new deconvolution algorithm to these optical and infrared data, both images of the flat spectrum core of the radio source have been identified. An extended source is also detected in the optical images which is consistent with the expected location of the lensing galaxy. The

F. Courbin; C. Lidman; B. L. Frye; P. Magain; T. J. Broadhurst; M. A. Pahre; S. G. Djorgovski

1998-01-01

433

Deconvolution of sea state parameters from altimeter waveforms  

NASA Technical Reports Server (NTRS)

The paper reports on an on-going effort at the JPL to estimate the accuracy of ocean state parameters which have been obtained from the specular point probability density function (pdf) of the ocean surface. This pdf is obtained by the deconvolution of the return waveform of oceanographic altimeters such as Seasat, Geosat, or Topex.

Rodriguez, Ernesto; Chapman, Bruce; Chi, Chong-Yung; Liu, Eric

1987-01-01

434

Laser-cytofluorescence microscopic image restoration by iterative deconvolution  

Microsoft Academic Search

The aim of our study is to improve the performances of an optical microscope by image deconvolution technique to attain that of a confocal one in the field of laser cytofluorescence. The fluorescence of antigen lymphocyte sites marked by rhodamine is induced by a laser ((lambda) equals 543 nm) or a mercury vapor lamp. A set of scanned fluorescence images

Chengqi Xu; Eric Maire; Serge Jacquey

1995-01-01

435

A quantitative evaluation of various iterative deconvolution algorithms  

Microsoft Academic Search

Various iterative deconvolution algorithms are evaluated that are commonly used to restore degraded chromatographic or spectroscopic peak data. The evaluation criteria include RMS errors, relative errors in peak areas, peak area variances, and rate of convergence. The iterative algorithms to be evaluated include Van Cittert's method, Van Cittert's method with constraint operators, relaxation based methods, and Gold's ratio method. The

P. B. Crilly

1991-01-01

436

Deconvolution in Random Effects Models via Normal Mixtures  

E-print Network

. . . . . . . . . . . . . 2 1.1.2 Bandwidth Selection . . . . . . . . . . . . . . . . . 5 1.1.3 Distribution Function Estimation . . . . . . . . . . 6 1.1.4 Optimality Results . . . . . . . . . . . . . . . . . . 7 1.2 Deconvolution in Location Random Effects Model . . . . . 8....2.2 Discretizing the Metric . . . . . . . . . . . . . . . . 17 2.2.3 Estimating Mixing Proportions p . . . . . . . . . . 18 2.2.4 Estimating ? . . . . . . . . . . . . . . . . . . . . . 18 TABLE OF CONTENTS...

Litton, Nathaniel A.

2010-10-12

437

Antenna Pattern Synthesis and Deconvolution of Microwave Radiometer Imaging Data  

E-print Network

Antenna Pattern Synthesis and Deconvolution of Microwave Radiometer Imaging Data C. T. Swift, M. A. Introduction A microwave radiometer measures the actual thermal emission smoothed by the radiation power radiometers. The first conically scanned space borne radiometer system for earth remote sensing was the SMMR

Reising, Steven C.

438

Real-time image deconvolution on the GPU  

NASA Astrophysics Data System (ADS)

Two-dimensional image deconvolution is an important and well-studied problem with applications to image deblurring and restoration. Most of the best deconvolution algorithms use natural image statistics that act as priors to regularize the problem. Recently, Krishnan and Fergus provide a fast deconvolution algorithm that yields results comparable to the current state of the art. They use a hyper-Laplacian image prior to regularize the problem. The resulting optimization problem is solved using alternating minimization in conjunction with a half-quadratic penalty function. In this paper, we provide an efficient CUDA implementation of their algorithm on the GPU. Our implementation leverages many wellknown CUDA optimization techniques, as well as several others that have a significant impact on this particular algorithm. We discuss each of these, as well as make a few observations regarding the CUFFT library. Our experiments were run on an Nvidia GeForce GTX 260. For a single channel image of size 710 x 470, we obtain over 40 fps, while on a larger image of size 1900 x 1266, we get almost 6 fps (without counting disk I/O). In addition to linear performance, we believe ours is the first implementation to perform deconvolutions at video rates. Our running times also demonstrate that our GPU implementation is over 27 times faster than the original CPU implementation.

Klosowski, James T.; Krishnan, Shankar

2011-01-01

439

MUTUAL INFORMATION APPROACH TO BLIND SEPARATION-DECONVOLUTION  

E-print Network

information for the blind separation and deconvolution of convolutive mixtures. Specifically, K observed seper- ate the sources up to a filtering, since replacing each of the sequences {Y1(t of sources as sensors, for simplicity in the case of instantaneous mixtures [1]. Here, we shall adopt

Pham, Dinh-Tuan

440

Contrast Enhancement of Luminescence Images via Point-Spread Deconvolution  

E-print Network

Contrast Enhancement of Luminescence Images via Point-Spread Deconvolution Daniel Walter1 , Anyao luminescence imaging system. It is found that an experimental definition of the point-spread function allows reducing the luminescence signal and increasing the relative noise level. An implementation of point

441

Blind deconvolution of ultrasonic signals in nondestructive testing applications  

Microsoft Academic Search

Advanced nondestructive testing techniques use a laser to generate ultrasonic waves at the surface of a test material. An air-coupled transducer receives the ultrasound that is the convolution of the signal leaving the test material and the distortion function. Blind deconvolution methods are applied to estimate the signal leaving the material

A. K. Nandi; D. Mampel; B. Roscher

1997-01-01

442

Breast image feature learning with adaptive deconvolutional networks  

NASA Astrophysics Data System (ADS)

Feature extraction is a critical component of medical image analysis. Many computer-aided diagnosis approaches employ hand-designed, heuristic lesion extracted features. An alternative approach is to learn features directly from images. In this preliminary study, we explored the use of Adaptive Deconvolutional Networks (ADN) for learning high-level features in diagnostic breast mass lesion images with potential application to computer-aided diagnosis (CADx) and content-based image retrieval (CBIR). ADNs (Zeiler, et. al., 2011), are recently-proposed unsupervised, generative hierarchical models that decompose images via convolution sparse coding and max pooling. We trained the ADNs to learn multiple layers of representation for two breast image data sets on two different modalities (739 full field digital mammography (FFDM) and 2393 ultrasound images). Feature map calculations were accelerated by use of GPUs. Following Zeiler et. al., we applied the Spatial Pyramid Matching (SPM) kernel (Lazebnik, et. al., 2006) on the inferred feature maps and combined this with a linear support vector machine (SVM) classifier for the task of binary classification between cancer and non-cancer breast mass lesions. Non-linear, local structure preserving dimension reduction, Elastic Embedding (Carreira-Perpin, 2010), was then used to visualize the SPM kernel output in 2D and qualitatively inspect image relationships learned. Performance was found to be competitive with current CADx schemes that use human-designed features, e.g., achieving a 0.632+ bootstrap AUC (by case) of 0.83 [0.78, 0.89] for an ultrasound image set (1125 cases).

Jamieson, Andrew R.; Drukker, Karen; Giger, Maryellen L.

2012-03-01

443

Discriminating adenocarcinoma from normal colonic mucosa through deconvolution of Raman spectra  

NASA Astrophysics Data System (ADS)

In this work, we considered the feasibility of Raman spectroscopy for discriminating between adenocarcinomatous and normal mucosal formalin-fixed colonic tissues. Unlike earlier studies in colorectal cancer, a spectral deconvolution model was implemented to derive spectral information. Eleven samples of human colon were used, and 55 spectra were analyzed. Each spectrum was resolved into 25 bands from 975 to 1720 cm-1, where modes of proteins, lipids, and nucleic acids are observed. From a comparative study of band intensities, those presenting higher differences between tissue types were correlated to biochemical assignments. Results from fitting procedure were further used as inputs for linear discriminant analysis, where combinations of band intensities and intensity ratios were tested, yielding accuracies up to 81%. This analysis yields objective discriminating parameters after fitting optimization. The bands with higher diagnosis relevance detected by spectra deconvolution enable to confine the study to some spectral regions instead of broader ranges. A critical view upon limitations of this approach is presented, along with a comparison of our results to earlier ones obtained in fresh colonic tissues. This enabled to assess the effect of formalin fixation in colonic tissues, and determine its relevance in the present analysis.

Cambraia Lopes, Patricia; Moreira, Joaquim Agostinho; Almeida, Abilio; Esteves, Artur; Gregora, Ivan; Ledinsky, Martin; Lopes, Jose Machado; Henrique, Rui; Oliveira, Albino

2011-12-01

444

Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details  

PubMed Central

Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners. PMID:21904538

Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca

2011-01-01

445

Deconvolution-Based CT and MR Brain Perfusion Measurement: Theoretical Model Revisited and Practical Implementation Details.  

PubMed

Deconvolution-based analysis of CT and MR brain perfusion data is widely used in clinical practice and it is still a topic of ongoing research activities. In this paper, we present a comprehensive derivation and explanation of the underlying physiological model for intravascular tracer systems. We also discuss practical details that are needed to properly implement algorithms for perfusion analysis. Our description of the practical computer implementation is focused on the most frequently employed algebraic deconvolution methods based on the singular value decomposition. In particular, we further discuss the need for regularization in order to obtain physiologically reasonable results. We include an overview of relevant preprocessing steps and provide numerous references to the literature. We cover both CT and MR brain perfusion imaging in this paper because they share many common aspects. The combination of both the theoretical as well as the practical aspects of perfusion analysis explicitly emphasizes the simplifications to the underlying physiological model that are necessary in order to apply it to measured data acquired with current CT and MR scanners. PMID:21904538

Fieselmann, Andreas; Kowarschik, Markus; Ganguly, Arundhuti; Hornegger, Joachim; Fahrig, Rebecca

2011-01-01

446

Market research for requirements analysis using linguistic tools  

Microsoft Academic Search

Numerous studies in recent months have proposed the use of linguistic instruments to support requirements analysis. There are two main reasons for this: (i) the progress made in natural language processing, (ii) the need to provide the developers of software systems with support in the early phases of requirements definition and conceptual modelling. This paper presents the results of an