Sample records for spectral analysis software

  1. Spectral Knowledge (SK-UTALCA): Software for Exploratory Analysis of High-Resolution Spectral Reflectance Data on Plant Breeding

    PubMed Central

    Lobos, Gustavo A.; Poblete-Echeverría, Carlos

    2017-01-01

    This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules. PMID:28119705

  2. Spectral Knowledge (SK-UTALCA): Software for Exploratory Analysis of High-Resolution Spectral Reflectance Data on Plant Breeding.

    PubMed

    Lobos, Gustavo A; Poblete-Echeverría, Carlos

    2016-01-01

    This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules.

  3. [Analysis of software for identifying spectral line of laser-induced breakdown spectroscopy based on LabVIEW].

    PubMed

    Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang

    2012-03-01

    Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.

  4. Spectral analysis software improves confidence in plant and soil water stable isotope analyses performed by isotope ratio infrared spectroscopy (IRIS).

    PubMed

    West, A G; Goldsmith, G R; Matimati, I; Dawson, T E

    2011-08-30

    Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be included when reporting stable isotope data from IRIS. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Spatio-temporally resolved spectral measurements of laser-produced plasma and semiautomated spectral measurement-control and analysis software

    NASA Astrophysics Data System (ADS)

    Cao, S. Q.; Su, M. G.; Min, Q.; Sun, D. X.; O'Sullivan, G.; Dong, C. Z.

    2018-02-01

    A spatio-temporally resolved spectral measurement system of highly charged ions from laser-produced plasmas is presented. Corresponding semiautomated computer software for measurement control and spectral analysis has been written to achieve the best synchronicity possible among the instruments. This avoids the tedious comparative processes between experimental and theoretical results. To demonstrate the capabilities of this system, a series of spatio-temporally resolved experiments of laser-produced Al plasmas have been performed and applied to benchmark the software. The system is a useful tool for studying the spectral structures of highly charged ions and for evaluating the spatio-temporal evolution of laser-produced plasmas.

  6. The U. S. Geological Survey, Digital Spectral Library: Version 1 (0.2 to 3.0um)

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Gallagher, Andrea J.; King, Trude V.V.; Calvin, Wendy M.

    1993-01-01

    We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 498 spectra of 444 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 um . The spectral resolution (Full Width Half Maximum) of the reflectance data is <= 4 nm in the visible (0.2-0.8 um) and <= 10 nm in the NIR (0.8-2.35 um). All spectra were corrected to absolute reflectance using an NIST Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from borate, carbonate, chloride, element, halide, hydroxide, nitrate, oxide, phosphate, sulfate, sulfide, sulfosalt, and the silicate (cyclosilicate, inosilicate, nesosilicate, phyllosilicate, sorosilicate, and tectosilicate) classes are represented. X-Ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses. The library and software are available as a series of U.S.G.S. Open File reports. PC user software is available to convert the binary data to ascii files (a separate U.S.G.S. open file report). Additionally, a binary data files are on line at the U.S.G.S. in Denver for anonymous ftp to users on the Internet. The library search software enables a user to search on documentation parameters as well as spectral features. The analysis system includes general spectral analysis routines, plotting packages, radiative transfer software for computing intimate mixtures, routines to derive optical constants from reflectance spectra, tools to analyze spectral features, and the capability to access imaging spectrometer data cubes for spectral analysis. Users may build customized libraries (at specific wavelengths and spectral resolution) for their own instruments using the library software. We are currently extending spectral coverage to 150 um. The libraries (original and convolved) will be made available in the future on a CD-ROM.

  7. The software and algorithms for hyperspectral data processing

    NASA Astrophysics Data System (ADS)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.

  8. Software development for the analysis of heartbeat sounds with LabVIEW in diagnosis of cardiovascular disease.

    PubMed

    Topal, Taner; Polat, Hüseyin; Güler, Inan

    2008-10-01

    In this paper, a time-frequency spectral analysis software (Heart Sound Analyzer) for the computer-aided analysis of cardiac sounds has been developed with LabVIEW. Software modules reveal important information for cardiovascular disorders, it can also assist to general physicians to come up with more accurate and reliable diagnosis at early stages. Heart sound analyzer (HSA) software can overcome the deficiency of expert doctors and help them in rural as well as urban clinics and hospitals. HSA has two main blocks: data acquisition and preprocessing, time-frequency spectral analyses. The heart sounds are first acquired using a modified stethoscope which has an electret microphone in it. Then, the signals are analysed using the time-frequency/scale spectral analysis techniques such as STFT, Wigner-Ville distribution and wavelet transforms. HSA modules have been tested with real heart sounds from 35 volunteers and proved to be quite efficient and robust while dealing with a large variety of pathological conditions.

  9. Software algorithm and hardware design for real-time implementation of new spectral estimator

    PubMed Central

    2014-01-01

    Background Real-time spectral analyzers can be difficult to implement for PC computer-based systems because of the potential for high computational cost, and algorithm complexity. In this work a new spectral estimator (NSE) is developed for real-time analysis, and compared with the discrete Fourier transform (DFT). Method Clinical data in the form of 216 fractionated atrial electrogram sequences were used as inputs. The sample rate for acquisition was 977 Hz, or approximately 1 millisecond between digital samples. Real-time NSE power spectra were generated for 16,384 consecutive data points. The same data sequences were used for spectral calculation using a radix-2 implementation of the DFT. The NSE algorithm was also developed for implementation as a real-time spectral analyzer electronic circuit board. Results The average interval for a single real-time spectral calculation in software was 3.29 μs for NSE versus 504.5 μs for DFT. Thus for real-time spectral analysis, the NSE algorithm is approximately 150× faster than the DFT. Over a 1 millisecond sampling period, the NSE algorithm had the capability to spectrally analyze a maximum of 303 data channels, while the DFT algorithm could only analyze a single channel. Moreover, for the 8 second sequences, the NSE spectral resolution in the 3-12 Hz range was 0.037 Hz while the DFT spectral resolution was only 0.122 Hz. The NSE was also found to be implementable as a standalone spectral analyzer board using approximately 26 integrated circuits at a cost of approximately $500. The software files used for analysis are included as a supplement, please see the Additional files 1 and 2. Conclusions The NSE real-time algorithm has low computational cost and complexity, and is implementable in both software and hardware for 1 millisecond updates of multichannel spectra. The algorithm may be helpful to guide radiofrequency catheter ablation in real time. PMID:24886214

  10. Processing Raman Spectra of High-Pressure Hydrogen Flames

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet; Kojima, Jun

    2006-01-01

    The Raman Code automates the analysis of laser-Raman-spectroscopy data for diagnosis of combustion at high pressure. On the basis of the theory of molecular spectroscopy, the software calculates the rovibrational and pure rotational Raman spectra of H2, O2, N2, and H2O in hydrogen/air flames at given temperatures and pressures. Given a set of Raman spectral data from measurements on a given flame and results from the aforementioned calculations, the software calculates the thermodynamic temperature and number densities of the aforementioned species. The software accounts for collisional spectral-line-broadening effects at pressures up to 60 bar (6 MPa). The line-broadening effects increase with pressure and thereby complicate the analysis. The software also corrects for spectral interference ("cross-talk") among the various chemical species. In the absence of such correction, the cross-talk is a significant source of error in temperatures and number densities. This is the first known comprehensive computer code that, when used in conjunction with a spectral calibration database, can process Raman-scattering spectral data from high-pressure hydrogen/air flames to obtain temperatures accurate to within 10 K and chemical-species number densities accurate to within 2 percent.

  11. The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1992-01-01

    The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.

  12. Spectral mapping tools from the earth sciences applied to spectral microscopy data.

    PubMed

    Harris, A Thomas

    2006-08-01

    Spectral imaging, originating from the field of earth remote sensing, is a powerful tool that is being increasingly used in a wide variety of applications for material identification. Several workers have used techniques like linear spectral unmixing (LSU) to discriminate materials in images derived from spectral microscopy. However, many spectral analysis algorithms rely on assumptions that are often violated in microscopy applications. This study explores algorithms originally developed as improvements on early earth imaging techniques that can be easily translated for use with spectral microscopy. To best demonstrate the application of earth remote sensing spectral analysis tools to spectral microscopy data, earth imaging software was used to analyze data acquired with a Leica confocal microscope with mechanical spectral scanning. For this study, spectral training signatures (often referred to as endmembers) were selected with the ENVI (ITT Visual Information Solutions, Boulder, CO) "spectral hourglass" processing flow, a series of tools that use the spectrally over-determined nature of hyperspectral data to find the most spectrally pure (or spectrally unique) pixels within the data set. This set of endmember signatures was then used in the full range of mapping algorithms available in ENVI to determine locations, and in some cases subpixel abundances of endmembers. Mapping and abundance images showed a broad agreement between the spectral analysis algorithms, supported through visual assessment of output classification images and through statistical analysis of the distribution of pixels within each endmember class. The powerful spectral analysis algorithms available in COTS software, the result of decades of research in earth imaging, are easily translated to new sources of spectral data. Although the scale between earth imagery and spectral microscopy is radically different, the problem is the same: mapping material locations and abundances based on unique spectral signatures. (c) 2006 International Society for Analytical Cytology.

  13. Software Defined Network Monitoring Scheme Using Spectral Graph Theory and Phantom Nodes

    DTIC Science & Technology

    2014-09-01

    networks is the emergence of software - defined networking ( SDN ) [1]. SDN has existed for the...Chapter III for network monitoring. A. SOFTWARE DEFINED NETWORKS SDNs provide a new and innovative method to simplify network hardware by logically...and R. Giladi, “Performance analysis of software - defined networking ( SDN ),” in Proc. of IEEE 21st International Symposium on Modeling, Analysis

  14. Reducing beam hardening effects and metal artefacts in spectral CT using Medipix3RX

    NASA Astrophysics Data System (ADS)

    Rajendran, K.; Walsh, M. F.; de Ruiter, N. J. A.; Chernoglazov, A. I.; Panta, R. K.; Butler, A. P. H.; Butler, P. H.; Bell, S. T.; Anderson, N. G.; Woodfield, T. B. F.; Tredinnick, S. J.; Healy, J. L.; Bateman, C. J.; Aamir, R.; Doesburg, R. M. N.; Renaud, P. F.; Gieseg, S. P.; Smithies, D. J.; Mohr, J. L.; Mandalika, V. B. H.; Opie, A. M. T.; Cook, N. J.; Ronaldson, J. P.; Nik, S. J.; Atharifard, A.; Clyne, M.; Bones, P. J.; Bartneck, C.; Grasset, R.; Schleich, N.; Billinghurst, M.

    2014-03-01

    This paper discusses methods for reducing beam hardening effects and metal artefacts using spectral x-ray information in biomaterial samples. A small-animal spectral scanner was operated in the 15 to 80 keV x-ray energy range for this study. We use the photon-processing features of a CdTe-Medipix3RX ASIC in charge summing mode to reduce beam hardening and associated artefacts. We present spectral data collected for metal alloy samples, its analysis using algebraic 3D reconstruction software and volume visualisation using a custom volume rendering software. The cupping effect and streak artefacts are quantified in the spectral datasets. The results show reduction in beam hardening effects and metal artefacts in the narrow high energy range acquired using the spectroscopic detector. A post-reconstruction comparison between CdTe-Medipix3RX and Si-Medipix3.1 is discussed. The raw data and processed data are made available (http://hdl.handle.net/10092/8851) for testing with other software routines.

  15. EEGgui: a program used to detect electroencephalogram anomalies after traumatic brain injury.

    PubMed

    Sick, Justin; Bray, Eric; Bregy, Amade; Dietrich, W Dalton; Bramlett, Helen M; Sick, Thomas

    2013-05-21

    Identifying and quantifying pathological changes in brain electrical activity is important for investigations of brain injury and neurological disease. An example is the development of epilepsy, a secondary consequence of traumatic brain injury. While certain epileptiform events can be identified visually from electroencephalographic (EEG) or electrocorticographic (ECoG) records, quantification of these pathological events has proved to be more difficult. In this study we developed MATLAB-based software that would assist detection of pathological brain electrical activity following traumatic brain injury (TBI) and present our MATLAB code used for the analysis of the ECoG. Software was developed using MATLAB(™) and features of the open access EEGLAB. EEGgui is a graphical user interface in the MATLAB programming platform that allows scientists who are not proficient in computer programming to perform a number of elaborate analyses on ECoG signals. The different analyses include Power Spectral Density (PSD), Short Time Fourier analysis and Spectral Entropy (SE). ECoG records used for demonstration of this software were derived from rats that had undergone traumatic brain injury one year earlier. The software provided in this report provides a graphical user interface for displaying ECoG activity and calculating normalized power density using fast fourier transform of the major brain wave frequencies (Delta, Theta, Alpha, Beta1, Beta2 and Gamma). The software further detects events in which power density for these frequency bands exceeds normal ECoG by more than 4 standard deviations. We found that epileptic events could be identified and distinguished from a variety of ECoG phenomena associated with normal changes in behavior. We further found that analysis of spectral entropy was less effective in distinguishing epileptic from normal changes in ECoG activity. The software presented here was a successful modification of EEGLAB in the Matlab environment that allows detection of epileptiform ECoG signals in animals after TBI. The code allows import of large EEG or ECoG data records as standard text files and uses fast fourier transform as a basis for detection of abnormal events. The software can also be used to monitor injury-induced changes in spectral entropy if required. We hope that the software will be useful for other investigators in the field of traumatic brain injury and will stimulate future advances of quantitative analysis of brain electrical activity after neurological injury or disease.

  16. Generation of a Database of Laboratory Laser-Induced Breakdown Spectroscopy (LIBS) Spectra and Associated Analysis Software

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.

    2015-06-01

    We describe plans to generate a database of LIBS spectra of planetary analog materials and develop free, open-source software to enable the planetary community to analyze LIBS (and other spectral) data.

  17. SpecViz: Interactive Spectral Data Analysis

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas Michael; STScI

    2016-06-01

    The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight-forward, consistent way. Through the development of such tools, STScI hopes to unify astronomical data analysis software for JWST and other instruments, allowing for efficient, reliable, and consistent scientific results.

  18. Development of spectral analysis math models and software program and spectral analyzer, digital converter interface equipment design

    NASA Technical Reports Server (NTRS)

    Hayden, W. L.; Robinson, L. H.

    1972-01-01

    Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.

  19. NDVI and Panchromatic Image Correlation Using Texture Analysis

    DTIC Science & Technology

    2010-03-01

    6 Figure 5. Spectral reflectance of vegetation and soil from 0.4 to 1.1 mm (From Perry...should help the classification methods to be able to classify kelp. Figure 5. Spectral reflectance of vegetation and soil from 0.4 to 1.1 mm...1988). Image processing software for imaging spectrometry analysis. Remote Sensing of Enviroment , 24: 201–210. Perry, C., & Lautenschlager, L. F

  20. PRISM: Processing routines in IDL for spectroscopic measurements (installation manual and user's guide, version 1.0)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2011-01-01

    This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.

  1. Easily extensible unix software for spectral analysis, display, modification, and synthesis of musical sounds

    NASA Astrophysics Data System (ADS)

    Beauchamp, James W.

    2002-11-01

    Software has been developed which enables users to perform time-varying spectral analysis of individual musical tones or successions of them and to perform further processing of the data. The package, called sndan, is freely available in source code, uses EPS graphics for display, and is written in ansi c for ease of code modification and extension. Two analyzers, a fixed-filter-bank phase vocoder (''pvan'') and a frequency-tracking analyzer (''mqan'') constitute the analysis front end of the package. While pvan's output consists of continuous amplitudes and frequencies of harmonics, mqan produces disjoint ''tracks.'' However, another program extracts a fundamental frequency and separates harmonics from the tracks, resulting in a continuous harmonic output. ''monan'' is a program used to display harmonic data in a variety of formats, perform various spectral modifications, and perform additive resynthesis of the harmonic partials, including possible pitch-shifting and time-scaling. Sounds can also be synthesized according to a musical score using a companion synthesis language, Music 4C. Several other programs in the sndan suite can be used for specialized tasks, such as signal display and editing. Applications of the software include producing specialized sounds for music compositions or psychoacoustic experiments or as a basis for developing new synthesis algorithms.

  2. Specdata: Automated Analysis Software for Broadband Spectra

    NASA Astrophysics Data System (ADS)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  3. DoE Phase II SBIR: Spectrally-Assisted Vehicle Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villeneuve, Pierre V.

    2013-02-28

    The goal of this Phase II SBIR is to develop a prototype software package to demonstrate spectrally-aided vehicle tracking performance. The primary application is to demonstrate improved target vehicle tracking performance in complex environments where traditional spatial tracker systems may show reduced performance. Example scenarios in Figure 1 include a) the target vehicle obscured by a large structure for an extended period of time, or b), the target engaging in extreme maneuvers amongst other civilian vehicles. The target information derived from spatial processing is unable to differentiate between the green versus the red vehicle. Spectral signature exploitation enables comparison ofmore » new candidate targets with existing track signatures. The ambiguity in this confusing scenario is resolved by folding spectral analysis results into each target nomination and association processes. Figure 3 shows a number of example spectral signatures from a variety of natural and man-made materials. The work performed over the two-year effort was divided into three general areas: algorithm refinement, software prototype development, and prototype performance demonstration. The tasks performed under this Phase II to accomplish the program goals were as follows: 1. Acquire relevant vehicle target datasets to support prototype. 2. Refine algorithms for target spectral feature exploitation. 3. Implement a prototype multi-hypothesis target tracking software package. 4. Demonstrate and quantify tracking performance using relevant data.« less

  4. M.S.L.A.P. Modular Spectral Line Analysis Program documentation

    NASA Technical Reports Server (NTRS)

    Joseph, Charles L.; Jenkins, Edward B.

    1991-01-01

    MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.

  5. Defining and Detecting Complex Peak Relationships in Mass Spectral Data: The Mz.unity Algorithm.

    PubMed

    Mahieu, Nathaniel G; Spalding, Jonathan L; Gelman, Susan J; Patti, Gary J

    2016-09-20

    Analysis of a single analyte by mass spectrometry can result in the detection of more than 100 degenerate peaks. These degenerate peaks complicate spectral interpretation and are challenging to annotate. In mass spectrometry-based metabolomics, this degeneracy leads to inflated false discovery rates, data sets containing an order of magnitude more features than analytes, and an inefficient use of resources during data analysis. Although software has been introduced to annotate spectral degeneracy, current approaches are unable to represent several important classes of peak relationships. These include heterodimers and higher complex adducts, distal fragments, relationships between peaks in different polarities, and complex adducts between features and background peaks. Here we outline sources of peak degeneracy in mass spectra that are not annotated by current approaches and introduce a software package called mz.unity to detect these relationships in accurate mass data. Using mz.unity, we find that data sets contain many more complex relationships than we anticipated. Examples include the adduct of glutamate and nicotinamide adenine dinucleotide (NAD), fragments of NAD detected in the same or opposite polarities, and the adduct of glutamate and a background peak. Further, the complex relationships we identify show that several assumptions commonly made when interpreting mass spectral degeneracy do not hold in general. These contributions provide new tools and insight to aid in the annotation of complex spectral relationships and provide a foundation for improved data set identification. Mz.unity is an R package and is freely available at https://github.com/nathaniel-mahieu/mz.unity as well as our laboratory Web site http://pattilab.wustl.edu/software/ .

  6. The igmspec database of public spectra probing the intergalactic medium

    NASA Astrophysics Data System (ADS)

    Prochaska, J. X.

    2017-04-01

    We describe v02 of igmspec, a database of publicly available ultraviolet, optical, and near-infrared spectra that probe the intergalactic medium (IGM). This database, a child of the specdb repository in the specdb github organization, comprises 403 277 unique sources and 434 686 spectra obtained with the world's greatest observatories. All of these data are distributed in a single ≈ 25GB HDF5 file maintained at the University of California Observatories and the University of California, Santa Cruz. The specdb software package includes Python scripts and modules for searching the source catalog and spectral datasets, and software links to the linetools package for spectral analysis. The repository also includes software to generate private spectral datasets that are compliant with International Virtual Observatory Alliance (IVOA) protocols and a Python-based interface for IVOA Simple Spectral Access queries. Future versions of igmspec will ingest other sources (e.g. gamma-ray burst afterglows) and other surveys as they become publicly available. The overall goal is to include every spectrum that effectively probes the IGM. Future databases of specdb may include publicly available galaxy spectra (exgalspec) and published supernovae spectra (snspec). The community is encouraged to join the effort on github: https://github.com/specdb.

  7. Rocket experiments for spectral estimation of electron density fine structure in the auroral and equatorial ionosphere and preliminary results

    NASA Technical Reports Server (NTRS)

    Tomei, B. A.; Smith, L. G.

    1986-01-01

    Sounding rockets equipped to monitor electron density and its fine structure were launched into the auroral and equatorial ionosphere in 1980 and 1983, respectively. The measurement electronics are based on the Langmuir probe and are described in detail. An approach to the spectral analysis of the density irregularities is addressed and a software algorithm implementing the approach is given. Preliminary results of the analysis are presented.

  8. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    PubMed

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.

  9. Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics

    DOE PAGES

    Scott, S. D.; Mumgaard, R. T.

    2016-07-20

    A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less

  10. Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, S. D.; Mumgaard, R. T.

    A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less

  11. Automatic analysis of nuclear-magnetic-resonance-spectroscopy clinical research data

    NASA Astrophysics Data System (ADS)

    Scott, Katherine N.; Wilson, David C.; Bruner, Angela P.; Lyles, Teresa A.; Underhill, Brandon; Geiser, Edward A.; Ballinger, J. Ray; Scott, James D.; Stopka, Christine B.

    1998-03-01

    A major problem of P-31 nuclear magnetic spectroscopy (MRS) in vivo applications is that when large data sets are acquired, the time invested in data reduction and analysis with currently available technologies may totally overshadow the time required for data acquisition. An example is out MRS monitoring of exercise therapy for patients with peripheral vascular disease. In these, the spectral acquisition requires 90 minutes per patient study, whereas data analysis and reduction requires 6-8 hours. Our laboratory currently uses the proprietary software SA/GE developed by General Electric. However, other software packages have similar limitations. When data analysis takes this long, the researcher does not have the rapid feedback required to ascertain the quality of data acquired nor the result of the study. This highly undesirable even in a research environment, but becomes intolerable in the clinical setting. The purpose of this report is to outline progress towards the development of an automated method for eliminating the spectral analysis burden on the researcher working in the clinical setting.

  12. Multispectral scanner system parameter study and analysis software system description, volume 2

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.

    1978-01-01

    The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.

  13. Determining the von Mises stress power spectral density for frequency domain fatigue analysis including out-of-phase stress components

    NASA Astrophysics Data System (ADS)

    Bonte, M. H. A.; de Boer, A.; Liebregts, R.

    2007-04-01

    This paper provides a new formula to take into account phase differences in the determination of an equivalent von Mises stress power spectral density (PSD) from multiple random inputs. The obtained von Mises PSD can subsequently be used for fatigue analysis. The formula was derived for use in the commercial vehicle business and was implemented in combination with Finite Element software to predict and analyse fatigue failure in the frequency domain.

  14. Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data

    NASA Technical Reports Server (NTRS)

    Likens, W. C.; Wrigley, R. C.

    1984-01-01

    Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.

  15. Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.

  16. Validation of luminescent source reconstruction using spectrally resolved bioluminescence images

    NASA Astrophysics Data System (ADS)

    Virostko, John M.; Powers, Alvin C.; Jansen, E. D.

    2008-02-01

    This study examines the accuracy of the Living Image® Software 3D Analysis Package (Xenogen, Alameda, CA) in reconstruction of light source depth and intensity. Constant intensity light sources were placed in an optically homogeneous medium (chicken breast). Spectrally filtered images were taken at 560, 580, 600, 620, 640, and 660 nanometers. The Living Image® Software 3D Analysis Package was employed to reconstruct source depth and intensity using these spectrally filtered images. For sources shallower than the mean free path of light there was proportionally higher inaccuracy in reconstruction. For sources deeper than the mean free path, the average error in depth and intensity reconstruction was less than 4% and 12%, respectively. The ability to distinguish multiple sources decreased with increasing source depth and typically required a spatial separation of twice the depth. The constant intensity light sources were also implanted in mice to examine the effect of optical inhomogeneity. The reconstruction accuracy suffered in inhomogeneous tissue with accuracy influenced by the choice of optical properties used in reconstruction.

  17. ORBS: A reduction software for SITELLE and SpiOMM data

    NASA Astrophysics Data System (ADS)

    Martin, Thomas

    2014-09-01

    ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).

  18. BATSE spectroscopy analysis system

    NASA Technical Reports Server (NTRS)

    Schaefer, Bradley E.; Bansal, Sandhia; Basu, Anju; Brisco, Phil; Cline, Thomas L.; Friend, Elliott; Laubenthal, Nancy; Panduranga, E. S.; Parkar, Nuru; Rust, Brad

    1992-01-01

    The Burst and Transient Source Experiment (BATSE) Spectroscopy Analysis System (BSAS) is the software system which is the primary tool for the analysis of spectral data from BATSE. As such, Guest Investigators and the community as a whole need to know its basic properties and characteristics. Described here are the characteristics of the BATSE spectroscopy detectors and the BSAS.

  19. A CLOUDY/XSPEC Interface

    NASA Technical Reports Server (NTRS)

    Porter, R. L.; Ferland, G. J.; Kraemer, S. B.; Armentrout, B. K.; Arnaud, K. A.; Turner, T. J.

    2007-01-01

    We discuss new functionality of the spectral simulation code CLOUDY which allows the user to calculate grids with one or more initial parameters varied and formats the predicted spectra in the standard FITS format. These files can then be imported into the x-ray spectral analysis software XSPEC and used as theoretical models for observations. We present and verify a test case. Finally, we consider a few observations and discuss our results.

  20. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  1. [Application of AOTF in spectral analysis. 1. Hardware and software designs for the self-constructed visible AOTF spectrophotometer].

    PubMed

    He, Jia-yao; Peng, Rong-fei; Zhang, Zhan-xia

    2002-02-01

    A self-constructed visible spectrophotometer using an acousto-optic tunable filter(AOTF) as a dispersing element is described. Two different AOTFs (one from The Institute for Silicate (Shanghai, China) and the other from Brimrose(USA)) are tested. The software written with visual C++ and operated on a Window98 platform is an applied program with dual database and multi-windows. Four independent windows, namely scanning, quantitative, calibration and result are incorporated. The Fourier self-deconvolution algorithm is also incorporated to improve the spectral resolution. The wavelengths are calibrated using the polynomial curve fitting method. The spectra and calibration curves of soluble aniline blue and phenol red are presented to show the feasibility of the constructed spectrophotometer.

  2. The measurement and analysis of leaf spectral reflectance of two stands of loblolly pine populations

    NASA Technical Reports Server (NTRS)

    Paul, Anthony D.

    1993-01-01

    My research was conducted under the mentorship of Dr. Jeff Luvall. I worked at Marshall from June 1 through August 6, 1993. My proposal is titled 'The Measurement and Analysis of Leaf Spectral Reflectance of Two Stands of Loblolly Pine Populations.' The populations for this study were chosen from a larger population of 31 families managed by the International Forest Seed Company, Odenville, Alabama. The technology for mobile ground base spectral detecting is new and therefore the majority of time, June 2 through July 9, was spent on learning the techniques of the Spectrometer 2 spectroradiometer used in the gathering of spectra information. The activities included in the learning process were as follows: calibration of the equipment, programming the associated computer for data management, operation of the spectral devices, and input and output of data. From July 12 through August 3 the time was spent on learning the 'STATGRAP' computer software. This software will be used in the analysis of the data retrieved by the Spectrometer 2 spectroradiometer. Dr. Greg Carter, at Stennis, a colleague of Dr. Luvall, has been conducting similar work with different instruments and procedures and has agreed to host us for a training session on data gathering and analysis. This visit, which has previously planned for July 9, 1993, but had to be postponed because of schedule conflicts, is now confirmed for August 18-22, 1993. This trip to Stennis will provide the knowledge for conducting the field operations in my study, i.e., gathering of data and file conversions.

  3. Performance enhancement of optical code-division multiple-access systems using transposed modified Walsh code

    NASA Astrophysics Data System (ADS)

    Sikder, Somali; Ghosh, Shila

    2018-02-01

    This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.

  4. Advanced processing and simulation of MRS data using the FID appliance (FID-A)-An open source, MATLAB-based toolkit.

    PubMed

    Simpson, Robin; Devenyi, Gabriel A; Jezzard, Peter; Hennessy, T Jay; Near, Jamie

    2017-01-01

    To introduce a new toolkit for simulation and processing of magnetic resonance spectroscopy (MRS) data, and to demonstrate some of its novel features. The FID appliance (FID-A) is an open-source, MATLAB-based software toolkit for simulation and processing of MRS data. The software is designed specifically for processing data with multiple dimensions (eg, multiple radiofrequency channels, averages, spectral editing dimensions). It is equipped with functions for importing data in the formats of most major MRI vendors (eg, Siemens, Philips, GE, Agilent) and for exporting data into the formats of several common processing software packages (eg, LCModel, jMRUI, Tarquin). This paper introduces the FID-A software toolkit and uses examples to demonstrate its novel features, namely 1) the use of a spectral registration algorithm to carry out useful processing routines automatically, 2) automatic detection and removal of motion-corrupted scans, and 3) the ability to perform several major aspects of the MRS computational workflow from a single piece of software. This latter feature is illustrated through both high-level processing of in vivo GABA-edited MEGA-PRESS MRS data, as well as detailed quantum mechanical simulations to generate an accurate LCModel basis set for analysis of the same data. All of the described processing steps resulted in a marked improvement in spectral quality compared with unprocessed data. Fitting of MEGA-PRESS data using a customized basis set resulted in improved fitting accuracy compared with a generic MEGA-PRESS basis set. The FID-A software toolkit enables high-level processing of MRS data and accurate simulation of in vivo MRS experiments. Magn Reson Med 77:23-33, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  5. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE PAGES

    Van Berkel, Gary J.; Kertesz, Vilmos

    2016-11-15

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  6. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Kertesz, Vilmos

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  7. The US Geological Survey, digital spectral reflectance library: version 1: 0.2 to 3.0 microns

    NASA Technical Reports Server (NTRS)

    Clark, Roger N.; Swayze, Gregg A.; King, Trude V. V.; Gallagher, Andrea J.; Calvin, Wendy M.

    1993-01-01

    We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 500 spectra of 447 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 microns. The spectral resolution (Full Width Half Maximum) of the reflectance data is less than or equal to 4 nm in the visible (0.2-0.8 microns) and less than or equal 10 nm in the NIR (0.8-2.35 microns). All spectra were corrected to absolute reflectance using an NBS Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from sulfide, oxide, hydroxide, halide, carbonate, nitrate, borate, phosphate, and silicate groups are represented. X-ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses.

  8. Yaxx: Yet another X-ray extractor

    NASA Astrophysics Data System (ADS)

    Aldcroft, Tom

    2013-06-01

    Yaxx is a Perl script that facilitates batch data processing using Perl open source software and commonly available software such as CIAO/Sherpa, S-lang, SAS, and FTOOLS. For Chandra and XMM analysis it includes automated spectral extraction, fitting, and report generation. Yaxx can be run without climbing an extensive learning curve; even so, yaxx is highly configurable and can be customized to support complex analysis. yaxx uses template files and takes full advantage of the unique Sherpa / S-lang environment to make much of the processing user configurable. Although originally developed with an emphasis on X-ray data analysis, yaxx evolved to be a general-purpose pipeline scripting package.

  9. Development of Data Processing Software for NBI Spectroscopic Analysis System

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2015-04-01

    A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation

  10. Lessons Learned From Developing A Streaming Data Framework for Scientific Analysis

    NASA Technical Reports Server (NTRS)

    Wheeler. Kevin R.; Allan, Mark; Curry, Charles

    2003-01-01

    We describe the development and usage of a streaming data analysis software framework. The framework is used for three different applications: Earth science hyper-spectral imaging analysis, Electromyograph pattern detection, and Electroencephalogram state determination. In each application the framework was used to answer a series of science questions which evolved with each subsequent answer. This evolution is summarized in the form of lessons learned.

  11. An MS-DOS-based program for analyzing plutonium gamma-ray spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W.D.; Buckley, W.M.

    1989-09-07

    A plutonium gamma-ray analysis system that operates on MS-DOS-based computers has been developed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra for plutonium isotopics. The program titled IAEAPU consists of three separate applications: a data-transfer application for transferring spectral data from a CICERO multichannel analyzer to a binary data file, a data-analysis application to analyze plutonium gamma-ray spectra, for plutonium isotopic ratios and weight percents of total plutonium, and a data-quality assurance application to check spectral data for proper data-acquisition setup and performance. Volume 3 contains the software listings for these applications.

  12. Computational mass spectrometry for small molecules

    PubMed Central

    2013-01-01

    The identification of small molecules from mass spectrometry (MS) data remains a major challenge in the interpretation of MS data. This review covers the computational aspects of identifying small molecules, from the identification of a compound searching a reference spectral library, to the structural elucidation of unknowns. In detail, we describe the basic principles and pitfalls of searching mass spectral reference libraries. Determining the molecular formula of the compound can serve as a basis for subsequent structural elucidation; consequently, we cover different methods for molecular formula identification, focussing on isotope pattern analysis. We then discuss automated methods to deal with mass spectra of compounds that are not present in spectral libraries, and provide an insight into de novo analysis of fragmentation spectra using fragmentation trees. In addition, this review shortly covers the reconstruction of metabolic networks using MS data. Finally, we list available software for different steps of the analysis pipeline. PMID:23453222

  13. Spectral Analysis Tool 6.2 for Windows

    NASA Technical Reports Server (NTRS)

    Morgan, Feiming; Sue, Miles; Peng, Ted; Tan, Harry; Liang, Robert; Kinman, Peter

    2006-01-01

    Spectral Analysis Tool 6.2 is the latest version of a computer program that assists in analysis of interference between radio signals of the types most commonly used in Earth/spacecraft radio communications. [An earlier version was reported in Software for Analyzing Earth/Spacecraft Radio Interference (NPO-20422), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 52.] SAT 6.2 calculates signal spectra, bandwidths, and interference effects for several families of modulation schemes. Several types of filters can be modeled, and the program calculates and displays signal spectra after filtering by any of the modeled filters. The program accommodates two simultaneous signals: a desired signal and an interferer. The interference-to-signal power ratio can be calculated for the filtered desired and interfering signals. Bandwidth-occupancy and link-budget calculators are included for the user s convenience. SAT 6.2 has a new software structure and provides a new user interface that is both intuitive and convenient. SAT 6.2 incorporates multi-tasking, multi-threaded execution, virtual memory management, and a dynamic link library. SAT 6.2 is designed for use on 32- bit computers employing Microsoft Windows operating systems.

  14. Spectral pattern classification in lidar data for rock identification in outcrops.

    PubMed

    Campos Inocencio, Leonardo; Veronez, Mauricio Roberto; Wohnrath Tognoli, Francisco Manoel; de Souza, Marcelo Kehl; da Silva, Reginaldo Macedônio; Gonzaga, Luiz; Blum Silveira, César Leonardo

    2014-01-01

    The present study aimed to develop and implement a method for detection and classification of spectral signatures in point clouds obtained from terrestrial laser scanner in order to identify the presence of different rocks in outcrops and to generate a digital outcrop model. To achieve this objective, a software based on cluster analysis was created, named K-Clouds. This software was developed through a partnership between UNISINOS and the company V3D. This tool was designed to begin with an analysis and interpretation of a histogram from a point cloud of the outcrop and subsequently indication of a number of classes provided by the user, to process the intensity return values. This classified information can then be interpreted by geologists, to provide a better understanding and identification from the existing rocks in the outcrop. Beyond the detection of different rocks, this work was able to detect small changes in the physical-chemical characteristics of the rocks, as they were caused by weathering or compositional changes.

  15. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos

    2017-02-15

    An "Open Access"-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendor-provided software libraries. Sample classification based on spectral comparison utilized the spectral contrast angle method. Using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. This work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  16. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  17. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  18. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  19. Elemental misinterpretation in automated analysis of LIBS spectra.

    PubMed

    Hübert, Waldemar; Ankerhold, Georg

    2011-07-01

    In this work, the Stark effect is shown to be mainly responsible for wrong elemental allocation by automated laser-induced breakdown spectroscopy (LIBS) software solutions. Due to broadening and shift of an elemental emission line affected by the Stark effect, its measured spectral position might interfere with the line position of several other elements. The micro-plasma is generated by focusing a frequency-doubled 200 mJ pulsed Nd/YAG laser on an aluminum target and furthermore on a brass sample in air at atmospheric pressure. After laser pulse excitation, we have measured the temporal evolution of the Al(II) ion line at 281.6 nm (4s(1)S-3p(1)P) during the decay of the laser-induced plasma. Depending on laser pulse power, the center of the measured line is red-shifted by 130 pm (490 GHz) with respect to the exact line position. In this case, the well-known spectral line positions of two moderate and strong lines of other elements coincide with the actual shifted position of the Al(II) line. Consequently, a time-resolving software analysis can lead to an elemental misinterpretation. To avoid a wrong interpretation of LIBS spectra in automated analysis software for a given LIBS system, we recommend using larger gate delays incorporating Stark broadening parameters and using a range of tolerance, which is non-symmetric around the measured line center. These suggestions may help to improve time-resolving LIBS software promising a smaller probability of wrong elemental identification and making LIBS more attractive for industrial applications.

  20. Spectral reflectance properties (0.4-2.5 um) of secondary Fe-oxide, Fe-hydroxide, and Fe-sulfate-hydrate minerals associated with sulfide-bearing mine waste

    USGS Publications Warehouse

    Crowley, J.K.; Williams, D.E.; Hammarstrom1, J.M.; Piatak, N.; Mars, J.C.; Chou, I-Ming

    2006-01-01

    Fifteen Fe-oxide, Fe-hydroxide, and Fe-sulphate-hydrate mineral species commonly associated with sulphide bearing mine wastes were characterized by using X-ray powder diffraction and scanning electron microscope methods. Diffuse reflectance spectra of the samples show diagnostic absorption features related to electronic processes involving ferric and/or ferrous iron, and to vibrational processes involving water and hydroxyl ions. Such spectral features enable field and remote sensing based studies of the mineral distributions. Because secondary minerals are sensitive indicators of pH, Eh, relative humidity, and other environmental conditions, spectral mapping of these minerals promises to have important applications to mine waste remediation studies. This report releases digital (ascii) spectra (spectral_data_files.zip) of the fifteen mineral samples to facilitate usage of the data with spectral libraries and spectral analysis software. The spectral data are provided in a two-column format listing wavelength (in micrometers) and reflectance, respectively.

  1. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  2. Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery

    NASA Astrophysics Data System (ADS)

    Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.

    2017-05-01

    In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.

  3. A new scoring function for top-down spectral deconvolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kou, Qiang; Wu, Si; Liu, Xiaowen

    2014-12-18

    Background: Top-down mass spectrometry plays an important role in intact protein identification and characterization. Top-down mass spectra are more complex than bottom-up mass spectra because they often contain many isotopomer envelopes from highly charged ions, which may overlap with one another. As a result, spectral deconvolution, which converts a complex top-down mass spectrum into a monoisotopic mass list, is a key step in top-down spectral interpretation. Results: In this paper, we propose a new scoring function, L-score, for evaluating isotopomer envelopes. By combining L-score with MS-Deconv, a new software tool, MS-Deconv+, was developed for top-down spectral deconvolution. Experimental results showedmore » that MS-Deconv+ outperformed existing software tools in top-down spectral deconvolution. Conclusions: L-score shows high discriminative ability in identification of isotopomer envelopes. Using L-score, MS-Deconv+ reports many correct monoisotopic masses missed by other software tools, which are valuable for proteoform identification and characterization.« less

  4. SandiaMRCR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-01-05

    SandiaMCR was developed to identify pure components and their concentrations from spectral data. This software efficiently implements the multivariate calibration regression alternating least squares (MCR-ALS), principal component analysis (PCA), and singular value decomposition (SVD). Version 3.37 also includes the PARAFAC-ALS Tucker-1 (for trilinear analysis) algorithms. The alternating least squares methods can be used to determine the composition without or with incomplete prior information on the constituents and their concentrations. It allows the specification of numerous preprocessing, initialization and data selection and compression options for the efficient processing of large data sets. The software includes numerous options including the definition ofmore » equality and non-negativety constraints to realistically restrict the solution set, various normalization or weighting options based on the statistics of the data, several initialization choices and data compression. The software has been designed to provide a practicing spectroscopist the tools required to routinely analysis data in a reasonable time and without requiring expert intervention.« less

  5. Multi-species Identification of Polymorphic Peptide Variants via Propagation in Spectral Networks*

    PubMed Central

    Bandeira, Nuno

    2016-01-01

    Peptide and protein identification remains challenging in organisms with poorly annotated or rapidly evolving genomes, as are commonly encountered in environmental or biofuels research. Such limitations render tandem mass spectrometry (MS/MS) database search algorithms ineffective as they lack corresponding sequences required for peptide-spectrum matching. We address this challenge with the spectral networks approach to (1) match spectra of orthologous peptides across multiple related species and then (2) propagate peptide annotations from identified to unidentified spectra. We here present algorithms to assess the statistical significance of spectral alignments (Align-GF), reduce the impurity in spectral networks, and accurately estimate the error rate in propagated identifications. Analyzing three related Cyanothece species, a model organism for biohydrogen production, spectral networks identified peptides from highly divergent sequences from networks with dozens of variant peptides, including thousands of peptides in species lacking a sequenced genome. Our analysis further detected the presence of many novel putative peptides even in genomically characterized species, thus suggesting the possibility of gaps in our understanding of their proteomic and genomic expression. A web-based pipeline for spectral networks analysis is available at http://proteomics.ucsd.edu/software. PMID:27609420

  6. SimPhospho: a software tool enabling confident phosphosite assignment.

    PubMed

    Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L

    2018-03-27

    Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.

  7. Directional Unfolded Source Term (DUST) for Compton Cameras.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  8. Android application and REST server system for quasar spectrum presentation and analysis

    NASA Astrophysics Data System (ADS)

    Wasiewicz, P.; Pietralik, K.; Hryniewicz, K.

    2017-08-01

    This paper describes the implementation of a system consisting of a mobile application and RESTful architecture server intended for the analysis and presentation of quasars' spectrum. It also depicts the quasar's characteristics and significance to the scientific community, the source for acquiring astronomical objects' spectral data, used software solutions as well as presents the aspect of Cloud Computing and various possible deployment configurations.

  9. Identification of triacylglycerol using automated annotation of high resolution multistage mass spectral trees.

    PubMed

    Wang, Xiupin; Peng, Qingzhi; Li, Peiwu; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen; Zhang, Liangxiao

    2016-10-12

    High complexity of identification for non-target triacylglycerols (TAGs) is a major challenge in lipidomics analysis. To identify non-target TAGs, a powerful tool named accurate MS(n) spectrometry generating so-called ion trees is used. In this paper, we presented a technique for efficient structural elucidation of TAGs on MS(n) spectral trees produced by LTQ Orbitrap MS(n), which was implemented as an open source software package, or TIT. The TIT software was used to support automatic annotation of non-target TAGs on MS(n) ion trees from a self-built fragment ion database. This database includes 19108 simulate TAG molecules from a random combination of fatty acids and corresponding 500582 self-built multistage fragment ions (MS ≤ 3). Our software can identify TAGs using a "stage-by-stage elimination" strategy. By utilizing the MS(1) accurate mass and referenced RKMD, the TIT software can discriminate unique elemental composition candidates. The regiospecific isomers of fatty acyl chains will be distinguished using MS(2) and MS(3) fragment spectra. We applied the algorithm to the selection of 45 TAG standards and demonstrated that the molecular ions could be 100% correctly assigned. Therefore, the TIT software could be applied to TAG identification in complex biological samples such as mouse plasma extracts. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  11. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  12. The spectra program library: A PC based system for gamma-ray spectra analysis and INAA data reduction

    USGS Publications Warehouse

    Baedecker, P.A.; Grossman, J.N.

    1995-01-01

    A PC based system has been developed for the analysis of gamma-ray spectra and for the complete reduction of data from INAA experiments, including software to average the results from mulitple lines and multiple countings and to produce a final report of analysis. Graphics algorithms may be called for the analysis of complex spectral features, to compare the data from alternate photopeaks and to evaluate detector performance during a given counting cycle. A database of results for control samples can be used to prepare quality control charts to evaluate long term precision and to search for systemic variations in data on reference samples as a function of time. The entire software library can be accessed through a user-friendly menu interface with internal help.

  13. An Excel‐based implementation of the spectral method of action potential alternans analysis

    PubMed Central

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  14. Spectral and Spatial Coherent Emission of Thermal Radiation from Metal-Semiconductor Nanostructures

    DTIC Science & Technology

    2012-03-01

    Coupled Wave Analysis (RCWA) numerical technique and Computer Simulation Technology (CST) electromagnetic modeling software, two structures were...Stephanie Gray, IR-VASE and modeling  Dr. Kevin Gross, FTIR  Mr. Richard Johnston, Cleanroom and Photolithography  Ms. Abbey Juhl, Nanoscribe...Appendix B. Supplemental IR-VASE Measurements and Modeling .............................114 Bibliography

  15. SaaS Platform for Time Series Data Handling

    NASA Astrophysics Data System (ADS)

    Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail

    2018-02-01

    The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  16. Evolutionary Computing Methods for Spectral Retrieval

    NASA Technical Reports Server (NTRS)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  17. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  18. Ground-Based Correction of Remote-Sensing Spectral Imagery

    NASA Technical Reports Server (NTRS)

    Alder-Golden, Steven M.; Rochford, Peter; Matthew, Michael; Berk, Alexander

    2007-01-01

    Software has been developed for an improved method of correcting for the atmospheric optical effects (primarily, effects of aerosols and water vapor) in spectral images of the surface of the Earth acquired by airborne and spaceborne remote-sensing instruments. In this method, the variables needed for the corrections are extracted from the readings of a radiometer located on the ground in the vicinity of the scene of interest. The software includes algorithms that analyze measurement data acquired from a shadow-band radiometer. These algorithms are based on a prior radiation transport software model, called MODTRAN, that has been developed through several versions up to what are now known as MODTRAN4 and MODTRAN5 . These components have been integrated with a user-friendly Interactive Data Language (IDL) front end and an advanced version of MODTRAN4. Software tools for handling general data formats, performing a Langley-type calibration, and generating an output file of retrieved atmospheric parameters for use in another atmospheric-correction computer program known as FLAASH have also been incorporated into the present soft-ware. Concomitantly with the soft-ware described thus far, there has been developed a version of FLAASH that utilizes the retrieved atmospheric parameters to process spectral image data.

  19. Multi range spectral feature fitting for hyperspectral imagery in extracting oilseed rape planting area

    NASA Astrophysics Data System (ADS)

    Pan, Zhuokun; Huang, Jingfeng; Wang, Fumin

    2013-12-01

    Spectral feature fitting (SFF) is a commonly used strategy for hyperspectral imagery analysis to discriminate ground targets. Compared to other image analysis techniques, SFF does not secure higher accuracy in extracting image information in all circumstances. Multi range spectral feature fitting (MRSFF) from ENVI software allows user to focus on those interesting spectral features to yield better performance. Thus spectral wavelength ranges and their corresponding weights must be determined. The purpose of this article is to demonstrate the performance of MRSFF in oilseed rape planting area extraction. A practical method for defining the weighted values, the variance coefficient weight method, was proposed to set up criterion. Oilseed rape field canopy spectra from the whole growth stage were collected prior to investigating its phenological varieties; oilseed rape endmember spectra were extracted from the Hyperion image as identifying samples to be used in analyzing the oilseed rape field. Wavelength range divisions were determined by the difference between field-measured spectra and image spectra, and image spectral variance coefficient weights for each wavelength range were calculated corresponding to field-measured spectra from the closest date. By using MRSFF, wavelength ranges were classified to characterize the target's spectral features without compromising spectral profile's entirety. The analysis was substantially successful in extracting oilseed rape planting areas (RMSE ≤ 0.06), and the RMSE histogram indicated a superior result compared to a conventional SFF. Accuracy assessment was based on the mapping result compared with spectral angle mapping (SAM) and the normalized difference vegetation index (NDVI). The MRSFF yielded a robust, convincible result and, therefore, may further the use of hyperspectral imagery in precision agriculture.

  20. From ultraviolet to Prussian blue: a spectral response for the cyanotype process and a safe educational activity to explain UV exposure for all ages.

    PubMed

    Turner, J; Parisi, A V; Downs, N; Lynch, M

    2014-12-01

    Engaging students and the public in understanding UV radiation and its effects is achievable using the real time experiment that incorporates blueprint paper, an "educational toy" that is a safe and easy demonstration of the cyanotype chemical process. The cyanotype process works through the presence of UV radiation. The blueprint paper was investigated to be used as not only engagement in discussion for public outreach about UV radiation, but also as a practical way to introduce the exploration of measurement of UV radiation exposure and as a consequence, digital image analysis. Tests of print methods and experiments, dose response, spectral response and dark response were investigated. Two methods of image analysis for dose response calculation are provided using easy to access software and two methods of pixel count analysis were used to determine spectral response characteristics. Variation in manufacture of the blueprint paper product indicates some variance between measurements. Most importantly, as a result of this investigation, a preliminary spectral response range for the radiation required to produce the cyanotype reaction is presented here, which has until now been unknown.

  1. On-Site Inspection RadioIsotopic Spectroscopy (Osiris) System Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caffrey, Gus J.; Egger, Ann E.; Krebs, Kenneth M.

    2015-09-01

    We have designed and tested hardware and software for the acquisition and analysis of high-resolution gamma-ray spectra during on-site inspections under the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The On-Site Inspection RadioIsotopic Spectroscopy—Osiris—software filters the spectral data to display only radioisotopic information relevant to CTBT on-site inspections, e.g.,132I. A set of over 100 fission-product spectra was employed for Osiris testing. These spectra were measured, where possible, or generated by modeling. The synthetic test spectral compositions include non-nuclear-explosion scenarios, e.g., a severe nuclear reactor accident, and nuclear-explosion scenarios such as a vented underground nuclear test. Comparing its computer-based analyses to expert visual analysesmore » of the test spectra, Osiris correctly identifies CTBT-relevant fission product isotopes at the 95% level or better.The Osiris gamma-ray spectrometer is a mechanically-cooled, battery-powered ORTEC Transpec-100, chosen to avoid the need for liquid nitrogen during on-site inspections. The spectrometer was used successfully during the recent 2014 CTBT Integrated Field Exercise in Jordan. The spectrometer is controlled and the spectral data analyzed by a Panasonic Toughbook notebook computer. To date, software development has been the main focus of the Osiris project. In FY2016-17, we plan to modify the Osiris hardware, integrate the Osiris software and hardware, and conduct rigorous field tests to ensure that the Osiris system will function correctly during CTBT on-site inspections. The planned development will raise Osiris to technology readiness level TRL-8; transfer the Osiris technology to a commercial manufacturer, and demonstrate Osiris to potential CTBT on-site inspectors.« less

  2. Analytical Utility of Mass Spectral Binning in Proteomic Experiments by SPectral Immonium Ion Detection (SPIID)*

    PubMed Central

    Kelstrup, Christian D.; Frese, Christian; Heck, Albert J. R.; Olsen, Jesper V.; Nielsen, Michael L.

    2014-01-01

    Unambiguous identification of tandem mass spectra is a cornerstone in mass-spectrometry-based proteomics. As the study of post-translational modifications (PTMs) by means of shotgun proteomics progresses in depth and coverage, the ability to correctly identify PTM-bearing peptides is essential, increasing the demand for advanced data interpretation. Several PTMs are known to generate unique fragment ions during tandem mass spectrometry, the so-called diagnostic ions, which unequivocally identify a given mass spectrum as related to a specific PTM. Although such ions offer tremendous analytical advantages, algorithms to decipher MS/MS spectra for the presence of diagnostic ions in an unbiased manner are currently lacking. Here, we present a systematic spectral-pattern-based approach for the discovery of diagnostic ions and new fragmentation mechanisms in shotgun proteomics datasets. The developed software tool is designed to analyze large sets of high-resolution peptide fragmentation spectra independent of the fragmentation method, instrument type, or protease employed. To benchmark the software tool, we analyzed large higher-energy collisional activation dissociation datasets of samples containing phosphorylation, ubiquitylation, SUMOylation, formylation, and lysine acetylation. Using the developed software tool, we were able to identify known diagnostic ions by comparing histograms of modified and unmodified peptide spectra. Because the investigated tandem mass spectra data were acquired with high mass accuracy, unambiguous interpretation and determination of the chemical composition for the majority of detected fragment ions was feasible. Collectively we present a freely available software tool that allows for comprehensive and automatic analysis of analogous product ions in tandem mass spectra and systematic mapping of fragmentation mechanisms related to common amino acids. PMID:24895383

  3. Systematic toxicological analysis: computer-assisted identification of poisons in biological materials.

    PubMed

    Stimpfl, Th; Demuth, W; Varmuza, K; Vycudilik, W

    2003-06-05

    A new software was developed to improve the chances for identification of a "general unknown" in complex biological materials. To achieve this goal, the total ion current chromatogram was simplified by filtering the acquired mass spectra via an automated subtraction procedure, which removed mass spectra originating from the sample matrix, as well as interfering substances from the extraction procedure. It could be shown that this tool emphasizes mass spectra of exceptional compounds, and therefore provides the forensic toxicologist with further evidence-even in cases where mass spectral data of the unknown compound are not available in "standard" spectral libraries.

  4. The Tetracorder user guide: version 4.4

    USGS Publications Warehouse

    Livo, Keith Eric; Clark, Roger N.

    2014-01-01

    Imaging spectroscopy mapping software assists in the identification and mapping of materials based on their chemical properties as expressed in spectral measurements of a planet including the solid or liquid surface or atmosphere. Such software can be used to analyze field, aircraft, or spacecraft data; remote sensing datasets; or laboratory spectra. Tetracorder is a set of software algorithms commanded through an expert system to identify materials based on their spectra (Clark and others, 2003). Tetracorder also can be used in traditional remote sensing analyses, because some of the algorithms are a version of a matched filter. Thus, depending on the instructions fed to the Tetracorder system, results can range from simple matched filter output, to spectral feature fitting, to full identification of surface materials (within the limits of the spectral signatures of materials over the spectral range and resolution of the imaging spectroscopy data). A basic understanding of spectroscopy by the user is required for developing an optimum mapping strategy and assessing the results.

  5. Bandwidth scalable, coherent transmitter based on the parallel synthesis of multiple spectral slices using optical arbitrary waveform generation.

    PubMed

    Geisler, David J; Fontaine, Nicolas K; Scott, Ryan P; He, Tingting; Paraschis, Loukas; Gerstel, Ori; Heritage, Jonathan P; Yoo, S J B

    2011-04-25

    We demonstrate an optical transmitter based on dynamic optical arbitrary waveform generation (OAWG) which is capable of creating high-bandwidth (THz) data waveforms in any modulation format using the parallel synthesis of multiple coherent spectral slices. As an initial demonstration, the transmitter uses only 5.5 GHz of electrical bandwidth and two 10-GHz-wide spectral slices to create 100-ns duration, 20-GHz optical waveforms in various modulation formats including differential phase-shift keying (DPSK), quaternary phase-shift keying (QPSK), and eight phase-shift keying (8PSK) with only changes in software. The experimentally generated waveforms showed clear eye openings and separated constellation points when measured using a real-time digital coherent receiver. Bit-error-rate (BER) performance analysis resulted in a BER < 9.8 × 10(-6) for DPSK and QPSK waveforms. Additionally, we experimentally demonstrate three-slice, 4-ns long waveforms that highlight the bandwidth scalable nature of the optical transmitter. The various generated waveforms show that the key transmitter properties (i.e., packet length, modulation format, data rate, and modulation filter shape) are software definable, and that the optical transmitter is capable of acting as a flexible bandwidth transmitter.

  6. The MPI-Mainz UV/VIS Spectral Atlas of Gaseous Molecules of Atmospheric Interest

    NASA Astrophysics Data System (ADS)

    Sander, Rolf; Keller-Rudek, Hannelore; Moortgat, Geert; Sörensen, Rüdiger

    2014-05-01

    Measurements from satellites can be used to obtain global concentration maps of atmospheric trace constituents. Critical parameters needed in the analysis of the satellite data are the absorption cross sections of the observed molecules. Here, we present the MPI-Mainz UV/VIS Spectral Atlas, which is a large collection of more than 5000 absorption cross section and quantum yield data files in the ultraviolet and visible (UV/VIS) wavelength region for gaseous molecules and radicals primarily of atmospheric interest. The data files contain results of individual measurements, covering research of almost a whole century. To compare and visualize the data sets, multicoloured graphical representations have been created. The Spectral Atlas is available on the internet at http://www.uv-vis-spectral-atlas-mainz.org. It has been completely overhauled and now appears with improved browse and search options, based on PostgreSQL, Django and Python database software. The web pages are continuously updated.

  7. Using fragmentation trees and mass spectral trees for identifying unknown compounds in metabolomics.

    PubMed

    Vaniya, Arpana; Fiehn, Oliver

    2015-06-01

    Identification of unknown metabolites is the bottleneck in advancing metabolomics, leaving interpretation of metabolomics results ambiguous. The chemical diversity of metabolism is vast, making structure identification arduous and time consuming. Currently, comprehensive analysis of mass spectra in metabolomics is limited to library matching, but tandem mass spectral libraries are small compared to the large number of compounds found in the biosphere, including xenobiotics. Resolving this bottleneck requires richer data acquisition and better computational tools. Multi-stage mass spectrometry (MSn) trees show promise to aid in this regard. Fragmentation trees explore the fragmentation process, generate fragmentation rules and aid in sub-structure identification, while mass spectral trees delineate the dependencies in multi-stage MS of collision-induced dissociations. This review covers advancements over the past 10 years as a tool for metabolite identification, including algorithms, software and databases used to build and to implement fragmentation trees and mass spectral annotations.

  8. Parallel Computing for the Computed-Tomography Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon

    2008-01-01

    This software computes the tomographic reconstruction of spatial-spectral data from raw detector images of the Computed-Tomography Imaging Spectrometer (CTIS), which enables transient-level, multi-spectral imaging by capturing spatial and spectral information in a single snapshot.

  9. Rapid identification and classification of Listeria spp. and serotype assignment of Listeria monocytogenes using fourier transform-infrared spectroscopy and artificial neural network analysis

    USDA-ARS?s Scientific Manuscript database

    The use of Fourier Transform-Infrared Spectroscopy (FT-IR) in conjunction with Artificial Neural Network software, NeuroDeveloper™ was examined for the rapid identification and classification of Listeria species and serotyping of Listeria monocytogenes. A spectral library was created for 245 strains...

  10. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Analysis of seismic stability of large-sized tank VST-20000 with software package ANSYS

    NASA Astrophysics Data System (ADS)

    Tarasenko, A. A.; Chepur, P. V.; Gruchenkova, A. A.

    2018-05-01

    The work is devoted to the study of seismic stability of vertical steel tank VST-20000 with due consideration of the system response “foundation-tank-liquid”, conducted on the basis of the finite element method, modal analysis and linear spectral theory. The calculations are performed for the tank model with a high degree of detailing of metallic structures: shells, a fixed roof, a bottom, a reinforcing ring.

  12. Software for Simulation of Hyperspectral Images

    NASA Technical Reports Server (NTRS)

    Richtsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2002-01-01

    A package of software generates simulated hyperspectral images for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport as well as surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, 'ground truth' is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces and the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for and a supplement to field validation data.

  13. Inexpensive Audio Activities: Earbud-based Sound Experiments

    NASA Astrophysics Data System (ADS)

    Allen, Joshua; Boucher, Alex; Meggison, Dean; Hruby, Kate; Vesenka, James

    2016-11-01

    Inexpensive alternatives to a number of classic introductory physics sound laboratories are presented including interference phenomena, resonance conditions, and frequency shifts. These can be created using earbuds, economical supplies such as Giant Pixie Stix® wrappers, and free software available for PCs and mobile devices. We describe two interference laboratories (beat frequency and two-speaker interference) and two resonance laboratories (quarter- and half-wavelength). Lastly, a Doppler laboratory using rotating earbuds is explained. The audio signal captured by all experiments is analyzed on free spectral analysis software and many of the experiments incorporate the unifying theme of measuring the speed of sound in air.

  14. Quantifying the Impact of Nanoparticle Coatings and Non-uniformities on XPS Analysis: Gold/silver Core-shell Nanoparticles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yung-Chen Andrew; Engelhard, Mark H.; Baer, Donald R.

    2016-03-07

    Abstract or short description: Spectral modeling of photoelectrons can serve as a valuable tool when combined with X-ray photoelectron spectroscopy (XPS) analysis. Herein, a new version of the NIST Simulation of Electron Spectra for Surface Analysis (SESSA 2.0) software, capable of directly simulating spherical multilayer NPs, was applied to model citrate stabilized Au/Ag-core/shell nanoparticles (NPs). The NPs were characterized using XPS and scanning transmission electron microscopy (STEM) to determine the composition and morphology of the NPs. The Au/Ag-core/shell NPs were observed to be polydispersed in size, non-circular, and contain off-centered Au-cores. Using the average NP dimensions determined from STEM analysis,more » SESSA spectral modeling indicated that washed Au/Ag-core shell NPs were stabilized with a 0.8 nm l« less

  15. Quadratic Blind Linear Unmixing: A Graphical User Interface for Tissue Characterization

    PubMed Central

    Gutierrez-Navarro, O.; Campos-Delgado, D.U.; Arce-Santana, E. R.; Jo, Javier A.

    2016-01-01

    Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. PMID:26589467

  16. Quadratic blind linear unmixing: A graphical user interface for tissue characterization.

    PubMed

    Gutierrez-Navarro, O; Campos-Delgado, D U; Arce-Santana, E R; Jo, Javier A

    2016-02-01

    Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. The GRIDView Visualization Package

    NASA Astrophysics Data System (ADS)

    Kent, B. R.

    2011-07-01

    Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.

  18. [Application of AOTF in spectral analysis. 3. Application of AOTF in atomic emission spectral analysis].

    PubMed

    Chen, Ze-yong; Peng, Rong-fei; Zhang, Zhan-xia

    2002-06-01

    An atomic emission spectrometer based on acousto-optic tunable filter (AOTF) was self-constructed and was used to evaluate its practical use in atomic emission analysis. The AOTF used was of model TEAF5-0.36-0.52-S (Brimrose, USA) and the frequency of the direct digital RF synthesizer ranges from 100 MHz to 200 MHz. ICP and PMT were used as light source and detector respectively. The software, written in Visual C++ and running on the Windows 98 platform, is of an utility program system having two data banks and multiwindows. The wavelength calibration was performed with 14 emission lines of Ca, Y, Li, Eu, Sr and Ba using a tenth-order polynomial for line fitting method. The absolute error of the peak position was less than 0.1 nm, and the peak deviation was only 0.04 nm as the PMT varied from 337.5 V to 412.5 V. The scanning emission spectra and the calibration curves of Ba, Y, Eu, Sc and Sr are presented. Their average correlation coefficient was 0.9991 and their detection limits were in the range of 0.051 to 0.97 micrograms.mL-1 respectively. The detection limit can be improved under optimized operating conditions. However, the spectral resolution is only 2.1 nm at the wavelength of 488 nm. Evidently, this poor spectral resolution would restrict the application of AOTF in atomic emission spectral analysis, unless an enhancing techniques is integrated in it.

  19. An Excel-based implementation of the spectral method of action potential alternans analysis.

    PubMed

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  20. Orbiter Flying Qualities (OFQ) Workstation user's guide

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.

    1988-01-01

    This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.

  1. A new approach to aid the characterisation and identification of metabolites of a model drug; partial isotope enrichment combined with novel formula elucidation software.

    PubMed

    Hobby, Kirsten; Gallagher, Richard T; Caldwell, Patrick; Wilson, Ian D

    2009-01-01

    This work describes the identification of 'isotopically enriched' metabolites of 4-cyanoaniline using the unique features of the software package 'Spectral Simplicity'. The software is capable of creating the theoretical mass spectra for partially isotope-enriched compounds, and subsequently performing an elemental composition analysis to give the elemental formula for the 'isotopically enriched' metabolite. A novel mass spectral correlation method, called 'FuzzyFit', was employed. 'FuzzyFit' utilises the expected experimental distribution of errors in both mass accuracy and isotope pattern and enables discrimination between statistically probable and improbable candidate formulae. The software correctly determined the molecular formulae of ten previously described metabolites of 4-cyanoaniline confirming the technique of partial isotope enrichment can produce results analogous to standard methodologies. Six previously unknown species were also identified, based on the presence of the unique 'designer' isotope ratio. Three of the unknowns were tentatively identified as N-acetylglutamine, O-methyl-N acetylglucuronide and a putative fatty acid conjugate. The discovery of a significant number of unknown species of a model drug with a comprehensive history of investigation highlights the potential for enhancement to the analytical process by the use of 'designer' isotope ratio compounds. The 'FuzzyFit' methodology significantly aided the elucidation of candidate formulae, by provision of a vastly simplified candidate formula data set. Copyright (c) 2008 John Wiley & Sons, Ltd.

  2. Fast interactive elastic registration of 12-bit multi-spectral images with subvoxel accuracy using display hardware

    NASA Astrophysics Data System (ADS)

    Noordmans, Herke Jan; de Roode, Rowland; Verdaasdonk, Rudolf

    2007-03-01

    Multi-spectral images of human tissue taken in-vivo often contain image alignment problems as patients have difficulty in retaining their posture during the acquisition time of 20 seconds. Previously, it has been attempted to correct motion errors with image registration software developed for MR or CT data but these algorithms have been proven to be too slow and erroneous for practical use with multi-spectral images. A new software package has been developed which allows the user to play a decisive role in the registration process as the user can monitor the progress of the registration continuously and force it in the right direction when it starts to fail. The software efficiently exploits videocard hardware to gain speed and to provide a perfect subvoxel correspondence between registration field and display. An 8 bit graphic card was used to efficiently register and resample 12 bit images using the hardware interpolation modes present on the graphic card. To show the feasibility of this new registration process, the software was applied in clinical practice evaluating the dosimetry for psoriasis and KTP laser treatment. The microscopic differences between images of normal skin and skin exposed to UV light proved that an affine registration step including zooming and slanting is critical for a subsequent elastic match to have success. The combination of user interactive registration software with optimal addressing the potentials of PC video card hardware greatly improves the speed of multi spectral image registration.

  3. Fast interactive registration tool for reproducible multi-spectral imaging for wound healing and treatment evaluation

    NASA Astrophysics Data System (ADS)

    Noordmans, Herke J.; de Roode, Rowland; Verdaasdonk, Rudolf

    2007-02-01

    Multi-spectral images of human tissue taken in-vivo often contain image alignment problems as patients have difficulty in retaining their posture during the acquisition time of 20 seconds. Previously, it has been attempted to correct motion errors with image registration software developed for MR or CT data but these algorithms have been proven to be too slow and erroneous for practical use with multi-spectral images. A new software package has been developed which allows the user to play a decisive role in the registration process as the user can monitor the progress of the registration continuously and force it in the right direction when it starts to fail. The software efficiently exploits videocard hardware to gain speed and to provide a perfect subvoxel correspondence between registration field and display. An 8 bit graphic card was used to efficiently register and resample 12 bit images using the hardware interpolation modes present on the graphic card. To show the feasibility of this new registration process, the software was applied in clinical practice evaluating the dosimetry for psoriasis and KTP laser treatment. The microscopic differences between images of normal skin and skin exposed to UV light proved that an affine registration step including zooming and slanting is critical for a subsequent elastic match to have success. The combination of user interactive registration software with optimal addressing the potentials of PC video card hardware greatly improves the speed of multi spectral image registration.

  4. Visualization techniques to aid in the analysis of multispectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.

  5. Type practical application in spectral analysis, combining Labview and open source software

    NASA Astrophysics Data System (ADS)

    Chioncel, C. P.; Anghel Drugarin, C. V.

    2018-01-01

    The paper presents the interconnection possibility of LabVIEW with his different opportunities and Scilab, one of the successful free MatLAB clones. The interconnection between those was made possible through the LabVIEW to Scilab gateway. This tool can be applied in virtual as well as in real laboratories, representing a true assistance for self-learning, too.

  6. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The tool is available at https://github.com/USGS-Astrogeology/PySAT_Point_Spectra_GUI. [1] Clegg, S.M., et al. (2017) Spectrochim Acta B. 129, 64-85. [2] Gaddis, L. et al. (2017) 3rd Planetary Data Workshop, #1986. [3] http://scikit-learn.org/ [4] Anderson, R.B., et al. (2017) Spectrochim. Acta B. 129, 49-57.

  7. Novel hyperspectral prediction method and apparatus

    NASA Astrophysics Data System (ADS)

    Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf

    2009-05-01

    Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.

  8. Soft x-ray scattering facility at the Advanced Light Source with real-time data processing and analysis

    NASA Astrophysics Data System (ADS)

    Gann, E.; Young, A. T.; Collins, B. A.; Yan, H.; Nasiatka, J.; Padmore, H. A.; Ade, H.; Hexemer, A.; Wang, C.

    2012-04-01

    We present the development and characterization of a dedicated resonant soft x-ray scattering facility. Capable of operation over a wide energy range, the beamline and endstation are primarily used for scattering from soft matter systems around the carbon K-edge (˜285 eV). We describe the specialized design of the instrument and characteristics of the beamline. Operational characteristics of immediate interest to users such as polarization control, degree of higher harmonic spectral contamination, and detector noise are delineated. Of special interest is the development of a higher harmonic rejection system that improves the spectral purity of the x-ray beam. Special software and a user-friendly interface have been implemented to allow real-time data processing and preliminary data analysis simultaneous with data acquisition.

  9. A portable platform to collect and review behavioral data simultaneously with neurophysiological signals.

    PubMed

    Tianxiao Jiang; Siddiqui, Hasan; Ray, Shruti; Asman, Priscella; Ozturk, Musa; Ince, Nuri F

    2017-07-01

    This paper presents a portable platform to collect and review behavioral data simultaneously with neurophysiological signals. The whole system is comprised of four parts: a sensor data acquisition interface, a socket server for real-time data streaming, a Simulink system for real-time processing and an offline data review and analysis toolbox. A low-cost microcontroller is used to acquire data from external sensors such as accelerometer and hand dynamometer. The micro-controller transfers the data either directly through USB or wirelessly through a bluetooth module to a data server written in C++ for MS Windows OS. The data server also interfaces with the digital glove and captures HD video from webcam. The acquired sensor data are streamed under User Datagram Protocol (UDP) to other applications such as Simulink/Matlab for real-time analysis and recording. Neurophysiological signals such as electroencephalography (EEG), electrocorticography (ECoG) and local field potential (LFP) recordings can be collected simultaneously in Simulink and fused with behavioral data. In addition, we developed a customized Matlab Graphical User Interface (GUI) software to review, annotate and analyze the data offline. The software provides a fast, user-friendly data visualization environment with synchronized video playback feature. The software is also capable of reviewing long-term neural recordings. Other featured functions such as fast preprocessing with multithreaded filters, annotation, montage selection, power-spectral density (PSD) estimate, time-frequency map and spatial spectral map are also implemented.

  10. Characterizing Cyclostationary Features of Digital Modulated Signals with Empirical Measurements using Spectral Correlation Function

    DTIC Science & Technology

    2011-06-01

    USING SPECTRAL CORRELATION FUNCTION THESIS Mujun Song, Captain, ROKA AFIT/GCE/ENG/11-09 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR...Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the...generator, Agilent E4438C, ESG Vector Signal Generator. Universal Software Radio Peripheral 2 (USRP2), which is a Software Defined Radio (SDR), is used

  11. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  12. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  13. The Utility of Free Software for Gravity and Magnetic Advanced Data Processing

    NASA Astrophysics Data System (ADS)

    Grandis, Hendra; Dahrin, Darharta

    2017-04-01

    The lack of computational tools, i.e. software, often hinders the proper teaching and application of geophysical data processing in academic institutions in Indonesia. Although there are academic licensing options for commercial software, such options are still way beyond the financial capability of some academic institutions. Academic community members (both lecturers and students) are supposed to be creative and resourceful to overcome such situation. Therefore, capability for writing computer programs or codes is a necessity. However, there are also many computer programs and even software that are freely available on the internet. Generally, the utility of the freely distributed software is limited for demonstration only or for visualizing and exchanging data. The paper discusses the utility of Geosoft’s Oasis Montaj Viewer along with USGS GX programs that are available for free. Useful gravity and magnetic advanced data processing (i.e. gradient calculation, spectral analysis etc.) can be performed “correctly” without any approximation that sometimes leads to dubious results and interpretation.

  14. Co-simulation coupling spectral/finite elements for 3D soil/structure interaction problems

    NASA Astrophysics Data System (ADS)

    Zuchowski, Loïc; Brun, Michael; De Martin, Florent

    2018-05-01

    The coupling between an implicit finite elements (FE) code and an explicit spectral elements (SE) code has been explored for solving the elastic wave propagation in the case of soil/structure interaction problem. The coupling approach is based on domain decomposition methods in transient dynamics. The spatial coupling at the interface is managed by a standard coupling mortar approach, whereas the time integration is dealt with an hybrid asynchronous time integrator. An external coupling software, handling the interface problem, has been set up in order to couple the FE software Code_Aster with the SE software EFISPEC3D.

  15. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  16. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an advanced nonlinear signal analysis topographical mapping system (ATMS) of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbopump families.

  17. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  18. Spectral Dynamics Inc., ships hybrid, 316-channel data acquisition system to Sandia Labs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Douglas

    2003-09-01

    Spectral Dynamics announced the shipment of a 316-channel data acquisition system. The system was custom designed for the Light Initiated High Explosive (LIHE) facility at Sandia Labs in Albuquerque, New Mexico by Spectral Dynamics Advanced Research Products Group. This Spectral Dynamics data acquisition system was tailored to meet the unique LIHE environmental and testing requirements utilizing Spectral Dynamics commercial off the shelf (COTS) Jaguar and VIDAS products supplemented by SD Alliance partner's (COTS) products. 'This system is just the beginning of our cutting edge merged technology solutions,' stated Mark Remelman, Manager for the Spectral Dynamics Advanced Research Products Group. 'Thismore » Hybrid system has 316-channels of data acquisition capability, comprised of 102.4kHz direct to disk acquisition and 2.5MHz, 200Mhz & 500Mhz RAM based capabilities. In addition it incorporates the advanced bridge conditioning and dynamic configuration capabilities offered by Spectral Dynamics new Smart Interface Panel System (SIPS{trademark}).' After acceptance testing, Tony King, the Instrumentation Engineer facilitating the project for the Sandia LIHE group commented; 'The LIHE staff was very impressed with the design, construction, attention to detail and overall performance of the instrumentation system'. This system combines VIDAS, a leading edge fourth generation SD-VXI hardware and field-proven software system from SD's Advanced Research Products Group with SD's Jaguar, a multiple Acquisition Control Peripheral (ACP) system that allows expansion to hundreds of channels without sacrificing signal processing performance. Jaguar incorporates dedicated throughput disks for each ACP providing time streaming to disk at up to the maximum sample rate. Spectral Dynamics, Inc. is a leading worldwide supplier of systems and software for advanced computer-automated data acquisition, vibration testing, structural dynamics, explosive shock, high-speed transient capture, acoustic analysis, monitoring, measurement, control and backup. Spectral Dynamics products are used for research, design verification, product testing and process improvement by manufacturers of all types of electrical, electronic and mechanical products, as well as by universities and government-funded agencies. The Advanced Research Products Group is the newest addition to the Spectral Dynamics family. Their newest VXI data acquisition hardware pushes the envelope on capabilities and embodies the same rock solid design methodologies, which have always differentiated Spectral Dynamics from its competition.« less

  19. Ocean Color Measurements from Landsat-8 OLI using SeaDAS

    NASA Technical Reports Server (NTRS)

    Franz, Bryan Alden; Bailey, Sean W.; Kuring, Norman; Werdell, P. Jeremy

    2014-01-01

    The Operational Land Imager (OLI) is a multi-spectral radiometer hosted on the recently launched Landsat-8 satellite. OLI includes a suite of relatively narrow spectral bands at 30-meter spatial resolution in the visible to shortwave infrared that make it a potential tool for ocean color radiometry: measurement of the reflected spectral radiance upwelling from beneath the ocean surface that carries information on the biogeochemical constituents of the upper ocean euphotic zone. To evaluate the potential of OLI to measure ocean color, processing support was implemented in SeaDAS, which is an open-source software package distributed by NASA for processing, analysis, and display of ocean remote sensing measurements from a variety of satellite-based multi-spectral radiometers. Here we describe the implementation of OLI processing capabilities within SeaDAS, including support for various methods of atmospheric correction to remove the effects of atmospheric scattering and absorption and retrieve the spectral remote-sensing reflectance (Rrs; sr exp 1). The quality of the retrieved Rrs imagery will be assessed, as will the derived water column constituents such as the concentration of the phytoplankton pigment chlorophyll a.

  20. Simulation of Hyperspectral Images

    NASA Technical Reports Server (NTRS)

    Richsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2004-01-01

    A software package generates simulated hyperspectral imagery for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport, as well as reflections from surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, "ground truth" is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces, as well as the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for, and a supplement to, field validation data.

  1. Measuring Glial Metabolism in Repetitive Brain Trauma and Alzheimer’s Disease

    DTIC Science & Technology

    2016-09-01

    Six methods: Single value decomposition (SVD), wavelet, sliding window, sliding window with Gaussian weighting, spline and spectral improvements...comparison of a range of different denoising methods for dynamic MRS. Six denoising methods were considered: Single value decomposition (SVD), wavelet...project by improving the software required for the data analysis by developing six different denoising methods. He also assisted with the testing

  2. Rapid Prototyping of Hyperspectral Image Analysis Algorithms for Improved Invasive Species Decision Support Tools

    NASA Astrophysics Data System (ADS)

    Bruce, L. M.; Ball, J. E.; Evangilista, P.; Stohlgren, T. J.

    2006-12-01

    Nonnative invasive species adversely impact ecosystems, causing loss of native plant diversity, species extinction, and impairment of wildlife habitats. As a result, over the past decade federal and state agencies and nongovernmental organizations have begun to work more closely together to address the management of invasive species. In 2005, approximately 500M dollars was budgeted by U.S. Federal Agencies for the management of invasive species. Despite extensive expenditures, most of the methods used to detect and quantify the distribution of these invaders are ad hoc, at best. Likewise, decisions on the type of management techniques to be used or evaluation of the success of these methods are typically non-systematic. More efficient methods to detect or predict the occurrence of these species, as well as the incorporation of this knowledge into decision support systems, are greatly needed. In this project, rapid prototyping capabilities (RPC) are utilized for an invasive species application. More precisely, our recently developed analysis techniques for hyperspectral imagery are being prototyped for inclusion in the national Invasive Species Forecasting System (ISFS). The current ecological forecasting tools in ISFS will be compared to our hyperspectral-based invasives prediction algorithms to determine if/how the newer algorithms enhance the performance of ISFS. The PIs have researched the use of remotely sensed multispectral and hyperspectral reflectance data for the detection of invasive vegetative species. As a result, the PI has designed, implemented, and benchmarked various target detection systems that utilize remotely sensed data. These systems have been designed to make decisions based on a variety of remotely sensed data, including high spectral/spatial resolution hyperspectral signatures (1000's of spectral bands, such as those measured using ASD handheld devices), moderate spectral/spatial resolution hyperspectral images (100's of spectral bands, such as Hyperion imagery), and low spectral/spatial resolution images (such as MODIS imagery). These algorithms include hyperspectral exploitation methods such as stepwise-LDA band selection, optimized spectral band grouping, and stepwise PCA component selection. The PIs have extensive experience with combining these recently- developed methods with conventional classifiers to form an end-to-end automated target recognition (ATR) system for detecting invasive species. The outputs of these systems can be invasive prediction maps, as well as quantitative accuracy assessments like confusion matrices, user accuracies, and producer accuracies. For all of these research endeavors, the PIs have developed numerous advanced signal and image processing methodologies, as well a suite of associated software modules. However, the use of the prototype software modules has been primarily contained to Mississippi State University. The project described in this presentation and paper will enable future systematic inclusion of these software modules into a DSS with national scope.

  3. Automated glycopeptide analysis—review of current state and future directions

    PubMed Central

    Dallas, David C.; Martin, William F.; Hua, Serenus

    2013-01-01

    Glycosylation of proteins is involved in immune defense, cell–cell adhesion, cellular recognition and pathogen binding and is one of the most common and complex post-translational modifications. Science is still struggling to assign detailed mechanisms and functions to this form of conjugation. Even the structural analysis of glycoproteins—glycoproteomics—remains in its infancy due to the scarcity of high-throughput analytical platforms capable of determining glycopeptide composition and structure, especially platforms for complex biological mixtures. Glycopeptide composition and structure can be determined with high mass-accuracy mass spectrometry, particularly when combined with chromatographic separation, but the sheer volume of generated data necessitates computational software for interpretation. This review discusses the current state of glycopeptide assignment software—advances made to date and issues that remain to be addressed. The various software and algorithms developed so far provide important insights into glycoproteomics. However, there is currently no freely available software that can analyze spectral data in batch and unambiguously determine glycopeptide compositions for N- and O-linked glycopeptides from relevant biological sources such as human milk and serum. Few programs are capable of aiding in structural determination of the glycan component. To significantly advance the field of glycoproteomics, analytical software and algorithms are required that: (i) solve for both N- and O-linked glycopeptide compositions, structures and glycosites in biological mixtures; (ii) are high-throughput and process data in batches; (iii) can interpret mass spectral data from a variety of sources and (iv) are open source and freely available. PMID:22843980

  4. ASERA: A Spectrum Eye Recognition Assistant

    NASA Astrophysics Data System (ADS)

    Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao; Zhao, Yongheng

    2018-04-01

    ASERA, ASpectrum Eye Recognition Assistant, aids in quasar spectral recognition and redshift measurement and can also be used to recognize various types of spectra of stars, galaxies and AGNs (Active Galactic Nucleus). This interactive software allows users to visualize observed spectra, superimpose template spectra from the Sloan Digital Sky Survey (SDSS), and interactively access related spectral line information. ASERA is an efficient and user-friendly semi-automated toolkit for the accurate classification of spectra observed by LAMOST (the Large Sky Area Multi-object Fiber Spectroscopic Telescope) and is available as a standalone Java application and as a Java applet. The software offers several functions, including wavelength and flux scale settings, zoom in and out, redshift estimation, and spectral line identification.

  5. COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA

    PubMed Central

    Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.

    2011-01-01

    Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793

  6. Soft x-ray scattering facility at the Advanced Light Source with real-time data processing and analysis.

    PubMed

    Gann, E; Young, A T; Collins, B A; Yan, H; Nasiatka, J; Padmore, H A; Ade, H; Hexemer, A; Wang, C

    2012-04-01

    We present the development and characterization of a dedicated resonant soft x-ray scattering facility. Capable of operation over a wide energy range, the beamline and endstation are primarily used for scattering from soft matter systems around the carbon K-edge (∼285 eV). We describe the specialized design of the instrument and characteristics of the beamline. Operational characteristics of immediate interest to users such as polarization control, degree of higher harmonic spectral contamination, and detector noise are delineated. Of special interest is the development of a higher harmonic rejection system that improves the spectral purity of the x-ray beam. Special software and a user-friendly interface have been implemented to allow real-time data processing and preliminary data analysis simultaneous with data acquisition. © 2012 American Institute of Physics

  7. Center for Efficient Exascale Discretizations Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir

    The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.

  8. NRL Hyperspectral Imagery Trafficability Tool (HITT): Software andSpectral-Geotechnical Look-up Tables for Estimation and Mapping of Soil Bearing Strength from Hyperspectral Imagery

    DTIC Science & Technology

    2012-09-28

    spectral-geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating...geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating institutions in four...2010; Bachmann, Fry, et al, 2012a). The NRL HITT tool is a model for how we develop and validate software, and the future development of tools by

  9. Using MountainsMap (Digital Surf) surface analysis software as an analysis tool for x-ray mirror optical metrology data

    NASA Astrophysics Data System (ADS)

    Duffy, Alan; Yates, Brian; Takacs, Peter

    2012-09-01

    The Optical Metrology Facility at the Canadian Light Source (CLS) has recently purchased MountainsMap surface analysis software from Digital Surf and we report here our experiences with this package and its usefulness as a tool for examining metrology data of synchrotron x-ray mirrors. The package has a number of operators that are useful for determining surface roughness and slope error including compliance with ISO standards (viz. ISO 4287 and ISO 25178). The software is extensible with MATLAB scripts either by loading an m-file or by a user written script. This makes it possible to apply a custom operator to measurement data sets. Using this feature we have applied the simple six-line MATLAB code for the direct least square fitting of ellipses developed by Fitzgibbon et. al. to investigate the residual slope error of elliptical mirrors upon the removal of the best-fit-ellipse. The software includes support for many instruments (e.g. Zygo, MicroMap, etc...) and can import ASCII data (e.g. LTP data). The stitching module allows the user to assemble overlapping images and we report on our experiences with this feature applied to MicroMap surface roughness data. The power spectral density function was determined for the stitched and unstitched data and compared.

  10. Development of an automated scanning monochromator for sensitivity calibration of the MUSTANG instrument

    NASA Astrophysics Data System (ADS)

    Rivers, Thane D.

    1992-06-01

    An Automated Scanning Monochromator was developed using: an Acton Research Corporation (ARC) Monochromator, Ealing Photomultiplier Tube and a Macintosh PC in conjunction with LabVIEW software. The LabVIEW Virtual Instrument written to operate the ARC Monochromator is a mouse driven user friendly program developed for automated spectral data measurements. Resolution and sensitivity of the Automated Scanning Monochromator System were determined experimentally. The Automated monochromator was then used for spectral measurements of a Platinum Lamp. Additionally, the reflectivity curve for a BaSO4 coated screen has been measured. Reflectivity measurements indicate a large discrepancy with expected results. Further analysis of the reflectivity experiment is required for conclusive results.

  11. HSI-Find: A Visualization and Search Service for Terascale Spectral Image Catalogs

    NASA Astrophysics Data System (ADS)

    Thompson, D. R.; Smith, A. T.; Castano, R.; Palmer, E. E.; Xing, Z.

    2013-12-01

    Imaging spectrometers are remote sensing instruments commonly deployed on aircraft and spacecraft. They provide surface reflectance in hundreds of wavelength channels, creating data cubes known as hyperspecrtral images. They provide rich compositional information making them powerful tools for planetary and terrestrial science. These data products can be challenging to interpret because they contain datapoints numbering in the thousands (Dawn VIR) or millions (AVIRIS-C). Cross-image studies or exploratory searches involving more than one scene are rare; data volumes are often tens of GB per image and typical consumer-grade computers cannot store more than a handful of images in RAM. Visualizing the information in a single scene is challenging since the human eye can only distinguish three color channels out of the hundreds available. To date, analysis has been performed mostly on single images using purpose-built software tools that require extensive training and commercial licenses. The HSIFind software suite provides a scalable distributed solution to the problem of visualizing and searching large catalogs of spectral image data. It consists of a RESTful web service that communicates to a javascript-based browser client. The software provides basic visualization through an intuitive visual interface, allowing users with minimal training to explore the images or view selected spectra. Users can accumulate a library of spectra from one or more images and use these to search for similar materials. The result appears as an intensity map showing the extent of a spectral feature in a scene. Continuum removal can isolate diagnostic absorption features. The server-side mapping algorithm uses an efficient matched filter algorithm that can process a megapixel image cube in just a few seconds. This enables real-time interaction, leading to a new way of interacting with the data: the user can launch a search with a single mouse click and see the resulting map in seconds. This allows the user to quickly explore each image, ascertain the main units of surface material, localize outliers, and develop an understanding of the various materials' spectral characteristics. The HSIFind software suite is currently in beta testing at the Planetary Science Institute and a process is underway to release it under an open source license to the broader community. We believe it will benefit instrument operations during remote planetary exploration, where tactical mission decisions demand rapid analysis of each new dataset. The approach also holds potential for public spectral catalogs where its shallow learning curve and portability can make these datasets accessible to a much wider range of researchers. Acknowledgements: The HSIFind project acknowledges the NASA Advanced MultiMission Operating System (AMMOS) and the Multimission Ground Support Services (MGSS). E. Palmer is with the Planetary Science Institute, Tucson, AZ. Other authors are with the Jet Propulsion Laboratory, Pasadena, CA. This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology under a contract with the National Aeronautics and Space Administration. Copyright 2013, California Institute of Technology.

  12. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    PubMed

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  13. Spectrally And Temporally Resolved Low-Light Level Video Microscopy

    NASA Astrophysics Data System (ADS)

    Wampler, John E.; Furukawa, Ruth; Fechheimer, Marcus

    1989-12-01

    The IDG law-light video microscope system was designed to aid studies of localization of subcellular luminescence sources and stimulus/response coupling in single living cells using luminescent probes. Much of the motivation for design of this instrument system came from the pioneering efforts of Dr. Reynolds (Reynolds, Q. Rev. Biophys. 5, 295-347; Reynolds and Taylor, Bioscience 30, 586-592) who showed the value of intensified video camera systems for detection and localizion of fluorescence and bioluminescence signals from biological tissues. Our instrument system has essentially two roles, 1) localization and quantitation of very weak bioluminescence signals and 2) quantitation of intracellular environmental characteristics such as pH and calcium ion concentrations using fluorescent and bioluminescent probes. The instrument system exhibits over one million fold operating range allowing visualization and enhancement of quantum limited images with quantum limited response, spectral analysis of fluorescence signals, and transmitted light imaging. The computer control of the system implements rapid switching between light regimes, spatially resolved spectral scanning, and digital data processing for spectral shape analysis and for detailed analysis of the statistical distribution of single cell measurements. The system design and software algorithms used by the system are summarized. These design criteria are illustrated with examples taken from studies of bioluminescence, applications of bioluminescence to study developmental processes and gene expression in single living cells, and applications of fluorescent probes to study stimulus/response coupling in living cells.

  14. Coherence Study of Geomagnetic Fluctuations in Frequency Range .04 - 0.6 HZ between Remote Land Sites.

    DTIC Science & Technology

    1983-12-01

    8 B. GEOMAGNETIC BACKGROUND NOISE------------------ 11 III. DATA COLLECTION SYSTEM----------------------------- 13 A... data collection system at two separated land sites, to modify and adapt previously de- veloped software for data analysis and to obtain spectral...the sources that produce these fluctuations. The data collection sites were separated by a distance of 40 km (see Appendix A). One site was at La Mesa

  15. Vibrational Spectral Studies of Gemfibrozil

    NASA Astrophysics Data System (ADS)

    Benitta, T. Asenath; Balendiran, G. K.; James, C.

    2008-11-01

    The Fourier Transform Raman and infrared spectra of the crystallized drug molecule 5-(2,5-Dimethylphenoxy)-2,2-dimethylpentanoic acid (Gemfibrozil) have been recorded and analyzed. Quantum chemical computational methods have been employed using Gaussian 03 software package based on Hartree Fock method for theoretically modeling the grown molecule. The optimized geometry and vibrational frequencies have been predicted. Observed vibrational modes have been assigned with the aid of normal coordinate analysis.

  16. Processing TES Level-1B Data

    NASA Technical Reports Server (NTRS)

    DeBaca, Richard C.; Sarkissian, Edwin; Madatyan, Mariyetta; Shepard, Douglas; Gluck, Scott; Apolinski, Mark; McDuffie, James; Tremblay, Dennis

    2006-01-01

    TES L1B Subsystem is a computer program that performs several functions for the Tropospheric Emission Spectrometer (TES). The term "L1B" (an abbreviation of "level 1B"), refers to data, specific to the TES, on radiometric calibrated spectral radiances and their corresponding noise equivalent spectral radiances (NESRs), plus ancillary geolocation, quality, and engineering data. The functions performed by TES L1B Subsystem include shear analysis, monitoring of signal levels, detection of ice build-up, and phase correction and radiometric and spectral calibration of TES target data. Also, the program computes NESRs for target spectra, writes scientific TES level-1B data to hierarchical- data-format (HDF) files for public distribution, computes brightness temperatures, and quantifies interpixel signal variability for the purpose of first-order cloud and heterogeneous land screening by the level-2 software summarized in the immediately following article. This program uses an in-house-developed algorithm, called "NUSRT," to correct instrument line-shape factors.

  17. MS2Analyzer: A Software for Small Molecule Substructure Annotations from Accurate Tandem Mass Spectra

    PubMed Central

    2015-01-01

    Systematic analysis and interpretation of the large number of tandem mass spectra (MS/MS) obtained in metabolomics experiments is a bottleneck in discovery-driven research. MS/MS mass spectral libraries are small compared to all known small molecule structures and are often not freely available. MS2Analyzer was therefore developed to enable user-defined searches of thousands of spectra for mass spectral features such as neutral losses, m/z differences, and product and precursor ions from MS/MS spectra in MSP/MGF files. The software is freely available at http://fiehnlab.ucdavis.edu/projects/MS2Analyzer/. As the reference query set, 147 literature-reported neutral losses and their corresponding substructures were collected. This set was tested for accuracy of linking neutral loss analysis to substructure annotations using 19 329 accurate mass tandem mass spectra of structurally known compounds from the NIST11 MS/MS library. Validation studies showed that 92.1 ± 6.4% of 13 typical neutral losses such as acetylations, cysteine conjugates, or glycosylations are correct annotating the associated substructures, while the absence of mass spectra features does not necessarily imply the absence of such substructures. Use of this tool has been successfully demonstrated for complex lipids in microalgae. PMID:25263576

  18. AtomPy: an open atomic-data curation environment

    NASA Astrophysics Data System (ADS)

    Bautista, Manuel; Mendoza, Claudio; Boswell, Josiah S; Ajoku, Chukwuemeka

    2014-06-01

    We present a cloud-computing environment for atomic data curation, networking among atomic data providers and users, teaching-and-learning, and interfacing with spectral modeling software. The system is based on Google-Drive Sheets, Pandas (Python Data Analysis Library) DataFrames, and IPython Notebooks for open community-driven curation of atomic data for scientific and technological applications. The atomic model for each ionic species is contained in a multi-sheet Google-Drive workbook, where the atomic parameters from all known public sources are progressively stored. Metadata (provenance, community discussion, etc.) accompanying every entry in the database are stored through Notebooks. Education tools on the physics of atomic processes as well as their relevance to plasma and spectral modeling are based on IPython Notebooks that integrate written material, images, videos, and active computer-tool workflows. Data processing workflows and collaborative software developments are encouraged and managed through the GitHub social network. Relevant issues this platform intends to address are: (i) data quality by allowing open access to both data producers and users in order to attain completeness, accuracy, consistency, provenance and currentness; (ii) comparisons of different datasets to facilitate accuracy assessment; (iii) downloading to local data structures (i.e. Pandas DataFrames) for further manipulation and analysis by prospective users; and (iv) data preservation by avoiding the discard of outdated sets.

  19. The Trial Software version for DEMETER power spectrum files visualization and mapping

    NASA Astrophysics Data System (ADS)

    Lozbin, Anatoliy; Inchin, Alexander; Shpadi, Maxim

    2010-05-01

    In the frame of Kazakhstan's Scientific Space System creation for earthquakes precursors research, the hardware and software of DEMETER satellite was investigated. The data processing Software of DEMETER is based on package SWAN under IDL Virtual machine and realizes many features, but we can't find an important tool for the spectrograms analysis - space-time visualization of power spectrum files from electromagnetic devices as ICE and IMSC. For elimination of this problem we have developed Software which is offered to use. The DeSS (DEMETER Spectrogram Software) - it is Software for visualization, analysis and a mapping of power spectrum data from electromagnetic devices ICE and IMSC. The Software primary goal is to give the researcher friendly tool for the analysis of electromagnetic data from DEMETER Satellite for earthquake precursors and other ionosphere events researches. The Input data for DeSS Software is a power spectrum files: - Power spectrum of 1 component of the electric field in the VLF range (APID 1132); - Power spectrum of 1 component of the electric field in the HF range (APID 1134); - Power spectrum of 1 component of the magnetic field in the VLF range (APID 1137). The main features and operations of the software is possible: - various time and frequency filtration; - visualization of time dependence of signal intensity on fixed frequency; - spectral density visualization for fixed frequency range; - spectrogram autosize and smooth spectrogram; - the information in each point of the spectrogram: time, frequency and intensity; - the spectrum information in the separate window, consisting of 4 blocks; - data mapping with 6 range scale. On the map we can browse next information: - satellite orbit; - conjugate point at the satellite altitude; - north conjugate point at the altitude 110 km; - south conjugate point at the altitude 110 km. This is only trial software version to help the researchers and we always ready collaborate with scientists for software improvement. References: 1. D.Lagoutte, J.Y. Brochot, D. de Carvalho, L.Madrias and M. Parrot. DEMETER Microsatellite. Scientific Mission Center. Data product description. DMT-SP-9-CM-6054-LPC. 2. D.Lagoutte, J.Y. Brochot, P.Latremoliere. SWAN - Software for Waveform Analysis. LPCE/NI/003.E - Part 1 (User's guide), Part 2 (Analysis tools), Part 3 (User's project interface).

  20. Albedos and spectral signatures determination and it connection to geological processes: Simile between Earth and other solar system bodies

    NASA Astrophysics Data System (ADS)

    Suarez, J.; Ochoa, L.; Saavedra, F.

    2017-07-01

    Remote sensing has always been the best investigation tool for planetary sciences. In this research have been used data of Surface albedo, electromagnetic spectra and satelital imagery in search of understanding glacier dynamics in some bodies of the solar system, and how it's related to their compositions and associated geological processes, this methodology is very common in icy moons studies. Through analytic software's some albedos map's and geomorphological analysis were made that allow interpretation of different types of ice in the glacier's and it's interaction with other materials, almost all the images were worked in the visible and infrared ranges of the spectrum; spectral data were later used to connect the reflectance whit chemical and reologic properties of the compounds studied. It have been concluded that the albedo analysis is an effective tool to differentiate materials in the bodies surfaces, but the application of spectral data is necessary to know the exact compounds of the glaciers and to have a better understanding of the icy bodies.

  1. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  2. MaRiMba: a software application for spectral library-based MRM transition list assembly.

    PubMed

    Sherwood, Carly A; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B

    2009-10-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture.

  3. Volpe SuperFar V6.0 Software and Support Documentation; Letter Report V324-FB48B3-LR3

    DOT National Transportation Integrated Search

    2017-09-29

    This Letter Report serves to deliver the third external release version of the USDOT Volpe Centers SuperFAR Spectral Aircraft Noise Processing Software (Version 6.0). Earlier versions of the software were delivered to FAA in February 2015 and Marc...

  4. Radio Astronomy Software Defined Receiver Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vacaliuc, Bogdan; Leech, Marcus; Oxley, Paul

    The paper describes a Radio Astronomy Software Defined Receiver (RASDR) that is currently under development. RASDR is targeted for use by amateurs and small institutions where cost is a primary consideration. The receiver will operate from HF thru 2.8 GHz. Front-end components such as preamps, block down-converters and pre-select bandpass filters are outside the scope of this development and will be provided by the user. The receiver includes RF amplifiers and attenuators, synthesized LOs, quadrature down converters, dual 8 bit ADCs and a Signal Processor that provides firmware processing of the digital bit stream. RASDR will interface to a usermore » s PC via a USB or higher speed Ethernet LAN connection. The PC will run software that provides processing of the bit stream, a graphical user interface, as well as data analysis and storage. Software should support MAC OS, Windows and Linux platforms and will focus on such radio astronomy applications as total power measurements, pulsar detection, and spectral line studies.« less

  5. Image Classification Workflow Using Machine Learning Methods

    NASA Astrophysics Data System (ADS)

    Christoffersen, M. S.; Roser, M.; Valadez-Vergara, R.; Fernández-Vega, J. A.; Pierce, S. A.; Arora, R.

    2016-12-01

    Recent increases in the availability and quality of remote sensing datasets have fueled an increasing number of scientifically significant discoveries based on land use classification and land use change analysis. However, much of the software made to work with remote sensing data products, specifically multispectral images, is commercial and often prohibitively expensive. The free to use solutions that are currently available come bundled up as small parts of much larger programs that are very susceptible to bugs and difficult to install and configure. What is needed is a compact, easy to use set of tools to perform land use analysis on multispectral images. To address this need, we have developed software using the Python programming language with the sole function of land use classification and land use change analysis. We chose Python to develop our software because it is relatively readable, has a large body of relevant third party libraries such as GDAL and Spectral Python, and is free to install and use on Windows, Linux, and Macintosh operating systems. In order to test our classification software, we performed a K-means unsupervised classification, Gaussian Maximum Likelihood supervised classification, and a Mahalanobis Distance based supervised classification. The images used for testing were three Landsat rasters of Austin, Texas with a spatial resolution of 60 meters for the years of 1984 and 1999, and 30 meters for the year 2015. The testing dataset was easily downloaded using the Earth Explorer application produced by the USGS. The software should be able to perform classification based on any set of multispectral rasters with little to no modification. Our software makes the ease of land use classification using commercial software available without an expensive license.

  6. Q-3D: Imaging Spectroscopy of Quasar Hosts with JWST Analyzed with a Powerful New PSF Decomposition and Spectral Analysis Package

    NASA Astrophysics Data System (ADS)

    Wylezalek, Dominika; Veilleux, Sylvain; Zakamska, Nadia; Barrera-Ballesteros, J.; Luetzgendorf, N.; Nesvadba, N.; Rupke, D.; Sun, A.

    2017-11-01

    In the last few years, optical and near-IR IFU observations from the ground have revolutionized extragalactic astronomy. The unprecedented infrared sensitivity, spatial resolution, and spectral coverage of the JWST IFUs will ensure high demand from the community. For a wide range of extragalactic phenomena (e.g. quasars, starbursts, supernovae, gamma ray bursts, tidal disruption events) and beyond (e.g. nebulae, debris disks around bright stars), PSF contamination will be an issue when studying the underlying extended emission. We propose to provide the community with a PSF decomposition and spectral analysis package for high dynamic range JWST IFU observations allowing the user to create science-ready maps of relevant spectral features. Luminous quasars, with their bright central source (quasar) and extended emission (host galaxy), are excellent test cases for this software. Quasars are also of high scientific interest in their own right as they are widely considered to be the main driver in regulating massive galaxy growth. JWST will revolutionize our understanding of black hole-galaxy co-evolution by allowing us to probe the stellar, gas, and dust components of nearby and distant galaxies, spatially and spectrally. We propose to use the IFU capabilities of NIRSpec and MIRI to study the impact of three carefully selected luminous quasars on their hosts. Our program will provide (1) a scientific dataset of broad interest that will serve as a pathfinder for JWST science investigations in IFU mode and (2) a powerful new data analysis tool that will enable frontier science for a wide swath of astrophysical research.

  7. Dispersed Fringe Sensing Analysis - DFSA

    NASA Technical Reports Server (NTRS)

    Sigrist, Norbert; Shi, Fang; Redding, David C.; Basinger, Scott A.; Ohara, Catherine M.; Seo, Byoung-Joon; Bikkannavar, Siddarayappa A.; Spechler, Joshua A.

    2012-01-01

    Dispersed Fringe Sensing (DFS) is a technique for measuring and phasing segmented telescope mirrors using a dispersed broadband light image. DFS is capable of breaking the monochromatic light ambiguity, measuring absolute piston errors between segments of large segmented primary mirrors to tens of nanometers accuracy over a range of 100 micrometers or more. The DFSA software tool analyzes DFS images to extract DFS encoded segment piston errors, which can be used to measure piston distances between primary mirror segments of ground and space telescopes. This information is necessary to control mirror segments to establish a smooth, continuous primary figure needed to achieve high optical quality. The DFSA tool is versatile, allowing precise piston measurements from a variety of different optical configurations. DFSA technology may be used for measuring wavefront pistons from sub-apertures defined by adjacent segments (such as Keck Telescope), or from separated sub-apertures used for testing large optical systems (such as sub-aperture wavefront testing for large primary mirrors using auto-collimating flats). An experimental demonstration of the coarse-phasing technology with verification of DFSA was performed at the Keck Telescope. DFSA includes image processing, wavelength and source spectral calibration, fringe extraction line determination, dispersed fringe analysis, and wavefront piston sign determination. The code is robust against internal optical system aberrations and against spectral variations of the source. In addition to the DFSA tool, the software package contains a simple but sophisticated MATLAB model to generate dispersed fringe images of optical system configurations in order to quickly estimate the coarse phasing performance given the optical and operational design requirements. Combining MATLAB (a high-level language and interactive environment developed by MathWorks), MACOS (JPL s software package for Modeling and Analysis for Controlled Optical Systems), and DFSA provides a unique optical development, modeling and analysis package to study current and future approaches to coarse phasing controlled segmented optical systems.

  8. Radar investigation of asteroids

    NASA Technical Reports Server (NTRS)

    Ostro, S. J.

    1981-01-01

    Software to support all stages of asteroid radar observation and data analysis is developed. First-order analysis of all data in hand is complete. Estimates of radar cross sections, circular polarization ratios, and limb-to-limb echo spectral bandwidths for asteroids 7 Iris, 16 Psyche, 97 Klotho, 1862 Apollo, and 1915 Quetzalcoatl are reported. Radar observations of two previously unobserved asteroids were conducted. An Aten asteroid, 2100 Ra-Shalom, with the smallest known semimajor axis (0.83) was detected. Preliminary data reduction indicates a circular polarization ratio comparable to those of Apollo, Quetzalcoatl, and Toro.

  9. The ISO SWS on-line system

    NASA Technical Reports Server (NTRS)

    Roelfsema, P. R.; Kester, D. J. M.; Wesselius, P. R.; Wieprech, E.; Sym, N.

    1992-01-01

    The software which is currently being developed for the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO) is described. The spectrometer has a wide range of capabilities in the 2-45 micron infrared band. SWS contains two independent gratings, one for the long and one for the short wavelength section of the band. With the gratings a spectral resolution of approximately 1000 to approximately 2500 can be obtained. The instrument also contains two Fabry-Perault's yielding a resolution between approximately 1000 and approximately 20000. Software is currently being developed for the acquisition, calibration, and analysis of SWS data. The software is firstly required to run in a pipeline mode without human interaction, to process data as they are received from the telescope. However, both for testing and calibration of the instrument as well as for evaluation of the planned operating procedures the software should also be suitable for interactive use. Thirdly the same software will be used for long term characterization of the instrument. The software must work properly within the environment designed by the European Space Agency (ESA) for the spacecraft operations. As a result strict constraints are put on I/O devices, throughput etc.

  10. HYDRA Hyperspectral Data Research Application Tom Rink and Tom Whittaker

    NASA Astrophysics Data System (ADS)

    Rink, T.; Whittaker, T.

    2005-12-01

    HYDRA is a freely available, easy to install tool for visualization and analysis of large local or remote hyper/multi-spectral datasets. HYDRA is implemented on top of the open source VisAD Java library via Jython - the Java implementation of the user friendly Python programming language. VisAD provides data integration, through its generalized data model, user-display interaction and display rendering. Jython has an easy to read, concise, scripting-like, syntax which eases software development. HYDRA allows data sharing of large datasets through its support of the OpenDAP and OpenADDE server-client protocols. The users can explore and interrogate data, and subset in physical and/or spectral space to isolate key areas of interest for further analysis without having to download an entire dataset. It also has an extensible data input architecture to recognize new instruments and understand different local file formats, currently NetCDF and HDF4 are supported.

  11. IRIS: a novel spectral imaging system for the analysis of cultural heritage objects

    NASA Astrophysics Data System (ADS)

    Papadakis, V. M.; Orphanos, Y.; Kogou, S.; Melessanaki, K.; Pouli, P.; Fotakis, C.

    2011-06-01

    A new portable spectral imaging system is herein presented capable of acquiring images of high resolution (2MPixels) ranging from 380 nm up to 950 nm. The system consists of a digital color CCD camera, 15 interference filters covering all the sensitivity range of the detector and a robust filter changing system. The acquisition software has been developed in "LabView" programming language allowing easy handling and modification by end-users. The system has been tested and evaluated on a series of objects of Cultural Heritage (CH) value including paintings, encrusted stonework, ceramics etc. This paper aims to present the system, as well as, its application and advantages in the analysis of artworks with emphasis on the detailed compositional and structural information of layered surfaces based on reflection & fluorescence spectroscopy. Specific examples will be presented and discussed on the basis of system improvements.

  12. Rocketdyne automated dynamics data analysis and management system

    NASA Technical Reports Server (NTRS)

    Tarn, Robert B.

    1988-01-01

    An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.

  13. Processing methods for differential analysis of LC/MS profile data

    PubMed Central

    Katajamaa, Mikko; Orešič, Matej

    2005-01-01

    Background Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. Results We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. Conclusion The software is freely available under the GNU General Public License and it can be obtained from the project web page at: . PMID:16026613

  14. Processing methods for differential analysis of LC/MS profile data.

    PubMed

    Katajamaa, Mikko; Oresic, Matej

    2005-07-18

    Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. The software is freely available under the GNU General Public License and it can be obtained from the project web page at: http://mzmine.sourceforge.net/.

  15. Comprehensive analysis of NMR data using advanced line shape fitting.

    PubMed

    Niklasson, Markus; Otten, Renee; Ahlner, Alexandra; Andresen, Cecilia; Schlagnitweit, Judith; Petzold, Katja; Lundström, Patrik

    2017-10-01

    NMR spectroscopy is uniquely suited for atomic resolution studies of biomolecules such as proteins, nucleic acids and metabolites, since detailed information on structure and dynamics are encoded in positions and line shapes of peaks in NMR spectra. Unfortunately, accurate determination of these parameters is often complicated and time consuming, in part due to the need for different software at the various analysis steps and for validating the results. Here, we present an integrated, cross-platform and open-source software that is significantly more versatile than the typical line shape fitting application. The software is a completely redesigned version of PINT ( https://pint-nmr.github.io/PINT/ ). It features a graphical user interface and includes functionality for peak picking, editing of peak lists and line shape fitting. In addition, the obtained peak intensities can be used directly to extract, for instance, relaxation rates, heteronuclear NOE values and exchange parameters. In contrast to most available software the entire process from spectral visualization to preparation of publication-ready figures is done solely using PINT and often within minutes, thereby, increasing productivity for users of all experience levels. Unique to the software are also the outstanding tools for evaluating the quality of the fitting results and extensive, but easy-to-use, customization of the fitting protocol and graphical output. In this communication, we describe the features of the new version of PINT and benchmark its performance.

  16. An approach to the analysis of SDSS spectroscopic outliers based on self-organizing maps. Designing the outlier analysis software package for the next Gaia survey

    NASA Astrophysics Data System (ADS)

    Fustes, D.; Manteiga, M.; Dafonte, C.; Arcay, B.; Ulla, A.; Smith, K.; Borrachero, R.; Sordo, R.

    2013-11-01

    Aims: A new method applied to the segmentation and further analysis of the outliers resulting from the classification of astronomical objects in large databases is discussed. The method is being used in the framework of the Gaia satellite Data Processing and Analysis Consortium (DPAC) activities to prepare automated software tools that will be used to derive basic astrophysical information that is to be included in final Gaia archive. Methods: Our algorithm has been tested by means of simulated Gaia spectrophotometry, which is based on SDSS observations and theoretical spectral libraries covering a wide sample of astronomical objects. Self-organizing maps networks are used to organize the information in clusters of objects, as homogeneously as possible according to their spectral energy distributions, and to project them onto a 2D grid where the data structure can be visualized. Results: We demonstrate the usefulness of the method by analyzing the spectra that were rejected by the SDSS spectroscopic classification pipeline and thus classified as "UNKNOWN". First, our method can help distinguish between astrophysical objects and instrumental artifacts. Additionally, the application of our algorithm to SDSS objects of unknown nature has allowed us to identify classes of objects with similar astrophysical natures. In addition, the method allows for the potential discovery of hundreds of new objects, such as white dwarfs and quasars. Therefore, the proposed method is shown to be very promising for data exploration and knowledge discovery in very large astronomical databases, such as the archive from the upcoming Gaia mission.

  17. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  18. EOS MLS Level 1B Data Processing Software. Version 3

    NASA Technical Reports Server (NTRS)

    Perun, Vincent S.; Jarnot, Robert F.; Wagner, Paul A.; Cofield, Richard E., IV; Nguyen, Honghanh T.; Vuu, Christina

    2011-01-01

    This software is an improvement on Version 2, which was described in EOS MLS Level 1B Data Processing, Version 2.2, NASA Tech Briefs, Vol. 33, No. 5 (May 2009), p. 34. It accepts the EOS MLS Level 0 science/engineering data, and the EOS Aura spacecraft ephemeris/attitude data, and produces calibrated instrument radiances and associated engineering and diagnostic data. This version makes the code more robust, improves calibration, provides more diagnostics outputs, defines the Galactic core more finely, and fixes the equator crossing. The Level 1 processing software manages several different tasks. It qualifies each data quantity using instrument configuration and checksum data, as well as data transmission quality flags. Statistical tests are applied for data quality and reasonableness. The instrument engineering data (e.g., voltages, currents, temperatures, and encoder angles) is calibrated by the software, and the filter channel space reference measurements are interpolated onto the times of each limb measurement with the interpolates being differenced from the measurements. Filter channel calibration target measurements are interpolated onto the times of each limb measurement, and are used to compute radiometric gain. The total signal power is determined and analyzed by each digital autocorrelator spectrometer (DACS) during each data integration. The software converts each DACS data integration from an autocorrelation measurement in the time domain into a spectral measurement in the frequency domain, and estimates separately the spectrally, smoothly varying and spectrally averaged components of the limb port signal arising from antenna emission and scattering effects. Limb radiances are also calibrated.

  19. Marathon: An Open Source Software Library for the Analysis of Markov-Chain Monte Carlo Algorithms

    PubMed Central

    Rechner, Steffen; Berger, Annabell

    2016-01-01

    We present the software library marathon, which is designed to support the analysis of sampling algorithms that are based on the Markov-Chain Monte Carlo principle. The main application of this library is the computation of properties of so-called state graphs, which represent the structure of Markov chains. We demonstrate applications and the usefulness of marathon by investigating the quality of several bounding methods on four well-known Markov chains for sampling perfect matchings and bipartite graphs. In a set of experiments, we compute the total mixing time and several of its bounds for a large number of input instances. We find that the upper bound gained by the famous canonical path method is often several magnitudes larger than the total mixing time and deteriorates with growing input size. In contrast, the spectral bound is found to be a precise approximation of the total mixing time. PMID:26824442

  20. An efficient approach to integrated MeV ion imaging.

    PubMed

    Nikbakht, T; Kakuee, O; Solé, V A; Vosuoghi, Y; Lamehi-Rachti, M

    2018-03-01

    An ionoluminescence (IL) spectral imaging system, besides the common MeV ion imaging facilities such as µ-PIXE and µ-RBS, is implemented at the Van de Graaff laboratory of Tehran. A versatile processing software is required to handle the large amount of data concurrently collected in µ-IL and common MeV ion imaging measurements through the respective methodologies. The open-source freeware PyMca, with image processing and multivariate analysis capabilities, is employed to simultaneously process common MeV ion imaging and µ-IL data. Herein, the program was adapted to support the OM_DAQ listmode data format. The appropriate performance of the µ-IL data acquisition system is confirmed through a case study. Moreover, the capabilities of the software for simultaneous analysis of µ-PIXE and µ-RBS experimental data are presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. On the Interpretation of the Fermi-GBM Transient Observed in Coincidence with LIGO Gravitational-wave Event GW150914

    NASA Astrophysics Data System (ADS)

    Connaughton, V.; Burns, E.; Goldstein, A.; Blackburn, L.; Briggs, M. S.; Christensen, N.; Hui, C. M.; Kocevski, D.; Littenberg, T.; McEnery, J. E.; Racusin, J.; Shawhan, P.; Veitch, J.; Wilson-Hodge, C. A.; Bhat, P. N.; Bissaldi, E.; Cleveland, W.; Giles, M. M.; Gibby, M. H.; von Kienlin, A.; Kippen, R. M.; McBreen, S.; Meegan, C. A.; Paciesas, W. S.; Preece, R. D.; Roberts, O. J.; Stanbro, M.; Veres, P.

    2018-01-01

    The weak transient detected by the Fermi Gamma-ray Burst Monitor (GBM) 0.4 s after GW150914 has generated much speculation regarding its possible association with the black hole binary merger. Investigation of the GBM data by Connaughton et al. revealed a source location consistent with GW150914 and a spectrum consistent with a weak, short gamma-ray burst. Greiner et al. present an alternative technique for fitting background-limited data in the low-count regime, and call into question the spectral analysis and the significance of the detection of GW150914-GBM presented in Connaughton et al. The spectral analysis of Connaughton et al. is not subject to the limitations of the low-count regime noted by Greiner et al. We find Greiner et al. used an inconsistent source position and did not follow the steps taken in Connaughton et al. to mitigate the statistical shortcomings of their software when analyzing this weak event. We use the approach of Greiner et al. to verify that our original spectral analysis is not biased. The detection significance of GW150914-GBM is established empirically, with a false-alarm rate (FAR) of ∼ {10}-4 Hz. A post-trials false-alarm probability (FAP) of 2.2× {10}-3 (2.9σ ) of this transient being associated with GW150914 is based on the proximity in time to the gravitational-wave event of a transient with that FAR. The FAR and the FAP are unaffected by the spectral analysis that is the focus of Greiner et al.

  2. An Open-Source Standard T-Wave Alternans Detector for Benchmarking.

    PubMed

    Khaustov, A; Nemati, S; Clifford, Gd

    2008-09-14

    We describe an open source algorithm suite for T-Wave Alternans (TWA) detection and quantification. The software consists of Matlab implementations of the widely used Spectral Method and Modified Moving Average with libraries to read both WFDB and ASCII data under windows and Linux. The software suite can run in both batch mode and with a provided graphical user interface to aid waveform exploration. Our software suite was calibrated using an open source TWA model, described in a partner paper [1] by Clifford and Sameni. For the PhysioNet/CinC Challenge 2008 we obtained a score of 0.881 for the Spectral Method and 0.400 for the MMA method. However, our objective was not to provide the best TWA detector, but rather a basis for detailed discussion of algorithms.

  3. Dispersive heterodyne probing method for laser frequency stabilization based on spectral hole burning in rare-earth doped crystals.

    PubMed

    Gobron, O; Jung, K; Galland, N; Predehl, K; Le Targat, R; Ferrier, A; Goldner, P; Seidelin, S; Le Coq, Y

    2017-06-26

    Frequency-locking a laser to a spectral hole in rare-earth doped crystals at cryogenic temperature has been shown to be a promising alternative to the use of high finesse Fabry-Perot cavities when seeking a very high short term stability laser (M. J. Thorpe et al., Nature Photonics 5, 688 (2011)). We demonstrate here a novel technique for achieving such stabilization, based on generating a heterodyne beat-note between a master laser and a slave laser whose dephasing caused by propagation near a spectral hole generate the error signal of the frequency lock. The master laser is far detuned from the center of the inhomogeneous absorption profile, and therefore exhibits only limited interaction with the crystal despite a potentially high optical power. The demodulation and frequency corrections are generated digitally with a hardware and software implementation based on a field-programmable gate array and a Software Defined Radio platform, making it straightforward to address several frequency channels (spectral holes) in parallel.

  4. Spectral, optical, thermal, Hirshfeld analysis and computational calculations of a new organic proton transfer crystal, 1H-benzo[d][1,2,3]triazol-3-ium-3,5-dinitrobenzoate

    NASA Astrophysics Data System (ADS)

    Sathya, K.; Dhamodharan, P.; Dhandapani, M.

    2018-05-01

    A molecular complex, 1H-benzo[d][1,2,3]triazol-3-ium-3,5-dinitrobenzoate, (BTDB), was synthesized, crystallized and characterized by CHN analysis and 1H, 13C NMR spectral studies. The crystal is transparent in entire visible region as evidenced by UV-Vis-NIR spectrum. TG/DTA analysis shows that BTDB is stable up to 150 °C. Single crystal XRD analysis was carried out to ascertain the molecular structure and BTDB crystallizes in the monoclinic system with space group P21/n. Computational studies that include optimization of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-311G(d,p) level. Hirshfeld surfaces and 2D fingerprint plots revealed that O⋯H, H⋯H and O⋯C interactions are the most prevalent. The first order hyperpolarizability (β) of BITB is 44 times greater than urea. The results show that the BTDB may be used for various opto-electronic applications.

  5. Astronomical data analysis software and systems I; Proceedings of the 1st Annual Conference, Tucson, AZ, Nov. 6-8, 1991

    NASA Technical Reports Server (NTRS)

    Worrall, Diana M. (Editor); Biemesderfer, Chris (Editor); Barnes, Jeannette (Editor)

    1992-01-01

    Consideration is given to a definition of a distribution format for X-ray data, the Einstein on-line system, the NASA/IPAC extragalactic database, COBE astronomical databases, Cosmic Background Explorer astronomical databases, the ADAM software environment, the Groningen Image Processing System, search for a common data model for astronomical data analysis systems, deconvolution for real and synthetic apertures, pitfalls in image reconstruction, a direct method for spectral and image restoration, and a discription of a Poisson imagery super resolution algorithm. Also discussed are multivariate statistics on HI and IRAS images, a faint object classification using neural networks, a matched filter for improving SNR of radio maps, automated aperture photometry of CCD images, interactive graphics interpreter, the ROSAT extreme ultra-violet sky survey, a quantitative study of optimal extraction, an automated analysis of spectra, applications of synthetic photometry, an algorithm for extra-solar planet system detection and data reduction facilities for the William Herschel telescope.

  6. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  7. Application of Novel Software Algorithms to Spectral-Domain Optical Coherence Tomography for Automated Detection of Diabetic Retinopathy.

    PubMed

    Adhi, Mehreen; Semy, Salim K; Stein, David W; Potter, Daniel M; Kuklinski, Walter S; Sleeper, Harry A; Duker, Jay S; Waheed, Nadia K

    2016-05-01

    To present novel software algorithms applied to spectral-domain optical coherence tomography (SD-OCT) for automated detection of diabetic retinopathy (DR). Thirty-one diabetic patients (44 eyes) and 18 healthy, nondiabetic controls (20 eyes) who underwent volumetric SD-OCT imaging and fundus photography were retrospectively identified. A retina specialist independently graded DR stage. Trained automated software generated a retinal thickness score signifying macular edema and a cluster score signifying microaneurysms and/or hard exudates for each volumetric SD-OCT. Of 44 diabetic eyes, 38 had DR and six eyes did not have DR. Leave-one-out cross-validation using a linear discriminant at missed detection/false alarm ratio of 3.00 computed software sensitivity and specificity of 92% and 69%, respectively, for DR detection when compared to clinical assessment. Novel software algorithms applied to commercially available SD-OCT can successfully detect DR and may have potential as a viable screening tool for DR in future. [Ophthalmic Surg Lasers Imaging Retina. 2016;47:410-417.]. Copyright 2016, SLACK Incorporated.

  8. Development of a Computer Architecture to Support the Optical Plume Anomaly Detection (OPAD) System

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1996-01-01

    The NASA OPAD spectrometer system relies heavily on extensive software which repetitively extracts spectral information from the engine plume and reports the amounts of metals which are present in the plume. The development of this software is at a sufficiently advanced stage where it can be used in actual engine tests to provide valuable data on engine operation and health. This activity will continue and, in addition, the OPAD system is planned to be used in flight aboard space vehicles. The two implementations, test-stand and in-flight, may have some differing requirements. For example, the data stored during a test-stand experiment are much more extensive than in the in-flight case. In both cases though, the majority of the requirements are similar. New data from the spectrograph is generated at a rate of once every 0.5 sec or faster. All processing must be completed within this period of time to maintain real-time performance. Every 0.5 sec, the OPAD system must report the amounts of specific metals within the engine plume, given the spectral data. At present, the software in the OPAD system performs this function by solving the inverse problem. It uses powerful physics-based computational models (the SPECTRA code), which receive amounts of metals as inputs to produce the spectral data that would have been observed, had the same metal amounts been present in the engine plume. During the experiment, for every spectrum that is observed, an initial approximation is performed using neural networks to establish an initial metal composition which approximates as accurately as possible the real one. Then, using optimization techniques, the SPECTRA code is repetitively used to produce a fit to the data, by adjusting the metal input amounts until the produced spectrum matches the observed one to within a given level of tolerance. This iterative solution to the original problem of determining the metal composition in the plume requires a relatively long period of time to execute the software in a modern single-processor workstation, and therefore real-time operation is currently not possible. A different number of iterations may be required to perform spectral data fitting per spectral sample. Yet, the OPAD system must be designed to maintain real-time performance in all cases. Although faster single-processor workstations are available for execution of the fitting and SPECTRA software, this option is unattractive due to the excessive cost associated with very fast workstations and also due to the fact that such hardware is not easily expandable to accommodate future versions of the software which may require more processing power. Initial research has already demonstrated that the OPAD software can take advantage of a parallel computer architecture to achieve the necessary speedup. Current work has improved the software by converting it into a form which is easily parallelizable. Timing experiments have been performed to establish the computational complexity and execution speed of major components of the software. This work provides the foundation of future work which will create a fully parallel version of the software executing in a shared-memory multiprocessor system.

  9. Spectral Graph Theory Analysis of Software-Defined Networks to Improve Performance and Security

    DTIC Science & Technology

    2015-09-01

    listed with its associated IP address. 3. Hardware Components The hardware in the test bed included HP switches and Raspberry Pis . Two types of...discernible difference between the two types. The hosts in the network are Raspberry Pis [58], which are small, inexpensive computers with 10/100... Pis ran one of four operating systems: Raspbian, ArchLinux, Kali, 85 and Windows 10. All of the Raspberry Pis were configured with Iperf [59

  10. Airborne hyperspectral remote sensing in Italy

    NASA Astrophysics Data System (ADS)

    Bianchi, Remo; Marino, Carlo M.; Pignatti, Stefano

    1994-12-01

    The Italian National Research Council (CNR) in the framework of its `Strategic Project for Climate and Environment in Southern Italy' established a new laboratory for airborne hyperspectral imaging devoted to environmental problems. Since the end of June 1994, the LARA (Laboratorio Aereo per Ricerche Ambientali -- Airborne Laboratory for Environmental Studies) Project is fully operative to provide hyperspectral data to the national and international scientific community by means of deployments of its CASA-212 aircraft carrying the Daedalus AA5000 MIVIS (multispectral infrared and visible imaging spectrometer) system. MIVIS is a modular instrument consisting of 102 spectral channels that use independent optical sensors simultaneously sampled and recorded onto a compact computer compatible magnetic tape medium with a data capacity of 10.2 Gbytes. To support the preprocessing and production pipeline of the large hyperspectral data sets CNR housed in Pomezia, a town close to Rome, a ground based computer system with a software designed to handle MIVIS data. The software (MIDAS-Multispectral Interactive Data Analysis System), besides the data production management, gives to users a powerful and highly extensible hyperspectral analysis system. The Pomezia's ground station is designed to maintain and check the MIVIS instrument performance through the evaluation of data quality (like spectral accuracy, signal to noise performance, signal variations, etc.), and to produce, archive, and diffuse MIVIS data in the form of geometrically and radiometrically corrected data sets on low cost and easy access CC media.

  11. A digital algorithm for spectral deconvolution with noise filtering and peak picking: NOFIPP-DECON

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.; Settle, G. L.; Knight, R. D.

    1975-01-01

    Noise-filtering, peak-picking deconvolution software incorporates multiple convoluted convolute integers and multiparameter optimization pattern search. The two theories are described and three aspects of the software package are discussed in detail. Noise-filtering deconvolution was applied to a number of experimental cases ranging from noisy, nondispersive X-ray analyzer data to very noisy photoelectric polarimeter data. Comparisons were made with published infrared data, and a man-machine interactive language has evolved for assisting in very difficult cases. A modified version of the program is being used for routine preprocessing of mass spectral and gas chromatographic data.

  12. Comparative shotgun proteomics using spectral count data and quasi-likelihood modeling.

    PubMed

    Li, Ming; Gray, William; Zhang, Haixia; Chung, Christine H; Billheimer, Dean; Yarbrough, Wendell G; Liebler, Daniel C; Shyr, Yu; Slebos, Robbert J C

    2010-08-06

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography-tandem mass spectrometry (LC-MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher's Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples.

  13. Comparative Shotgun Proteomics Using Spectral Count Data and Quasi-Likelihood Modeling

    PubMed Central

    2010-01-01

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography−tandem mass spectrometry (LC−MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher’s Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography−multiple reaction monitoring mass spectrometry (LC−MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples. PMID:20586475

  14. Cloud parallel processing of tandem mass spectrometry based proteomics data.

    PubMed

    Mohammed, Yassene; Mostovenko, Ekaterina; Henneman, Alex A; Marissen, Rob J; Deelder, André M; Palmblad, Magnus

    2012-10-05

    Data analysis in mass spectrometry based proteomics struggles to keep pace with the advances in instrumentation and the increasing rate of data acquisition. Analyzing this data involves multiple steps requiring diverse software, using different algorithms and data formats. Speed and performance of the mass spectral search engines are continuously improving, although not necessarily as needed to face the challenges of acquired big data. Improving and parallelizing the search algorithms is one possibility; data decomposition presents another, simpler strategy for introducing parallelism. We describe a general method for parallelizing identification of tandem mass spectra using data decomposition that keeps the search engine intact and wraps the parallelization around it. We introduce two algorithms for decomposing mzXML files and recomposing resulting pepXML files. This makes the approach applicable to different search engines, including those relying on sequence databases and those searching spectral libraries. We use cloud computing to deliver the computational power and scientific workflow engines to interface and automate the different processing steps. We show how to leverage these technologies to achieve faster data analysis in proteomics and present three scientific workflows for parallel database as well as spectral library search using our data decomposition programs, X!Tandem and SpectraST.

  15. NMRPipe: a multidimensional spectral processing system based on UNIX pipes.

    PubMed

    Delaglio, F; Grzesiek, S; Vuister, G W; Zhu, G; Pfeifer, J; Bax, A

    1995-11-01

    The NMRPipe system is a UNIX software environment of processing, graphics, and analysis tools designed to meet current routine and research-oriented multidimensional processing requirements, and to anticipate and accommodate future demands and developments. The system is based on UNIX pipes, which allow programs running simultaneously to exchange streams of data under user control. In an NMRPipe processing scheme, a stream of spectral data flows through a pipeline of processing programs, each of which performs one component of the overall scheme, such as Fourier transformation or linear prediction. Complete multidimensional processing schemes are constructed as simple UNIX shell scripts. The processing modules themselves maintain and exploit accurate records of data sizes, detection modes, and calibration information in all dimensions, so that schemes can be constructed without the need to explicitly define or anticipate data sizes or storage details of real and imaginary channels during processing. The asynchronous pipeline scheme provides other substantial advantages, including high flexibility, favorable processing speeds, choice of both all-in-memory and disk-bound processing, easy adaptation to different data formats, simpler software development and maintenance, and the ability to distribute processing tasks on multi-CPU computers and computer networks.

  16. MaRiMba: A Software Application for Spectral Library-Based MRM Transition List Assembly

    PubMed Central

    Sherwood, Carly A.; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K.; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W.; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B.

    2009-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture. PMID:19603829

  17. Detection And Mapping (DAM) package. Volume 4B: Software System Manual, part 2

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    Computer programs, graphic devices, and an integrated set of manual procedures designed for efficient production of precisely registered and formatted maps from digital data are presented. The software can be used on any Univac 1100 series computer. The software includes pre-defined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3.

  18. McIDAS-V: Advanced Visualization for 3D Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Rink, T.; Achtor, T. H.

    2010-12-01

    McIDAS-V is a Java-based, open-source, freely available software package for analysis and visualization of geophysical data. Its advanced capabilities provide very interactive 4-D displays, including 3D volumetric rendering and fast sub-manifold slicing, linked to an abstract mathematical data model with built-in metadata for units, coordinate system transforms and sampling topology. A Jython interface provides user defined analysis and computation in terms of the internal data model. These powerful capabilities to integrate data, analysis and visualization are being applied to hyper-spectral sounding retrievals, eg. AIRS and IASI, of moisture and cloud density to interrogate and analyze their 3D structure, as well as, validate with instruments such as CALIPSO, CloudSat and MODIS. The object oriented framework design allows for specialized extensions for novel displays and new sources of data. Community defined CF-conventions for gridded data are understood by the software, and can be immediately imported into the application. This presentation will show examples how McIDAS-V is used in 3-dimensional data analysis, display and evaluation.

  19. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user-created plugins that add new functionality.This work was supported in part by HST AR #13919, HST GO #14268, and HST AR #14560.

  20. A data base and analysis program for shuttle main engine dynamic pressure measurements. Appendix F: Data base plots for SSME tests 750-120 through 750-200

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is presented. The data base represents dynamic pressure measurements obtained during single engine hot firing tests of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is also included to estimate spectral trends with SSME power level.

  1. Spectral analysis of aeromagnetic profiles for depth estimation principles, software, and practical application

    USGS Publications Warehouse

    Sadek, H.S.; Rashad, S.M.; Blank, H.R.

    1984-01-01

    If proper account is taken of the constraints of the method, it is capable of providing depth estimates to within an accuracy of about 10 percent under suitable circumstances. The estimates are unaffected by source magnetization and are relatively insensitive to assumptions as to source shape or distribution. The validity of the method is demonstrated by analyses of synthetic profiles and profiles recorded over Harrat Rahat, Saudi Arabia, and Diyur, Egypt, where source depths have been proved by drilling.

  2. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  3. Determination of tailored filter sets to create rayfiles including spatial and angular resolved spectral information.

    PubMed

    Rotscholl, Ingo; Trampert, Klaus; Krüger, Udo; Perner, Martin; Schmidt, Franz; Neumann, Cornelius

    2015-11-16

    To simulate and optimize optical designs regarding perceived color and homogeneity in commercial ray tracing software, realistic light source models are needed. Spectral rayfiles provide angular and spatial varying spectral information. We propose a spectral reconstruction method with a minimum of time consuming goniophotometric near field measurements with optical filters for the purpose of creating spectral rayfiles. Our discussion focuses on the selection of the ideal optical filter combination for any arbitrary spectrum out of a given filter set by considering measurement uncertainties with Monte Carlo simulations. We minimize the simulation time by a preselection of all filter combinations, which bases on factorial design.

  4. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-05-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  5. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-01-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  6. Inclusion of TCAF model in XSPEC to study accretion flow dynamics around black hole candidates

    NASA Astrophysics Data System (ADS)

    Debnath, Dipak; Chakrabarti, Sandip Kumar; Mondal, Santanu

    Spectral and Temporal properties of black hole candidates can be well understood with the Chakrabarti-Titarchuk solution of two component advective flow (TCAF). This model requires two accretion rates, namely, the Keplerian disk accretion rate and the sub-Keplerian halo accretion rate, the latter being composed of a low angular momentum flow which may or may not develop a shock. In this solution, the relevant parameter is the relative importance of the halo (which creates the Compton cloud region) rate with respect to the Keplerian disk rate (soft photon source). Though this model has been used earlier to manually fit data of several black hole candidates quite satisfactorily, for the first time we are able to create a user friendly version by implementing additive Table model FITS file into GSFC/NASA's spectral analysis software package XSPEC. This enables any user to extract physical parameters of accretion flows, such as two accretion rates, shock location, shock strength etc. for any black hole candidate. Most importantly, unlike any other theoretical model, we show that TCAF is capable of predicting timing properties from spectral fits, since in TCAF, a shock is responsible for deciding spectral slopes as well as QPO frequencies.

  7. Expanding the scope and applicability of laser-based spectroscopy to studies of ecohydrology by removing organic contaminants in natural water

    NASA Astrophysics Data System (ADS)

    Dennis, K. J.; Rees-Owen, R. L.; Brooks, P. D.; Carter, J.; Dawson, T. E.

    2014-12-01

    The ability to measure the stable isotopic composition of plant and soil waters, surface waters and ambient atmospheric vapor is essential to understanding an ecosystem's water budget, including how water cycles between the air, plants and the subsurface. With the advent of laser-based spectroscopy, e.g., Cavity Ring-Down Spectroscopy (CRDS), the isotopic analysis of waters has become increasingly cost-effective and prevalent, with comparable precision to conventional isotope ratio mass spectrometry methods. However, early work [1,2] demonstrated that the accuracy of isotopic analysis by laser-absorption techniques could be compromised by the spectral interference from organic compounds, in particular methanol and ethanol [1], which are prevalent in ecologically-derived waters. Here we present results from the Picarro Micro-Combustion Module (MCM), which acts to destructively remove these interfering organic species from the analyzed water vapor stream by oxidizing them to CO2 and H2O. Analyzed samples include simulated plant water solutions, waters doped with varying concentrations of potentially problematic organic compounds, and actual plant water extracts. We find that the median offset between IRMS and a Picarro L2130-i outfitted with a MCM is less than 0.5 ‰ for δ18O and less than 1 ‰ for δD. In parallel to the destruction of organic contaminants, a software tool can also be used to assess the probability of spectral interference. This software performs a statistical analysis of spectral fit parameters, e.g., the shift in the spectral baseline, and compares unknown samples to clean standard waters. In general, the most common primary metabolites present in plant materials include the light organic acids, e.g., benzoic and formic acid. At low concentrations (0.1 and 1%) formic acid does not appear to interfere with the resolved absorption spectra for H2O, HDO and H218O. Similar tests will be conducted for benzoic acid. Conversely, although methanol and ethanol are only present in trace amounts in plants, these alcohols can cause large interferences even at the low concentrations (1% and 0.025% for ethanol and methanol, respectively). Using these results, we will propose when CRDS for ecologically-derived waters functions best. [1] Brand et al. (2009), RCM, 23 [2] West et al. (2010), RCM, 24

  8. Space Missions: Long Term Preservation of IDL-based Software using GDL

    NASA Astrophysics Data System (ADS)

    Coulais, A.; Schellens, M.; Arabas, S.; Lenoir, M.; Noreskal, L.; Erard, S.

    2012-09-01

    GNU Data Language (GDL) is a free software clone of IDL, an interactive language widely used in Astronomy and space missions since decades. Proprietary status, license restrictions, price, sustainability and continuity of support for particular platforms are recurrent concerns in the Astronomy community, especially concerning space missions, which require long-term support. In this paper, we describe the key features of GDL and the main achievements from recent development work. We illustrate the maturity of GDL by presenting two examples of application: reading spectral cubes in PDS format and use of the HEALPix library. These examples support the main argument of the paper: that GDL has reached a level of maturity and usability ensuring long term preservation of analysis capabilities for numerous ground experiments and spaces missions based on IDL.

  9. Assessment of a spectral domain OCT segmentation software in a retrospective cohort study of exudative AMD patients.

    PubMed

    Tilleul, Julien; Querques, Giuseppe; Canoui-Poitrine, Florence; Leveziel, Nicolas; Souied, Eric H

    2013-01-01

    To assess the ability of the Spectralis optical coherence tomography (OCT) segmentation software to identify the inner limiting membrane and Bruch's membrane in exudative age-related macular degeneration (AMD) patients. Thirty-eight eyes of 38 naive exudative AMD patients were retrospectively included. They all had a complete ophthalmologic examination including Spectralis OCT at baseline, at month 1 and 2. Reliability of the segmentation software was assessed by 2 ophthalmologists. Reliability of the segmentation software was defined as good if both inner limiting membrane and Bruch's membrane were correctly drawn. A total of 38 patients charts were reviewed (114 scans). The inner limiting membrane was correctly drawn by the segmentation software in 114/114 spectral domain OCT scans (100%). Conversely, Bruch's membrane was correctly drawn in 59/114 scans (51.8%). The software was less reliable in locating Bruch's membrane in case of pigment epithelium detachment (PED) than without PED (42.5 vs. 73.5%, respectively; p = 0.049), but its reliability was not associated with SRF or CME (p = 0.55 and p = 0.10, respectively). Segmentation of the inner limiting membrane was constantly trustworthy but Bruch's membrane segmentation was poorly reliable using the automatic Spectralis segmentation software. Based on this software, evaluation of retinal thickness may be incorrect, particularly in case of PED. PED is effectively an important parameter which is not included when measuring retinal thickness. Copyright © 2012 S. Karger AG, Basel.

  10. Synthesis, spectral characterization and density functional theory exploration of 1-(quinolin-3-yl)piperidin-2-ol

    NASA Astrophysics Data System (ADS)

    Suresh, M.; Syed Ali Padusha, M.; Bharanidharan, S.; Saleem, H.; Dhandapani, A.; Manivarman, S.

    2015-06-01

    The experimental and theoretical vibrational frequencies of a newly synthesized compound, namely 1-(quinolin-3-yl)piperidin-2-ol (QPPO) are analyzed. The experimental FT-IR (4000-400 cm-1) and FT-Raman (4000-100 cm-1) of the molecule in solid phase have been recorded. The optimized molecular structure, vibrational assignments of QPPO have been investigated experimentally and theoretically using Gaussian03W software package. The stability of the molecule arising from hyper-conjugative interaction and charge delocalization has been analyzed using NBO analysis. The first order hyperpolarizability (β0) is calculated to find its character in non-linear optics. Gauge including atomic orbital (GIAO) method is used to calculate 1H NMR chemical shift calculations were carried out and compared with experimental data. The electronic properties like UV-Visible spectral analysis and HOMO-LUMO energies were reported. The energy gap shows that the charge transfer occurs within the molecule. Thermodynamic parameters of the title compound were calculated at various temperatures.

  11. Near-infrared hyperspectral imaging for quality analysis of agricultural and food products

    NASA Astrophysics Data System (ADS)

    Singh, C. B.; Jayas, D. S.; Paliwal, J.; White, N. D. G.

    2010-04-01

    Agricultural and food processing industries are always looking to implement real-time quality monitoring techniques as a part of good manufacturing practices (GMPs) to ensure high-quality and safety of their products. Near-infrared (NIR) hyperspectral imaging is gaining popularity as a powerful non-destructive tool for quality analysis of several agricultural and food products. This technique has the ability to analyse spectral data in a spatially resolved manner (i.e., each pixel in the image has its own spectrum) by applying both conventional image processing and chemometric tools used in spectral analyses. Hyperspectral imaging technique has demonstrated potential in detecting defects and contaminants in meats, fruits, cereals, and processed food products. This paper discusses the methodology of hyperspectral imaging in terms of hardware, software, calibration, data acquisition and compression, and development of prediction and classification algorithms and it presents a thorough review of the current applications of hyperspectral imaging in the analyses of agricultural and food products.

  12. Interpreting forest and grassland biome productivity utilizing nested scales of image resolution and biogeographical analysis

    NASA Technical Reports Server (NTRS)

    Iverson, L. R.; Olson, J. S.; Risser, P. G.; Treworgy, C.; Frank, T.; Cook, E.; Ke, Y.

    1986-01-01

    Data acquisition, initial site characterization, image and geographic information methods available, and brief evaluations of first-year for NASA's Thematic Mapper (TM) working group are presented. The TM and other spectral data are examined in order to relate local, intensive ecosystem research findings to estimates of carbon cycling rates over wide geographic regions. The effort is to span environments ranging from dry to moist climates and from good to poor site quality using the TM capability, with and without the inclusion of geographic information system (GIS) data, and thus to interpret the local spatial pattern of factors conditioning biomass or productivity. Twenty-eight TM data sets were acquired, archived, and evaluated. The ERDAS image processing and GIS system were installed on the microcomputer (PC-AT) and its capabilities are being investigated. The TM coverage of seven study areas were exported via ELAS software on the Prime to the ERDAS system. Statistical analysis procedures to be used on the spectral data are being identified.

  13. Bioimpedance Harmonic Analysis as a Diagnostic Tool to Assess Regional Circulation and Neural Activity

    NASA Astrophysics Data System (ADS)

    Mudraya, I. S.; Revenko, S. V.; Khodyreva, L. A.; Markosyan, T. G.; Dudareva, A. A.; Ibragimov, A. R.; Romich, V. V.; Kirpatovsky, V. I.

    2013-04-01

    The novel technique based on harmonic analysis of bioimpedance microvariations with original hard- and software complex incorporating a high-resolution impedance converter was used to assess the neural activity and circulation in human urinary bladder and penis in patients with pelvic pain, erectile dysfunction, and overactive bladder. The therapeutic effects of shock wave therapy and Botulinum toxin detrusor injections were evaluated quantitatively according to the spectral peaks at low 0.1 Hz frequency (M for Mayer wave), respiratory (R) and cardiac (C) rhythms with their harmonics. Enhanced baseline regional neural activity identified according to M and R peaks was found to be presumably sympathetic in pelvic pain patients, and parasympathetic - in patients with overactive bladder. Total pulsatile activity and pulsatile resonances found in the bladder as well as in the penile spectrum characterised regional circulation and vascular tone. The abnormal spectral parameters characteristic of the patients with genitourinary diseases shifted to the norm in the cases of efficient therapy. Bioimpedance harmonic analysis seems to be a potent tool to assess regional peculiarities of circulatory and autonomic nervous activity in the course of patient treatment.

  14. Processing Satellite Imagery To Detect Waste Tire Piles

    NASA Technical Reports Server (NTRS)

    Skiles, Joseph; Schmidt, Cynthia; Wuinlan, Becky; Huybrechts, Catherine

    2007-01-01

    A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors and managers time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate. The task of identifying tire piles in satellite imagery is uniquely challenging because of their low reflectance levels: Tires tend to be spectrally confused with shadows and deep water, both of which reflect little light to satellite-borne imaging systems. In this methodology, the challenge is met, in part, by use of software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model included incorporation of lessons learned in previous research on the detection and mapping of tire piles by use of manual/ visual and/or computational analysis of aerial and satellite imagery. The TIRe model is a computational model for identifying tire piles and discriminating between tire piles and other objects. The input to the TIRe model is the georeferenced but otherwise raw satellite spectral images of a geographic region to be surveyed. The TIRe model identifies the darkest objects in the images and, on the basis of spatial and spectral image characteristics, discriminates against other dark objects, which can include vegetation, some bodies of water, and dark soils. The TIRe model can identify piles of as few as 100 tires. The output of the TIRe model is a binary mask showing areas containing suspected tire piles and spectrally similar features. This mask is overlaid on the original satellite imagery and examined by a trained image analyst, who strives to further discriminate against non-tire objects that the TIRe model tentatively identified as tire piles. After the analyst has made adjustments, the mask is used to create a synoptic, geographically accurate tire-pile survey map, which can be overlaid with a road map and/or any other map or set of georeferenced data, according to a customer s preferences.

  15. Calibration of the ROSAT HRI Spectral Response

    NASA Technical Reports Server (NTRS)

    Prestwich, Andrea H.; Silverman, John; McDowell, Jonathan; Callanan, Paul; Snowden, Steve

    2000-01-01

    The ROSAT High Resolution Imager has a limited (2-band) spectral response. This spectral capability can give X-ray hardness ratios on spatial scales of 5 arcseconds. The spectral response of the center of the detector was calibrated before the launch of ROSAT, but the gain decreases with time and also is a function of position on the detector. To complicate matters further, the satellite is 'wobbled', possibly moving a source across several spatial gain states. These difficulties have prevented the spectral response of the ROSAT High Resolution Imager (HRI) from being used for scientific measurements. We have used Bright Earth data and in-flight calibration sources to map the spatial and temporal gain changes, and written software which will allow ROSAT users to generate a calibrated XSPEC (an x ray spectral fitting package) response matrix and hence determine a calibrated hardness ratio. In this report, we describe the calibration procedure and show how to obtain a response matrix. In Section 2 we give an overview of the calibration procedure, in Section 3 we give a summary of HRI spatial and temporal gain variations. Section 4 describes the routines used to determine the gain distribution of a source. In Sections 5 and 6, we describe in detail how, the Bright Earth database and calibration sources are used to derive a corrected response matrix for a given observation. Finally, Section 7 describes how to use the software.

  16. High resolution ultrasonic spectroscopy system for nondestructive evaluation

    NASA Technical Reports Server (NTRS)

    Chen, C. H.

    1991-01-01

    With increased demand for high resolution ultrasonic evaluation, computer based systems or work stations become essential. The ultrasonic spectroscopy method of nondestructive evaluation (NDE) was used to develop a high resolution ultrasonic inspection system supported by modern signal processing, pattern recognition, and neural network technologies. The basic system which was completed consists of a 386/20 MHz PC (IBM AT compatible), a pulser/receiver, a digital oscilloscope with serial and parallel communications to the computer, an immersion tank with motor control of X-Y axis movement, and the supporting software package, IUNDE, for interactive ultrasonic evaluation. Although the hardware components are commercially available, the software development is entirely original. By integrating signal processing, pattern recognition, maximum entropy spectral analysis, and artificial neural network functions into the system, many NDE tasks can be performed. The high resolution graphics capability provides visualization of complex NDE problems. The phase 3 efforts involve intensive marketing of the software package and collaborative work with industrial sectors.

  17. Improvment of short cut numerical method for determination of periods of free oscillations for basins with irregular geometry and bathymetry

    NASA Astrophysics Data System (ADS)

    Chernov, Anton; Kurkin, Andrey; Pelinovsky, Efim; Yalciner, Ahmet; Zaytsev, Andrey

    2010-05-01

    A short cut numerical method for evaluation of the modes of free oscillations of the basins which have irregular geometry and bathymetry was presented in the paper (Yalciner A.C., Pelinovsky E., 2007). In the method, a single wave is inputted to the basin as an initial impulse. The respective agitation in the basin is computed by using the numerical method solving the nonlinear form of long wave equations. The time histories of water surface fluctuations at different locations due to propagation of the waves in relation to the initial impulse are stored and analyzed by the fast Fourier transform technique (FFT) and energy spectrum curves for each location are obtained. The frequencies of each mode of free oscillations are determined from the peaks of the spectrum curves. Some main features were added for this method and will be discussed here: 1. Instead of small number of gauges which were manually installed in the studied area the information from numerical simulation now is recorded on the regular net of the «simulation» gauges which was place everywhere on the sea surface in the depth deeper than "coast" level with the fixed presetted distance between gauges. The spectral analysis of wave records was produced by Welch periodorgam method instead of simple FFT so it's possible to get spectral power estimation for wave process and determine confidence interval for spectra peaks. 2. After the power spectral estimation procedure the common peak of studied seiche can be found and mean spectral amplitudes for this peak were calculated numerically by a Simpson integration method for all gauges in the basin and the mean spectral amplitudes spatial distribution map can be ploted. The spatial distribution helps to study structure of seiche and determine effected dangerous areas. 3. Nested grid module in the NAMI-DANCE - nonlinear shallow water equations calculation software package was developed. This is very important feature for complicated different scale (ocean - sea - bay - harbor) phenomenons studying. The new developed software was tested for Mediterranian, Sea of Okhotsk and South China sea regions. This software can be usefull in local tsunami mapping and tsunami propagation in the coastal zone. References: Yalciner A.C., Pelinovsky E. A short cut numerical method for determination of periods of free oscillations for basins with irregular geometry and bathymetry // Ocean engineering. V. 34. 2007. С. 747 - 757

  18. Diagnosis of glaucoma and detection of glaucoma progression using spectral domain optical coherence tomography.

    PubMed

    Grewal, Dilraj S; Tanna, Angelo P

    2013-03-01

    With the rapid adoption of spectral domain optical coherence tomography (SDOCT) in clinical practice and the recent advances in software technology, there is a need for a review of the literature on glaucoma detection and progression analysis algorithms designed for the commercially available instruments. Peripapillary retinal nerve fiber layer (RNFL) thickness and macular thickness, including segmental macular thickness calculation algorithms, have been demonstrated to be repeatable and reproducible, and have a high degree of diagnostic sensitivity and specificity in discriminating between healthy and glaucomatous eyes across the glaucoma continuum. Newer software capabilities such as glaucoma progression detection algorithms provide an objective analysis of longitudinally obtained structural data that enhances our ability to detect glaucomatous progression. RNFL measurements obtained with SDOCT appear more sensitive than time domain OCT (TDOCT) for glaucoma progression detection; however, agreement with the assessments of visual field progression is poor. Over the last few years, several studies have been performed to assess the diagnostic performance of SDOCT structural imaging and its validity in assessing glaucoma progression. Most evidence suggests that SDOCT performs similarly to TDOCT for glaucoma diagnosis; however, SDOCT may be superior for the detection of early stage disease. With respect to progression detection, SDOCT represents an important technological advance because of its improved resolution and repeatability. Advancements in RNFL thickness quantification, segmental macular thickness calculation and progression detection algorithms, when used correctly, may help to improve our ability to diagnose and manage glaucoma.

  19. DigitSeis: A New Digitization Software and its Application to the Harvard-Adam Dziewoński Observatory Collection

    NASA Astrophysics Data System (ADS)

    Bogiatzis, P.; Altoé, I. L.; Karamitrou, A.; Ishii, M.; Ishii, H.

    2015-12-01

    DigitSeis is a new open-source, interactive digitization software written in MATLAB that converts digital, raster images of analog seismograms to readily usable, discretized time series using image processing algorithms. DigitSeis automatically identifies and corrects for various geometrical distortions of seismogram images that are acquired through the original recording, storage, and scanning procedures. With human supervision, the software further identifies and classifies important features such as time marks and notes, corrects time-mark offsets from the main trace, and digitizes the combined trace with an analysis to obtain as accurate timing as possible. Although a large effort has been made to minimize the human input, DigitSeis provides interactive tools for challenging situations such as trace crossings and stains in the paper. The effectiveness of the software is demonstrated with the digitization of seismograms that are over half a century old from the Harvard-Adam Dziewoński observatory that is still in operation as a part of the Global Seismographic Network (station code HRV and network code IU). The spectral analysis of the digitized time series shows no spurious features that may be related to the occurrence of minute and hour marks. They also display signals associated with significant earthquakes, and a comparison of the spectrograms with modern recordings reveals similarities in the background noise.

  20. Demonstrating Sound with Music Production Software

    ERIC Educational Resources Information Center

    Keeports, David

    2010-01-01

    Readily available software designed for the production of music can be adapted easily to the physics classroom. Programs such as Apple's GarageBand access large libraries of recorded sound waves that can be heard and displayed both before and after alterations. Tools such as real-time spectral analysers, digital effects, and audio file editors…

  1. Illumination analysis of LAPAN's IR micro bolometer

    NASA Astrophysics Data System (ADS)

    Bustanul, A.; Irwan, P.; Andi M., T.

    2016-10-01

    We have since 2 years ago been doing a research in term of an IR Micrometer Bolometer which aims to fulfill our office, LAPAN, desire to put it as one of payloads into LAPAN's next micro satellite project, either at LAPAN A4 or at LAPAN A5. Due to the lack of experience on the subject, everything had been initiated by spectral radiance analysis adjusted by catastrophes sources in Indonesia, mainly wild fire (forest fire) and active volcano. Based on the result of the appropriate spectral radiance wavelength, 3.8 - 4 μm, and field of view (FOV), we, then, went through the further analysis, optical analysis. Focusing in illumination matter, the process was done by using Zemax software. Optical pass Interference and Stray light were two things that become our concern throughout the work. They could also be an evaluation of the performance optimization of illumination analysis of our optical design. The results, graphs, show that our design performance is close diffraction limited and the image blur of the geometrical produced by Lapan's IR Micro Bolometer lenses is in the pixel area range. Therefore, our optical design performance is relatively good and will produce image with high quality. In this paper, the Illumination analysis and process of LAPAN's Infra Red (IR) Micro Bolometer is presented.

  2. Hydrogen bonded 2-methyl-1H-imidazol-3-ium 3,5-dinitrobenzoate 3,5-dinitrobenzoic acid, a new optical crystal: Evaluation of properties by structural, spectral, quantum chemical calculations, Z-scan and Hirshfeld studies

    NASA Astrophysics Data System (ADS)

    Sathya, K.; Dhamodharan, P.; Dhandapani, M.

    2018-03-01

    A new hydrgen bonded proton transfer complex, 2-methyl imidazolium 3, 5-dinitrobenzoate 3,5-dinitro benzoic acid (MIDB) was synthesized by the reaction between 2-methyl imidazole with 3,5-dinitro benzoic acid (1:2) in methanol solvent at room temperature. The crystals were subjected to FT-IR spectral analysis to confirm the functional groups of the new compound. Single crystal XRD analysis reveals that MIDB belongs to monoclinic system with P21/c space group. The asymmetric unit consists of one 2-methyl imidazolium cation, one 3, 5-dinitrobenzoate anion and one uncharged 3,5-dinitro benzoic acid moiety. Experimental NMR spectroscopic data and theoretically calculated NMR data correlated very well to estabilish the exact carbon skeleton and hydrogen environment in the molecular structure of MIDB. The thermal stability of the compound was investigated by thermogravimetry and differential thermal analysis (TG-DTA). Computational studies such as optimization of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-31 g basis set level. The calculated first-order polarizability (β) of MIDB from computational studies is 4.1752 × 10-30 esu, which is 32 times greater than that of urea. UV-vis-NIR spectral studies revealed that the MIDB has a large optical transparency window. The optical nonlinearities of MIDB have been investigated by Z-scan technique with Hesbnd Ne laser radiation of wavelength 632.8 nm. Hirshfeld analysis indicate O⋯H/H⋯O interactions are the superior interactions confirming excessive hydrogen bond net work in the molecular structure.

  3. An overview of the CILBO spectral observation program

    NASA Astrophysics Data System (ADS)

    Rudawska, R.; Zender, J.; Koschny, D.

    2016-01-01

    The video equipment can be easily adopted with a spectral grating to obtain spectral information from meteors. Therefore, in recent years spectroscopic observations of meteors have become quite popular. The Meteor Research Group (MRG) of the European Space Agency has been working on upgrating the analysis of meteor spectra as well, operating image-intensified camera with objective grating (ICC8). ICC8 is located on Tenerife station of the double-station camera setup CILBO (Canary Island Long-Baseline Observatory). The pipeline software processes data with the standard calibration procedure (dark current, flat field, lens distortion corrections). While using the position of a meteor recorded by ICC7 camera (zero order), the position of the 1st order spectrum as a function of wavelength is computed Moreover, thanks to the double meteor observations carried by ICC7 (Tenerife) and ICC9 (La Palma), trajectory of a meteor and its orbit is determined. Which merged with simultaneously measurement of meteor spectrum from ICC8, allow us to identify the source of the meteoroid. Here, we report on preliminary results from a sample of meteor spectra collected by CILBO-ICC8 camera since 2012.

  4. [Integration design and diffraction characteristics analysis of prism-grating-prism].

    PubMed

    He, Tian-Bo; Bayanheshig; Li, Wen-Hao; Kong, Peng; Tang, Yu-Guo

    2014-01-01

    Prism-grating-prism (PGP) module is the important dispersing component in the hyper spectral imager. In order to effectively predict the distribution of diffraction efficiency of the whole PGP component and its diffraction characteristics before fabrication, a method of the PGP integration design is proposed. From the point of view of the volume phase holographic grating (VPHG) design, combined with the restrictive correlation between the various parameters of prisms and grating, we compiled the analysis software for calculating the whole PGP's diffraction efficiency. Furthermore, the effects of the structure parameters of prisms and grating on the PGP's diffraction characteristics were researched in detail. In particular we discussed the Bragg wavelength shift behaviour of the grating and a broadband PGP spectral component with high diffraction efficiency was designed for the imaging spectrometers. The result of simulation indicated that the spectral bandwidth of the PGP becomes narrower with the dispersion coefficient of prism 1 material decreasing; Bragg wavelength shift characteristics broaden the bandwidth of VPHG both spectrally and angularly, higher angular selectivity is desirable for selection requirements of the prism 1 material, and it can be easily tuned to achieve spectral bandwidth suitable for imaging PGP spectrograph; the vertex angle of prism 1, the film thickness and relative permittivity modulation of the grating have a significant impact on the distribution of PGP's diffraction efficiency, so precision control is necessary when fabrication. The diffraction efficiency of the whole PGP component designed by this method is no less than 50% in the wavelength range from 400 to 1000 nm, the specific design parameters have been given in this paper that have a certain reference value for PGP fabrication.

  5. An evaluation of remote sensing technologies for the detection of fugitive contamination at selected Superfund hazardous waste sites in Pennsylvania

    USGS Publications Warehouse

    Slonecker, E. Terrence; Fisher, Gary B.

    2014-01-01

    This evaluation was conducted to assess the potential for using both traditional remote sensing, such as aerial imagery, and emerging remote sensing technology, such as hyperspectral imaging, as tools for postclosure monitoring of selected hazardous waste sites. Sixteen deleted Superfund (SF) National Priorities List (NPL) sites in Pennsylvania were imaged with a Civil Air Patrol (CAP) Airborne Real-Time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) sensor between 2009 and 2012. Deleted sites are those sites that have been remediated and removed from the NPL. The imagery was processed to radiance and atmospherically corrected to relative reflectance with standard software routines using the Environment for Visualizing Imagery (ENVI, ITT–VIS, Boulder, Colorado) software. Standard routines for anomaly detection, endmember collection, vegetation stress, and spectral analysis were applied.

  6. The Seismic Tool-Kit (STK): An Open Source Software For Learning the Basis of Signal Processing and Seismology.

    NASA Astrophysics Data System (ADS)

    Reymond, D.

    2016-12-01

    We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  7. Direct glycan structure determination of intact N-linked glycopeptides by low-energy collision-induced dissociation tandem mass spectrometry and predicted spectral library searching.

    PubMed

    Pai, Pei-Jing; Hu, Yingwei; Lam, Henry

    2016-08-31

    Intact glycopeptide MS analysis to reveal site-specific protein glycosylation is an important frontier of proteomics. However, computational tools for analyzing MS/MS spectra of intact glycopeptides are still limited and not well-integrated into existing workflows. In this work, a new computational tool which combines the spectral library building/searching tool, SpectraST (Lam et al. Nat. Methods2008, 5, 873-875), and the glycopeptide fragmentation prediction tool, MassAnalyzer (Zhang et al. Anal. Chem.2010, 82, 10194-10202) for intact glycopeptide analysis has been developed. Specifically, this tool enables the determination of the glycan structure directly from low-energy collision-induced dissociation (CID) spectra of intact glycopeptides. Given a list of possible glycopeptide sequences as input, a sample-specific spectral library of MassAnalyzer-predicted spectra is built using SpectraST. Glycan identification from CID spectra is achieved by spectral library searching against this library, in which both m/z and intensity information of the possible fragmentation ions are taken into consideration for improved accuracy. We validated our method using a standard glycoprotein, human transferrin, and evaluated its potential to be used in site-specific glycosylation profiling of glycoprotein datasets from LC-MS/MS. In addition, we further applied our method to reveal, for the first time, the site-specific N-glycosylation profile of recombinant human acetylcholinesterase expressed in HEK293 cells. For maximum usability, SpectraST is developed as part of the Trans-Proteomic Pipeline (TPP), a freely available and open-source software suite for MS data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Focus: a robust workflow for one-dimensional NMR spectral analysis.

    PubMed

    Alonso, Arnald; Rodríguez, Miguel A; Vinaixa, Maria; Tortosa, Raül; Correig, Xavier; Julià, Antonio; Marsal, Sara

    2014-01-21

    One-dimensional (1)H NMR represents one of the most commonly used analytical techniques in metabolomic studies. The increase in the number of samples analyzed as well as the technical improvements involving instrumentation and spectral acquisition demand increasingly accurate and efficient high-throughput data processing workflows. We present FOCUS, an integrated and innovative methodology that provides a complete data analysis workflow for one-dimensional NMR-based metabolomics. This tool will allow users to easily obtain a NMR peak feature matrix ready for chemometric analysis as well as metabolite identification scores for each peak that greatly simplify the biological interpretation of the results. The algorithm development has been focused on solving the critical difficulties that appear at each data processing step and that can dramatically affect the quality of the results. As well as method integration, simplicity has been one of the main objectives in FOCUS development, requiring very little user input to perform accurate peak alignment, peak picking, and metabolite identification. The new spectral alignment algorithm, RUNAS, allows peak alignment with no need of a reference spectrum, and therefore, it reduces the bias introduced by other alignment approaches. Spectral alignment has been tested against previous methodologies obtaining substantial improvements in the case of moderate or highly unaligned spectra. Metabolite identification has also been significantly improved, using the positional and correlation peak patterns in contrast to a reference metabolite panel. Furthermore, the complete workflow has been tested using NMR data sets from 60 human urine samples and 120 aqueous liver extracts, reaching a successful identification of 42 metabolites from the two data sets. The open-source software implementation of this methodology is available at http://www.urr.cat/FOCUS.

  9. Analysis of sulfates on low molecular weight heparin using mass spectrometry: structural characterization of enoxaparin.

    PubMed

    Gupta, Rohitesh; Ponnusamy, Moorthy P

    2018-05-31

    Structural characterization of low molecular weight heparin (LMWH) is critical to meet biosimilarity standards. In this context, the review focuses on structural analysis of labile sulfates attached to the side-groups of LMWH using mass spectrometry. A comprehensive review of this topic will help readers to identify key strategies for tackling the problem related to sulfate loss. At the same time, various mass spectrometry techniques are presented to facilitate compositional analysis of LMWH, mainly enoxaparin. Areas covered: This review summarizes findings on mass spectrometry application for LMWH, including modulation of sulfates, using enzymology and sample preparation approaches. Furthermore, popular open-source software packages for automated spectral data interpretation are also discussed. Successful use of LC/MS can decipher structural composition for LMWH and help evaluate their sameness or biosimilarity with the innovator molecule. Overall, the literature has been searched using PubMed by typing various search queries such as 'enoxaparin', 'mass spectrometry', 'low molecular weight heparin', 'structural characterization', etc. Expert commentary: This section highlights clinically relevant areas that need improvement to achieve satisfactory commercialization of LMWHs. It also primarily emphasizes the advancements in instrumentation related to mass spectrometry, and discusses building automated software for data interpretation and analysis.

  10. ORBS, ORCS, OACS, a Software Suite for Data Reduction and Analysis of the Hyperspectral Imagers SITELLE and SpIOMM

    NASA Astrophysics Data System (ADS)

    Martin, T.; Drissen, L.; Joncas, G.

    2015-09-01

    SITELLE (installed in 2015 at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont-Mégantic) are the first Imaging Fourier Transform Spectrometers (IFTS) capable of obtaining a hyperspectral data cube which samples a 12 arc minutes field of view into four millions of visible spectra. The result of each observation is made up of two interferometric data cubes which need to be merged, corrected, transformed and calibrated in order to get a spectral cube of the observed region ready to be analysed. ORBS is a fully automatic data reduction software that has been entirely designed for this purpose. The data size (up to 68 Gb for larger science cases) and the computational needs have been challenging and the highly parallelized object-oriented architecture of ORBS reflects the solutions adopted which made possible to process 68 Gb of raw data in less than 11 hours using 8 cores and 22.6 Gb of RAM. It is based on a core framework (ORB) that has been designed to support the whole software suite for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS). They all aim to provide a strong basis for the creation and development of specialized analysis modules that could benefit the scientific community working with SITELLE and SpIOMM.

  11. Automated Big Data Analysis in Bottom-up and Targeted Proteomics

    PubMed Central

    van der Plas-Duivesteijn, Suzanne; Domański, Dominik; Smith, Derek; Borchers, Christoph; Palmblad, Magnus; Mohamme, Yassene

    2014-01-01

    Similar to other data intensive sciences, analyzing mass spectrometry-based proteomics data involves multiple steps and diverse software using different algorithms and data formats and sizes. Besides that the distributed and evolving nature of the data in online repositories, another challenge is that a scientists have to deal with many steps of analysis pipelines. A documented data processing is also becoming an essential part for the overall reproducibility of the results. Thanks to different e-Science initiatives, scientific workflow engines have become a means for automated, sharable and reproducible data processing. While these are designed as general tools, they can be employed to solve different challenges that we are facing in handling our Big Data. Here we present three use cases: improving the performance of different spectral search engines by decomposing input data and recomposing the resulting files, building spectral libraries from more than 20 million spectra, and integrating information from multiple resources to select most appropriate peptides for targeted proteomics analyses. The three use cases demonstrate different challenges in exploiting proteomics data analysis. In the first we integrate local and cloud processing resources in order to obtain better performance resulting in more than 30-fold speed improvement. By considering search engines as legacy software our solution is applicable to multiple search algorithms. The second use case is an example of automated processing of many data files of different sizes and locations, starting with raw data and ending with the final, ready-to-use library. This demonstrates the robustness and fault tolerance when dealing with huge amount data stored in multiple files. The third use case demonstrates retrieval and integration of information and data from multiple online repositories. In addition to the diversity of data formats and Web interfaces, this use case also illustrates how to deal with incomplete data.

  12. Spectral CT with metal artifacts reduction software for improvement of tumor visibility in the vicinity of gold fiducial markers.

    PubMed

    Brook, Olga R; Gourtsoyianni, Sofia; Brook, Alexander; Mahadevan, Anand; Wilcox, Carol; Raptopoulos, Vassilios

    2012-06-01

    To evaluate spectral computed tomography (CT) with metal artifacts reduction software (MARS) for reduction of metal artifacts associated with gold fiducial seeds. Thirteen consecutive patients with 37 fiducial seeds implanted for radiation therapy of abdominal lesions were included in this HIPAA-compliant, institutional review board-approved prospective study. Six patients were women (46%) and seven were men (54%). The mean age was 61.1 years (median, 58 years; range, 29-78 years). Spectral imaging was used for arterial phase CT. Images were reconstructed with and without MARS in axial, coronal, and sagittal planes. Two radiologists independently reviewed reconstructions and selected the best image, graded the visibility of the tumor, and assessed the amount of artifacts in all planes. A linear-weighted κ statistic and Wilcoxon signed-rank test were used to assess interobserver variability. Histogram analysis with the Kolmogorov-Smirnov test was used for objective evaluation of artifacts reduction. Fiducial seeds were placed in pancreas (n = 5), liver (n = 7), periportal lymph nodes (n = 1), and gallbladder bed (n = 1). MARS-reconstructed images received a better grade than those with standard reconstruction in 60% and 65% of patients by the first and second radiologist, respectively. Tumor visibility was graded higher with standard versus MARS reconstruction (grade, 3.7 ± 1.0 vs 2.8 ± 1.1; P = .001). Reduction of blooming was noted on MARS-reconstructed images (P = .01). Amount of artifacts, for both any and near field, was significantly smaller on sagittal and coronal MARS-reconstructed images than on standard reconstructions (P < .001 for all comparisons). Far-field artifacts were more prominent on axial MARS-reconstructed images than on standard reconstructions (P < .01). Linear-weighted κ statistic showed moderate to perfect agreement between radiologists. CT number distribution was narrower with MARS than with standard reconstruction in 35 of 37 patients (P < .001). Spectral CT with use of MARS improved tumor visibility in the vicinity of gold fiducial seeds.

  13. Topics In Chemical Instrumentation: Fourier Transformations for Chemists Part I. Introduction to the Fourier Transform.

    ERIC Educational Resources Information Center

    Glasser, L.

    1987-01-01

    This paper explores how Fourier Transform (FT) mimics spectral transformation, how this property can be exploited to advantage in spectroscopy, and how the FT can be used in data treatment. A table displays a number of important FT serial/spectral pairs related by Fourier Transformations. A bibliography and listing of computer software related to…

  14. Airborne Multi-Spectral Minefield Survey

    DTIC Science & Technology

    2005-05-01

    Swedish Defence Research Agency), GEOSPACE (Austria), GTD ( Ingenieria de Sistemas y Software Industrial, Spain), IMEC (Ineruniversity MicroElectronic...RTO-MP-SET-092 18 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Airborne Multi-Spectral Minefield Survey Dirk-Jan de Lange, Eric den...actions is the severe lack of baseline information. To respond to this in a rapid way, cost-efficient data acquisition methods are a key issue. de

  15. Observatory software for the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Vermeulen, Tom; Isani, Sidik; Withington, Kanoa; Ho, Kevin; Szeto, Kei; Murowinski, Rick

    2016-07-01

    The Canada-France-Hawaii Telescope is currently in the conceptual design phase to redevelop its facility into the new Maunakea Spectroscopic Explorer (MSE). MSE is designed to be the largest non-ELT optical/NIR astronomical telescope, and will be a fully dedicated facility for multi-object spectroscopy over a broad range of spectral resolutions. This paper outlines the software and control architecture envisioned for the new facility. The architecture will be designed around much of the existing software infrastructure currently used at CFHT as well as the latest proven opensource software. CFHT plans to minimize risk and development time by leveraging existing technology.

  16. Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1

    NASA Technical Reports Server (NTRS)

    Schlosser, E. H.

    1980-01-01

    The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.

  17. Reference software implementation for GIFTS ground data processing

    NASA Astrophysics Data System (ADS)

    Garcia, R. K.; Howell, H. B.; Knuteson, R. O.; Martin, G. D.; Olson, E. R.; Smuga-Otto, M. J.

    2006-08-01

    Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.

  18. Quantum chemical studies on structural, vibrational, nonlinear optical properties and chemical reactivity of indigo carmine dye

    NASA Astrophysics Data System (ADS)

    El-Mansy, M. A. M.

    2017-08-01

    Structural and vibrational spectroscopic studies were performed on indigo carmine (IC) isomers using FT-IR spectral analysis along with DFT/B3LYP method utilizing Gaussian 09 software. GaussView 5 program has been employed to perform a detailed interpretation of vibrational spectra. Simulation of infrared spectra has led to an excellent overall agreement with the observed spectral patterns. Mulliken population analyses on atomic charges, MEP, HOMO-LUMO, NLO, first order hyperpolarizability and thermodynamic properties have been examined by (DFT/B3LYP) method with the SDD basis set level. Density of state spectra (DOS) were calculated using GaussSum 3 at the same level of theory. Molecular modeling approved that DOS Spectra are the most significant tools for differentiating between two IC isomers so far. Moreover, The IC isomers (cis-isomer) have shown an extended applicability for manufacturing both NLO and photovoltaic devices such as solar cells.

  19. Calibration of the ROSAT HRI Spectral Response

    NASA Technical Reports Server (NTRS)

    Prestwich, Andrea

    1998-01-01

    The ROSAT High Resolution Imager has a limited (2-band) spectral response. This spectral capability can give X-ray hardness ratios on spatial scales of 5 arcseconds. The spectral response of the center of the detector was calibrated before the launch of ROSAT, but the gain decreases-with time and also is a function of position on the detector. To complicate matters further, the satellite is "wobbled", possibly moving a source across several spatial gain states. These difficulties have prevented the spectral response of the ROSAT HRI from being used for scientific measurements. We have used Bright Earth data and in-flight calibration sources to map the spatial and temporal gain changes, and written software which will allow ROSAT users to generate a calibrated XSPEC response matrix and hence determine a calibrated hardness ratio. In this report, we describe the calibration procedure and show how to obtain a response matrix. In Section 2 we give an overview of the calibration procedure, in Section 3 we give a summary of HRI spatial and temporal gain variations. Section 4 describes the routines used to determine the gain distribution of a source. In Sections 5 and 6, we describe in detail how the Bright Earth database and calibration sources are used to derive a corrected response matrix for a given observation. Finally, Section 7 describes how to use the software.

  20. A framelet-based iterative maximum-likelihood reconstruction algorithm for spectral CT

    NASA Astrophysics Data System (ADS)

    Wang, Yingmei; Wang, Ge; Mao, Shuwei; Cong, Wenxiang; Ji, Zhilong; Cai, Jian-Feng; Ye, Yangbo

    2016-11-01

    Standard computed tomography (CT) cannot reproduce spectral information of an object. Hardware solutions include dual-energy CT which scans the object twice in different x-ray energy levels, and energy-discriminative detectors which can separate lower and higher energy levels from a single x-ray scan. In this paper, we propose a software solution and give an iterative algorithm that reconstructs an image with spectral information from just one scan with a standard energy-integrating detector. The spectral information obtained can be used to produce color CT images, spectral curves of the attenuation coefficient μ (r,E) at points inside the object, and photoelectric images, which are all valuable imaging tools in cancerous diagnosis. Our software solution requires no change on hardware of a CT machine. With the Shepp-Logan phantom, we have found that although the photoelectric and Compton components were not perfectly reconstructed, their composite effect was very accurately reconstructed as compared to the ground truth and the dual-energy CT counterpart. This means that our proposed method has an intrinsic benefit in beam hardening correction and metal artifact reduction. The algorithm is based on a nonlinear polychromatic acquisition model for x-ray CT. The key technique is a sparse representation of iterations in a framelet system. Convergence of the algorithm is studied. This is believed to be the first application of framelet imaging tools to a nonlinear inverse problem.

  1. Convolutional neural networks for vibrational spectroscopic data analysis.

    PubMed

    Acquarelli, Jacopo; van Laarhoven, Twan; Gerretzen, Jan; Tran, Thanh N; Buydens, Lutgarde M C; Marchiori, Elena

    2017-02-15

    In this work we show that convolutional neural networks (CNNs) can be efficiently used to classify vibrational spectroscopic data and identify important spectral regions. CNNs are the current state-of-the-art in image classification and speech recognition and can learn interpretable representations of the data. These characteristics make CNNs a good candidate for reducing the need for preprocessing and for highlighting important spectral regions, both of which are crucial steps in the analysis of vibrational spectroscopic data. Chemometric analysis of vibrational spectroscopic data often relies on preprocessing methods involving baseline correction, scatter correction and noise removal, which are applied to the spectra prior to model building. Preprocessing is a critical step because even in simple problems using 'reasonable' preprocessing methods may decrease the performance of the final model. We develop a new CNN based method and provide an accompanying publicly available software. It is based on a simple CNN architecture with a single convolutional layer (a so-called shallow CNN). Our method outperforms standard classification algorithms used in chemometrics (e.g. PLS) in terms of accuracy when applied to non-preprocessed test data (86% average accuracy compared to the 62% achieved by PLS), and it achieves better performance even on preprocessed test data (96% average accuracy compared to the 89% achieved by PLS). For interpretability purposes, our method includes a procedure for finding important spectral regions, thereby facilitating qualitative interpretation of results. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Structural characterization and DFT study of a new optical crystal: 2-amino-3-methylpyridinium-3,5-dinitrobenzoate

    NASA Astrophysics Data System (ADS)

    Sathya, K.; Dhamodharan, P.; Dhandapani, M.

    2018-05-01

    A new proton transfer complex was synthesized by the reaction between 2-amino-3-methyl pyridine with 3,5-dinitro benzoic acid in methanol solvent at room temperature. Chemical composition and stoichiometry of the synthesized complex 2-amino-3-methylpyridinium 3,5-dinitrobenzoate (AMPDB) were verified by CHN analysis. The AMPDB crystals were subjected to FT-IR spectral analysis to confirm the functional groups in the compound. UV-Vis-NIR spectral studies revealed that the AMPDB has a large optical transparency window. Single crystal XRD analysis reveals that AMPDB belongs to a monoclinic system with P21/c space group. NMR spectroscopic data indicate the exact carbon skeleton and hydrogen environment in the molecular structure of AMPDB. The thermal stability of the compound was investigated by thermogravimetry (TG). Computational studies such as optimisation of molecular geometry, natural bond analysis (NBO), Mulliken population analysis and HOMO-LUMO analysis were performed using Gaussian 09 software by B3LYP method at 6-311 G(d p) basis set. The first order hyperpolarizability (β) value is 37 times greater than that of urea. The optical nonlinearities of AMPDB have been investigated by Z-scan technique with He-Ne laser radiation of wavelength 632.8 nm. Hirshfeld analysis indicate O⋯H/H⋯O interactions are the superior interactions confirming intensive hydrogen bond net work.

  3. Are There Hidden Supernovae?

    NASA Technical Reports Server (NTRS)

    Bregman, Jesse; Harker, David; Dunham, E.; Rank, David; Temi, Pasquale

    1997-01-01

    Ames Research Center and UCSC have been working on the development of a Mid IR Camera for the KAO in order to search for extra galactic supernovae. The development of the camera and its associated data reduction software have been successfully completed. Spectral Imaging of the Orion Bar at 6.2 and 7.8 microns demonstrates the derotation and data reduction software which was developed.

  4. Spectrophotometer-Integrating-Sphere System for Computing Solar Absorptance

    NASA Technical Reports Server (NTRS)

    Witte, William G., Jr.; Slemp, Wayne S.; Perry, John E., Jr.

    1991-01-01

    A commercially available ultraviolet, visible, near-infrared spectrophotometer was modified to utilize an 8-inch-diameter modified Edwards-type integrated sphere. Software was written so that the reflectance spectra could be used to obtain solar absorptance values of 1-inch-diameter specimens. A descriptions of the system, spectral reflectance, and software for calculation of solar absorptance from reflectance data are presented.

  5. Interpreting forest and grassland biome productivity utilizing nested scales of image resolution and biogeographical analysis

    NASA Technical Reports Server (NTRS)

    Iverson, L. R.; Cook, E. A.; Graham, R. L.; Olson, J. S.; Frank, T.; Ke, Y.; Treworgy, C.; Risser, P. G.

    1986-01-01

    Several hardware, software, and data collection problems encountered were conquered. The Geographic Information System (GIS) data from other systems were converted to ERDAS format for incorporation with the image data. Statistical analysis of the relationship between spectral values and productivity is being pursued. Several project sites, including Jackson, Pope, Boulder, Smokies, and Huntington Forest are evolving as the most intensively studied areas, primarily due to availability of data and time. Progress with data acquisition and quality checking, more details on experimental sites, and brief summarizations of research results and future plans are discussed. Material on personnel, collaborators, facilities, site background, and meetings and publications of the investigators are included.

  6. Design and construction of an Offner spectrometer based on geometrical analysis of ring fields.

    PubMed

    Kim, Seo Hyun; Kong, Hong Jin; Lee, Jong Ung; Lee, Jun Ho; Lee, Jai Hoon

    2014-08-01

    A method to obtain an aberration-corrected Offner spectrometer without ray obstruction is proposed. A new, more efficient spectrometer optics design is suggested in order to increase its spectral resolution. The derivation of a new ring equation to eliminate ray obstruction is based on geometrical analysis of the ring fields for various numerical apertures. The analytical design applying this equation was demonstrated using the optical design software Code V in order to manufacture a spectrometer working in wavelengths of 900-1700 nm. The simulation results show that the new concept offers an analytical initial design taking the least time of calculation. The simulated spectrometer exhibited a modulation transfer function over 80% at Nyquist frequency, root-mean-square spot diameters under 8.6 μm, and a spectral resolution of 3.2 nm. The final design and its realization of a high resolution Offner spectrometer was demonstrated based on the simulation result. The equation and analytical design procedure shown here can be applied to most Offner systems regardless of the wavelength range.

  7. An excitation wavelength-scanning spectral imaging system for preclinical imaging

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Jiang, Yanan; Patsekin, Valery; Rajwa, Bartek; Robinson, J. Paul

    2008-02-01

    Small-animal fluorescence imaging is a rapidly growing field, driven by applications in cancer detection and pharmaceutical therapies. However, the practical use of this imaging technology is limited by image-quality issues related to autofluorescence background from animal tissues, as well as attenuation of the fluorescence signal due to scatter and absorption. To combat these problems, spectral imaging and analysis techniques are being employed to separate the fluorescence signal from background autofluorescence. To date, these technologies have focused on detecting the fluorescence emission spectrum at a fixed excitation wavelength. We present an alternative to this technique, an imaging spectrometer that detects the fluorescence excitation spectrum at a fixed emission wavelength. The advantages of this approach include increased available information for discrimination of fluorescent dyes, decreased optical radiation dose to the animal, and ability to scan a continuous wavelength range instead of discrete wavelength sampling. This excitation-scanning imager utilizes an acousto-optic tunable filter (AOTF), with supporting optics, to scan the excitation spectrum. Advanced image acquisition and analysis software has also been developed for classification and unmixing of the spectral image sets. Filtering has been implemented in a single-pass configuration with a bandwidth (full width at half maximum) of 16nm at 550nm central diffracted wavelength. We have characterized AOTF filtering over a wide range of incident light angles, much wider than has been previously reported in the literature, and we show how changes in incident light angle can be used to attenuate AOTF side lobes and alter bandwidth. A new parameter, in-band to out-of-band ratio, was defined to assess the quality of the filtered excitation light. Additional parameters were measured to allow objective characterization of the AOTF and the imager as a whole. This is necessary for comparing the excitation-scanning imager to other spectral and fluorescence imaging technologies. The effectiveness of the hyperspectral imager was tested by imaging and analysis of mice with injected fluorescent dyes. Finally, a discussion of the optimization of spectral fluorescence imagers is given, relating the effects of filter quality on fluorescence images collected and the analysis outcome.

  8. Linear: A Novel Algorithm for Reconstructing Slitless Spectroscopy from HST/WFC3

    NASA Astrophysics Data System (ADS)

    Ryan, R. E., Jr.; Casertano, S.; Pirzkal, N.

    2018-03-01

    We present a grism extraction package (LINEAR) designed to reconstruct 1D spectra from a collection of slitless spectroscopic images, ideally taken at a variety of orientations, dispersion directions, and/or dither positions. Our approach is to enumerate every transformation between all direct image positions (i.e., a potential source) and the collection of grism images at all relevant wavelengths. This leads to solving a large, sparse system of linear equations, which we invert using the standard LSQR algorithm. We implement a number of color and geometric corrections (such as flat field, pixel-area map, source morphology, and spectral bandwidth), but assume many effects have been calibrated out (such as basic reductions, background subtraction, and astrometric refinement). We demonstrate the power of our approach with several Monte Carlo simulations and the analysis of archival data. The simulations include astrometric and photometric uncertainties, sky-background estimation, and signal-to-noise calculations. The data are G141 observations obtained with the Wide-Field Camera 3 of the Hubble Ultra-Deep Field, and show the power of our formalism by improving the spectral resolution without sacrificing the signal-to-noise (a tradeoff that is often made by current approaches). Additionally, our approach naturally accounts for source contamination, which is only handled heuristically by present softwares. We conclude with a discussion of various observations where our approach will provide much improved spectral 1D spectra, such as crowded fields (star or galaxy clusters), spatially resolved spectroscopy, or surveys with strict completeness requirements. At present our software is heavily geared for Wide-Field Camera 3 IR, however we plan extend the codebase for additional instruments.

  9. Brain metabolic pattern analysis using a magnetic resonance spectra classification software in experimental stroke.

    PubMed

    Jiménez-Xarrié, Elena; Davila, Myriam; Candiota, Ana Paula; Delgado-Mederos, Raquel; Ortega-Martorell, Sandra; Julià-Sapé, Margarida; Arús, Carles; Martí-Fàbregas, Joan

    2017-01-13

    Magnetic resonance spectroscopy (MRS) provides non-invasive information about the metabolic pattern of the brain parenchyma in vivo. The SpectraClassifier software performs MRS pattern-recognition by determining the spectral features (metabolites) which can be used objectively to classify spectra. Our aim was to develop an Infarct Evolution Classifier and a Brain Regions Classifier in a rat model of focal ischemic stroke using SpectraClassifier. A total of 164 single-voxel proton spectra obtained with a 7 Tesla magnet at an echo time of 12 ms from non-infarcted parenchyma, subventricular zones and infarcted parenchyma were analyzed with SpectraClassifier ( http://gabrmn.uab.es/?q=sc ). The spectra corresponded to Sprague-Dawley rats (healthy rats, n = 7) and stroke rats at day 1 post-stroke (acute phase, n = 6 rats) and at days 7 ± 1 post-stroke (subacute phase, n = 14). In the Infarct Evolution Classifier, spectral features contributed by lactate + mobile lipids (1.33 ppm), total creatine (3.05 ppm) and mobile lipids (0.85 ppm) distinguished among non-infarcted parenchyma (100% sensitivity and 100% specificity), acute phase of infarct (100% sensitivity and 95% specificity) and subacute phase of infarct (78% sensitivity and 100% specificity). In the Brain Regions Classifier, spectral features contributed by myoinositol (3.62 ppm) and total creatine (3.04/3.05 ppm) distinguished among infarcted parenchyma (100% sensitivity and 98% specificity), non-infarcted parenchyma (84% sensitivity and 84% specificity) and subventricular zones (76% sensitivity and 93% specificity). SpectraClassifier identified candidate biomarkers for infarct evolution (mobile lipids accumulation) and different brain regions (myoinositol content).

  10. The volatile compound BinBase mass spectral database.

    PubMed

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement.

  11. The volatile compound BinBase mass spectral database

    PubMed Central

    2011-01-01

    Background Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. Description The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). Conclusions The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement. PMID:21816034

  12. Design of Instrument Control Software for Solar Vector Magnetograph at Udaipur Solar Observatory

    NASA Astrophysics Data System (ADS)

    Gosain, Sanjay; Venkatakrishnan, P.; Venugopalan, K.

    2004-04-01

    A magnetograph is an instrument which makes measurement of solar magnetic field by measuring Zeeman induced polarization in solar spectral lines. In a typical filter based magnetograph there are three main modules namely, polarimeter, narrow-band spectrometer (filter), and imager(CCD camera). For a successful operation of magnetograph it is essential that these modules work in synchronization with each other. Here, we describe the design of instrument control system implemented for the Solar Vector Magnetograph under development at Udaipur Solar Observatory. The control software is written in Visual Basic and exploits the Component Object Model (COM) components for a fast and flexible application development. The user can interact with the instrument modules through a Graphical User Interface (GUI) and can program the sequence of magnetograph operations. The integration of Interactive Data Language (IDL) ActiveX components in the interface provides a powerful tool for online visualization, analysis and processing of images.

  13. Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Domik, Gitta; Alam, Salim; Pinkney, Paul

    1992-01-01

    This report describes our project activities for the period Sep. 1991 - Oct. 1992. Our activities included stabilizing the software system STAR, porting STAR to IDL/widgets (improved user interface), targeting new visualization techniques for multi-dimensional data visualization (emphasizing 3D visualization), and exploring leading-edge 3D interface devices. During the past project year we emphasized high-end visualization techniques, by exploring new tools offered by state-of-the-art visualization software (such as AVS3 and IDL4/widgets), by experimenting with tools still under research at the Department of Computer Science (e.g., use of glyphs for multidimensional data visualization), and by researching current 3D input/output devices as they could be used to explore 3D astrophysical data. As always, any project activity is driven by the need to interpret astrophysical data more effectively.

  14. A computationally efficient software application for calculating vibration from underground railways

    NASA Astrophysics Data System (ADS)

    Hussein, M. F. M.; Hunt, H. E. M.

    2009-08-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  15. Modeling SMAP Spacecraft Attitude Control Estimation Error Using Signal Generation Model

    NASA Technical Reports Server (NTRS)

    Rizvi, Farheen

    2016-01-01

    Two ground simulation software are used to model the SMAP spacecraft dynamics. The CAST software uses a higher fidelity model than the ADAMS software. The ADAMS software models the spacecraft plant, controller and actuator models, and assumes a perfect sensor and estimator model. In this simulation study, the spacecraft dynamics results from the ADAMS software are used as CAST software is unavailable. The main source of spacecraft dynamics error in the higher fidelity CAST software is due to the estimation error. A signal generation model is developed to capture the effect of this estimation error in the overall spacecraft dynamics. Then, this signal generation model is included in the ADAMS software spacecraft dynamics estimate such that the results are similar to CAST. This signal generation model has similar characteristics mean, variance and power spectral density as the true CAST estimation error. In this way, ADAMS software can still be used while capturing the higher fidelity spacecraft dynamics modeling from CAST software.

  16. Metal artefact reduction in gemstone spectral imaging dual-energy CT with and without metal artefact reduction software.

    PubMed

    Lee, Young Han; Park, Kwan Kyu; Song, Ho-Taek; Kim, Sungjun; Suh, Jin-Suck

    2012-06-01

    To assess the usefulness of gemstone spectral imaging (GSI) dual-energy CT (DECT) with/without metal artefact reduction software (MARs). The DECTs were performed using fast kV-switching GSI between 80 and 140 kV. The CT data were retro-reconstructed with/without MARs, by different displayed fields-of-view (DFOV), and with synthesised monochromatic energy in the range 40-140 keV. A phantom study of size and CT numbers was performed in a titanium plate and a stainless steel plate. A clinical study was performed in 26 patients with metallic hardware. All images were retrospectively reviewed in terms of the visualisation of periprosthetic regions and the severity of beam-hardening artefacts by using a five-point scale. The GSI-MARs reconstruction can markedly reduce the metal-related artefacts, and the image quality was affected by the prosthesis composition and DFOV. The spectral CT numbers of the prosthesis and periprosthetic regions showed different patterns on stainless steel and titanium plates. Dual-energy CT with GSI-MARs can reduce metal-related artefacts and improve the delineation of the prosthesis and periprosthetic region. We should be cautious when using GSI-MARs because the image quality was affected by the prosthesis composition, energy (in keV) and DFOV. The metallic composition and size should be considered in metallic imaging with GSI-MARs reconstruction. • Metal-related artefacts can be troublesome on musculoskeletal computed tomography (CT). • Gemstone spectral imaging (GSI) with dual-energy CT (DECT) offers a novel solution • GSI and metallic artefact reduction software (GSI-MAR) can markedly reduce these artefacts. • However image quality is influenced by the prosthesis composition and other parameters. • We should be aware about potential overcorrection when using GSI-MARs.

  17. EVALUATION OF PATCHY ATROPHY SECONDARY TO HIGH MYOPIA BY SEMIAUTOMATED SOFTWARE FOR FUNDUS AUTOFLUORESCENCE ANALYSIS.

    PubMed

    Miere, Alexandra; Capuano, Vittorio; Serra, Rita; Jung, Camille; Souied, Eric; Querques, Giuseppe

    2017-05-31

    To evaluate the progression of patchy atrophy in high myopia using semiautomated software for fundus autofluorescence (FAF) analysis. The medical records and multimodal imaging of 21 consecutive highly myopic patients with macular chorioretinal patchy atrophy (PA) were retrospectively analyzed. All patients underwent repeated fundus autofluorescence and spectral domain optical coherence tomography over at least 12 months. Color fundus photography was also performed in a subset of patients. Total atrophy area was measured on FAF images using Region Finder semiautomated software embedded in Spectralis (Heidelberg Engineering, Heidelberg, Germany) at baseline and during follow-up visits. Region Finder was compared with manually measured PA on FAF images. Twenty-two eyes of 21 patients (14 women, 7 men; mean age 62.8 + 13.0 years, range 32-84 years) were included. Mean PA area using Region Finder was 2.77 ± 2.91 SD mm at baseline, 3.12 ± 2.68 mm at Month 6, 3.43 ± 2.68 mm at Month 12, and 3.73 ± 2.74 mm at Month 18 (overall P < 0.005); this accounts for PA progression rate of 0.821 mm/year. Atrophy progression was significantly greater among eyes with larger PA compared with smaller baseline PA at Months 6, 12, and 18. There was no statistically significant difference between semiautomated Region Finder PA area and manually measured PA area on FAF images. Fundus autofluorescence analysis by Region Finder semiautomated software provides accurate measurements of lesion area and allows us to quantify the progression of PA in high myopia. In our series, PA enlarged significantly over at least 12 months, and its progression seemed to be related to the lesion size at baseline.

  18. NASA Tech Briefs, March 2012

    NASA Technical Reports Server (NTRS)

    2012-01-01

    The topics include: 1) Spectral Profiler Probe for In Situ Snow Grain Size and Composition Stratigraphy; 2) Portable Fourier Transform Spectroscopy for Analysis of Surface Contamination and Quality Control; 3) In Situ Geochemical Analysis and Age Dating of Rocks Using Laser Ablation-Miniature Mass Spectrometer; 4) Physics Mining of Multi-Source Data Sets; 5) Photogrammetry Tool for Forensic Analysis; 6) Connect Global Positioning System RF Module; 7) Simple Cell Balance Circuit; 8) Miniature EVA Software Defined Radio; 9) Remotely Accessible Testbed for Software Defined Radio Development; 10) System-of-Systems Technology-Portfolio-Analysis Tool; 11) VESGEN Software for Mapping and Quantification of Vascular Regulators; 12) Constructing a Database From Multiple 2D Images for Camera Pose Estimation and Robot Localization; 13) Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology; 14) 3D Visualization for Phoenix Mars Lander Science Operations; 15) RxGen General Optical Model Prescription Generator; 16) Carbon Nanotube Bonding Strength Enhancement Using Metal Wicking Process; 17) Multi-Layer Far-Infrared Component Technology; 18) Germanium Lift-Off Masks for Thin Metal Film Patterning; 19) Sealing Materials for Use in Vacuum at High Temperatures; 20) Radiation Shielding System Using a Composite of Carbon Nanotubes Loaded With Electropolymers; 21) Nano Sponges for Drug Delivery and Medicinal Applications; 22) Molecular Technique to Understand Deep Microbial Diversity; 23) Methods and Compositions Based on Culturing Microorganisms in Low Sedimental Fluid Shear Conditions; 24) Secure Peer-to-Peer Networks for Scientific Information Sharing; 25) Multiplexer/Demultiplexer Loading Tool (MDMLT); 26) High-Rate Data-Capture for an Airborne Lidar System; 27) Wavefront Sensing Analysis of Grazing Incidence Optical Systems; 28) Foam-on-Tile Damage Model; 29) Instrument Package Manipulation Through the Generation and Use of an Attenuated-Fluent Gas Fold; 30) Multicolor Detectors for Ultrasensitive Long-Wave Imaging Cameras; 31) Lunar Reconnaissance Orbiter (LRO) Command and Data Handling Flight Electronics Subsystem; and 32) Electro-Optic Segment-Segment Sensors for Radio and Optical Telescopes.

  19. Design and evaluation of a THz time domain imaging system using standard optical design software.

    PubMed

    Brückner, Claudia; Pradarutti, Boris; Müller, Ralf; Riehemann, Stefan; Notni, Gunther; Tünnermann, Andreas

    2008-09-20

    A terahertz (THz) time domain imaging system is analyzed and optimized with standard optical design software (ZEMAX). Special requirements to the illumination optics and imaging optics are presented. In the optimized system, off-axis parabolic mirrors and lenses are combined. The system has a numerical aperture of 0.4 and is diffraction limited for field points up to 4 mm and wavelengths down to 750 microm. ZEONEX is used as the lens material. Higher aspherical coefficients are used for correction of spherical aberration and reduction of lens thickness. The lenses were manufactured by ultraprecision machining. For optimization of the system, ray tracing and wave-optical methods were combined. We show how the ZEMAX Gaussian beam analysis tool can be used to evaluate illumination optics. The resolution of the THz system was tested with a wire and a slit target, line gratings of different period, and a Siemens star. The behavior of the temporal line spread function can be modeled with the polychromatic coherent line spread function feature in ZEMAX. The spectral and temporal resolutions of the line gratings are compared with the respective modulation transfer function of ZEMAX. For maximum resolution, the system has to be diffraction limited down to the smallest wavelength of the spectrum of the THz pulse. Then, the resolution on time domain analysis of the pulse maximum can be estimated with the spectral resolution of the center of gravity wavelength. The system resolution near the optical axis on time domain analysis of the pulse maximum is 1 line pair/mm with an intensity contrast of 0.22. The Siemens star is used for estimation of the resolution of the whole system. An eight channel electro-optic sampling system was used for detection. The resolution on time domain analysis of the pulse maximum of all eight channels could be determined with the Siemens star to be 0.7 line pairs/mm.

  20. CONNJUR R: An annotation strategy for fostering reproducibility in bio-NMR: protein spectral assignment

    PubMed Central

    Fenwick, Matthew; Hoch, Jeffrey C.; Ulrich, Eldon; Gryk, Michael R.

    2015-01-01

    Reproducibility is a cornerstone of the scientific method, essential for validation of results by independent laboratories and the sine qua non of scientific progress. A key step toward reproducibility of biomolecular NMR studies was the establishment of public data repositories (PDB and BMRB). Nevertheless, bio-NMR studies routinely fall short of the requirement for reproducibility that all the data needed to reproduce the results are published. A key limitation is that considerable metadata goes unpublished, notably manual interventions that are typically applied during the assignment of multidimensional NMR spectra. A general solution to this problem has been elusive, in part because of the wide range of approaches and software packages employed in the analysis of protein NMR spectra. Here we describe an approach for capturing missing metadata during the assignment of protein NMR spectra that can be generalized to arbitrary workflows, different software packages, other biomolecules, or other stages of data analysis in bio-NMR. We also present extensions to the NMR-STAR data dictionary that enable machine archival and retrieval of the “missing” metadata. PMID:26253947

  1. SOFIA: a flexible source finder for 3D spectral line data

    NASA Astrophysics Data System (ADS)

    Serra, Paolo; Westmeier, Tobias; Giese, Nadine; Jurek, Russell; Flöer, Lars; Popping, Attila; Winkel, Benjamin; van der Hulst, Thijs; Meyer, Martin; Koribalski, Bärbel S.; Staveley-Smith, Lister; Courtois, Hélène

    2015-04-01

    We introduce SOFIA, a flexible software application for the detection and parametrization of sources in 3D spectral line data sets. SOFIA combines for the first time in a single piece of software a set of new source-finding and parametrization algorithms developed on the way to future H I surveys with ASKAP (WALLABY, DINGO) and APERTIF. It is designed to enable the general use of these new algorithms by the community on a broad range of data sets. The key advantages of SOFIA are the ability to: search for line emission on multiple scales to detect 3D sources in a complete and reliable way, taking into account noise level variations and the presence of artefacts in a data cube; estimate the reliability of individual detections; look for signal in arbitrarily large data cubes using a catalogue of 3D coordinates as a prior; provide a wide range of source parameters and output products which facilitate further analysis by the user. We highlight the modularity of SOFIA, which makes it a flexible package allowing users to select and apply only the algorithms useful for their data and science questions. This modularity makes it also possible to easily expand SOFIA in order to include additional methods as they become available. The full SOFIA distribution, including a dedicated graphical user interface, is publicly available for download.

  2. mzStudio: A Dynamic Digital Canvas for User-Driven Interrogation of Mass Spectrometry Data.

    PubMed

    Ficarro, Scott B; Alexander, William M; Marto, Jarrod A

    2017-08-01

    Although not yet truly 'comprehensive', modern mass spectrometry-based experiments can generate quantitative data for a meaningful fraction of the human proteome. Importantly for large-scale protein expression analysis, robust data pipelines are in place for identification of un-modified peptide sequences and aggregation of these data to protein-level quantification. However, interoperable software tools that enable scientists to computationally explore and document novel hypotheses for peptide sequence, modification status, or fragmentation behavior are not well-developed. Here, we introduce mzStudio, an open-source Python module built on our multiplierz project. This desktop application provides a highly-interactive graphical user interface (GUI) through which scientists can examine and annotate spectral features, re-search existing PSMs to test different modifications or new spectral matching algorithms, share results with colleagues, integrate other domain-specific software tools, and finally create publication-quality graphics. mzStudio leverages our common application programming interface (mzAPI) for access to native data files from multiple instrument platforms, including ion trap, quadrupole time-of-flight, Orbitrap, matrix-assisted laser desorption ionization, and triple quadrupole mass spectrometers and is compatible with several popular search engines including Mascot, Proteome Discoverer, X!Tandem, and Comet. The mzStudio toolkit enables researchers to create a digital provenance of data analytics and other evidence that support specific peptide sequence assignments.

  3. Ultra high energy resolution focusing monochromator for inelastic X-ray scattering spectrometer

    DOE PAGES

    Suvorov, Alexey; Cunsolo, Alessandro; Chubar, Oleg; ...

    2015-11-25

    Further development of a focusing monochromator concept for X-ray energy resolution of 0.1 meV and below is presented. Theoretical analysis of several optical layouts based on this concept was supported by numerical simulations performed in the “Synchrotron Radiation Workshop” software package using the physical-optics approach and careful modeling of partially-coherent synchrotron (undulator) radiation. Along with the energy resolution, the spectral shape of the energy resolution function was investigated. We show that under certain conditions the decay of the resolution function tails can be faster than that of the Gaussian function.

  4. Advanced Optimal Extraction for the Spitzer/IRS

    NASA Astrophysics Data System (ADS)

    Lebouteiller, V.; Bernard-Salas, J.; Sloan, G. C.; Barry, D. J.

    2010-02-01

    We present new advances in the spectral extraction of pointlike sources adapted to the Infrared Spectrograph (IRS) on board the Spitzer Space Telescope. For the first time, we created a supersampled point-spread function of the low-resolution modules. We describe how to use the point-spread function to perform optimal extraction of a single source and of multiple sources within the slit. We also examine the case of the optimal extraction of one or several sources with a complex background. The new algorithms are gathered in a plug-in called AdOpt which is part of the SMART data analysis software.

  5. A data base and analysis program for shuttle main engine dynamic pressure measurements. Appendix B: Data base plots for SSME tests 901-290 through 901-414

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is described. The data base represents dynamic pressure measurements obtained during single engine hot firing tesets of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is also included to estimate spectral trends with SSME power level. Flow dynamic environments in high performance rocket engines are discussed.

  6. A data base and analysis program for shuttle main engine dynamic pressure measurements. Appendix C: Data base plots for SSME tests 902-214 through 902-314

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1986-01-01

    A dynamic pressure data base and data base management system developed to characterize the Space Shuttle Main Engine (SSME) dynamic pressure environment is reported. The data base represents dynamic pressure measurements obtained during single engine hot firing tests of the SSME. Software is provided to permit statistical evaluation of selected measurements under specified operating conditions. An interpolation scheme is included to estimate spectral trends with SSME power level. Flow Dynamic Environments in High Performance Rocket Engines are described.

  7. Computation of full energy peak efficiency for nuclear power plant radioactive plume using remote scintillation gamma-ray spectrometry.

    PubMed

    Grozdov, D S; Kolotov, V P; Lavrukhin, Yu E

    2016-04-01

    A method of full energy peak efficiency estimation in the space around scintillation detector, including the presence of a collimator, has been developed. It is based on a mathematical convolution of the experimental results with the following data extrapolation. The efficiency data showed the average uncertainty less than 10%. Software to calculate integral efficiency for nuclear power plant plume was elaborated. The paper also provides results of nuclear power plant plume height estimation by analysis of the spectral data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Making Sense of Remotely Sensed Ultra-Spectral Infrared Data

    NASA Technical Reports Server (NTRS)

    2001-01-01

    NASA's Jet Propulsion Laboratory (JPL), Pasadena, California, Earth Observing System (EOS) programs, the Deep Space Network (DSN), and various Department of Defense (DOD) technology demonstration programs, combined their technical expertise to develop SEASCRAPE, a software program that obtains data when thermal infrared radiation passes through the Earth's atmosphere and reaches a sensor. Licensed by the California Institute of Technology (Caltech), SEASCRAPE automatically inverts complex infrared data and makes it possible to obtain estimates of the state of the atmosphere along the ray path. Former JPL staff members created a small entrepreneurial firm, Remote Sensing Analysis Systems, Inc., of Altadena, California, to commercialize the product. The founders believed that a commercial version of the software was needed for future U.S. government missions and the commercial monitoring of pollution. With the inversion capability of this software and remote sensing instrumentation, it is possible to monitor pollution sources from safe and secure distances on a noninterfering, noncooperative basis. The software, now know as SEASCRAPE_Plus, allows the user to determine the presence of pollution products, their location and their abundance along the ray path. The technology has been cleared by the Department of Commerce for export, and is currently used by numerous research and engineering organizations around the world.

  9. WUVS simulator: detectability of spectral lines with the WSO-UV spectrographs

    NASA Astrophysics Data System (ADS)

    Marcos-Arenal, Pablo; de Castro, Ana I. Gómez; Abarca, Belén Perea; Sachkov, Mikhail

    2017-04-01

    The World Space Observatory Ultraviolet telescope is equipped with high dispersion (55,000) spectrographs working in the 1150 to 3100 Å spectral range. To evaluate the impact of the design on the scientific objectives of the mission, a simulation software tool has been developed. This simulator builds on the development made for the PLATO space mission and it is designed to generate synthetic time-series of images by including models of all important noise sources. We describe its design and performance. Moreover, its application to the detectability of important spectral features for star formation and exoplanetary research is addressed.

  10. Spectral Characterization of Analog Samples in Anticipation of OSIRIS-REx's Arrival at Bennu

    NASA Technical Reports Server (NTRS)

    Donaldson Hanna, K. L.; Schrader, D. L.; Bowles, N. E.; Clark, B. E.; Cloutis, E. A.; Connolly, H. C., Jr.; Hamilton, V. E.; Keller, L. P.; Lauretta, D. S.; Lim, L. F.; hide

    2017-01-01

    NASA's Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) mission successfully launched on September 8th, 2016. During its rendezvous with near-Earth asteroid (101955) Bennu beginning in 2018, OSIRIS-REx will characterize the asteroid's physical, mineralogical, and chemical properties in an effort to globally map the properties of Bennu, a primitive carbonaceous asteroid, and choose a sampling location]. In preparation for these observations, analog samples were spectrally characterized across visible, near- and thermal-infrared wavelengths and were used in initial tests on mineral-phase-detection and abundance-determination software algorithms.

  11. An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence

    NASA Astrophysics Data System (ADS)

    Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras

    2014-05-01

    We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.

  12. Onboard Processor for Compressing HSI Data

    NASA Technical Reports Server (NTRS)

    Cook, Sid; Harsanyi, Joe; Day, John H. (Technical Monitor)

    2002-01-01

    With EO-1 Hyperion and MightySat in orbit NASA and the DoD are showing their continued commitment to hyperspectral imaging (HSI). As HSI sensor technology continues to mature, the ever-increasing amounts of sensor data generated will result in a need for more cost effective communication and data handling systems. Lockheed Martin, with considerable experience in spacecraft design and developing special purpose onboard processors, has teamed with Applied Signal & Image Technology (ASIT), who has an extensive heritage in HSI, to develop a real-time and intelligent onboard processing (OBP) system to reduce HSI sensor downlink requirements. Our goal is to reduce the downlink requirement by a factor greater than 100, while retaining the necessary spectral fidelity of the sensor data needed to satisfy the many science, military, and intelligence goals of these systems. Our initial spectral compression experiments leverage commercial-off-the-shelf (COTS) spectral exploitation algorithms for segmentation, material identification and spectral compression that ASIT has developed. ASIT will also support the modification and integration of this COTS software into the OBP. Other commercially available COTS software for spatial compression will also be employed as part of the overall compression processing sequence. Over the next year elements of a high-performance reconfigurable OBP will be developed to implement proven preprocessing steps that distill the HSI data stream in both spectral and spatial dimensions. The system will intelligently reduce the volume of data that must be stored, transmitted to the ground, and processed while minimizing the loss of information.

  13. Cut-off characterisation of energy spectra of bright fermi sources: Current instrument limits and future possibilities

    NASA Astrophysics Data System (ADS)

    Romoli, C.; Taylor, A. M.; Aharonian, F.

    2017-02-01

    In this paper some of the brightest GeV sources observed by the Fermi-LAT were analysed, focusing on their spectral cut-off region. The sources chosen for this investigation were the brightest blazar flares of 3C 454.3 and 3C 279 and the Vela pulsar with a reanalysis with the latest Fermi-LAT software. For the study of the spectral cut-off we first explored the Vela pulsar spectrum, whose statistics in the time interval of the 3FGL catalog allowed strong constraints to be obtained on the parameters. We subsequently performed a new analysis of the flaring blazar SEDs. For these sources we obtained constraints on the cut-off parameters under the assumption that their underlying spectral distribution is described by a power-law with a stretched exponential cut-off. We then highlighted the significant potential improvements on such constraints by observations with next generation ground based Cherenkov telescopes, represented in our study by the Cherenkov Telescope Array (CTA). Adopting currently available simulations for this future observatory, we demonstrate the considerable improvement in cut-off constraints achievable by observations with this new instrument when compared with that achievable by satellite observations.

  14. Synthetic Scene Generation of the Stennis V and V Target Range for the Calibration of Remote Sensing Systems

    NASA Technical Reports Server (NTRS)

    Cao, Chang-Yong; Blonski, Slawomir; Ryan, Robert; Gasser, Jerry; Zanoni, Vicki

    1999-01-01

    The verification and validation (V&V) target range developed at Stennis Space Center is a useful test site for the calibration of remote sensing systems. In this paper, we present a simple algorithm for generating synthetic radiance scenes or digital models of this target range. The radiation propagation for the target in the solar reflective and thermal infrared spectral regions is modeled using the atmospheric radiative transfer code MODTRAN 4. The at-sensor, in-band radiance and spectral radiance for a given sensor at a given altitude is predicted. Software is developed to generate scenes with different spatial and spectral resolutions using the simulated at-sensor radiance values. The radiometric accuracy of the simulation is evaluated by comparing simulated with AVIRIS acquired radiance values. The results show that in general there is a good match between AVIRIS sensor measured and MODTRAN predicted radiance values for the target despite the fact that some anomalies exist. Synthetic scenes provide a cost-effective way for in-flight validation of the spatial and radiometric accuracy of the data. Other applications include mission planning, sensor simulation, and trade-off analysis in sensor design.

  15. Simulating the WFIRST coronagraph integral field spectrograph

    NASA Astrophysics Data System (ADS)

    Rizzo, Maxime J.; Groff, Tyler D.; Zimmermann, Neil T.; Gong, Qian; Mandell, Avi M.; Saxena, Prabal; McElwain, Michael W.; Roberge, Aki; Krist, John; Riggs, A. J. Eldorado; Cady, Eric J.; Mejia Prada, Camilo; Brandt, Timothy; Douglas, Ewan; Cahoy, Kerri

    2017-09-01

    A primary goal of direct imaging techniques is to spectrally characterize the atmospheres of planets around other stars at extremely high contrast levels. To achieve this goal, coronagraphic instruments have favored integral field spectrographs (IFS) as the science cameras to disperse the entire search area at once and obtain spectra at each location, since the planet position is not known a priori. These spectrographs are useful against confusion from speckles and background objects, and can also help in the speckle subtraction and wavefront control stages of the coronagraphic observation. We present a software package, the Coronagraph and Rapid Imaging Spectrograph in Python (crispy) to simulate the IFS of the WFIRST Coronagraph Instrument (CGI). The software propagates input science cubes using spatially and spectrally resolved coronagraphic focal plane cubes, transforms them into IFS detector maps and ultimately reconstructs the spatio-spectral input scene as a 3D datacube. Simulated IFS cubes can be used to test data extraction techniques, refine sensitivity analyses and carry out design trade studies of the flight CGI-IFS instrument. crispy is a publicly available Python package and can be adapted to other IFS designs.

  16. Design of direct-vision cyclo-olefin-polymer double Amici prism for spectral imaging.

    PubMed

    Wang, Lei; Shao, Zhengzheng; Tang, Wusheng; Liu, Jiying; Nie, Qianwen; Jia, Hui; Dai, Suian; Zhu, Jubo; Li, Xiujian

    2017-10-20

    A direct-vision Amici prism is a desired dispersion element in the value of spectrometers and spectral imaging systems. In this paper, we focus on designing a direct-vision cyclo-olefin-polymer double Amici prism for spectral imaging systems. We illustrate a designed structure: E48R/N-SF4/E48R, from which we obtain 13 deg dispersion across the visible spectrum, which is equivalent to 700 line pairs/mm grating. We construct a simulative spectral imaging system with the designed direct-vision cyclo-olefin-polymer double Amici prism in optical design software and compare its imaging performance to a glass double Amici prism in the same system. The results of spot-size RMS demonstrate that the plastic prism can serve as well as their glass competitors and have better spectral resolution.

  17. Raman spectral post-processing for oral tissue discrimination – a step for an automatized diagnostic system

    PubMed Central

    Carvalho, Luis Felipe C. S.; Nogueira, Marcelo Saito; Neto, Lázaro P. M.; Bhattacharjee, Tanmoy T.; Martin, Airton A.

    2017-01-01

    Most oral injuries are diagnosed by histopathological analysis of a biopsy, which is an invasive procedure and does not give immediate results. On the other hand, Raman spectroscopy is a real time and minimally invasive analytical tool with potential for the diagnosis of diseases. The potential for diagnostics can be improved by data post-processing. Hence, this study aims to evaluate the performance of preprocessing steps and multivariate analysis methods for the classification of normal tissues and pathological oral lesion spectra. A total of 80 spectra acquired from normal and abnormal tissues using optical fiber Raman-based spectroscopy (OFRS) were subjected to PCA preprocessing in the z-scored data set, and the KNN (K-nearest neighbors), J48 (unpruned C4.5 decision tree), RBF (radial basis function), RF (random forest), and MLP (multilayer perceptron) classifiers at WEKA software (Waikato environment for knowledge analysis), after area normalization or maximum intensity normalization. Our results suggest the best classification was achieved by using maximum intensity normalization followed by MLP. Based on these results, software for automated analysis can be generated and validated using larger data sets. This would aid quick comprehension of spectroscopic data and easy diagnosis by medical practitioners in clinical settings. PMID:29188115

  18. Raman spectral post-processing for oral tissue discrimination - a step for an automatized diagnostic system.

    PubMed

    Carvalho, Luis Felipe C S; Nogueira, Marcelo Saito; Neto, Lázaro P M; Bhattacharjee, Tanmoy T; Martin, Airton A

    2017-11-01

    Most oral injuries are diagnosed by histopathological analysis of a biopsy, which is an invasive procedure and does not give immediate results. On the other hand, Raman spectroscopy is a real time and minimally invasive analytical tool with potential for the diagnosis of diseases. The potential for diagnostics can be improved by data post-processing. Hence, this study aims to evaluate the performance of preprocessing steps and multivariate analysis methods for the classification of normal tissues and pathological oral lesion spectra. A total of 80 spectra acquired from normal and abnormal tissues using optical fiber Raman-based spectroscopy (OFRS) were subjected to PCA preprocessing in the z-scored data set, and the KNN (K-nearest neighbors), J48 (unpruned C4.5 decision tree), RBF (radial basis function), RF (random forest), and MLP (multilayer perceptron) classifiers at WEKA software (Waikato environment for knowledge analysis), after area normalization or maximum intensity normalization. Our results suggest the best classification was achieved by using maximum intensity normalization followed by MLP. Based on these results, software for automated analysis can be generated and validated using larger data sets. This would aid quick comprehension of spectroscopic data and easy diagnosis by medical practitioners in clinical settings.

  19. Towards real-time non contact spatial resolved oxygenation monitoring using a multi spectral filter array camera in various light conditions

    NASA Astrophysics Data System (ADS)

    Bauer, Jacob R.; van Beekum, Karlijn; Klaessens, John; Noordmans, Herke Jan; Boer, Christa; Hardeberg, Jon Y.; Verdaasdonk, Rudolf M.

    2018-02-01

    Non contact spatial resolved oxygenation measurements remain an open challenge in the biomedical field and non contact patient monitoring. Although point measurements are the clinical standard till this day, regional differences in the oxygenation will improve the quality and safety of care. Recent developments in spectral imaging resulted in spectral filter array cameras (SFA). These provide the means to acquire spatial spectral videos in real-time and allow a spatial approach to spectroscopy. In this study, the performance of a 25 channel near infrared SFA camera was studied to obtain spatial oxygenation maps of hands during an occlusion of the left upper arm in 7 healthy volunteers. For comparison a clinical oxygenation monitoring system, INVOS, was used as a reference. In case of the NIRS SFA camera, oxygenation curves were derived from 2-3 wavelength bands with a custom made fast analysis software using a basic algorithm. Dynamic oxygenation changes were determined with the NIR SFA camera and INVOS system at different regional locations of the occluded versus non-occluded hands and showed to be in good agreement. To increase the signal to noise ratio, algorithm and image acquisition were optimised. The measurement were robust to different illumination conditions with NIR light sources. This study shows that imaging of relative oxygenation changes over larger body areas is potentially possible in real time.

  20. Swift captures the spectrally evolving prompt emission of GRB070616

    NASA Astrophysics Data System (ADS)

    Starling, R. L. C.; O'Brien, P. T.; Willingale, R.; Page, K. L.; Osborne, J. P.; de Pasquale, M.; Nakagawa, Y. E.; Kuin, N. P. M.; Onda, K.; Norris, J. P.; Ukwatta, T. N.; Kodaka, N.; Burrows, D. N.; Kennea, J. A.; Page, M. J.; Perri, M.; Markwardt, C. B.

    2008-02-01

    The origins of gamma-ray burst (GRB) prompt emission are currently not well understood and in this context long, well-observed events are particularly important to study. We present the case of GRB070616, analysing the exceptionally long-duration multipeaked prompt emission, and later afterglow, captured by all the instruments on-board Swift and by Suzaku Wide-Band All-Sky Monitor (WAM). The high-energy light curve remained generally flat for several hundred seconds before going into a steep decline. Spectral evolution from hard to soft is clearly taking place throughout the prompt emission, beginning at 285s after the trigger and extending to 1200s. We track the movement of the spectral peak energy, whilst observing a softening of the low-energy spectral slope. The steep decline in flux may be caused by a combination of this strong spectral evolution and the curvature effect. We investigate origins for the spectral evolution, ruling out a superposition of two power laws and considering instead an additional component dominant during the late prompt emission. We also discuss origins for the early optical emission and the physics of the afterglow. The case of GRB070616 clearly demonstrates that both broad-band coverage and good time resolution are crucial to pin down the origins of the complex prompt emission in GRBs. This paper is dedicated to the memory of Dr Francesca Tamburelli who died during its production. Francesca played a fundamental role within the team which is in charge of the development of the Swift X-Ray Telescope (XRT) data analysis software at the Italian Space Agency's Science Data Centre in Frascati. She is sadly missed. E-mail: rlcs1@star.le.ac.uk

  1. A novel model for examining recovery of phonation after vocal nerve damage.

    PubMed

    Bhama, Prabhat K; Hillel, Allen D; Merati, Albert L; Perkel, David J

    2011-05-01

    Recurrent laryngeal nerve injury remains a dominant clinical issue in laryngology. To date, no animal model of laryngeal reinnervation has offered an outcome measure that can reflect the degree of recovery based on vocal function. We present an avian model system for studying recovery of learned vocalizations after nerve injury. Prospective animal study. Digital recordings of bird song were made from 11 adult male zebra finches; nine birds underwent bilateral crushing of the nerve supplying the vocal organ, and two birds underwent sham surgery. Songs from all the birds were then recorded regularly and analyzed based on temporal and spectral characteristics using computer software. Indices were calculated to indicate the degree of similarity between preoperative and postoperative song. Nerve crush caused audible differences in song quality and significant drops (P<0.05) in measured spectral and, to a lesser degree, temporal indices. Spectral indices recovered significantly (mean=43.0%; standard deviation [SD]=40.7; P<0.02), and there was an insignificant trend toward recovery of temporal index (mean=28.0%; SD=41.4; P=0.0771). In five of the nine (56%) birds, there was a greater than 50% recovery of spectral indices within a 4-week period. Two birds exhibited substantially less recovery of spectral indices and two birds had a persistent decline in spectral indices. Recovery of temporal index was highly variable as well, ranging from persistent further declines of 45.1% to recovery of 87%. Neither sham bird exhibited significant (P>0.05) differences in song after nerve crush. The songbird model system allows functional analysis of learned vocalization after surgical damage to vocal nerves. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  2. Combined use of ESI-QqTOF-MS and ESI-QqTOF-MS/MS with mass-spectral library search for qualitative analysis of drugs.

    PubMed

    Pavlic, Marion; Libiseller, Kathrin; Oberacher, Herbert

    2006-09-01

    The potential of the combined use of ESI-QqTOF-MS and ESI-QqTOF-MS/MS with mass-spectral library search for the identification of therapeutic and illicit drugs has been evaluated. Reserpine was used for standardizing experimental conditions and for characterization of the performance of the applied mass spectrometric system. Experiments revealed that because of the mass accuracy, the stability of calibration, and the reproducibility of fragmentation, the QqTOF mass spectrometer is an appropriate platform for establishment of a tandem-mass-spectral library. Three-hundred and nineteen substances were used as reference samples to build the spectral library. For each reference compound, product-ion spectra were acquired at ten different collision-energy values between 5 eV and 50 eV. For identification of unknown compounds, a library search algorithm was developed. The closeness of matching between a measured product-ion spectrum and a spectrum stored in the library was characterized by a value called "match probability", which took into account the number of matched fragment ions, the number of fragment ions observed in the two spectra, and the sum of the intensity differences calculated for matching fragments. A large value for the match probability indicated a close match between the measured and the reference spectrum. A unique feature of the library search algorithm-an implemented spectral purification option-enables characterization of multi-contributor fragment-ion spectra. With the aid of this software feature, substances comprising only 1.0% of the total amount of binary mixtures were unequivocally assigned, in addition to the isobaric main contributors. The spectral library was successfully applied to the characterization of 39 forensic casework samples.

  3. Mapped minerals at Questa, New Mexico, using airborne visible-infrared imaging spectrometer (AVIRIS) data -- Preliminary report

    USGS Publications Warehouse

    Livo, K. Eric; Clark, Roger N.

    2002-01-01

    This preliminary study for the First Quarterly Report has spectrally mapped hydrothermally altered minerals useful in assisting in assessment of water quality of the Red River. Airborne Visible-Infrared Imaging Spectrometer (AVIRIS) data was analyzed to characterize mined and unmined ground at Questa, New Mexico. AVIRIS data covers the Red River drainage north of the river, from between the town of Questa on the west, to east of the town of Red River. The data was calibrated and analyzed using U.S. Geological Survey custom software and spectral mineral library. AVIRIS data was tested for spectral features that matched similar features in the spectral mineral library. Goodness-of-fit and band-depth were calculated for each comparison of spectral features and used to identify surface mineralogy. Mineral distribution, mineral associations, and AVIRIS pixel spectra were examined. Mineral maps show the distribution of iron hydroxides, iron sulfates, clays, micas, carbonates, and other minerals. Initial results show a system of alteration suites that overprint each other. Quartz-sericite-pyrite (QSP) alteration grading out to propylitic alteration (epidote and calcite) was identified at the Questa Mine (molybdenum porphyry) and a similar alteration pattern was mapped at the landslide (?scar?) areas. Supergene weathering overprints the altered rock, as shown by jarosite, kaolinite, and gypsum. In the spectral analysis, hydrothermally altered ground appears to be more extensive at the unmined Goat Hill Gulch and the mined ground, than the ?scars? to the east. Though the ?scars? have similar overall altered mineral suites, there are differences between the ?scars? in sericite, kaolinite, jarosite, gypsum, and calcite abundance. Fieldwork has verified the results at the central unmined ?scar? areas.

  4. Spatially explicit spectral analysis of point clouds and geospatial data

    USGS Publications Warehouse

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described, and its functionality illustrated with an example of a high-resolution bathymetric point cloud data collected with multibeam echosounder.

  5. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*

    PubMed Central

    Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-01-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  6. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  7. Real-Time Data Display

    NASA Technical Reports Server (NTRS)

    Pedings, Marc

    2007-01-01

    RT-Display is a MATLAB-based data acquisition environment designed to use a variety of commercial off-the-shelf (COTS) hardware to digitize analog signals to a standard data format usable by other post-acquisition data analysis tools. This software presents the acquired data in real time using a variety of signal-processing algorithms. The acquired data is stored in a standard Operator Interactive Signal Processing Software (OISPS) data-formatted file. RT-Display is primarily configured to use the Agilent VXI (or equivalent) data acquisition boards used in such systems as MIDDAS (Multi-channel Integrated Dynamic Data Acquisition System). The software is generalized and deployable in almost any testing environment, without limitations or proprietary configuration for a specific test program or project. With the Agilent hardware configured and in place, users can start the program and, in one step, immediately begin digitizing multiple channels of data. Once the acquisition is completed, data is converted into a common binary format that also can be translated to specific formats used by external analysis software, such as OISPS and PC-Signal (product of AI Signal Research Inc.). RT-Display at the time of this reporting was certified on Agilent hardware capable of acquisition up to 196,608 samples per second. Data signals are presented to the user on-screen simultaneously for 16 channels. Each channel can be viewed individually, with a maximum capability of 160 signal channels (depending on hardware configuration). Current signal presentations include: time data, fast Fourier transforms (FFT), and power spectral density plots (PSD). Additional processing algorithms can be easily incorporated into this environment.

  8. In search of Nemesis

    NASA Technical Reports Server (NTRS)

    Carlson, S.; Culler, T.; Muller, R. A.; Tetreault, M.; Perlmutter, S.

    1994-01-01

    The parallax of all stars of visual magnitude greater than about 6.5 has already been measured. If Nemesis is a main-sequence star 1 parsec away, this requires Nemesis's mass to be less than about 0.4 solar masses. If it were less than about 0.05 solar masses its gravity would be too weak to trigger a comet storm. If Nemesis is on the main sequence, this mass range requires it to be a red dwarf. A red dwarf companion would probably have been missed by standard astronomical surveys. Nearby stars are usually found because they are bright or have high proper motion. However, Nemesis's proper motion would now be 0.01 arcsec/yr, and if it is a red dwarf its magnitude is about 10 - too dim to attract attention. Unfortunately, standard four-color photometry does not distinguish between red dwarfs and giants. So although surveys such as the Dearborn Red Star Catalog list stars by magnitude and spectral type, they do not identify the dwarfs. Every star of the correct spectral type and magnitude must be scrutinized. Our candidate list is a hybrid; candidate red stars are identified in the astrometrically poor Dearborn Red Star Catalog and their positions are corrected using the Hubble Guide Star Catalog. When errors in the Dearborn catalog make it impossible to identify the corresponding Hubble star, the fields are split so that we have one centering on each possible candidate. We are currently scrutinizing 3098 fields, which we believe contain all possible red dwarf candidates in the northern hemisphere. Since our last report the analysis and database software has been completely rebuilt to take advantage of updated hardware, to make the data more accessible, and to implement improved methods of data analysis. The software is now completed and we are eliminating stars every clear night.

  9. Platform-independent and label-free quantitation of proteomic data using MS1 extracted ion chromatograms in skyline: application to protein acetylation and phosphorylation.

    PubMed

    Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W

    2012-05-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.

  10. Platform-independent and Label-free Quantitation of Proteomic Data Using MS1 Extracted Ion Chromatograms in Skyline

    PubMed Central

    Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.

    2012-01-01

    Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539

  11. Development and implementation of software systems for imaging spectroscopy

    USGS Publications Warehouse

    Boardman, J.W.; Clark, R.N.; Mazer, A.S.; Biehl, L.L.; Kruse, F.A.; Torson, J.; Staenz, K.

    2006-01-01

    Specialized software systems have played a crucial role throughout the twenty-five year course of the development of the new technology of imaging spectroscopy, or hyperspectral remote sensing. By their very nature, hyperspectral data place unique and demanding requirements on the computer software used to visualize, analyze, process and interpret them. Often described as a marriage of the two technologies of reflectance spectroscopy and airborne/spaceborne remote sensing, imaging spectroscopy, in fact, produces data sets with unique qualities, unlike previous remote sensing or spectrometer data. Because of these unique spatial and spectral properties hyperspectral data are not readily processed or exploited with legacy software systems inherited from either of the two parent fields of study. This paper provides brief reviews of seven important software systems developed specifically for imaging spectroscopy.

  12. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    NASA Technical Reports Server (NTRS)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.

  13. [Fast determination of induction period of motor gasoline using Fourier transform attenuated total reflection infrared spectroscopy].

    PubMed

    Liu, Ya-Fei; Yuan, Hong-Fu; Song, Chun-Feng; Xie, Jin-Chun; Li, Xiao-Yu; Yan, De-Lin

    2014-11-01

    A new method is proposed for the fast determination of the induction period of gasoline using Fourier transform attenuated total reflection infrared spectroscopy (ATR-FTIR). A dedicated analysis system with the function of spectral measurement, data processing, display and storage was designed and integrated using a Fourier transform infrared spectrometer module and chemometric software. The sample presentation accessory designed which has advantages of constant optical path, convenient sample injection and cleaning is composed of a nine times reflection attenuated total reflectance (ATR) crystal of zinc selenide (ZnSe) coated with a diamond film and a stainless steel lid with sealing device. The influence of spectral scanning number and repeated sample loading times on the spectral signal-to-noise ratio was studied. The optimum spectral scanning number is 15 times and the optimum sample loading number is 4 times. Sixty four different gasoline samples were collected from the Beijing-Tianjin area and the induction period values were determined as reference data by standard method GB/T 8018-87. The infrared spectra of these samples were collected in the operating condition mentioned above using the dedicated fast analysis system. Spectra were pretreated using mean centering and 1st derivative to reduce the influence of spectral noise and baseline shift A PLS calibration model for the induction period was established by correlating the known induction period values of the samples with their spectra. The correlation coefficient (R2), standard error of calibration (SEC) and standard error of prediction (SEP) of the model are 0.897, 68.3 and 91.9 minutes, respectively. The relative deviation of the model for gasoline induction period prediction is less than 5%, which meets the requirements of repeatability tolerance in GB method. The new method is simple and fast. It takes no more than 3 minutes to detect one sample. Therefore, the method is feasible for implementing fast determination of gasoline induction period, and of a positive meaning in the evaluation of fuel quality.

  14. Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei

    2001-01-01

    This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

  15. The well-tuned blues: the role of structural colours as optical signals in the species recognition of a local butterfly fauna (Lepidoptera: Lycaenidae: Polyommatinae)

    PubMed Central

    Bálint, Zsolt; Kertész, Krisztián; Piszter, Gábor; Vértesy, Zofia; Biró, László P.

    2012-01-01

    The photonic nanoarchitectures responsible for the blue colour of the males of nine polyommatine butterfly species living in the same site were investigated structurally by electron microscopy and spectrally by reflectance spectroscopy. Optical characterization was carried out on 110 exemplars. The structural data extracted by dedicated software and the spectral data extracted by standard software were inputted into an artificial neural network software to test the specificity of the structural and optical characteristics. It was found that both the structural and the spectral data allow species identification with an accuracy better than 90 per cent. The reflectance data were further analysed using a colour representation diagram built in a manner analogous to that of the human Commission Internationale de l'Eclairage diagram, but the additional blue visual pigment of lycaenid butterflies was taken into account. It was found that this butterfly-specific colour representation diagram yielded a much clearer distinction of the position of the investigated species compared with previous calculations using the human colour space. The specific colours of the investigated species were correlated with the 285 flight-period data points extracted from museum collections. The species with somewhat similar colours fly in distinct periods of the year such that the blue colours are well tuned for safe mate/competitor recognition. This allows for the creation of an effective pre-zygotic isolation mechanism for closely related synchronic and syntopic species. PMID:22319114

  16. Study on temperature measurement of gas turbine blade based on analysis of error caused by the reflected radiation and emission angle

    NASA Astrophysics Data System (ADS)

    Li, Dong; Feng, Chi; Gao, Shan; Chen, Liwei; Daniel, Ketui

    2018-06-01

    Accurate measurement of gas turbine blade temperature is of great significance as far as blade health monitoring is concerned. An important method for measuring this temperature is the use of a radiation pyrometer. In this research, error of the pyrometer caused by reflected radiation from the surfaces surrounding the target and the emission angle of the target was analyzed. Important parameters for this analysis were the view factor between interacting surfaces, spectral directional emissivity, pyrometer operating wavelength and the surface temperature distribution on the blades and the vanes. The interacting surface of the rotor blade and the vane models used were discretized using triangular surface elements from which contour integral was used to calculate the view factor between the surface elements. Spectral directional emissivities were obtained from an experimental setup of Ni based alloy samples. A pyrometer operating wavelength of 1.6 μm was chosen. Computational fluid dynamics software was used to simulate the temperature distribution of the rotor blade and the guide vane based on the actual gas turbine input parameters. Results obtained in this analysis show that temperature error introduced by reflected radiation and emission angle ranges from  ‑23 K to 49 K.

  17. Spectrally formulated user-defined element in conventional finite element environment for wave motion analysis in 2-D composite structures

    NASA Astrophysics Data System (ADS)

    Khalili, Ashkan; Jha, Ratneshwar; Samaratunga, Dulip

    2016-11-01

    Wave propagation analysis in 2-D composite structures is performed efficiently and accurately through the formulation of a User-Defined Element (UEL) based on the wavelet spectral finite element (WSFE) method. The WSFE method is based on the first-order shear deformation theory which yields accurate results for wave motion at high frequencies. The 2-D WSFE model is highly efficient computationally and provides a direct relationship between system input and output in the frequency domain. The UEL is formulated and implemented in Abaqus (commercial finite element software) for wave propagation analysis in 2-D composite structures with complexities. Frequency domain formulation of WSFE leads to complex valued parameters, which are decoupled into real and imaginary parts and presented to Abaqus as real values. The final solution is obtained by forming a complex value using the real number solutions given by Abaqus. Five numerical examples are presented in this article, namely undamaged plate, impacted plate, plate with ply drop, folded plate and plate with stiffener. Wave motions predicted by the developed UEL correlate very well with Abaqus simulations. The results also show that the UEL largely retains computational efficiency of the WSFE method and extends its ability to model complex features.

  18. Hybrid Interferometric/Dispersive Atomic Spectroscopy For Nuclear Materials Analysis

    NASA Astrophysics Data System (ADS)

    Morgan, Phyllis K.

    Laser-induced breakdown spectroscopy (LIBS) is an optical emission spectroscopy technique that holds promise for detection and rapid analysis of elements relevant for nuclear safeguards and nonproliferation, including the measurement of isotope ratios. One important application of LIBS is the measurement of uranium enrichment (235U/238U), which requires high spectral resolution (e.g., 25 pm for the 424.437 nm U II line). Measuring uranium enrichment is important in nuclear nonproliferation and safeguards because the uranium highly enriched in the 235U isotope can be used to construct nuclear weapons. High-resolution dispersive spectrometers necessary for such measurements are typically bulky and expensive. A hybrid interferometric/dispersive spectrometer prototype, which consists of an inexpensive, compact Fabry-Perot etalon integrated with a low to moderate resolution Czerny-Turner spectrometer, was assembled for making high-resolution measurements of nuclear materials in a laboratory setting. To more fully take advantage of this low-cost, compact hybrid spectrometer, a mathematical reconstruction technique was developed to accurately reconstruct relative line strengths from complex spectral patterns with high resolution. Measurement of the mercury 313.1555/313.1844 nm doublet from a mercury-argon lamp yielded a spectral line intensity ratio of 0.682, which agrees well with an independent measurement by an echelle spectrometer and previously reported values. The hybrid instrument was used in LIBS measurements and achieved the resolution needed for isotopic selectivity of LIBS of uranium in ambient air. The samples used were a natural uranium foil (0.7% of 235U) and a uranium foil highly enriched in 235U to 93%. Both samples were provided by the Penn State University's Breazeale Nuclear Reactor. The enrichment of the uranium foils was verified using a high-purity germanium detector and dedicated software for multi-group spectral analysis. Uranium spectral line widths of ˜10 pm were measured at a center wavelength 424.437 nm, clearly discriminating the natural from the highly enriched uranium at that wavelength. The 424.167 nm isotope shift (˜6 pm), limited by spectral broadening, was only partially resolved but still discernible. This instrument and reconstruction method could enable the design of significantly smaller, portable high-resolution instruments with isotopic specificity, benefiting nuclear safeguards, treaty verification, nuclear forensics, and a variety of other spectroscopic applications.

  19. A robust pseudo-inverse spectral filter applied to the Earth Radiation Budget Experiment (ERBE) scanning channels

    NASA Technical Reports Server (NTRS)

    Avis, L. M.; Green, R. N.; Suttles, J. T.; Gupta, S. K.

    1984-01-01

    Computer simulations of a least squares estimator operating on the ERBE scanning channels are discussed. The estimator is designed to minimize the errors produced by nonideal spectral response to spectrally varying and uncertain radiant input. The three ERBE scanning channels cover a shortwave band a longwave band and a ""total'' band from which the pseudo inverse spectral filter estimates the radiance components in the shortwave band and a longwave band. The radiance estimator draws on instantaneous field of view (IFOV) scene type information supplied by another algorithm of the ERBE software, and on a priori probabilistic models of the responses of the scanning channels to the IFOV scene types for given Sun scene spacecraft geometry. It is found that the pseudoinverse spectral filter is stable, tolerant of errors in scene identification and in channel response modeling, and, in the absence of such errors, yields minimum variance and essentially unbiased radiance estimates.

  20. The MPI-Mainz UV/VIS Spectral Atlas of Gaseous Molecules of Atmospheric Interest

    NASA Astrophysics Data System (ADS)

    Keller-Rudek, H.; Moortgat, G. K.; Sander, R.; Sörensen, R.

    2013-08-01

    We present the MPI-Mainz UV/VIS Spectral Atlas, which is a large collection of absorption cross sections and quantum yields in the ultraviolet and visible (UV/VIS) wavelength region for gaseous molecules and radicals primarily of atmospheric interest. The data files contain results of individual measurements, covering research of almost a whole century. To compare and visualize the data sets, multicoloured graphical representations have been created. The Spectral Atlas is available on the internet at http://www.uv-vis-spectral-atlas-mainz.org. It now appears with improved browse and search options, based on new database software. In addition to the web pages, which are continuously updated, a frozen version of the data is available under the doi:10.5281/zenodo.6951.

  1. Acoustic Database for Turbofan Engine Core-Noise Sources. I; Volume

    NASA Technical Reports Server (NTRS)

    Gordon, Grant

    2015-01-01

    In this program, a database of dynamic temperature and dynamic pressure measurements were acquired inside the core of a TECH977 turbofan engine to support investigations of indirect combustion noise. Dynamic temperature and pressure measurements were recorded for engine gas dynamics up to temperatures of 3100 degrees Fahrenheit and transient responses as high as 1000 hertz. These measurements were made at the entrance of the high pressure turbine (HPT) and at the entrance and exit of the low pressure turbine (LPT). Measurements were made at two circumferential clocking positions. In the combustor and inter-turbine duct (ITD), measurements were made at two axial locations to enable the exploration of time delays. The dynamic temperature measurements were made using dual thin-wire thermocouple probes. The dynamic pressure measurements were made using semi-infinite probes. Prior to the engine test, a series of bench, oven, and combustor rig tests were conducted to characterize the performance of the dual wire temperature probes and to define and characterize the data acquisition systems. A measurement solution for acquiring dynamic temperature and pressure data on the engine was defined. A suite of hardware modifications were designed to incorporate the dynamic temperature and pressure instrumentation into the TECH977 engine. In particular, a probe actuation system was developed to protect the delicate temperature probes during engine startup and transients in order to maximize sensor life. A set of temperature probes was procured and the TECH977 engine was assembled with the suite of new and modified hardware. The engine was tested at four steady state operating speeds, with repeats. Dynamic pressure and temperature data were acquired at each condition for at least one minute. At the two highest power settings, temperature data could not be obtained at the forward probe locations since the mean temperatures exceeded the capability of the probes. The temperature data were processed using software that accounts for the effects of convective and conductive heat transfer. The software was developed under previous NASA sponsored programs. Compensated temperature spectra and compensated time histories corresponding to the dynamic temperature of the gas stream were generated. Auto-spectral and cross-spectral analyses of the data were performed to investigate spectral features, acoustic circumferential mode content, signal coherence, and time delays. The dynamic temperature data exhibit a wideband and fairly flat spectral content. The temperature spectra do not change substantially with operating speed. The pressure spectra in the combustor and ITD exhibit generally similar shapes and amplitudes, making it difficult to identify any features that suggest the presence of indirect combustion noise. Cross-spectral analysis reveal a strong correlation between pressure and temperature fluctuations in the ITD, but little correlation between temperature fluctuations at the entrance of the HPT and pressure fluctuations downstream of it. Temperature fluctuations at the entrance of the low pressure turbine were an order of magnitude smaller than those at the entrance to the high pressure turbine. Time delay analysis of the temperature fluctuations in the combustor was inconclusive, perhaps due to the substantial mixing that occurs between the upstream and downstream locations. Time delay analysis of the temperature fluctuations in the ITD indicate that they convect at the mean flow speed. Analysis of the data did not reveal any convincing indications of the presence of indirect combustion noise. However, this analysis has been preliminary and additional exploration of the data is recommended including the use of more sophisticated signal processing to explore subtle issues that have been revealed but which are not yet fully understood or explained.

  2. Error Analysis of Indirect Broadband Monitoring of Multilayer Optical Coatings using Computer Simulations

    NASA Astrophysics Data System (ADS)

    Semenov, Z. V.; Labusov, V. A.

    2017-11-01

    Results of studying the errors of indirect monitoring by means of computer simulations are reported. The monitoring method is based on measuring spectra of reflection from additional monitoring substrates in a wide spectral range. Special software (Deposition Control Simulator) is developed, which allows one to estimate the influence of the monitoring system parameters (noise of the photodetector array, operating spectral range of the spectrometer and errors of its calibration in terms of wavelengths, drift of the radiation source intensity, and errors in the refractive index of deposited materials) on the random and systematic errors of deposited layer thickness measurements. The direct and inverse problems of multilayer coatings are solved using the OptiReOpt library. Curves of the random and systematic errors of measurements of the deposited layer thickness as functions of the layer thickness are presented for various values of the system parameters. Recommendations are given on using the indirect monitoring method for the purpose of reducing the layer thickness measurement error.

  3. Influence of the spectral distribution of light on the characteristics of photovoltaic panel. Comparison between simulation and experimental

    NASA Astrophysics Data System (ADS)

    Chadel, Meriem; Bouzaki, Mohammed Moustafa; Chadel, Asma; Petit, Pierre; Sawicki, Jean-Paul; Aillerie, Michel; Benyoucef, Boumediene

    2017-02-01

    We present and analyze experimental results obtained with a laboratory setup based on a hardware and smart instrumentation for the complete study of performance of PV panels using for illumination an artificial radiation source (Halogen lamps). Associated to an accurate analysis, this global experimental procedure allows the determination of effective performance under standard conditions thanks to a simulation process originally developed under Matlab software environment. The uniformity of the irradiated surface was checked by simulation of the light field. We studied the response of standard commercial photovoltaic panels under enlightenment measured by a spectrometer with different spectra for two sources, halogen lamps and sunlight. Then, we bring a special attention to the influence of the spectral distribution of light on the characteristics of photovoltaic panel, that we have performed as a function of temperature and for different illuminations with dedicated measurements and studies of the open circuit voltage and short-circuit current.

  4. Analysis of scanner data for crop inventories

    NASA Technical Reports Server (NTRS)

    Horvath, R. (Principal Investigator); Cicone, R. C.; Kauth, R. J.; Malila, W. A.

    1981-01-01

    Progress and technical issues are reported in the development of corn/soybeans area estimation procedures for use on data from South America, with particular emphasis on Argentina. Aspects related to the supporting research section of the AgRISTARS Project discussed include: (1) multisegment corn/soybean estimation; (2) through the season separability of corn and soybeans within the U.S. corn belt; (3) TTS estimation; (4) insights derived from the baseline corn and soybean procedure; (5) small fields research; and (6) simulating the spectral appearance of wheat as a function of its growth and development. To assist the foreign commodity production forecasting, the performance of the baseline corn/soybean procedure was analyzed and the procedure modified. Fundamental limitations were found in the existing guidelines for discriminating these two crops. The temporal and spectral characteristics of corn and soybeans must be determined because other crops grow with them in Argentina. The state of software technology is assessed and the use of profile techniques for estimation is considered.

  5. Data Mining Meets HCI: Making Sense of Large Graphs

    DTIC Science & Technology

    2012-07-01

    graph algo- rithms, won the Open Source Software World Challenge, Silver Award. We have released Pegasus as free , open-source software, downloaded by...METIS [77], spectral clustering [108], and the parameter- free “Cross-associations” (CA) [26]. Belief Propagation can also be used for clus- tering, as...number of tools have been developed to support “ landscape ” views of information. These include WebBook and Web- Forager [23], which use a book metaphor

  6. Titanbrowse: a new paradigm for access, visualization and analysis of hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Penteado, Paulo F.

    2016-10-01

    Currently there are archives and tools to explore remote sensing imaging, but these lack some functionality needed for hyperspectral imagers: 1) Querying and serving only whole datacubes is not enough, since in each cube there is typically a large variation in observation geometry over the spatial pixels. Thus, often the most useful unit for selecting observations of interest is not a whole cube but rather a single spectrum. 2) Pixel-specific geometric data included in the standard pipelines is calculated at only one point per pixel. Particularly for selections of pixels from many different cubes, or observations near the limb, it is necessary to know the actual extent of each pixel. 3) Database queries need not only metadata, but also by the spectral data. For instance, one query might look for atypical values of some band, or atypical relations between bands, denoting spectral features (such as ratios or differences between bands). 4) There is the need to evaluate arbitrary, dynamically-defined, complex functions of the data (beyond just simple arithmetic operations), both for selection in the queries, and for visualization, to interactively tune the queries to the observations of interest. 5) Making the most useful query for some analysis often requires interactive visualization integrated with data selection and processing, because the user needs to explore how different functions of the data vary over the observations without having to download data and import it into visualization software. 6) Complementary to interactive use, an API allowing programmatic access to the system is needed for systematic data analyses. 7) Direct access to calibrated and georeferenced data, without the need to download data and software and learn to process it.We present titanbrowse, a database, exploration and visualization system for Cassini VIMS observations of Titan, designed to fullfill the aforementioned needs. While it originallly ran on data in the user's computer, we are now developing an online version, so that users do not need to download software and data. The server, which we maintain, processes the queries and communicates the results to the client the user runs. http://ppenteado.net/titanbrowse.

  7. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  8. Synthesis of Multispectral Bands from Hyperspectral Data: Validation Based on Images Acquired by AVIRIS, Hyperion, ALI, and ETM+

    NASA Technical Reports Server (NTRS)

    Blonksi, Slawomir; Gasser, Gerald; Russell, Jeffrey; Ryan, Robert; Terrie, Greg; Zanoni, Vicki

    2001-01-01

    Multispectral data requirements for Earth science applications are not always studied rigorously studied before a new remote sensing system is designed. A study of the spatial resolution, spectral bandpasses, and radiometric sensitivity requirements of real-world applications would focus the design onto providing maximum benefits to the end-user community. To support systematic studies of multispectral data requirements, the Applications Research Toolbox (ART) has been developed at NASA's Stennis Space Center. The ART software allows users to create and assess simulated datasets while varying a wide range of system parameters. The simulations are based on data acquired by existing multispectral and hyperspectral instruments. The produced datasets can be further evaluated for specific end-user applications. Spectral synthesis of multispectral images from hyperspectral data is a key part of the ART software. In this process, hyperspectral image cubes are transformed into multispectral imagery without changes in spatial sampling and resolution. The transformation algorithm takes into account spectral responses of both the synthesized, broad, multispectral bands and the utilized, narrow, hyperspectral bands. To validate the spectral synthesis algorithm, simulated multispectral images are compared with images collected near-coincidentally by the Landsat 7 ETM+ and the EO-1 ALI instruments. Hyperspectral images acquired with the airborne AVIRIS instrument and with the Hyperion instrument onboard the EO-1 satellite were used as input data to the presented simulations.

  9. An automatic detection software for differential reflection spectroscopy

    NASA Astrophysics Data System (ADS)

    Yuksel, Seniha Esen; Dubroca, Thierry; Hummel, Rolf E.; Gader, Paul D.

    2012-06-01

    Recent terrorist attacks have sprung a need for a large scale explosive detector. Our group has developed differential reflection spectroscopy which can detect explosive residue on surfaces such as parcel, cargo and luggage. In short, broad band ultra-violet and visible light is shone onto a material (such as a parcel) moving on a conveyor belt. Upon reflection off the surface, the light intensity is recorded with a spectrograph (spectrometer in combination with a CCD camera). This reflected light intensity is then subtracted and normalized with the next data point collected, resulting in differential reflection spectra in the 200-500 nm range. Explosives show spectral finger-prints at specific wavelengths, for example, the spectrum of 2,4,6, trinitrotoluene (TNT) shows an absorption edge at 420 nm. Additionally, we have developed an automated software which detects the characteristic features of explosives. One of the biggest challenges for the algorithm is to reach a practical limit of detection. In this study, we introduce our automatic detection software which is a combination of principal component analysis and support vector machines. Finally we present the sensitivity and selectivity response of our algorithm as a function of the amount of explosive detected on a given surface.

  10. Mass spectrometer output file format mzML.

    PubMed

    Deutsch, Eric W

    2010-01-01

    Mass spectrometry is an important technique for analyzing proteins and other biomolecular compounds in biological samples. Each of the vendors of these mass spectrometers uses a different proprietary binary output file format, which has hindered data sharing and the development of open source software for downstream analysis. The solution has been to develop, with the full participation of academic researchers as well as software and hardware vendors, an open XML-based format for encoding mass spectrometer output files, and then to write software to use this format for archiving, sharing, and processing. This chapter presents the various components and information available for this format, mzML. In addition to the XML schema that defines the file structure, a controlled vocabulary provides clear terms and definitions for the spectral metadata, and a semantic validation rules mapping file allows the mzML semantic validator to insure that an mzML document complies with one of several levels of requirements. Complete documentation and example files insure that the format may be uniformly implemented. At the time of release, there already existed several implementations of the format and vendors have committed to supporting the format in their products.

  11. Optimization of the coherence function estimation for multi-core central processing unit

    NASA Astrophysics Data System (ADS)

    Cheremnov, A. G.; Faerman, V. A.; Avramchuk, V. S.

    2017-02-01

    The paper considers use of parallel processing on multi-core central processing unit for optimization of the coherence function evaluation arising in digital signal processing. Coherence function along with other methods of spectral analysis is commonly used for vibration diagnosis of rotating machinery and its particular nodes. An algorithm is given for the function evaluation for signals represented with digital samples. The algorithm is analyzed for its software implementation and computational problems. Optimization measures are described, including algorithmic, architecture and compiler optimization, their results are assessed for multi-core processors from different manufacturers. Thus, speeding-up of the parallel execution with respect to sequential execution was studied and results are presented for Intel Core i7-4720HQ и AMD FX-9590 processors. The results show comparatively high efficiency of the optimization measures taken. In particular, acceleration indicators and average CPU utilization have been significantly improved, showing high degree of parallelism of the constructed calculating functions. The developed software underwent state registration and will be used as a part of a software and hardware solution for rotating machinery fault diagnosis and pipeline leak location with acoustic correlation method.

  12. Classification of communication signals of the little brown bat

    NASA Astrophysics Data System (ADS)

    Melendez, Karla V.; Jones, Douglas L.; Feng, Albert S.

    2005-09-01

    Little brown bats, Myotis lucifugus, are known for their ability to echolocate and utilize their echolocation system to navigate, locate, and identify prey. Their echolocation signals have been characterized in detail, but their communication signals are poorly understood despite their widespread use during the social interactions. The goal of this study was to characterize the communication signals of little brown bats. Sound recordings were made overnight on five individual bats (housed separately from a large group of captive bats) for 7 nights, using a Pettersson ultrasound detector D240x bat detector and Nagra ARES-BB digital recorder. The spectral and temporal characteristics of recorded sounds were first analyzed using BATSOUND software from Pettersson. Sounds were first classified by visual observation of calls' temporal pattern and spectral composition, and later using an automatic classification scheme based on multivariate statistical parameters in MATLAB. Human- and machine-based analysis revealed five discrete classes of bat's communication signals: downward frequency-modulated calls, constant frequency calls, broadband noise bursts, broadband chirps, and broadband click trains. Future studies will focus on analysis of calls' spectrotemporal modulations to discriminate any subclasses that may exist. [Research supported by Grant R01-DC-04998 from the National Institute for Deafness and Communication Disorders.

  13. Hyperspectral imaging applied to complex particulate solids systems

    NASA Astrophysics Data System (ADS)

    Bonifazi, Giuseppe; Serranti, Silvia

    2008-04-01

    HyperSpectral Imaging (HSI) is based on the utilization of an integrated hardware and software (HW&SW) platform embedding conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Although HSI was originally developed for remote sensing, it has recently emerged as a powerful process analytical tool, for non-destructive analysis, in many research and industrial sectors. The possibility to apply on-line HSI based techniques in order to identify and quantify specific particulate solid systems characteristics is presented and critically evaluated. The originally developed HSI based logics can be profitably applied in order to develop fast, reliable and lowcost strategies for: i) quality control of particulate products that must comply with specific chemical, physical and biological constraints, ii) performance evaluation of manufacturing strategies related to processing chains and/or realtime tuning of operative variables and iii) classification-sorting actions addressed to recognize and separate different particulate solid products. Case studies, related to recent advances in the application of HSI to different industrial sectors, as agriculture, food, pharmaceuticals, solid waste handling and recycling, etc. and addressed to specific goals as contaminant detection, defect identification, constituent analysis and quality evaluation are described, according to authors' originally developed application.

  14. SCPS: a fast implementation of a spectral method for detecting protein families on a genome-wide scale.

    PubMed

    Nepusz, Tamás; Sasidharan, Rajkumar; Paccanaro, Alberto

    2010-03-09

    An important problem in genomics is the automatic inference of groups of homologous proteins from pairwise sequence similarities. Several approaches have been proposed for this task which are "local" in the sense that they assign a protein to a cluster based only on the distances between that protein and the other proteins in the set. It was shown recently that global methods such as spectral clustering have better performance on a wide variety of datasets. However, currently available implementations of spectral clustering methods mostly consist of a few loosely coupled Matlab scripts that assume a fair amount of familiarity with Matlab programming and hence they are inaccessible for large parts of the research community. SCPS (Spectral Clustering of Protein Sequences) is an efficient and user-friendly implementation of a spectral method for inferring protein families. The method uses only pairwise sequence similarities, and is therefore practical when only sequence information is available. SCPS was tested on difficult sets of proteins whose relationships were extracted from the SCOP database, and its results were extensively compared with those obtained using other popular protein clustering algorithms such as TribeMCL, hierarchical clustering and connected component analysis. We show that SCPS is able to identify many of the family/superfamily relationships correctly and that the quality of the obtained clusters as indicated by their F-scores is consistently better than all the other methods we compared it with. We also demonstrate the scalability of SCPS by clustering the entire SCOP database (14,183 sequences) and the complete genome of the yeast Saccharomyces cerevisiae (6,690 sequences). Besides the spectral method, SCPS also implements connected component analysis and hierarchical clustering, it integrates TribeMCL, it provides different cluster quality tools, it can extract human-readable protein descriptions using GI numbers from NCBI, it interfaces with external tools such as BLAST and Cytoscape, and it can produce publication-quality graphical representations of the clusters obtained, thus constituting a comprehensive and effective tool for practical research in computational biology. Source code and precompiled executables for Windows, Linux and Mac OS X are freely available at http://www.paccanarolab.org/software/scps.

  15. Development of a low-cost, 11 µm spectral domain optical coherence tomography surface profilometry prototype

    NASA Astrophysics Data System (ADS)

    Suliali, Nyasha J.; Baricholo, Peter; Neethling, Pieter H.; Rohwer, Erich G.

    2017-06-01

    A spectral-domain Optical Coherence Tomography (OCT) surface profilometry prototype has been developed for the purpose of surface metrology of optical elements. The prototype consists of a light source, spectral interferometer, sample fixture and software currently running on Microsoft® Windows platforms. In this system, a broadband light emitting diode beam is focused into a Michelson interferometer with a plane mirror as its sample fixture. At the interferometer output, spectral interferograms of broadband sources were measured using a Czerny-Turner mount monochromator with a 2048-element complementary metal oxide semiconductor linear array as the detector. The software performs importation and interpolation of interferometer spectra to pre-condition the data for image computation. One dimensional axial OCT images were computed by Fourier transformation of the measured spectra. A first reflection surface profilometry (FRSP) algorithm was then formulated to perform imaging of step-function-surfaced samples. The algorithm re-constructs two dimensional colour-scaled slice images by concatenation of 21 and 13 axial scans to form a 10 mm and 3.0 mm slice respectively. Measured spectral interferograms, computed interference fringe signals and depth reflectivity profiles were comparable to simulations and correlated to displacements of a single reflector linearly translated about the arm null-mismatch point. Surface profile images of a double-step-function-surfaced sample, embedded with inclination and crack detail were plotted with an axial resolution of 11 μm. The surface shape, defects and misalignment relative to the incident beam were detected to the order of a micron, confirming high resolution of the developed system as compared to electro-mechanical surface profilometry techniques.

  16. HydroClimATe: hydrologic and climatic analysis toolkit

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.

    2014-01-01

    The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.

  17. The initial design of LAPAN's IR micro bolometer using mission analysis process

    NASA Astrophysics Data System (ADS)

    Bustanul, A.; Irwan, P.; M. T., Andi; Firman, B.

    2016-11-01

    As new player in Infra Red (IR) sector, uncooled, small, and lightweight IR Micro Bolometer has been chosen as one of payloads for LAPAN's next micro satellite project. Driven the desire to create our own IR Micro Bolometer, mission analysis design procedure has been applied. After tracing all possible missions, the Planck's and Wien's Law for black body, Temperature Responsivity (TR), and sub-pixel response had been utilized in order to determine the appropriate spectral radiance. The 3.8 - 4 μm wavelength were available to detect wild fire (forest fire) and active volcanoes, two major problems faced by Indonesia. In order to strengthen and broaden the result, iteration process had been used throughout the process. The analysis, then, were continued by calculating Ground pixel size, IFOV pixel, swath width, and focus length. Meanwhile, regarding of resolution, at least it is 400 m. The further procedure covered the integrated of optical design, wherein we combined among optical design software, Zemax, with mechanical analysis software (structure and thermal analysis), such as Nastran and Thermal Desktop / Sinda Fluint. The integration process was intended to produce high performance optical system of our IR Micro Bolometer that can be used under extreme environment. The results of all those analysis, either in graphs or in measurement, show that the initial design of LAPAN'S IR Micro Bolometer meets the determined requirement. However, it needs the further evaluation (iteration). This paper describes the initial design of LAPAN's IR Micro Bolometer using mission analysis process

  18. Sum and mean. Standard programs for activation analysis.

    PubMed

    Lindstrom, R M

    1994-01-01

    Two computer programs in use for over a decade in the Nuclear Methods Group at NIST illustrate the utility of standard software: programs widely available and widely used, in which (ideally) well-tested public algorithms produce results that are well understood, and thereby capable of comparison, within the community of users. Sum interactively computes the position, net area, and uncertainty of the area of spectral peaks, and can give better results than automatic peak search programs when peaks are very small, very large, or unusually shaped. Mean combines unequal measurements of a single quantity, tests for consistency, and obtains the weighted mean and six measures of its uncertainty.

  19. The Chandra X-ray Observatory: An Astronomical Facility Available to the World

    NASA Technical Reports Server (NTRS)

    Smith, Randall K.

    2006-01-01

    The Chandra X-ray observatory, one of NASA's "Great Observatories," provides high angular and spectral resolution X-ray data which is freely available to all. In this review I describe the instruments on chandra along with their current calibration, as well as the chandra proposal system, the freely-available Chandra analysis software package CIAO, and the Chandra archive. As Chandra is in its 6th year of operation, the archive already contains calibrated observations of a large range of X-ray sources. The Chandra X-ray Center is committed to assisting astronomers from any country who wish to use data from the archive or propose for observations

  20. Observations of candidate oscillating eclipsing binaries and two newly discovered pulsating variables

    NASA Astrophysics Data System (ADS)

    Liakos, A.; Niarchos, P.

    2009-03-01

    CCD observations of 24 eclipsing binary systems with spectral types ranging between A0-F0, candidate for containing pulsating components, were obtained. Appropriate exposure times in one or more photometric filters were used so that short-periodic pulsations could be detected. Their light curves were analyzed using the Period04 software in order to search for pulsational behaviour. Two new variable stars, namely GSC 2673-1583 and GSC 3641-0359, were discov- ered as by-product during the observations of eclipsing variables. The Fourier analysis of the observations of each star, the dominant pulsation frequencies and the derived frequency spectra are also presented.

  1. ALDAS user's manual

    NASA Technical Reports Server (NTRS)

    Watts, Michael E.

    1991-01-01

    The Acoustic Laboratory Data Acquisition System (ALDAS) is an inexpensive, transportable means to digitize and analyze data. The system is based on the Macintosh 2 family of computers, with internal analog-to-digital boards providing four channels of simultaneous data acquisition at rates up to 50,000 samples/sec. The ALDAS software package, written for use with rotorcraft acoustics, performs automatic acoustic calibration of channels, data display, two types of cycle averaging, and spectral amplitude analysis. The program can use data obtained from internal analog-to-digital conversion, or discrete external data imported in ASCII format. All aspects of ALDAS can be improved as new hardware becomes available and new features are introduced into the code.

  2. Whistle register: a preliminary investigation by HSDI visualization and acoustics on female cases

    NASA Astrophysics Data System (ADS)

    Di Corcia, Antonio; Fussi, Franco

    2012-02-01

    In this study we investigated laryngeal behaviors involved during vocal production of highest female vocal ranges in Flute in M3 Register, in Whistle Register and in a newly formulated by us, Hiss Register. Observations were carried with stroboscopy and High Speed Digital Imaging and with spectrographic and psycho-acoustic analysis by means of a software system having a wide spectral range (0-20.000 Hz). Results indicate that at the highest pitch vocal folds vibration is absent or significantly reduced, glottic contact is incomplete. These acoustic form of extreme pitch levels comprised intra-harmonic noise and overtones within 10 to 18 kHz range.

  3. Width of anterior chamber angle determined by OCT, and correlation to refraction and age in a German working population: the MIPH Eye&Health Study.

    PubMed

    Vossmerbaeumer, Urs; Schuster, Alexander K; Fischer, Joachim E

    2013-12-01

    Optical coherence tomography (OCT) of the anterior segment allows quantitative analysis of the geometry of the chamber angle. We performed bilateral spectral-domain OCT measurements in healthy, emmetropic, hyperopic, and myopic subjects to establish correlations between the width of the angle, the refraction, and intraocular pressure of the test persons. Out of 4,617 eyes (2,309 subjects), those with refractive errors of < -4 or > +3 diopters were identified by objective refraction measurement (KR-8800 Kerato-Refractometer, Topcon Inc., Japan) and examined using the anterior segment mode of a spectral-domain 3D OCT-2000 (Topcon Inc., Japan). Non-contact tonometry was performed (CT-80, Topcon Inc., Japan). One hundred and eight eyes of 54 emmetropic subjects (± 0.5 dpt) served as reference group. Previous ocular surgery was exclusion criterion in all groups. Width of the chamber angle was determined using semi-automated software tools and statistical analysis of the data (Pearson correlation, ANOVA with post-hoc test and Bonferroni correction, regression analysis) was performed using SPSS software (SPSS 19.0, Chicago, IL, USA). Six hundred and sixty-eight eyes of 398 persons (292 male, 96 female) were included in the study. Mean hyperopic refraction was +4.24 (+3  to +7.75) dpt, mean myopic refraction was -5.86 (-4 to -11.75) dpt. Valid chamber angle OCT measurements could be obtained from 50 (69.4 %) hyperopic and 400 (71.4 %) myopic eyes meeting the inclusion criteria. The mean width of the chamber angle was determined as 31.8° (range: 13.5 to 45.6, SD 7.49) in the hyperopic group, 40.8° (range: 19.3 to 66.0, SD 8.1) in the myopic group, and 36.3° (range: 21.1 to 51.8, SD 6.8) in the emmetropic reference group. Correlation was highly significant (p > 0.001) between refractive error and the aperture of the chamber angle as measured from OCT. The association of the intraocular pressure and the refraction was also highly significant (p > 0.001) for the three groups. The spectral-domain OCT yielded measurements that could be used for digital analysis of the chamber angle geometry. Our results highlight the correlation between refraction and aperture of the angle in hyperopia and myopia as determined by the 3D OCT-2000: hyperopia is associated with a narrower chamber angle, myopia with a wider aperture of the angle.

  4. NCAR global model topography generation software for unstructured grids

    NASA Astrophysics Data System (ADS)

    Lauritzen, P. H.; Bacmeister, J. T.; Callaghan, P. F.; Taylor, M. A.

    2015-06-01

    It is the purpose of this paper to document the NCAR global model topography generation software for unstructured grids. Given a model grid, the software computes the fraction of the grid box covered by land, the gridbox mean elevation, and associated sub-grid scale variances commonly used for gravity wave and turbulent mountain stress parameterizations. The software supports regular latitude-longitude grids as well as unstructured grids; e.g. icosahedral, Voronoi, cubed-sphere and variable resolution grids. As an example application and in the spirit of documenting model development, exploratory simulations illustrating the impacts of topographic smoothing with the NCAR-DOE CESM (Community Earth System Model) CAM5.2-SE (Community Atmosphere Model version 5.2 - Spectral Elements dynamical core) are shown.

  5. An assessment of AVIRIS data for hydrothermal alteration mapping in the Goldfield Mining District, Nevada

    NASA Technical Reports Server (NTRS)

    Carrere, Veronique; Abrams, Michael J.

    1988-01-01

    Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data were acquired over the Goldfield Mining District, Nevada, in September 1987. Goldfield is one of the group of large epithermal precious metal deposits in Tertiary volcanic rocks, associated with silicic volcanism and caldera formation. Hydrothermal alteration consists of silicification along fractures, advanced agrillic and argillic zones further away from veins and more widespread propylitic zones. An evaluation of AVIRIS data quality was performed. Faults in the data, related to engineering problems and a different behavior of the instrument while on-board the U2, were encountered. Consequently, a decision was made to use raw data and correct them only for dark current variations and detector read-out-delays. New software was written to that effect. Atmospheric correction was performed using the flat field correction technique. Analysis of the data was then performed to extract spectral information, mainly concentrating on the 2 to 2.45 micron window, as the alteration minerals of interest have their distinctive spectral reflectance features in this region. Principally kaolinite and alunite spectra were clearly obtained. Mapping of the different minerals and alteration zones was attempted using ratios and clustering techniques. Poor signal-to-noise performance of the instrument and the lack of appropriate software prevented the production of an alteration map of the area. Spectra extracted locally from the AVIRIS data were checked in the field by collecting representative samples of the outcrops.

  6. Managing distributed software development in the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Plante, Raymond L.; Boneventura, Nina; Busko, Ivo; Cresitello-Dittmar, Mark; D'Abrusco, Raffaele; Doe, Stephen; Ebert, Rick; Laurino, Omar; Pevunova, Olga; Refsdal, Brian; Thomas, Brian

    2012-09-01

    The U.S. Virtual Astronomical Observatory (VAO) is a product-driven organization that provides new scientific research capabilities to the astronomical community. Software development for the VAO follows a lightweight framework that guides development of science applications and infrastructure. Challenges to be overcome include distributed development teams, part-time efforts, and highly constrained schedules. We describe the process we followed to conquer these challenges while developing Iris, the VAO application for analysis of 1-D astronomical spectral energy distributions (SEDs). Iris was successfully built and released in less than a year with a team distributed across four institutions. The project followed existing International Virtual Observatory Alliance inter-operability standards for spectral data and contributed a SED library as a by-product of the project. We emphasize lessons learned that will be folded into future development efforts. In our experience, a well-defined process that provides guidelines to ensure the project is cohesive and stays on track is key to success. Internal product deliveries with a planned test and feedback loop are critical. Release candidates are measured against use cases established early in the process, and provide the opportunity to assess priorities and make course corrections during development. Also key is the participation of a stakeholder such as a lead scientist who manages the technical questions, advises on priorities, and is actively involved as a lead tester. Finally, frequent scheduled communications (for example a bi-weekly tele-conference) assure issues are resolved quickly and the team is working toward a common vision.

  7. Zachary D. Barker: Final DHS HS-STEM Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barker, Z D

    Working at Lawrence Livermore National Laboratory (LLNL) this summer has provided a very unique and special experience for me. I feel that the research opportunities given to me have allowed me to significantly benefit my research group, the laboratory, the Department of Homeland Security, and the Department of Energy. The researchers in the Single Particle Aerosol Mass Spectrometry (SPAMS) group were very welcoming and clearly wanted me to get the most out of my time in Livermore. I feel that my research partner, Veena Venkatachalam of MIT, and I have been extremely productive in meeting our research goals throughout thismore » summer, and have learned much about working in research at a national laboratory such as Lawrence Livermore. I have learned much about the technical aspects of research while working at LLNL, however I have also gained important experience and insight into how research groups at national laboratories function. I believe that this internship has given me valuable knowledge and experience which will certainly help my transition to graduate study and a career in engineering. My work with Veena Venkatachalam in the SPAMS group this summer has focused on two major projects. Initially, we were tasked with an analysis of data collected by the group this past spring in a large public environment. The SPAMS instrument was deployed for over two months, collecting information on many of the ambient air particles circulating through the area. Our analysis of the particle data collected during this deployment concerned several aspects, including finding groups, or clusters, of particles that seemed to appear more during certain times of day, analyzing the mass spectral data of clusters and comparing them with mass spectral data of known substances, and comparing the real-time detection capability of the SPAMS instrument with that of a commercially available biological detection instrument. This analysis was performed in support of a group report to the Department of Homeland Security on the results of the deployment. The analysis of the deployment data revealed some interesting applications of the SPAMS instrument to homeland security situations. Using software developed in-house by SPAMS group member Dr. Paul Steele, Veena and I were able to cluster a subset of data over a certain timeframe (ranging from a single hour to an entire week). The software used makes clusters based on the mass spectral characteristics of the each particle in the data set, as well as other parameters. By looking more closely at the characteristics of individual clusters, including the mass spectra, conclusions could be made about what these particles are. This was achieved partially through examination and discussion of the mass spectral data with the members of the SPAMS group, as well as through comparison with known mass spectra collected from substances tested in the laboratory. In many cases, broad conclusions could be drawn about the identity of a cluster of particles.« less

  8. Nektar++: An open-source spectral/ hp element framework

    NASA Astrophysics Data System (ADS)

    Cantwell, C. D.; Moxey, D.; Comerford, A.; Bolis, A.; Rocco, G.; Mengaldo, G.; De Grazia, D.; Yakovlev, S.; Lombard, J.-E.; Ekelschot, D.; Jordi, B.; Xu, H.; Mohamied, Y.; Eskilsson, C.; Nelson, B.; Vos, P.; Biotto, C.; Kirby, R. M.; Sherwin, S. J.

    2015-07-01

    Nektar++ is an open-source software framework designed to support the development of high-performance scalable solvers for partial differential equations using the spectral/ hp element method. High-order methods are gaining prominence in several engineering and biomedical applications due to their improved accuracy over low-order techniques at reduced computational cost for a given number of degrees of freedom. However, their proliferation is often limited by their complexity, which makes these methods challenging to implement and use. Nektar++ is an initiative to overcome this limitation by encapsulating the mathematical complexities of the underlying method within an efficient C++ framework, making the techniques more accessible to the broader scientific and industrial communities. The software supports a variety of discretisation techniques and implementation strategies, supporting methods research as well as application-focused computation, and the multi-layered structure of the framework allows the user to embrace as much or as little of the complexity as they need. The libraries capture the mathematical constructs of spectral/ hp element methods, while the associated collection of pre-written PDE solvers provides out-of-the-box application-level functionality and a template for users who wish to develop solutions for addressing questions in their own scientific domains.

  9. From pixel to voxel: a deeper view of biological tissue by 3D mass spectral imaging

    PubMed Central

    Ye, Hui; Greer, Tyler; Li, Lingjun

    2011-01-01

    Three dimensional mass spectral imaging (3D MSI) is an exciting field that grants the ability to study a broad mass range of molecular species ranging from small molecules to large proteins by creating lateral and vertical distribution maps of select compounds. Although the general premise behind 3D MSI is simple, factors such as choice of ionization method, sample handling, software considerations and many others must be taken into account for the successful design of a 3D MSI experiment. This review provides a brief overview of ionization methods, sample preparation, software types and technological advancements driving 3D MSI research of a wide range of low- to high-mass analytes. Future perspectives in this field are also provided to conclude that the positive and promises ever-growing applications in the biomedical field with continuous developments of this powerful analytical tool. PMID:21320052

  10. WDR-PK-AK-018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollister, R

    2009-08-26

    Method - CES SOP-HW-P556 'Field and Bulk Gamma Analysis'. Detector - High-purity germanium, 40% relative efficiency. Calibration - The detector was calibrated on February 8, 2006 using a NIST-traceable sealed source, and the calibration was verified using an independent sealed source. Count Time and Geometry - The sample was counted for 20 minutes at 72 inches from the detector. A lead collimator was used to limit the field-of-view to the region of the sample. The drum was rotated 180 degrees halfway through the count time. Date and Location of Scans - June 1,2006 in Building 235 Room 1136. Spectral Analysismore » Spectra were analyzed with ORTEC GammaVision software. Matrix and geometry corrections were calculated using OR TEC Isotopic software. A background spectrum was measured at the counting location. No man-made radioactivity was observed in the background. Results were determined from the sample spectra without background subtraction. Minimum detectable activities were calculated by the Nureg 4.16 method. Results - Detected Pu-238, Pu-239, Am-241 and Am-243.« less

  11. Conceptual design of the CZMIL data processing system (DPS): algorithms and software for fusing lidar, hyperspectral data, and digital images

    NASA Astrophysics Data System (ADS)

    Park, Joong Yong; Tuell, Grady

    2010-04-01

    The Data Processing System (DPS) of the Coastal Zone Mapping and Imaging Lidar (CZMIL) has been designed to automatically produce a number of novel environmental products through the fusion of Lidar, spectrometer, and camera data in a single software package. These new products significantly transcend use of the system as a bathymeter, and support use of CZMIL as a complete coastal and benthic mapping tool. The DPS provides a spinning globe capability for accessing data files; automated generation of combined topographic and bathymetric point clouds; a fully-integrated manual editor and data analysis tool; automated generation of orthophoto mosaics; automated generation of reflectance data cubes from the imaging spectrometer; a coupled air-ocean spectral optimization model producing images of chlorophyll and CDOM concentrations; and a fusion based capability to produce images and classifications of the shallow water seafloor. Adopting a multitasking approach, we expect to achieve computation of the point clouds, DEMs, and reflectance images at a 1:1 processing to acquisition ratio.

  12. THE DiskMass SURVEY. III. STELLAR KINEMATICS VIA CROSS-CORRELATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westfall, Kyle B.; Bershady, Matthew A.; Verheijen, Marc A. W., E-mail: westfall@astro.rug.nl, E-mail: mab@astro.wisc.edu, E-mail: verheyen@astro.rug.nl

    2011-03-15

    We describe a new cross-correlation (CC) approach used by our survey to derive stellar kinematics from galaxy-continuum spectroscopy. This approach adopts the formal error analysis derived by Statler, but properly handles spectral masks. Thus, we address the primary concerns regarding application of the CC method to censored data, while maintaining its primary advantage by consolidating kinematic and template-mismatch information toward different regions of the CC function. We identify a systematic error in the nominal CC method of approximately 10% in velocity dispersion incurred by a mistreatment of detector-censored data, which is eliminated by our new method. We derive our approachmore » from first principles, and we use Monte Carlo simulations to demonstrate its efficacy. An identical set of Monte Carlo simulations performed using the well-established penalized-pixel-fitting code of Cappellari and Emsellem compares favorably with the results from our newly implemented software. Finally, we provide a practical demonstration of this software by extracting stellar kinematics from SparsePak spectra of UGC 6918.« less

  13. LIQUID: an-open source software for identifying lipids in LC-MS/MS-based lipidomics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyle, Jennifer E.; Crowell, Kevin L.; Casey, Cameron P.

    2017-01-31

    We introduce an open-source software, LIQUID, for semi-automated processing and visualization of LC-MS/MS based lipidomics data. LIQUID provides users with the capability to process high throughput data and contains a customizable target library and scoring model per project needs. The graphical user interface provides visualization of multiple lines of spectral evidence for each lipid identification, allowing rapid examination of data for making confident identifications of lipid molecular species.

  14. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  15. View_SPECPR: Software for Plotting Spectra (Installation Manual and User's Guide, Version 1.2)

    USGS Publications Warehouse

    Kokaly, Raymond F.

    2008-01-01

    This document describes procedures for installing and using the 'View_SPECPR' software system to plot spectra stored in SPECPR (SPECtrum Processing Routines) files. The View_SPECPR software is comprised of programs written in IDL (Interactive Data Language) that run within the ENVI (ENvironment for Visualizing Images) image processing system. SPECPR files are used by earth-remote-sensing scientists and planetary scientists for storing spectra collected by laboratory, field, and remote sensing instruments. A widely distributed SPECPR file is the U.S. Geological Survey (USGS) spectral library that contains thousands of spectra of minerals, vegetation, and man-made materials (Clark and others, 2007). SPECPR files contain reflectance data and associated wavelength and spectral resolution data, as well as meta-data on the time and date of collection and spectrometer settings. Furthermore, the SPECPR file automatically tracks changes to data records through its 'history' fields. For more details on the format and content of SPECPR files, see Clark (1993). For more details on ENVI, see ITT (2008). This program has been updated using an ENVI 4.5/IDL7.0 full license operating on a Windows XP operating system and requires the installation of the iTools components of IDL7.0; however, this program should work with full licenses on UNIX/LINUX systems. This software has not been tested with ENVI licenses on Windows Vista or Apple Operating Systems.

  16. Digital data from shuttle photography: The effects of platform variables

    NASA Technical Reports Server (NTRS)

    Davis, Bruce E.

    1987-01-01

    Two major criticisms of using Shuttle hand held photography as an Earth science sensor are that it is nondigital, nonquantitative and that it has inconsistent platform characteristics, e.g., variable look angles, especially as compared to remote sensing satellites such as LANDSAT and SPOT. However, these criticisms are assumptions and have not been systematically investigated. The spectral effects of off-nadir views of hand held photography from the Shuttle and their role in interpretation of lava flow morphology on the island of Hawaii are studied. Digitization of photography at JSC and use of LIPS image analysis software in obtaining data is discussed. Preliminary interpretative results of one flow are given. Most of the time was spent in developing procedures and overcoming equipment problems. Preliminary data are satisfactory for detailed analysis.

  17. Design and Construction of a Field Capable Snapshot Hyperspectral Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Arik, Glenda H.

    2005-01-01

    The computed-tomography imaging spectrometer (CTIS) is a device which captures the spatial and spectral content of a rapidly evolving same in a single image frame. The most recent CTIS design is optically all reflective and uses as its dispersive device a stated the-art reflective computer generated hologram (CGH). This project focuses on the instrument's transition from laboratory to field. This design will enable the CTIS to withstand a harsh desert environment. The system is modeled in optical design software using a tolerance analysis. The tolerances guide the design of the athermal mount and component parts. The parts are assembled into a working mount shell where the performance of the mounts is tested for thermal integrity. An interferometric analysis of the reflective CGH is also performed.

  18. Algorithm 971: An Implementation of a Randomized Algorithm for Principal Component Analysis

    PubMed Central

    LI, HUAMIN; LINDERMAN, GEORGE C.; SZLAM, ARTHUR; STANTON, KELLY P.; KLUGER, YUVAL; TYGERT, MARK

    2017-01-01

    Recent years have witnessed intense development of randomized methods for low-rank approximation. These methods target principal component analysis and the calculation of truncated singular value decompositions. The present article presents an essentially black-box, foolproof implementation for Mathworks’ MATLAB, a popular software platform for numerical computation. As illustrated via several tests, the randomized algorithms for low-rank approximation outperform or at least match the classical deterministic techniques (such as Lanczos iterations run to convergence) in basically all respects: accuracy, computational efficiency (both speed and memory usage), ease-of-use, parallelizability, and reliability. However, the classical procedures remain the methods of choice for estimating spectral norms and are far superior for calculating the least singular values and corresponding singular vectors (or singular subspaces). PMID:28983138

  19. Analysis of acoustic emission during abrasive waterjet machining of sheet metals

    NASA Astrophysics Data System (ADS)

    Mokhtar, Nazrin; Gebremariam, MA; Zohari, H.; Azhari, Azmir

    2018-04-01

    The present paper reports on the analysis of acoustic emission (AE) produced during abrasive waterjet (AWJ) machining process. This paper focuses on the relationship of AE and surface quality of sheet metals. The changes in acoustic emission signals recorded by the mean of power spectral density (PSD) via covariance method in relation to the surface quality of the cut are discussed. The test was made using two materials for comparison namely aluminium 6061 and stainless steel 304 with five different feed rates. The acoustic emission data were captured by Labview and later processed using MATLAB software. The results show that the AE spectrums correlated with different feed rates and surface qualities. It can be concluded that the AE is capable of monitoring the changes of feed rate and surface quality.

  20. Investigation of dynamic noise affecting geodynamics information in a tethered subsatellite

    NASA Technical Reports Server (NTRS)

    Gullahorn, G. E.

    1984-01-01

    The effects of a tethered satellite system's internal dynamics on the subsatellite were calculated including both overall motions (libration and attitude oscillations) and internal tether oscillations. The SKYHOOK tether simulation program was modified to operate with atmospheric density variations and to output quantities of interest. Techniques and software for analyzing the results were developed including noise spectral analysis. A program was begun for computing a stable configuration of a tether system subject to air drag. These configurations will be of use as initial conditions for SKYHOOK and, through linearized analysis, directly for stability and dynamical studies. A case study in which the subsatellite traverses an atmospheric density enhancement confirmed some theoretical calculations, and pointed out some aspects of the interaction with the tether system dynamics.

  1. Evaluation of appropriate sensor specifications for space based ballistic missile detection

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin; Wendelstein, Norbert

    2012-10-01

    The detection and tracking of ballistic missiles (BMs) during launch or cloud break using satellite based electro-optical (EO) sensors is a promising possibility for pre-instructing early warning and fire control radars. However, the successful detection of a BM is depending on the applied infrared (IR)-channel, as emission and reflection of threat and background vary in different spectral (IR-) bands and for different observation scenarios. In addition, the spatial resolution of the satellite based system also conditions the signal-to-clutter-ratio (SCR) and therefore the predictability of the flight path. Generally available satellite images provide data in spectral bands, which are suitable for remote sensing applications and earth surface observations. However, in the fields of BM early warning, these bands are not of interest making the simulation of background data essential. The paper focuses on the analysis of IR-bands suitable for missile detection by trading off the suppression of background signature against threat signal strength. This comprises a radiometric overview of the background radiation in different spectral bands for different climates and seasons as well as for various cloud types and covers. A brief investigation of the BM signature and its trajectory within a threat scenario is presented. Moreover, the influence on the SCR caused by different observation scenarios and varying spatial resolution are pointed out. The paper also introduces the software used for simulating natural background spectral radiance images, MATISSE ("Advanced Modeling of the Earth for Environment and Scenes Simulation") by ONERA [1].

  2. Regularities And Irregularities Of The Stark Parameters For Single Ionized Noble Gases

    NASA Astrophysics Data System (ADS)

    Peláez, R. J.; Djurovic, S.; Cirišan, M.; Aparicio, J. A.; Mar S.

    2010-07-01

    Spectroscopy of ionized noble gases has a great importance for the laboratory and astrophysical plasmas. Generally, spectra of inert gases are important for many physics areas, for example laser physics, fusion diagnostics, photoelectron spectroscopy, collision physics, astrophysics etc. Stark halfwidths as well as shifts of spectral lines are usually employed for plasma diagnostic purposes. For example atomic data of argon krypton and xenon will be useful for the spectral diagnostic of ITER. In addition, the software used for stellar atmosphere simulation like TMAP, and SMART require a large amount of atomic and spectroscopic data. Availability of these parameters will be useful for a further development of stellar atmosphere and evolution models. Stark parameters data of spectral lines can also be useful for verification of theoretical calculations and investigation of regularities and systematic trends of these parameters within a multiplet, supermultiplet or transition array. In the last years, different trends and regularities of Stark parameters (halwidths and shifts of spectral lines) have been analyzed. The conditions related with atomic structure of the element as well as plasma conditions are responsible for regular or irregular behaviors of the Stark parameters. The absence of very close perturbing levels makes Ne II as a good candidate for analysis of the regularities. Other two considered elements Kr II and Xe II with complex spectra present strong perturbations and in some cases an irregularities in Stark parameters appear. In this work we analyze the influence of the perturbations to Stark parameters within the multiplets.

  3. Recording and interpretation/analysis of tilt signals with five ASKANIA borehole tiltmeters at the KTB.

    PubMed

    Gebauer, André; Jahr, Thomas; Jentzsch, Gerhard

    2007-05-01

    In June 2003, a large scale injection experiment started at the Continental Deep Drilling site (KTB) in Germany. A tiltmeter array was installed which consisted of five high resolution borehole tiltmeters of the ASKANIA type, also equipped with three dimensional seismometers. For the next 11 months, 86 000 m(3) were injected into the KTB pilot borehole 4000 m deep. The average injection rate was approximately 200 l/min. The research objective was to observe and to analyze deformation caused by the injection into the upper crust at the kilometer range. A new data acquisition system was developed by Geo-Research Center Potsdam (GFZ) to master the expected huge amount of seismic and tilt data. Furthermore, it was necessary to develop a new preprocessing software called PREANALYSE for long-period time series. This software includes different useful functions, such as step and spike correction, interpolation, filtering, and spectral analysis. This worldwide unique installation offers the excellent opportunity of the separation of signals due to injection and due to environment by correlation of the data of the five stations with the ground water table and meteorological data.

  4. Algorithms and Results of Eye Tissues Differentiation Based on RF Ultrasound

    PubMed Central

    Jurkonis, R.; Janušauskas, A.; Marozas, V.; Jegelevičius, D.; Daukantas, S.; Patašius, M.; Paunksnis, A.; Lukoševičius, A.

    2012-01-01

    Algorithms and software were developed for analysis of B-scan ultrasonic signals acquired from commercial diagnostic ultrasound system. The algorithms process raw ultrasonic signals in backscattered spectrum domain, which is obtained using two time-frequency methods: short-time Fourier and Hilbert-Huang transformations. The signals from selected regions of eye tissues are characterized by parameters: B-scan envelope amplitude, approximated spectral slope, approximated spectral intercept, mean instantaneous frequency, mean instantaneous bandwidth, and parameters of Nakagami distribution characterizing Hilbert-Huang transformation output. The backscattered ultrasound signal parameters characterizing intraocular and orbit tissues were processed by decision tree data mining algorithm. The pilot trial proved that applied methods are able to correctly classify signals from corpus vitreum blood, extraocular muscle, and orbit tissues. In 26 cases of ocular tissues classification, one error occurred, when tissues were classified into classes of corpus vitreum blood, extraocular muscle, and orbit tissue. In this pilot classification parameters of spectral intercept and Nakagami parameter for instantaneous frequencies distribution of the 1st intrinsic mode function were found specific for corpus vitreum blood, orbit and extraocular muscle tissues. We conclude that ultrasound data should be further collected in clinical database to establish background for decision support system for ocular tissue noninvasive differentiation. PMID:22654643

  5. Portable, stand-off spectral imaging camera for detection of effluents and residues

    NASA Astrophysics Data System (ADS)

    Goldstein, Neil; St. Peter, Benjamin; Grot, Jonathan; Kogan, Michael; Fox, Marsha; Vujkovic-Cvijin, Pajo; Penny, Ryan; Cline, Jason

    2015-06-01

    A new, compact and portable spectral imaging camera, employing a MEMs-based encoded imaging approach, has been built and demonstrated for detection of hazardous contaminants including gaseous effluents and solid-liquid residues on surfaces. The camera is called the Thermal infrared Reconfigurable Analysis Camera for Effluents and Residues (TRACER). TRACER operates in the long wave infrared and has the potential to detect a wide variety of materials with characteristic spectral signatures in that region. The 30 lb. camera is tripod mounted and battery powered. A touch screen control panel provides a simple user interface for most operations. The MEMS spatial light modulator is a Texas Instruments Digital Microarray Array with custom electronics and firmware control. Simultaneous 1D-spatial and 1Dspectral dimensions are collected, with the second spatial dimension obtained by scanning the internal spectrometer slit. The sensor can be configured to collect data in several modes including full hyperspectral imagery using Hadamard multiplexing, panchromatic thermal imagery, and chemical-specific contrast imagery, switched with simple user commands. Matched filters and other analog filters can be generated internally on-the-fly and applied in hardware, substantially reducing detection time and improving SNR over HSI software processing, while reducing storage requirements. Results of preliminary instrument evaluation and measurements of flame exhaust are presented.

  6. Forest Watch: a K-12 Outreach Program to Engage Young Students in Authentic, Hands-On Science

    NASA Astrophysics Data System (ADS)

    Rock, B. N.; Gagnon, M.

    2008-12-01

    The Forest Watch Program is a K-12 science outreach program developed at the University of New Hampshire (UNH) in 1991. The program engages pre-college teachers and their students in assisting researchers at UNH in the assessment of the state-of-health of white pine (Pinus strobus), a known bio- indicator species for exposure to elevated levels of ground-level ozone. Students participate in three types of activities: 1. selection, collection, and analysis of needle samples from five permanently tagged white pine trees near their school; 2. Study of needles in their classroom and sending a set of needles to UNH for spectral analysis; and 3. analysis of remote sensing data (Landsat TM) provided of their local area using freeware software (MultiSpec). Student-derived foliar symptomology, needle length, needle retention, and tree biometrics, plus the spectral indices, allow UNH researchers to characterize annual variations in tree state-of-health, and to correlate that state-of-health with annual summer ozone levels collected by the EPA and state environmental monitoring networks. The results suggest that regional air quality and state- of-health of trees has improved since 1991. Annual student data and the yearly spectral variations, for the same trees, suggest that white pine health has improved dramatically since 1997/8. This improvement in tree health corresponds with improved regional air quality. An evaluation of student data reliability has been conducted and suggests that the DBH measurements are a most reliable indicator of tree growth. Student data are more reliable if multiple sets of measurements are made and averaged together, compared with single sets of measurements. Based on both student data and spectral analysis of student-provided branch samples, the greatest damage (chlorosis) occurs in trees located along the seacoast areas. Participation in Forest Watch introduces students to the scientific method via an authentic research program. The program is designed in partnership with participating teachers, and thus meets New England state science and mathematics curricula for K-12 education. Student participation in Forest Watch has resulted in an improved understanding of inter-annual white pine state-of-health response to improved air quality across the New England region.

  7. Characterizing pigments with hyperspectral imaging variable false-color composites

    NASA Astrophysics Data System (ADS)

    Hayem-Ghez, Anita; Ravaud, Elisabeth; Boust, Clotilde; Bastian, Gilles; Menu, Michel; Brodie-Linder, Nancy

    2015-11-01

    Hyperspectral imaging has been used for pigment characterization on paintings for the last 10 years. It is a noninvasive technique, which mixes the power of spectrophotometry and that of imaging technologies. We have access to a visible and near-infrared hyperspectral camera, ranging from 400 to 1000 nm in 80-160 spectral bands. In order to treat the large amount of data that this imaging technique generates, one can use statistical tools such as principal component analysis (PCA). To conduct the characterization of pigments, researchers mostly use PCA, convex geometry algorithms and the comparison of resulting clusters to database spectra with a specific tolerance (like the Spectral Angle Mapper tool on the dedicated software ENVI). Our approach originates from false-color photography and aims at providing a simple tool to identify pigments thanks to imaging spectroscopy. It can be considered as a quick first analysis to see the principal pigments of a painting, before using a more complete multivariate statistical tool. We study pigment spectra, for each kind of hue (blue, green, red and yellow) to identify the wavelength maximizing spectral differences. The case of red pigments is most interesting because our methodology can discriminate the red pigments very well—even red lakes, which are always difficult to identify. As for the yellow and blue categories, it represents a good progress of IRFC photography for pigment discrimination. We apply our methodology to study the pigments on a painting by Eustache Le Sueur, a French painter of the seventeenth century. We compare the results to other noninvasive analysis like X-ray fluorescence and optical microscopy. Finally, we draw conclusions about the advantages and limits of the variable false-color image method using hyperspectral imaging.

  8. Terrain-analysis procedures for modeling radar backscatter

    USGS Publications Warehouse

    Schaber, Gerald G.; Pike, Richard J.; Berlin, Graydon Lennis

    1978-01-01

    The collection and analysis of detailed information on the surface of natural terrain are important aspects of radar-backscattering modeling. Radar is especially sensitive to surface-relief changes in the millimeter- to-decimeter scale four conventional K-band (~1-cm wavelength) to L-band (~25-cm wavelength) radar systems. Surface roughness statistics that characterize these changes in detail have been generated by a comprehensive set of seven programmed calculations for radar-backscatter modeling from sets of field measurements. The seven programs are 1) formatting of data in readable form for subsequent topographic analysis program; 2) relief analysis; 3) power spectral analysis; 4) power spectrum plots; 5) slope angle between slope reversals; 6) slope angle against slope interval plots; and 7) base length slope angle and curvature. This complete Fortran IV software package, 'Terrain Analysis', is here presented for the first time. It was originally developed a decade ago for investigations of lunar morphology and surface trafficability for the Apollo Lunar Roving Vehicle.

  9. Further development of the dynamic gas temperature measurement system. Volume 2: Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Stocks, Dana R.

    1986-01-01

    The Dynamic Gas Temperature Measurement System compensation software accepts digitized data from two different diameter thermocouples and computes a compensated frequency response spectrum for one of the thermocouples. Detailed discussions of the physical system, analytical model, and computer software are presented in this volume and in Volume 1 of this report under Task 3. Computer program software restrictions and test cases are also presented. Compensated and uncompensated data may be presented in either the time or frequency domain. Time domain data are presented as instantaneous temperature vs time. Frequency domain data may be presented in several forms such as power spectral density vs frequency.

  10. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Test Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product used in conjunction with the supercritical fluid extraction Fourier transform infrared spectrometer (SFE-FTIR) system. This product converts the raw spectral data to useful area numbers. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.'' This test plan will be performed in conjunction with or prior to HNF-6936, ''HA-53 Supercritical Fluid Extraction System Acceptance Test Plan'', to perform analyses for water. The test will ensure that the software can be installed properly and will manipulate the analytical data correctly.

  11. Active-passive data fusion algorithms for seafloor imaging and classification from CZMIL data

    NASA Astrophysics Data System (ADS)

    Park, Joong Yong; Ramnath, Vinod; Feygels, Viktor; Kim, Minsu; Mathur, Abhinav; Aitken, Jennifer; Tuell, Grady

    2010-04-01

    CZMIL will simultaneously acquire lidar and passive spectral data. These data will be fused to produce enhanced seafloor reflectance images from each sensor, and combined at a higher level to achieve seafloor classification. In the DPS software, the lidar data will first be processed to solve for depth, attenuation, and reflectance. The depth measurements will then be used to constrain the spectral optimization of the passive spectral data, and the resulting water column estimates will be used recursively to improve the estimates of seafloor reflectance from the lidar. Finally, the resulting seafloor reflectance cube will be combined with texture metrics estimated from the seafloor topography to produce classifications of the seafloor.

  12. Improvement in Recursive Hierarchical Segmentation of Data

    NASA Technical Reports Server (NTRS)

    Tilton, James C.

    2006-01-01

    A further modification has been made in the algorithm and implementing software reported in Modified Recursive Hierarchical Segmentation of Data (GSC- 14681-1), NASA Tech Briefs, Vol. 30, No. 6 (June 2006), page 51. That software performs recursive hierarchical segmentation of data having spatial characteristics (e.g., spectral-image data). The output of a prior version of the software contained artifacts, including spurious segmentation-image regions bounded by processing-window edges. The modification for suppressing the artifacts, mentioned in the cited article, was addition of a subroutine that analyzes data in the vicinities of seams to find pairs of regions that tend to lie adjacent to each other on opposite sides of the seams. Within each such pair, pixels in one region that are more similar to pixels in the other region are reassigned to the other region. The present modification provides for a parameter ranging from 0 to 1 for controlling the relative priority of merges between spatially adjacent and spatially non-adjacent regions. At 1, spatially-adjacent-/spatially- non-adjacent-region merges have equal priority. At 0, only spatially-adjacent-region merges (no spectral clustering) are allowed. Between 0 and 1, spatially-adjacent- region merges have priority over spatially- non-adjacent ones.

  13. Platform-Independent Cirrus and Spectralis Thickness Measurements in Eyes with Diabetic Macular Edema Using Fully Automated Software

    PubMed Central

    Willoughby, Alex S.; Chiu, Stephanie J.; Silverman, Rachel K.; Farsiu, Sina; Bailey, Clare; Wiley, Henry E.; Ferris, Frederick L.; Jaffe, Glenn J.

    2017-01-01

    Purpose We determine whether the automated segmentation software, Duke Optical Coherence Tomography Retinal Analysis Program (DOCTRAP), can measure, in a platform-independent manner, retinal thickness on Cirrus and Spectralis spectral domain optical coherence tomography (SD-OCT) images in eyes with diabetic macular edema (DME) under treatment in a clinical trial. Methods Automatic segmentation software was used to segment the internal limiting membrane (ILM), inner retinal pigment epithelium (RPE), and Bruch's membrane (BM) in SD-OCT images acquired by Cirrus and Spectralis commercial systems, from the same eye, on the same day during a clinical interventional DME trial. Mean retinal thickness differences were compared across commercial and DOCTRAP platforms using intraclass correlation (ICC) and Bland-Altman plots. Results The mean 1 mm central subfield thickness difference (standard error [SE]) comparing segmentation of Spectralis images with DOCTRAP versus HEYEX was 0.7 (0.3) μm (0.2 pixels). The corresponding values comparing segmentation of Cirrus images with DOCTRAP versus Cirrus software was 2.2 (0.7) μm. The mean 1 mm central subfield thickness difference (SE) comparing segmentation of Cirrus and Spectralis scan pairs with DOCTRAP using BM as the outer retinal boundary was −2.3 (0.9) μm compared to 2.8 (0.9) μm with inner RPE as the outer boundary. Conclusions DOCTRAP segmentation of Cirrus and Spectralis images produces validated thickness measurements that are very similar to each other, and very similar to the values generated by the corresponding commercial software in eyes with treated DME. Translational Relevance This software enables automatic total retinal thickness measurements across two OCT platforms, a process that is impractical to perform manually. PMID:28180033

  14. Use of timesat to estimate phenological parameters in Northwestern Patagonia

    NASA Astrophysics Data System (ADS)

    Oddi, Facundo; Minotti, Priscilla; Ghermandi, Luciana; Lasaponara, Rosa

    2015-04-01

    Under a global change context, ecosystems are receiving high pressure and the ecology science play a key role for monitoring and assessment of natural resources. To achieve an effective resources management to develop an ecosystem functioning knowledge based on spatio-temporal perspective is useful. Satellite imagery periodically capture the spectral response of the earth and remote sensing have been widely utilized as classification and change detection tool making possible evaluate the intra and inter-annual plant dynamics. Vegetation spectral indices (e.g., NDVI) are particularly suitable to study spatio-temporal processes related to plant phenology and remote sensing specific software, such as TIMESAT, has been developed to carry out time series analysis of spectral indexes. We used TIMESAT software applied to series of 25 years of NDVI bi-monthly composites (240 images covering the period 1982-2006) from the NOAA-AVHRR sensor (8 x 8 km) to assessment plant pheonology over 900000 ha of shrubby-grasslands in the Northwestern of Patagonia, Argentina. The study area corresponds to a Mediterranean environment and is part of a gradient defined by a sharp drop west-east in the precipitation regime (600 mm to 280 mm). We fitted the temporal series of NDVI data to double logistic functions by least-squares methods evaluating three seasonality parameters: a) start of growing season, b) growing season length, c) NDVI seasonal integral. According to fitted models by TIMESAT, start average of growing season was the second half of September (± 10 days) with beginnings latest in the east (dryer areas). The average growing season length was 180 days (± 15 days) without a clear spatial trend. The NDVI seasonal integral showed a clear trend of decrease in west-east direction following the precipitation gradient. The temporal and spatial information allows revealing important patterns of ecological interest, which can be of great importance to environmental monitoring. In this work we also show as utilizing TIMESAT to characterize the plant phenology at regional scale.

  15. Seismic design parameters - A user guide

    USGS Publications Warehouse

    Leyendecker, E.V.; Frankel, A.D.; Rukstales, K.S.

    2001-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings (1997 NEHRP Provisions) introduced seismic design procedure that is based on the explicit use of spectral response acceleration rather than the traditional peak ground acceleration and/or peak ground velocity or zone factors. The spectral response accelerations are obtained from spectral response acceleration maps accompanying the report. Maps are available for the United States and a number of U.S. territories. Since 1997 additional codes and standards have also adopted seismic design approaches based on the same procedure used in the NEHRP Provisions and the accompanying maps. The design documents using the 1997 NEHRP Provisions procedure may be divided into three categories -(1) Design of New Construction, (2) Design and Evaluation of Existing Construction, and (3) Design of Residential Construction. A CD-ROM has been prepared for use in conjunction with the design documents in each of these three categories. The spectral accelerations obtained using the software on the CD are the same as those that would be obtained by using the maps accompanying the design documents. The software has been prepared to operate on a personal computer using a Windows (Microsoft Corporation) operating environment and a point and click type of interface. The user can obtain the spectral acceleration values that would be obtained by use of the maps accompanying the design documents, include site factors appropriate for the Site Class provided by the user, calculate a response spectrum that includes the site factor, and plot a response spectrum. Sites may be located by providing the latitude-longitude or zip code for all areas covered by the maps. All of the maps used in the various documents are also included on the CDROM

  16. RSpec: New Real-time Spectroscopy Software Enhances High School and College Learning

    NASA Astrophysics Data System (ADS)

    Field, Tom

    2011-01-01

    Nothing beats hands-on experience! Students often have a more profound learning experience in a hands-on laboratory than in a classroom. However, development of inquiry-based curricula for teaching spectroscopy has been thwarted by the absence of affordable equipment. There is now a software program that brings the excitement of real-time spectroscopy into the lab. It eliminates the processing delays that accompany conventional after-the-fact data analysis -- delays that often result in sagging enthusiasm and loss of interest in young, active minds. RSpec is the ideal software for high school or undergraduate physics classes. It is a state-of-the-art, multi-threaded software program that allows students to observe spectral profile graphs and their colorful synthesized spectra in real-time video. Using an off-the-shelf webcam, DSLR, cooled-CCD or even a cell phone camera, students can now gain hands-on experience in gathering, calibrating, and identifying spectra. Light sources can include the sun, bright night-time astronomical objects, or gas tubes. Students can even build their own spectroscopes using inexpensive diffraction "rainbow” glasses. For more advanced students, the addition of an inexpensive slitless diffraction grating allows the study of even more exciting objects. With a modest 8” telescope, students can use a simple webcam to classify star types, and to detect such exciting phenomena as Neptune's methane-absorption lines, M42's emission lines, and even, believe it or not, the redshift of 3C 273. These adventures are possible even under light-polluted urban skies. RSpec is also an excellent program for amateur astronomers who want to transition from visual CCD imaging to actual scientific data collection and analysis. As the developer of this software, I worked with both teachers and experienced spectroscopists to ensure that it would bring a compelling experience to your students. The response to real-time, colorful data has been very enthusiastic both in the classroom and in public outreach.

  17. Disseminating Metaproteomic Informatics Capabilities and Knowledge Using the Galaxy-P Framework

    PubMed Central

    Easterly, Caleb; Gruening, Bjoern; Johnson, James; Kolmeder, Carolin A.; Kumar, Praveen; May, Damon; Mehta, Subina; Mesuere, Bart; Brown, Zachary; Elias, Joshua E.; Hervey, W. Judson; McGowan, Thomas; Muth, Thilo; Rudney, Joel; Griffin, Timothy J.

    2018-01-01

    The impact of microbial communities, also known as the microbiome, on human health and the environment is receiving increased attention. Studying translated gene products (proteins) and comparing metaproteomic profiles may elucidate how microbiomes respond to specific environmental stimuli, and interact with host organisms. Characterizing proteins expressed by a complex microbiome and interpreting their functional signature requires sophisticated informatics tools and workflows tailored to metaproteomics. Additionally, there is a need to disseminate these informatics resources to researchers undertaking metaproteomic studies, who could use them to make new and important discoveries in microbiome research. The Galaxy for proteomics platform (Galaxy-P) offers an open source, web-based bioinformatics platform for disseminating metaproteomics software and workflows. Within this platform, we have developed easily-accessible and documented metaproteomic software tools and workflows aimed at training researchers in their operation and disseminating the tools for more widespread use. The modular workflows encompass the core requirements of metaproteomic informatics: (a) database generation; (b) peptide spectral matching; (c) taxonomic analysis and (d) functional analysis. Much of the software available via the Galaxy-P platform was selected, packaged and deployed through an online metaproteomics “Contribution Fest“ undertaken by a unique consortium of expert software developers and users from the metaproteomics research community, who have co-authored this manuscript. These resources are documented on GitHub and freely available through the Galaxy Toolshed, as well as a publicly accessible metaproteomics gateway Galaxy instance. These documented workflows are well suited for the training of novice metaproteomics researchers, through online resources such as the Galaxy Training Network, as well as hands-on training workshops. Here, we describe the metaproteomics tools available within these Galaxy-based resources, as well as the process by which they were selected and implemented in our community-based work. We hope this description will increase access to and utilization of metaproteomics tools, as well as offer a framework for continued community-based development and dissemination of cutting edge metaproteomics software. PMID:29385081

  18. Disseminating Metaproteomic Informatics Capabilities and Knowledge Using the Galaxy-P Framework.

    PubMed

    Blank, Clemens; Easterly, Caleb; Gruening, Bjoern; Johnson, James; Kolmeder, Carolin A; Kumar, Praveen; May, Damon; Mehta, Subina; Mesuere, Bart; Brown, Zachary; Elias, Joshua E; Hervey, W Judson; McGowan, Thomas; Muth, Thilo; Nunn, Brook; Rudney, Joel; Tanca, Alessandro; Griffin, Timothy J; Jagtap, Pratik D

    2018-01-31

    The impact of microbial communities, also known as the microbiome, on human health and the environment is receiving increased attention. Studying translated gene products (proteins) and comparing metaproteomic profiles may elucidate how microbiomes respond to specific environmental stimuli, and interact with host organisms. Characterizing proteins expressed by a complex microbiome and interpreting their functional signature requires sophisticated informatics tools and workflows tailored to metaproteomics. Additionally, there is a need to disseminate these informatics resources to researchers undertaking metaproteomic studies, who could use them to make new and important discoveries in microbiome research. The Galaxy for proteomics platform (Galaxy-P) offers an open source, web-based bioinformatics platform for disseminating metaproteomics software and workflows. Within this platform, we have developed easily-accessible and documented metaproteomic software tools and workflows aimed at training researchers in their operation and disseminating the tools for more widespread use. The modular workflows encompass the core requirements of metaproteomic informatics: (a) database generation; (b) peptide spectral matching; (c) taxonomic analysis and (d) functional analysis. Much of the software available via the Galaxy-P platform was selected, packaged and deployed through an online metaproteomics "Contribution Fest" undertaken by a unique consortium of expert software developers and users from the metaproteomics research community, who have co-authored this manuscript. These resources are documented on GitHub and freely available through the Galaxy Toolshed, as well as a publicly accessible metaproteomics gateway Galaxy instance. These documented workflows are well suited for the training of novice metaproteomics researchers, through online resources such as the Galaxy Training Network, as well as hands-on training workshops. Here, we describe the metaproteomics tools available within these Galaxy-based resources, as well as the process by which they were selected and implemented in our community-based work. We hope this description will increase access to and utilization of metaproteomics tools, as well as offer a framework for continued community-based development and dissemination of cutting edge metaproteomics software.

  19. Development of a CCD based solar speckle imaging system

    NASA Astrophysics Data System (ADS)

    Nisenson, Peter; Stachnik, Robert V.; Noyes, Robert W.

    1986-02-01

    A program to develop software and hardware for the purpose of obtaining high angular resolution images of the solar surface is described. The program included the procurement of a Charge Coupled Devices imaging system; an extensive laboratory and remote site testing of the camera system; the development of a software package for speckle image reconstruction which was eventually installed and tested at the Sacramento Peak Observatory; and experiments of the CCD system (coupled to an image intensifier) for low light level, narrow spectral band solar imaging.

  20. Automated Method of Frequency Determination in Software Metric Data Through the Use of the Multiple Signal Classification (MUSIC) Algorithm

    DTIC Science & Technology

    1998-06-26

    METHOD OF FREQUENCY DETERMINATION 4 IN SOFTWARE METRIC DATA THROUGH THE USE OF THE 5 MULTIPLE SIGNAL CLASSIFICATION ( MUSIC ) ALGORITHM 6 7 STATEMENT OF...graph showing the estimated power spectral 12 density (PSD) generated by the multiple signal classification 13 ( MUSIC ) algorithm from the data set used...implemented in this module; however, it is preferred to use 1 the Multiple Signal Classification ( MUSIC ) algorithm. The MUSIC 2 algorithm is

  1. Characteristics of AVIRIS Band Measurements in Desert Agroecosystems in the Area of Blythe, California. 1; Studies of Cotton Spectra

    NASA Technical Reports Server (NTRS)

    Hanna, Safwat H. Shakir

    2001-01-01

    Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data from Blythe, California, were acquired in June 1997 to study agricultural spectra from different crops and to identify crops in other areas with similar environmental factors and similar spectral properties. The main objectives of this study are: (1) to compare the spectral and radiometric characteristics of AVIRIS data from agriculture crops with ground spectra measured by a FieldSpec ASD spectrometer; (2) to explore the use of AVIRIS spectral images for identifying agricultural crops; (3) to study the spectral expression of environmental factors on selected crops; and (4) to build a spectral library for the crops that were studied. A long-term goal is to extend the spectral library for different vegetation or crops in different stages of growth. To support our study, on July 18 and 19, 2000, we collected spectra using the FieldSpec spectrometer from selected fields with different crops in the Blythe area of California (longitude 114 deg 33.28 W and latitude 33 deg 25.42 N to longitude 1140 44.53 W and latitude 33 deg 39.77 N). These crops were cotton in different stages of growth, varieties of grass pure or mixed, Sudan grass, Bermuda grass, Teff grass, and alfalfa. Some of the fields were treated with different types of irrigation (i.e., wet to dry conditions). Additional parameters were studied such as the soil water content (WC), pH, and organic matter (OM). The results of this study showed that for crops known to be similar, there is a significant correlation between the spectra that were collected by AVIRIS in 1997 and spectra measured by the FieldSpec (registered) spectrometer in 2000. This correlation allowed development of a spectral library to be used in ENVI-IDL analysis software. This library was used successfully to identify different crops. Furthermore, using IDL algorithms of Spectral Angle Mapper classification (SAM), spectral feature fitting (SFF) and spectral binary encoding (SPE) showed that there is excellent agreement between the predicted and the actual crop type (i.e., the correlation is between 85-90% match). Further use of the AVIRIS images can be of a value to crop identification or crop yield for commercial use.

  2. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    NASA Astrophysics Data System (ADS)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal spectral data at 1 s time intervals, which represents data collected by a mobile system operating in a dynamic radiation background environment; and one that represents static measurements with a foreground spectrum (background plus source) and a background spectrum. These data include controlled variations in both Source Related Factors (nuclide, nuclide combinations, activities, distances, collection times, shielding configurations, and background spectra) and Detector Related Factors (currently only gain shifts, but resolution changes and non-linear energy calibration errors will be added soon). The software tools will allow the developer to evaluate the performance impact of each of these factors. Although this first implementation is somewhat limited in scope, considering only NaI-based detection systems and two application domains, it is hoped that (with community feedback) a wider range of detector types and applications will be included in the future. This article describes the methods used for dataset creation, the software validation/performance measurement tools, the performance metrics used, and examples of baseline performance.

  3. Feasibility and demonstration of a cloud-based RIID analysis system

    NASA Astrophysics Data System (ADS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  4. The MPI-Mainz UV/VIS Spectral Atlas of Gaseous Molecules of Atmospheric Interest

    NASA Astrophysics Data System (ADS)

    Keller-Rudek, H.; Moortgat, G. K.; Sander, R.; Sörensen, R.

    2013-12-01

    We present the MPI-Mainz UV/VIS Spectral Atlas of Gaseous Molecules, which is a large collection of absorption cross sections and quantum yields in the ultraviolet and visible (UV/VIS) wavelength region for gaseous molecules and radicals primarily of atmospheric interest. The data files contain results of individual measurements, covering research of almost a whole century. To compare and visualize the data sets, multicoloured graphical representations have been created. The MPI-Mainz UV/VIS Spectral Atlas is available on the Internet at http://www.uv-vis-spectral-atlas-mainz.org. It now appears with improved browse and search options, based on new database software. In addition to the Web pages, which are continuously updated, a frozen version of the data is available under the doi:10.5281/zenodo.6951.

  5. Leaf area index retrieval using Hyperion EO-1 data-based vegetation indices in Himalayan forest system

    NASA Astrophysics Data System (ADS)

    Singh, Dharmendra; Singh, Sarnam

    2016-04-01

    Present Study is being taken to retrieve Leaf Area Indexn(LAI) in Himalayan forest system using vegetation indices developed from Hyperion EO-1 hyperspectral data. Hemispherical photograph were captured in the month of March and April, 2012 at 40 locations, covering moist tropical Sal forest, subtropical Bauhinia and pine forest and temperate Oak forest and analysed using an open source GLA software. LAI in the study region was ranging in between 0.076 m2/m2 to 6.00 m2/m2. These LAI values were used to develop spectral models with the FLAASH corrected Hyperion measurements.Normalized difference vegetation index (NDVI) was used taking spectral reflectance values of all the possible combinations of 170 atmospherically corrected channels. The R2 was ranging from lowest 0.0 to highest 0.837 for the band combinations of spectral region 640 nm and 670 nm. The spectral model obtained was, spectral reflectance (y) = 0.02x LAI(x) - 0.0407.

  6. Overview of open resources to support automated structure verification and elucidation

    EPA Science Inventory

    Cheminformatics methods form an essential basis for providing analytical scientists with access to data, algorithms and workflows. There are an increasing number of free online databases (compound databases, spectral libraries, data repositories) and a rich collection of software...

  7. Improving the spectral resolution of flat-field concave grating miniature spectrometers by dividing a wide spectral band into two narrow ones.

    PubMed

    Zhou, Qian; Pang, Jinchao; Li, Xinghui; Ni, Kai; Tian, Rui

    2015-11-10

    In this study, a new flat-field concave grating miniature spectrometer is proposed with improved resolution across a wide spectral band. A mirror is added to a conventional concave grating spectrometer and placed near the existing detector array, allowing a wide spectral band to be divided into two adjacent subspectral bands. One of these bands is directly detected by the detector, and the other is indirectly analyzed by the same detector after being reflected by the mirror. These two subspectral bands share the same entrance slit, concave grating, and detector, which allows for a compact size, while maintaining an improved spectral resolution across the entire spectral band. The positions of the mirror and other parameters of the spectrometer are designed by a computer procedure and the optical design software ZEMAX. Simulation results show that the resolution of this kind of flat-field concave grating miniature spectrometer is better than 1.6 nm across a spectral band of 700 nm. Experiments based on three laser sources reveal that the measured resolutions are comparable to the simulated ones, with a maximum relative error between them of less than 19%.

  8. Spectrum-based method to generate good decoy libraries for spectral library searching in peptide identifications.

    PubMed

    Cheng, Chia-Ying; Tsai, Chia-Feng; Chen, Yu-Ju; Sung, Ting-Yi; Hsu, Wen-Lian

    2013-05-03

    As spectral library searching has received increasing attention for peptide identification, constructing good decoy spectra from the target spectra is the key to correctly estimating the false discovery rate in searching against the concatenated target-decoy spectral library. Several methods have been proposed to construct decoy spectral libraries. Most of them construct decoy peptide sequences and then generate theoretical spectra accordingly. In this paper, we propose a method, called precursor-swap, which directly constructs decoy spectral libraries directly at the "spectrum level" without generating decoy peptide sequences by swapping the precursors of two spectra selected according to a very simple rule. Our spectrum-based method does not require additional efforts to deal with ion types (e.g., a, b or c ions), fragment mechanism (e.g., CID, or ETD), or unannotated peaks, but preserves many spectral properties. The precursor-swap method is evaluated on different spectral libraries and the results of obtained decoy ratios show that it is comparable to other methods. Notably, it is efficient in time and memory usage for constructing decoy libraries. A software tool called Precursor-Swap-Decoy-Generation (PSDG) is publicly available for download at http://ms.iis.sinica.edu.tw/PSDG/.

  9. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  10. Evaluation of illumination system uniformity for wide-field biomedical hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Sawyer, Travis W.; Siri Luthman, A.; E Bohndiek, Sarah

    2017-04-01

    Hyperspectral imaging (HSI) systems collect both spatial (morphological) and spectral (chemical) information from a sample. HSI can provide sensitive analysis for biological and medical applications, for example, simultaneously measuring reflectance and fluorescence properties of a tissue, which together with structural information could improve early cancer detection and tumour characterisation. Illumination uniformity is a critical pre-condition for quantitative data extraction from an HSI system. Non-uniformity can cause glare, specular reflection and unwanted shading, which negatively impact statistical analysis procedures used to extract abundance of different chemical species. Here, we model and evaluate several illumination systems frequently used in wide-field biomedical imaging to test their potential for HSI. We use the software LightTools and FRED. The analysed systems include: a fibre ring light; a light emitting diode (LED) ring; and a diffuse scattering dome. Each system is characterised for spectral, spatial, and angular uniformity, as well as transfer efficiency. Furthermore, an approach to measure uniformity using the Kullback-Leibler divergence (KLD) is introduced. The KLD is generalisable to arbitrary illumination shapes, making it an attractive approach for characterising illumination distributions. Although the systems are quite comparable in their spatial and spectral uniformity, the most uniform angular distribution is achieved using a diffuse scattering dome, yielding a contrast of 0.503 and average deviation of 0.303 over a ±60° field of view with a 3.9% model error in the angular domain. Our results suggest that conventional illumination sources can be applied in HSI, but in the case of low light levels, bespoke illumination sources may offer improved performance.

  11. Analysis of Human Plasma Metabolites across Different Liquid Chromatography - Mass Spectrometry Platforms: Cross-platform Transferable Chemical Signatures

    PubMed Central

    Telu, Kelly H.; Yan, Xinjian; Wallace, William E.; Stein, Stephen E.; Simón-Manso, Yamil

    2016-01-01

    RATIONALE The metabolite profiling of a NIST plasma Standard Reference Material (SRM 1950) on different LC-MS platforms showed significant differences. Although these findings suggest caution when interpreting metabolomics results, the degree of overlap of both profiles allowed us to use tandem mass spectral libraries of recurrent spectra to evaluate to what extent these results are transferable across platforms and to develop cross-platform chemical signatures. METHODS Non-targeted global metabolite profiles of SRM 1950 were obtained on different LC-MS platforms using reversed phase chromatography and different chromatographic scales (nano, conventional and UHPLC). The data processing and the metabolite differential analysis were carried out using publically available (XCMS), proprietary (Mass Profiler Professional) and in-house software (NIST pipeline). RESULTS Repeatability and intermediate precision showed that the non-targeted SRM 1950 profiling was highly reproducible when working on the same platform (RSD < 2%); however, substantial differences were found in the LC-MS patterns originating on different platforms or even using different chromatographic scales (conventional HPLC, UHPLC and nanoLC) on the same platform. A substantial degree of overlap (common molecular features) was also found. A procedure to generate consistent chemical signatures using tandem mass spectral libraries of recurrent spectra is proposed. CONLUSIONS Different platforms rendered significantly different metabolite profiles, but the results were highly reproducible when working within one platform. Tandem mass spectral libraries of recurrent spectra are proposed to evaluate the degree of transferability of chemical signatures generated on different platforms. Chemical signatures based on our procedure are most likely cross-platform transferable. PMID:26842580

  12. Dereplication of Natural Products Using GC-TOF Mass Spectrometry: Improved Metabolite Identification by Spectral Deconvolution Ratio Analysis.

    PubMed

    Carnevale Neto, Fausto; Pilon, Alan C; Selegato, Denise M; Freire, Rafael T; Gu, Haiwei; Raftery, Daniel; Lopes, Norberto P; Castro-Gamboa, Ian

    2016-01-01

    Dereplication based on hyphenated techniques has been extensively applied in plant metabolomics, thereby avoiding re-isolation of known natural products. However, due to the complex nature of biological samples and their large concentration range, dereplication requires the use of chemometric tools to comprehensively extract information from the acquired data. In this work we developed a reliable GC-MS-based method for the identification of non-targeted plant metabolites by combining the Ratio Analysis of Mass Spectrometry deconvolution tool (RAMSY) with Automated Mass Spectral Deconvolution and Identification System software (AMDIS). Plants species from Solanaceae, Chrysobalanaceae and Euphorbiaceae were selected as model systems due to their molecular diversity, ethnopharmacological potential, and economical value. The samples were analyzed by GC-MS after methoximation and silylation reactions. Dereplication was initiated with the use of a factorial design of experiments to determine the best AMDIS configuration for each sample, considering linear retention indices and mass spectral data. A heuristic factor (CDF, compound detection factor) was developed and applied to the AMDIS results in order to decrease the false-positive rates. Despite the enhancement in deconvolution and peak identification, the empirical AMDIS method was not able to fully deconvolute all GC-peaks, leading to low MF values and/or missing metabolites. RAMSY was applied as a complementary deconvolution method to AMDIS to peaks exhibiting substantial overlap, resulting in recovery of low-intensity co-eluted ions. The results from this combination of optimized AMDIS with RAMSY attested to the ability of this approach as an improved dereplication method for complex biological samples such as plant extracts.

  13. Dereplication of Natural Products Using GC-TOF Mass Spectrometry: Improved Metabolite Identification by Spectral Deconvolution Ratio Analysis

    PubMed Central

    Carnevale Neto, Fausto; Pilon, Alan C.; Selegato, Denise M.; Freire, Rafael T.; Gu, Haiwei; Raftery, Daniel; Lopes, Norberto P.; Castro-Gamboa, Ian

    2016-01-01

    Dereplication based on hyphenated techniques has been extensively applied in plant metabolomics, thereby avoiding re-isolation of known natural products. However, due to the complex nature of biological samples and their large concentration range, dereplication requires the use of chemometric tools to comprehensively extract information from the acquired data. In this work we developed a reliable GC-MS-based method for the identification of non-targeted plant metabolites by combining the Ratio Analysis of Mass Spectrometry deconvolution tool (RAMSY) with Automated Mass Spectral Deconvolution and Identification System software (AMDIS). Plants species from Solanaceae, Chrysobalanaceae and Euphorbiaceae were selected as model systems due to their molecular diversity, ethnopharmacological potential, and economical value. The samples were analyzed by GC-MS after methoximation and silylation reactions. Dereplication was initiated with the use of a factorial design of experiments to determine the best AMDIS configuration for each sample, considering linear retention indices and mass spectral data. A heuristic factor (CDF, compound detection factor) was developed and applied to the AMDIS results in order to decrease the false-positive rates. Despite the enhancement in deconvolution and peak identification, the empirical AMDIS method was not able to fully deconvolute all GC-peaks, leading to low MF values and/or missing metabolites. RAMSY was applied as a complementary deconvolution method to AMDIS to peaks exhibiting substantial overlap, resulting in recovery of low-intensity co-eluted ions. The results from this combination of optimized AMDIS with RAMSY attested to the ability of this approach as an improved dereplication method for complex biological samples such as plant extracts. PMID:27747213

  14. SOSPEX, an interactive tool to explore SOFIA spectral cubes

    NASA Astrophysics Data System (ADS)

    Fadda, Dario; Chambers, Edward T.

    2018-01-01

    We present SOSPEX (SOFIA SPectral EXplorer), an interactive tool to visualize and analyze spectral cubes obtained with the FIFI-LS and GREAT instruments onboard the SOFIA Infrared Observatory. This software package is written in Python 3 and it is available either through Github or Anaconda.Through this GUI it is possible to explore directly the spectral cubes produced by the SOFIA pipeline and archived in the SOFIA Science Archive. Spectral cubes are visualized showing their spatial and spectral dimensions in two different windows. By selecting a part of the spectrum, the flux from the corresponding slice of the cube is visualized in the spatial window. On the other hand, it is possible to define apertures on the spatial window to show the corresponding spectral energy distribution in the spectral window.Flux isocontours can be overlapped to external images in the spatial window while line names, atmospheric transmission, or external spectra can be overplotted on the spectral window. Atmospheric models with specific parameters can be retrieved, compared to the spectra and applied to the uncorrected FIFI-LS cubes in the cases where the standard values give unsatisfactory results. Subcubes can be selected and saved as FITS files by cropping or cutting the original cubes. Lines and continuum can be fitted in the spectral window saving the results in Jyson files which can be reloaded later. Finally, in the case of spatially extended observations, it is possible to compute spectral momenta as a function of the position to obtain velocity dispersion maps or velocity diagrams.

  15. The spectral cell method in nonlinear earthquake modeling

    NASA Astrophysics Data System (ADS)

    Giraldo, Daniel; Restrepo, Doriam

    2017-12-01

    This study examines the applicability of the spectral cell method (SCM) to compute the nonlinear earthquake response of complex basins. SCM combines fictitious-domain concepts with the spectral-version of the finite element method to solve the wave equations in heterogeneous geophysical domains. Nonlinear behavior is considered by implementing the Mohr-Coulomb and Drucker-Prager yielding criteria. We illustrate the performance of SCM with numerical examples of nonlinear basins exhibiting physically and computationally challenging conditions. The numerical experiments are benchmarked with results from overkill solutions, and using MIDAS GTS NX, a finite element software for geotechnical applications. Our findings show good agreement between the two sets of results. Traditional spectral elements implementations allow points per wavelength as low as PPW = 4.5 for high-order polynomials. Our findings show that in the presence of nonlinearity, high-order polynomials (p ≥ 3) require mesh resolutions above of PPW ≥ 10 to ensure displacement errors below 10%.

  16. International Symposium on Numerical Methods in Engineering, 5th, Ecole Polytechnique Federale de Lausanne, Switzerland, Sept. 11-15, 1989, Proceedings. Volumes 1 & 2

    NASA Astrophysics Data System (ADS)

    Gruber, Ralph; Periaux, Jaques; Shaw, Richard Paul

    Recent advances in computational mechanics are discussed in reviews and reports. Topics addressed include spectral superpositions on finite elements for shear banding problems, strain-based finite plasticity, numerical simulation of hypersonic viscous continuum flow, constitutive laws in solid mechanics, dynamics problems, fracture mechanics and damage tolerance, composite plates and shells, contact and friction, metal forming and solidification, coupling problems, and adaptive FEMs. Consideration is given to chemical flows, convection problems, free boundaries and artificial boundary conditions, domain-decomposition and multigrid methods, combustion and thermal analysis, wave propagation, mixed and hybrid FEMs, integral-equation methods, optimization, software engineering, and vector and parallel computing.

  17. Synthesis, crystal structure investigation, spectroscopic characterizations and DFT computations on a novel 1-(2-chloro-4-phenylquinolin-3-yl)ethanone

    NASA Astrophysics Data System (ADS)

    Murugavel, S.; Stephen, C. S. Jacob Prasanna; Subashini, R.; Reddy, H. Raveendranatha; AnanthaKrishnan, Dhanabalan

    2016-10-01

    The title compound 1-(2-chloro-4-phenylquinolin-3-yl)ethanone (CPQE) was synthesised effectively by chlorination of 3-acetyl-4-phenylquinolin-2(1H)-one (APQ) using POCl3 reagent. Structural and vibrational spectroscopic studies were performed by utilizing single crystal X-ray diffraction, FTIR and NMR spectral analysis along with DFT method utilizing GAUSSIAN‧ 03 software. Veda program has been employed to perform a detailed interpretation of vibrational spectra. Mulliken population analyses on atomic charges, MEP, HOMO-LUMO, NBO, Global chemical reactivity descriptors and thermodynamic properties have been examined by (DFT/B3LYP) method with the 6-311G(d,p) basis set level.

  18. An EPIC Tale of the Quiescent Particle Background

    NASA Technical Reports Server (NTRS)

    Snowden, S.L.; Kuntz, K.D.

    2017-01-01

    Extended Source Analysis Software Use Based Empirical Investigation: (1) Builds quiescent particle background (QPB) spectra and images for observations of extended sources that fill (or mostly fill) the FOV i.e., annular background subtraction won't work. (2) Uses a combination of Filter Wheel Closed (FWC) and corner data to capture the spectral, spatial, and temporal variation of the quiescent particle background. New Work: (1) Improved understanding of the QPB (aided by adding a whole lot of data since 2008). (2) Significantly improved statistics (did I mention a LOT more data?). (3) Better characterization and identification of anomalous states. (4) Builds backgrounds for some anomalous state. (5) New efficient method for non-anomalous states.

  19. USGS Digital Spectral Library splib06a

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Wise, Richard A.; Livo, K. Eric; Hoefen, Todd M.; Kokaly, Raymond F.; Sutley, Stephen J.

    2007-01-01

    Introduction We have assembled a digital reflectance spectral library that covers the wavelength range from the ultraviolet to far infrared along with sample documentation. The library includes samples of minerals, rocks, soils, physically constructed as well as mathematically computed mixtures, plants, vegetation communities, microorganisms, and man-made materials. The samples and spectra collected were assembled for the purpose of using spectral features for the remote detection of these and similar materials. Analysis of spectroscopic data from laboratory, aircraft, and spacecraft instrumentation requires a knowledge base. The spectral library discussed here forms a knowledge base for the spectroscopy of minerals and related materials of importance to a variety of research programs being conducted at the U.S. Geological Survey. Much of this library grew out of the need for spectra to support imaging spectroscopy studies of the Earth and planets. Imaging spectrometers, such as the National Aeronautics and Space Administration (NASA) Airborne Visible/Infra Red Imaging Spectrometer (AVIRIS) or the NASA Cassini Visual and Infrared Mapping Spectrometer (VIMS) which is currently orbiting Saturn, have narrow bandwidths in many contiguous spectral channels that permit accurate definition of absorption features in spectra from a variety of materials. Identification of materials from such data requires a comprehensive spectral library of minerals, vegetation, man-made materials, and other subjects in the scene. Our research involves the use of the spectral library to identify the components in a spectrum of an unknown. Therefore, the quality of the library must be very good. However, the quality required in a spectral library to successfully perform an investigation depends on the scientific questions to be answered and the type of algorithms to be used. For example, to map a mineral using imaging spectroscopy and the mapping algorithm of Clark and others (1990a, 2003b), one simply needs a diagnostic absorption band. The mapping system uses continuum-removed reference spectral features fitted to features in observed spectra. Spectral features for such algorithms can be obtained from a spectrum of a sample containing large amounts of contaminants, including those that add other spectral features, as long as the shape of the diagnostic feature of interest is not modified. If, however, the data are needed for radiative transfer models to derive mineral abundances from reflectance spectra, then completely uncontaminated spectra are required. This library contains spectra that span a range of quality, with purity indicators to flag spectra for (or against) particular uses. Acquiring spectral measurements and performing sample characterizations for this library has taken about 15 person-years of effort. Software to manage the library and provide scientific analysis capability is provided (Clark, 1980, 1993). A personal computer (PC) reader for the library is also available (Livo and others, 1993). The program reads specpr binary files (Clark, 1980, 1993) and plots spectra. Another program that reads the specpr format is written in IDL (Kokaly, 2005). In our view, an ideal spectral library consists of samples covering a very wide range of materials, has large wavelength range with very high precision, and has enough sample analyses and documentation to establish the quality of the spectra. Time and available resources limit what can be achieved. Ideally, for each mineral, the sample analysis would include X-ray diffraction (XRD), electron microprobe (EM) or X-ray fluorescence (XRF), and petrographic microscopic analyses. For some minerals, such as iron oxides, additional analyses such as Mossbauer would be helpful. We have found that to make the basic spectral measurements, provide XRD, EM or XRF analyses, and microscopic analyses, document the results, and complete an entry of one spectral library sample, all takes about

  20. Using GIS servers and interactive maps in spectral data sharing and administration: Case study of Ahvaz Spectral Geodatabase Platform (ASGP)

    NASA Astrophysics Data System (ADS)

    Karami, Mojtaba; Rangzan, Kazem; Saberi, Azim

    2013-10-01

    With emergence of air-borne and space-borne hyperspectral sensors, spectroscopic measurements are gaining more importance in remote sensing. Therefore, the number of available spectral reference data is constantly increasing. This rapid increase often exhibits a poor data management, which leads to ultimate isolation of data on disk storages. Spectral data without precise description of the target, methods, environment, and sampling geometry cannot be used by other researchers. Moreover, existing spectral data (in case it accompanied with good documentation) become virtually invisible or unreachable for researchers. Providing documentation and a data-sharing framework for spectral data, in which researchers are able to search for or share spectral data and documentation, would definitely improve the data lifetime. Relational Database Management Systems (RDBMS) are main candidates for spectral data management and their efficiency is proven by many studies and applications to date. In this study, a new approach to spectral data administration is presented based on spatial identity of spectral samples. This method benefits from scalability and performance of RDBMS for storage of spectral data, but uses GIS servers to provide users with interactive maps as an interface to the system. The spectral files, photographs and descriptive data are considered as belongings of a geospatial object. A spectral processing unit is responsible for evaluation of metadata quality and performing routine spectral processing tasks for newly-added data. As a result, by using internet browser software the users would be able to visually examine availability of data and/or search for data based on descriptive attributes associated to it. The proposed system is scalable and besides giving the users good sense of what data are available in the database, it facilitates participation of spectral reference data in producing geoinformation.

  1. Universal dispersion model for characterization of optical thin films over wide spectral range: Application to magnesium fluoride

    NASA Astrophysics Data System (ADS)

    Franta, Daniel; Nečas, David; Giglia, Angelo; Franta, Pavel; Ohlídal, Ivan

    2017-11-01

    Optical characterization of magnesium fluoride thin films is performed in a wide spectral range from far infrared to extreme ultraviolet (0.01-45 eV) utilizing the universal dispersion model. Two film defects, i.e. random roughness of the upper boundaries and defect transition layer at lower boundary are taken into account. An extension of universal dispersion model consisting in expressing the excitonic contributions as linear combinations of Gaussian and truncated Lorentzian terms is introduced. The spectral dependencies of the optical constants are presented in a graphical form and by the complete set of dispersion parameters that allows generating tabulated optical constants with required range and step using a simple utility in the newAD2 software package.

  2. Analog of electromagnetically induced transparency at terahertz frequency based on a bilayer-double-H-metamaterial

    NASA Astrophysics Data System (ADS)

    Wang, Yue'e.; Li, Zhi; Hu, Fangrong

    2018-01-01

    We designed a bilayer-double-H-metamaterials (BDHM) composed of two layers of metal and two layers of dielectric to analog a spectral response of electromagnetically induced transparency (EIT) at terahertz frequency. By changing the incident angle, the BDHM exhibits an EIT-like spectral response. The tunable spectral performances and modulation mechanism of the transparent peak are theoretically investigated using full-wave electromagnetic simulation software. The physical mechanism of the EIT-like effect is based on the constructive and destructive interference between the induced electrical dipoles. Our work provides a new way to realize the EIT-like effect only by changing the incident angles of the metamaterials. The potential applications include tunable filters, sensors, attenuators, switches, and so on.

  3. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    PubMed

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Identification of seedling cabbages and weeds using hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Target detectionis one of research focues for precision chemical application. This study developed a method to identify seedling cabbages and weeds using hyperspectral spectral imaging. In processing the image data, with ENVI software, after dimension reduction, noise reduction, de-correlation for h...

  5. Detection of soil erosion with Thematic Mapper (TM) satellite data within Pinyon-Juniper woodlands

    NASA Technical Reports Server (NTRS)

    Price, Kevin Paul

    1987-01-01

    Pinyon-Juniper woodlands dominate approximately 24.3 million hectares (60 million acres) in the western United States. The overall objective was to test the sensitivity of the LANDSAT Thematic Mapper (TM) spectral data for detecting varying degrees of soil erosion within the Pinyon-Juniper woodlands. A second objective was to assess the potential of the spectral data for assigning the Universal Soil Loss Equation (USLE) crop management (C) factor values to varying cover types within the woodland. Thematic Mapper digital data for June 2, 1984 on channels 2, 3, 4, and 5 were used. Digital data analysis was performed using the ELAS software package. Best results were achieved using CLUS, an unsupervised clustering algorithm. Fifteen of the 40 Pinyon-Juniper signatures were identified as being relatively pure Pinyon-Juniper woodland. Final analysis resulted in the grouping of the 15 signatures into three major groups. Ten study sites were selected from each of the three groups and located on the ground. At each site the following field measurements were taken: percent tree canopy and percent understory cover, soil texture, total soil loss, and soil erosion rate estimates. A technique for measuring soil erosion within Pinyon-Juniper woodlands was developed. A theoretical model of site degradation after Pinyon-Juniper invasion is presented.

  6. A method of noise reduction in heterodyne interferometric vibration metrology by combining auto-correlation analysis and spectral filtering

    NASA Astrophysics Data System (ADS)

    Hao, Hongliang; Xiao, Wen; Chen, Zonghui; Ma, Lan; Pan, Feng

    2018-01-01

    Heterodyne interferometric vibration metrology is a useful technique for dynamic displacement and velocity measurement as it can provide a synchronous full-field output signal. With the advent of cost effective, high-speed real-time signal processing systems and software, processing of the complex signals encountered in interferometry has become more feasible. However, due to the coherent nature of the laser sources, the sequence of heterodyne interferogram are corrupted by a mixture of coherent speckle and incoherent additive noise, which can severely degrade the accuracy of the demodulated signal and the optical display. In this paper, a new heterodyne interferometric demodulation method by combining auto-correlation analysis and spectral filtering is described leading to an expression for the dynamic displacement and velocity of the object under test that is significantly more accurate in both the amplitude and frequency of the vibrating waveform. We present a mathematical model of the signals obtained from interferograms that contain both vibration information of the measured objects and the noise. A simulation of the signal demodulation process is presented and used to investigate the noise from the system and external factors. The experimental results show excellent agreement with measurements from a commercial Laser Doppler Velocimetry (LDV).

  7. Chemometric aided NIR portable instrument for rapid assessment of medicine quality.

    PubMed

    Zontov, Y V; Balyklova, K S; Titova, A V; Rodionova, O Ye; Pomerantsev, A L

    2016-11-30

    The progress in instrumentation technology has led to miniaturization of NIR instruments. Fast systems that contain no moving parts were developed to be used in the field, warehouses, drugstores, etc. At the same time, in general these portable/handheld spectrometers have a lower spectral resolution and a narrower spectral region than stationary ones. Vendors of portable instruments supply their equipment with special software for spectra processing, which aims at simplifying the analyst's work to the highest degree possible. Often such software is not fully capable of solving complex problems. In application to a real-world problem of counterfeit drug detection we demonstrate that even impaired spectral data do carry information sufficient for drug authentication. The chemometrics aided approach helps to extract this information and thus to extend the applicability of miniaturized NIR instruments. MicroPhazir-RX NIR spectrometer is used as an example of a portable instrument. The data driven soft independent modeling of class analogy (DD-SIMCA) method is employed for data processing. A representative set of tablets of a calcium channel blocker from 6 different manufacturers is used to illustrate the proposed approach. It is shown that the DD-SIMCA approach yields a better result than the basic method provided by the instrument vendor. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. PROS: An IRAF based system for analysis of x ray data

    NASA Technical Reports Server (NTRS)

    Conroy, M. A.; Deponte, J.; Moran, J. F.; Orszak, J. S.; Roberts, W. P.; Schmidt, D.

    1992-01-01

    PROS is an IRAF based software package for the reduction and analysis of x-ray data. The use of a standard, portable, integrated environment provides for both multi-frequency and multi-mission analysis. The analysis of x-ray data differs from optical analysis due to the nature of the x-ray data and its acquisition during constantly varying conditions. The scarcity of data, the low signal-to-noise ratio and the large gaps in exposure time make data screening and masking an important part of the analysis. PROS was developed to support the analysis of data from the ROSAT and Einstein missions but many of the tasks have been used on data from other missions. IRAF/PROS provides a complete end-to-end system for x-ray data analysis: (1) a set of tools for importing and exporting data via FITS format -- in particular, IRAF provides a specialized event-list format, QPOE, that is compatible with its IMAGE (2-D array) format; (2) a powerful set of IRAF system capabilities for both temporal and spatial event filtering; (3) full set of imaging and graphics tasks; (4) specialized packages for scientific analysis such as spatial, spectral and timing analysis -- these consist of both general and mission specific tasks; and (5) complete system support including ftp and magnetic tape releases, electronic and conventional mail hotline support, electronic mail distribution of solutions to frequently asked questions and current known bugs. We will discuss the philosophy, architecture and development environment used by PROS to generate a portable, multimission software environment. PROS is available on all platforms that support IRAF, including Sun/Unix, VAX/VMS, HP, and Decstations. It is available on request at no charge.

  9. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  10. An in-line micro-pyrolysis system to remove contaminating organic species for precise and accurate water isotope analysis by spectroscopic techniques

    NASA Astrophysics Data System (ADS)

    Panetta, R. J.; Hsiao, G.

    2011-12-01

    Trace levels of organic contaminants such as short alcohols and terpenoids have been shown to cause spectral interference in water isotope analysis by spectroscopic techniques. The result is degraded precision and accuracy in both δD and δ18O for samples such as beverages, plant extracts or slightly contaminated waters. An initial approach offered by manufacturers is post-processing software that analyzes spectral features to identify and flag contaminated samples. However, it is impossible for this software to accurately reconstruct the water isotope signature, thus it is primarily a metric for data quality. Here, we describe a novel in-line pyrolysis system (Micro-Pyrolysis Technology, MPT) placed just prior to the inlet of a cavity ring-down spectroscopy (CRDS) analyzer that effectively removes interfering organic molecules without altering the isotope values of the water. Following injection of the water sample, N2 carrier gas passes the sample through a micro-pyrolysis tube heated with multiple high temperature elements in an oxygen-free environment. The temperature is maintained above the thermal decomposition threshold of most organic compounds (≤ 900 oC), but well below that of water (~2000 oC). The main products of the pyrolysis reaction are non-interfering species such as elemental carbon and H2 gas. To test the efficacy and applicability of the system, waters of known isotopic composition were spiked with varying amounts of common interfering alcohols (methanol, ethanol, propanol, hexanol, trans-2-hexenol, cis-3-hexanol up to 5 % v/v) and common soluble plant terpenoids (carveol, linalool, geraniol, prenol). Spiked samples with no treatment to remove the organics show strong interfering absorption peaks that adversely affect the δD and δ18O values. However, with the MPT in place, all interfering absorption peaks are removed and the water absorption spectrum is fully restored. As a consequence, the δD and δ18O values also return to their original values, demonstrating effective removal of interfering species with no isotopic fractionation during the pyrolysis. Tests of water spiked quantitatively show the MPT is most effective at removing interferences up to 1 % v/v. This level is typical for plant extracts and interstitial waters, i.e. the majority of natural samples that suffer from spectral interference.

  11. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  12. The Lightcurve Legacy of COS and STIS

    NASA Astrophysics Data System (ADS)

    Ely, Justin

    2014-10-01

    The Cosmic Origin Spectrograph {COS} and Space Telescope Imaging Spectrograph {STIS} have been advancing astronomy with high quality spectroscopic products for years, and in the case of STIS, more than a decade. Though already incredibly productive, there remains an untapped potential of discovery in the data of these instruments. Due to their detector designs, both of these instruments can operate in a mode where each indivudal photon's arrival time is recorded and saved. Though this TIME-TAG ability is typically utilized to provide second-by-second calibrations to the final spectral data, this mode can also be exploited to re-examine the data in the time domain, turning spectra into lightcurves. With the appropriate knowledge and software, the time-resolved spectra can instead be extracted into photometric lightcurves with high temporal and spectral resolution.We propose here to expand our current software tool into a community-ready pipeline and to deliver a collection of high-level science lightcurves for the entire COS and STIS TIME-TAG archives. By providing this tool and data archive to the community we will lower the barrier to time domain research with these two instruments. This will demonstrate to the community not only the potential contained in re-analysis of existing datasets, but also the exquisite time-series capabilities of the instruments available for future cycles. The enabling and demonstration of this so far underutilized technique should be done now. At a time when HST and its UV capabilities are nearing their end, it's vital that all possible avenues for exploration are made readily available to the scientific community.

  13. NMR of (133)Cs(+) in stretched hydrogels: One-dimensional, z- and NOESY spectra, and probing the ion's environment in erythrocytes.

    PubMed

    Kuchel, Philip W; Shishmarev, Dmitry; Puckeridge, Max; Levitt, Malcolm H; Naumann, Christoph; Chapman, Bogdan E

    2015-12-01

    (133)Cs nuclear magnetic resonance (NMR) spectroscopy was conducted on (133)Cs(+) in gelatin hydrogels that were either relaxed or stretched. Stretching generated a septet from this spin-7/2 nucleus, and its nuclear magnetic relaxation was studied via z-spectra, and two-dimensional nuclear Overhauser (NOESY) spectroscopy. Various spectral features were well simulated by using Mathematica and the software package SpinDynamica. Spectra of CsCl in suspensions of human erythrocytes embedded in gelatin gel showed separation of the resonances from the cation inside and outside the cells. Upon stretching the sample, the extracellular (133)Cs(+) signal split into a septet, while the intracellular peak was unchanged, revealing different alignment/ordering properties of the environment inside and around the cells. Differential interference contrast light microscopy confirmed that the cells were stretched when the overall sample was elongated. Analysis of the various spectral features of (133)Cs(+) reported here opens up applications of this K(+) congener for studies of cation-handling by metabolically-active cells and tissues in aligned states. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Comparative study of the efficiency of computed univariate and multivariate methods for the estimation of the binary mixture of clotrimazole and dexamethasone using two different spectral regions

    NASA Astrophysics Data System (ADS)

    Fayez, Yasmin Mohammed; Tawakkol, Shereen Mostafa; Fahmy, Nesma Mahmoud; Lotfy, Hayam Mahmoud; Shehata, Mostafa Abdel-Aty

    2018-04-01

    Three methods of analysis are conducted that need computational procedures by the Matlab® software. The first is the univariate mean centering method which eliminates the interfering signal of the one component at a selected wave length leaving the amplitude measured to represent the component of interest only. The other two multivariate methods named PLS and PCR depend on a large number of variables that lead to extraction of the maximum amount of information required to determine the component of interest in the presence of the other. Good accurate and precise results are obtained from the three methods for determining clotrimazole in the linearity range 1-12 μg/mL and 75-550 μg/mL with dexamethasone acetate 2-20 μg/mL in synthetic mixtures and pharmaceutical formulation using two different spectral regions 205-240 nm and 233-278 nm. The results obtained are compared statistically to each other and to the official methods.

  15. Density functional theory analysis and molecular docking evaluation of 1-(2, 5-dichloro-4-sulfophenyl)-3-methyl-5-pyrazolone as COX2 inhibitor against inflammatory diseases

    NASA Astrophysics Data System (ADS)

    Kavitha, T.; Velraj, G.

    2017-08-01

    The molecular structure of 1-(2, 5-Dichloro-4-Sulfophenyl)-3-Methyl-5-Pyrazolone (DSMP) was optimized using DFT/B3LYP/6-31++G(d,p) level and its corresponding experimental as well as theoretical FT-IR, FT-Raman vibrational frequencies and UV-Vis spectral analysis were carried out. The vibrational assignments and total energy distributions of each vibration were presented with the aid of Veda 4xx software. The molecular electrostatic potential, HOMO-LUMO energies, global and local reactivity descriptors and natural bond orbitals were analyzed in order to find the most possible reactive sites of the molecule and it was found that DSMP molecule possess enhanced nucleophilic activity. One of the common known COX2 inhibitor, celecoxib (CXB) was also found to exhibit similar reactivity properties and hence DSMP was also expected to inhibit COX enzymes. In order to detect the COX inhibition nature of DSMP, molecular docking analysis was carried out with the help of Autodock software. For that, the optimized structure was in turn used for docking DSMP with COX enzymes. The binding energy scores and inhibitory constant values reveal that the DSMP molecule possess good binding affinity and low inhibition constant towards COX2 enzyme and hence it can be used as an anti-inflammatory drug after carrying out necessary biological tests.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.

    BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less

  17. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  18. UFO (UnFold Operator) user guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kissel, L.; Biggs, F.; Marking, T.R.

    UFO is a collection of interactive utility programs for estimating unknown functions of one variable using a wide-ranging class of information as input, for miscellaneous data-analysis applications, for performing feasibility studies, and for supplementing our other software. Inverse problems, which include spectral unfolds, inverse heat-transfer problems, time-domain deconvolution, and unusual or difficult curve-fit problems, are classes of applications for which UFO is well suited. Extensive use of B-splines and (X,Y)-datasets is made to represent functions. The (X,Y)-dataset representation is unique in that it is not restricted to equally-spaced data. This feature is used, for example, in a table-generating algorithm thatmore » evaluates a function to a user-specified interpolation accuracy while minimizing the number of points stored in the corresponding dataset. UFO offers a variety of miscellaneous data-analysis options such as plotting, comparing, transforming, scaling, integrating; and adding, subtracting, multiplying, and dividing functions together. These options are often needed as intermediate steps in analyzing and solving difficult inverse problems, but they also find frequent use in other applications. Statistical options are available to calculate goodness-of-fit to measurements, specify error bands on solutions, give confidence limits on calculated quantities, and to point out the statistical consequences of operations such as smoothing. UFO is designed to do feasibility studies on a variety of engineering measurements. It is also tailored to supplement our Test Analysis and Design codes, SRAD Test-Data Archive software, and Digital Signal Analysis routines.« less

  19. Field Utilization and Analysis of AIS 128-channel Imagery Using Microcomputers: Application to Yerington, Nevada Field Area

    NASA Technical Reports Server (NTRS)

    Lyon, R. J. P.; Lanz, K.

    1985-01-01

    Geologists in exploration need to be able to determine the mineral composition of a given outcrop, and then proceed to another in order to carry out the process of geologic mapping. Since April 1984 researchers have been developing a portable microcomputer-based imaging system (with a grey-scale of 16 shades of amber), which were demonstrated during the November 1984 GSA field trip in the field at Yerington, NV. A color-version of the same technology was recently demonstrated. The portable computer selected is a COLBY 10-Megabyte, hard disk-equipped repackaged-IBM/XT, which operates on either 110/220 VAC or on 12VDC from the cigarette lighter in a field vehicle. A COMPAQ PLUS or an IBM Portable will also work on modified software. The underlying concept is that the atmospheric transmission and surface albedo/slope terms are multiplicative, relating the spectral irradiance to the spectral color of the surface materials. Thus, the spectral color of a pixel remains after averaged log-albedo and log-irradiance have been estimated. All these steps can be carried out on the COLBY microcomputer, using 80 image lines of the 128-channel, 12-bit imagery. Results are shown for such an 80-line segment, showing the identification of an O-H bearing mineral group (of slightly varying specific characters) on the flight line.

  20. New radio meteor detecting and logging software

    NASA Astrophysics Data System (ADS)

    Kaufmann, Wolfgang

    2017-08-01

    A new piece of software ``Meteor Logger'' for the radio observation of meteors is described. It analyses an incoming audio stream in the frequency domain to detect a radio meteor signal on the basis of its signature, instead of applying an amplitude threshold. For that reason the distribution of the three frequencies with the highest spectral power are considered over the time (3f method). An auto notch algorithm is developed to prevent the radio meteor signal detection from being jammed by a present interference line. The results of an exemplary logging session are discussed.

  1. Practical uses of SPFIT

    NASA Astrophysics Data System (ADS)

    Drouin, Brian J.

    2017-10-01

    Over twenty-five years ago, Herb Pickett introduced his quantum-mechanical fitting programs to the spectroscopic community. The utility and flexibility of the software has enabled a whole generation of spectroscopists to analyze both simple and complex spectra without having to write and compile their own code. Last year Stewart Novick provided a primer for the coming generation of users. This follow-on work will serve as a guide to intermediate and advanced usage of the software. It is meant to be used in concert with the online documentation as well as the spectral line catalog archive.

  2. Determination of the vinyl fluoride line intensities by TDL spectroscopy: the object oriented approach of Visual Line Shape Fitting Program to line profile analysis

    NASA Astrophysics Data System (ADS)

    Tasinato, Nicola; Pietropolli Charmet, Andrea; Stoppa, Paolo; Giorgianni, Santi

    2010-03-01

    In this work the self-broadening coefficients and the integrated line intensities for a number of ro-vibrational transitions of vinyl fluoride have been determined for the first time by means of TDL spectroscopy. The spectra recorded in the atmospheric window around 8.7 µm appear very crowded with a density of about 90 lines per cm-1. In order to fit these spectral features a new fitting software has been implemented. The program, which is designed for laser spectroscopy, can fit many lines simultaneously on the basis of different theoretical profiles (Doppler, Lorentz, Voigt, Galatry and Nelkin-Ghatak). Details of the object oriented implementation of the application are given. The reliability of the program is demonstrated by determining the line parameters of some ro-vibrational lines of sulphur dioxide in the ν1 band region around 9 µm. Then the software is used for the line profile analysis of vinyl fluoride. The experimental line shapes show deviations from the Voigt profile, which can be well modelled by using a Dicke narrowed line shape function. This leads to the determination of the self-narrowing coefficient within the framework of the strong collision model.

  3. Commercial Earth Observation

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Through the Earth Observation Commercial Applications Program (EOCAP) at Stennis Space Center, Applied Analysis, Inc. developed a new tool for analyzing remotely sensed data. The Applied Analysis Spectral Analytical Process (AASAP) detects or classifies objects smaller than a pixel and removes the background. This significantly enhances the discrimination among surface features in imagery. ERDAS, Inc. offers the system as a modular addition to its ERDAS IMAGINE software package for remote sensing applications. EOCAP is a government/industry cooperative program designed to encourage commercial applications of remote sensing. Projects can run three years or more and funding is shared by NASA and the private sector participant. Through the Earth Observation Commercial Applications Program (EOCAP), Ocean and Coastal Environmental Sensing (OCENS) developed SeaStation for marine users. SeaStation is a low-cost, portable, shipboard satellite groundstation integrated with vessel catch and product monitoring software. Linked to the Global Positioning System, SeaStation provides real time relationships between vessel position and data such as sea surface temperature, weather conditions and ice edge location. This allows the user to increase fishing productivity and improve vessel safety. EOCAP is a government/industry cooperative program designed to encourage commercial applications of remote sensing. Projects can run three years or more and funding is shared by NASA and the private sector participant.

  4. imzML: Imaging Mass Spectrometry Markup Language: A common data format for mass spectrometry imaging.

    PubMed

    Römpp, Andreas; Schramm, Thorsten; Hester, Alfons; Klinkert, Ivo; Both, Jean-Pierre; Heeren, Ron M A; Stöckli, Markus; Spengler, Bernhard

    2011-01-01

    Imaging mass spectrometry is the method of scanning a sample of interest and generating an "image" of the intensity distribution of a specific analyte. The data sets consist of a large number of mass spectra which are usually acquired with identical settings. Existing data formats are not sufficient to describe an MS imaging experiment completely. The data format imzML was developed to allow the flexible and efficient exchange of MS imaging data between different instruments and data analysis software.For this purpose, the MS imaging data is divided in two separate files. The mass spectral data is stored in a binary file to ensure efficient storage. All metadata (e.g., instrumental parameters, sample details) are stored in an XML file which is based on the standard data format mzML developed by HUPO-PSI. The original mzML controlled vocabulary was extended to include specific parameters of imaging mass spectrometry (such as x/y position and spatial resolution). The two files (XML and binary) are connected by offset values in the XML file and are unambiguously linked by a universally unique identifier. The resulting datasets are comparable in size to the raw data and the separate metadata file allows flexible handling of large datasets.Several imaging MS software tools already support imzML. This allows choosing from a (growing) number of processing tools. One is no longer limited to proprietary software, but is able to use the processing software which is best suited for a specific question or application. On the other hand, measurements from different instruments can be compared within one software application using identical settings for data processing. All necessary information for evaluating and implementing imzML can be found at http://www.imzML.org .

  5. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  6. Presentation Of The Small Baseline NSBAS Processing Chain On A Case Example: The ETNA Deformation Monitoring From 2003 to 2010 Using ENVISAT Data

    NASA Astrophysics Data System (ADS)

    Doin, Marie-Pierre; Lodge, Felicity; Guillaso, Stephane; Jolivet, Romain; Lasserre, Cecile; Ducret, Gabriel; Grandin, Raphael; Pathier, Erwan; Pinel, Virginie

    2012-01-01

    We assemble a processing chain that handles InSAR computation from raw data to time series analysis. A large part of the chain (from raw data to geocoded unwrapped interferograms) is based on ROI PAC modules (Rosen et al., 2004), with original routines rearranged and combined with new routines to process in series and in a common radar geometry all SAR images and interferograms. A new feature of the software is the range-dependent spectral filtering to improve coherence in interferograms with long spatial baselines. Additional components include a module to estimate and remove digital elevation model errors before unwrapping, a module to mitigate the effects of the atmospheric phase delay and remove residual orbit errors, and a module to construct the phase change time series from small baseline interferograms (Berardino et al. 2002). This paper describes the main elements of the processing chain and presents an example of application of the software using a data set from the ENVISAT mission covering the Etna volcano.

  7. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knio, Omar M.

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less

  8. Transforming the Geocomputational Battlespace Framework with HDF5

    DTIC Science & Technology

    2010-08-01

    layout level, dataset arrays can be stored in chunks or tiles , enabling fast subsetting of large datasets, including compressed datasets. HDF software...Image Base (CIB) image of the AOI: an orthophoto made from rectified grayscale aerial images b. An IKONOS satellite image made up of 3 spectral

  9. Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry.

    PubMed

    Villarrubia, J S; Tondare, V N; Vladár, A E

    2016-01-01

    The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples-mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.

  10. Software-Defined Architectures for Spectrally Efficient Cognitive Networking in Extreme Environments

    NASA Astrophysics Data System (ADS)

    Sklivanitis, Georgios

    The objective of this dissertation is the design, development, and experimental evaluation of novel algorithms and reconfigurable radio architectures for spectrally efficient cognitive networking in terrestrial, airborne, and underwater environments. Next-generation wireless communication architectures and networking protocols that maximize spectrum utilization efficiency in congested/contested or low-spectral availability (extreme) communication environments can enable a rich body of applications with unprecedented societal impact. In recent years, underwater wireless networks have attracted significant attention for military and commercial applications including oceanographic data collection, disaster prevention, tactical surveillance, offshore exploration, and pollution monitoring. Unmanned aerial systems that are autonomously networked and fully mobile can assist humans in extreme or difficult-to-reach environments and provide cost-effective wireless connectivity for devices without infrastructure coverage. Cognitive radio (CR) has emerged as a promising technology to maximize spectral efficiency in dynamically changing communication environments by adaptively reconfiguring radio communication parameters. At the same time, the fast developing technology of software-defined radio (SDR) platforms has enabled hardware realization of cognitive radio algorithms for opportunistic spectrum access. However, existing algorithmic designs and protocols for shared spectrum access do not effectively capture the interdependencies between radio parameters at the physical (PHY), medium-access control (MAC), and network (NET) layers of the network protocol stack. In addition, existing off-the-shelf radio platforms and SDR programmable architectures are far from fulfilling runtime adaptation and reconfiguration across PHY, MAC, and NET layers. Spectrum allocation in cognitive networks with multi-hop communication requirements depends on the location, network traffic load, and interference profile at each network node. As a result, the development and implementation of algorithms and cross-layer reconfigurable radio platforms that can jointly treat space, time, and frequency as a unified resource to be dynamically optimized according to inter- and intra-network interference constraints is of fundamental importance. In the next chapters, we present novel algorithmic and software/hardware implementation developments toward the deployment of spectrally efficient terrestrial, airborne, and underwater wireless networks. In Chapter 1 we review the state-of-art in commercially available SDR platforms, describe their software and hardware capabilities, and classify them based on their ability to enable rapid prototyping and advance experimental research in wireless networks. Chapter 2 discusses system design and implementation details toward real-time evaluation of a software-radio platform for all-spectrum cognitive channelization in the presence of narrowband or wideband primary stations. All-spectrum channelization is achieved by designing maximum signal-to-interference-plus-noise ratio (SINR) waveforms that span the whole continuum of the device-accessible spectrum, while satisfying peak power and interference temperature (IT) constraints for the secondary and primary users, respectively. In Chapter 3, we introduce the concept of all-spectrum channelization based on max-SINR optimized sparse-binary waveforms, we propose optimal and suboptimal waveform design algorithms, and evaluate their SINR and bit-error-rate (BER) performance in an SDR testbed. Chapter 4 considers the problem of channel estimation with minimal pilot signaling in multi-cell multi-user multi-input multi-output (MIMO) systems with very large antenna arrays at the base station, and proposes a least-squares (LS)-type algorithm that iteratively extracts channel and data estimates from a short record of data measurements. Our algorithmic developments toward spectrally-efficient cognitive networking through joint optimization of channel access code-waveforms and routes in a multi-hop network are described in Chapter 5. Algorithmic designs are software optimized on heterogeneous multi-core general-purpose processor (GPP)-based SDR architectures by leveraging a novel software-radio framework that offers self-optimization and real-time adaptation capabilities at the PHY, MAC, and NET layers of the network protocol stack. Our system design approach is experimentally validated under realistic conditions in a large-scale hybrid ground-air testbed deployment. Chapter 6 reviews the state-of-art in software and hardware platforms for underwater wireless networking and proposes a software-defined acoustic modem prototype that enables (i) cognitive reconfiguration of PHY/MAC parameters, and (ii) cross-technology communication adaptation. The proposed modem design is evaluated in terms of effective communication data rate in both water tank and lake testbed setups. In Chapter 7, we present a novel receiver configuration for code-waveform-based multiple-access underwater communications. The proposed receiver is fully reconfigurable and executes (i) all-spectrum cognitive channelization, and (ii) combined synchronization, channel estimation, and demodulation. Experimental evaluation in terms of SINR and BER show that all-spectrum channelization is a powerful proposition for underwater communications. At the same time, the proposed receiver design can significantly enhance bandwidth utilization. Finally, in Chapter 8, we focus on challenging practical issues that arise in underwater acoustic sensor network setups where co-located multi-antenna sensor deployment is not feasible due to power, computation, and hardware limitations, and design, implement, and evaluate an underwater receiver structure that accounts for multiple carrier frequency and timing offsets in virtual (distributed) MIMO underwater systems.

  11. Imaging acoustic vibrations in an ear model using spectrally encoded interferometry

    NASA Astrophysics Data System (ADS)

    Grechin, Sveta; Yelin, Dvir

    2018-01-01

    Imaging vibrational patterns of the tympanic membrane would allow an accurate measurement of its mechanical properties and provide early diagnosis of various hearing disorders. Various optical technologies have been suggested to address this challenge and demonstrated in vitro using point scanning and full-field interferometry. Spectrally encoded imaging has been previously demonstrated capable of imaging tissue acoustic vibrations with high spatial resolution, including two-dimensional phase and amplitude mapping. In this work, we demonstrate a compact optical apparatus for imaging acoustic vibrations that could be incorporated into a commercially available digital otoscope. By transmitting harmonic sound waves through the otoscope insufflation port and analyzing the spectral interferograms using custom-built software, we demonstrate high-resolution vibration imaging of a circular rubber membrane within an ear model.

  12. A COST EFFECTIVE MULTI-SPECTRAL SCANNER FOR NATURAL GAS DETECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yudaya Sivathanu; Jongmook Lim; Vinoo Narayanan

    The objective of this project is to design, fabricate and field demonstrate a cost effective, multi-spectral scanner for natural gas leak detection in transmission and distribution pipelines. During the first six months of the project, the design for a laboratory version of the multispectral scanner was completed. The optical, mechanical, and electronic design for the scanner was completed. The optical design was analyzed using Zeemax Optical Design software and found to provide sufficiently resolved performance for the scanner. The electronic design was evaluated using a bread board and very high signal to noise ratios were obtained. Fabrication of a laboratorymore » version of the multi-spectral scanner is currently in progress. A technology status report and a research management plan was also completed during the same period.« less

  13. Curatr: a web application for creating, curating and sharing a mass spectral library.

    PubMed

    Palmer, Andrew; Phapale, Prasad; Fay, Dominik; Alexandrov, Theodore

    2018-04-15

    We have developed a web application curatr for the rapid generation of high quality mass spectral fragmentation libraries from liquid-chromatography mass spectrometry datasets. Curatr handles datasets from single or multiplexed standards and extracts chromatographic profiles and potential fragmentation spectra for multiple adducts. An intuitive interface helps users to select high quality spectra that are stored along with searchable molecular information, the providence of each standard and experimental metadata. Curatr supports exports to several standard formats for use with third party software or submission to repositories. We demonstrate the use of curatr to generate the EMBL Metabolomics Core Facility spectral library http://curatr.mcf.embl.de. Source code and example data are at http://github.com/alexandrovteam/curatr/. palmer@embl.de. Supplementary data are available at Bioinformatics online.

  14. Sharpending of the Vnir and SWIR Bands of the Wide Band Spectral Imager Onboard Tiangong-II Imagery Using the Selected Bands

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Li, X.; Liu, G.; Huang, C.; Li, H.; Guan, X.

    2018-04-01

    The Tiangong-II space lab was launched at the Jiuquan Satellite Launch Center of China on September 15, 2016. The Wide Band Spectral Imager (WBSI) onboard the Tiangong-II has 14 visible and near-infrared (VNIR) spectral bands covering the range from 403-990 nm and two shortwave infrared (SWIR) bands covering the range from 1230-1250 nm and 1628-1652 nm respectively. In this paper the selected bands are proposed which aims at considering the closest spectral similarities between the VNIR with 100 m spatial resolution and SWIR bands with 200 m spatial resolution. The evaluation of Gram-Schmidt transform (GS) sharpening techniques embedded in ENVI software is presented based on four types of the different low resolution pan band. The experimental results indicated that the VNIR band with higher CC value with the raw SWIR Band was selected, more texture information was injected the corresponding sharpened SWIR band image, and at that time another sharpened SWIR band image preserve the similar spectral and texture characteristics to the raw SWIR band image.

  15. Assessment of AVIRIS data from vegetated sites in the Owens Valley, California

    NASA Technical Reports Server (NTRS)

    Rock, B. N.; Elvidge, Christopher D.; Defeo, N. J.

    1988-01-01

    Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were acquired from the Bishop, CA area, located at the northern end of the Owens Valley, on July 30, 1987. Radiometrically-corrected AVIRIS data were flat-field corrected, and spectral curves produced and analyzed for pixels taken from both native and cultivated vegetation sites, using the JPS SPAM software program and PC-based spreadsheet programs. Analyses focussed on the chlorophyll well and red edge portions of the spectral curves. Results include the following: AVIRIS spectral data are acquired at sufficient spectral resolution to allow detection of blue shifts of both the chlorophyll well and red edge in moisture-stressed vegetation when compared with non-stressed vegetation; a normalization of selected parameters (chlorophyll well and near infrared shoulder) may be used to emphasize the shift in red edge position; and the presence of the red edge in AVIRIS spectral curves may be useful in detecting small amounts (20 to 30 pct cover) of semi-arid and arid vegetation ground cover. A discussion of possible causes of AVIRIS red edge shifts in respsonse to stress is presented.

  16. Analysis of human plasma metabolites across different liquid chromatography/mass spectrometry platforms: Cross-platform transferable chemical signatures.

    PubMed

    Telu, Kelly H; Yan, Xinjian; Wallace, William E; Stein, Stephen E; Simón-Manso, Yamil

    2016-03-15

    The metabolite profiling of a NIST plasma Standard Reference Material (SRM 1950) on different liquid chromatography/mass spectrometry (LC/MS) platforms showed significant differences. Although these findings suggest caution when interpreting metabolomics results, the degree of overlap of both profiles allowed us to use tandem mass spectral libraries of recurrent spectra to evaluate to what extent these results are transferable across platforms and to develop cross-platform chemical signatures. Non-targeted global metabolite profiles of SRM 1950 were obtained on different LC/MS platforms using reversed-phase chromatography and different chromatographic scales (conventional HPLC, UHPLC and nanoLC). The data processing and the metabolite differential analysis were carried out using publically available (XCMS), proprietary (Mass Profiler Professional) and in-house software (NIST pipeline). Repeatability and intermediate precision showed that the non-targeted SRM 1950 profiling was highly reproducible when working on the same platform (relative standard deviation (RSD) <2%); however, substantial differences were found in the LC/MS patterns originating on different platforms or even using different chromatographic scales (conventional HPLC, UHPLC and nanoLC) on the same platform. A substantial degree of overlap (common molecular features) was also found. A procedure to generate consistent chemical signatures using tandem mass spectral libraries of recurrent spectra is proposed. Different platforms rendered significantly different metabolite profiles, but the results were highly reproducible when working within one platform. Tandem mass spectral libraries of recurrent spectra are proposed to evaluate the degree of transferability of chemical signatures generated on different platforms. Chemical signatures based on our procedure are most likely cross-platform transferable. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  17. Advanced Ecosystem Mapping Techniques for Large Arctic Study Domains Using Calibrated High-Resolution Imagery

    NASA Astrophysics Data System (ADS)

    Macander, M. J.; Frost, G. V., Jr.

    2015-12-01

    Regional-scale mapping of vegetation and other ecosystem properties has traditionally relied on medium-resolution remote sensing such as Landsat (30 m) and MODIS (250 m). Yet, the burgeoning availability of high-resolution (<=2 m) imagery and ongoing advances in computing power and analysis tools raises the prospect of performing ecosystem mapping at fine spatial scales over large study domains. Here we demonstrate cutting-edge mapping approaches over a ~35,000 km² study area on Alaska's North Slope using calibrated and atmospherically-corrected mosaics of high-resolution WorldView-2 and GeoEye-1 imagery: (1) an a priori spectral approach incorporating the Satellite Imagery Automatic Mapper (SIAM) algorithms; (2) image segmentation techniques; and (3) texture metrics. The SIAM spectral approach classifies radiometrically-calibrated imagery to general vegetation density categories and non-vegetated classes. The SIAM classes were developed globally and their applicability in arctic tundra environments has not been previously evaluated. Image segmentation, or object-based image analysis, automatically partitions high-resolution imagery into homogeneous image regions that can then be analyzed based on spectral, textural, and contextual information. We applied eCognition software to delineate waterbodies and vegetation classes, in combination with other techniques. Texture metrics were evaluated to determine the feasibility of using high-resolution imagery to algorithmically characterize periglacial surface forms (e.g., ice-wedge polygons), which are an important physical characteristic of permafrost-dominated regions but which cannot be distinguished by medium-resolution remote sensing. These advanced mapping techniques provide products which can provide essential information supporting a broad range of ecosystem science and land-use planning applications in northern Alaska and elsewhere in the circumpolar Arctic.

  18. Radial Profiles of PKS 0745-191 Galaxy Cluster with XMM-Newton X-Ray Observations

    NASA Astrophysics Data System (ADS)

    Tumer, A.; Ezer, C.; Ercan, E.

    2017-10-01

    Since clusters of galaxies are the largest comprehensive samples of the universe, they provide essential information on from the most basic to the most complex physical mechanisms such as nucleosynthesis and supernovae events. Some of these information are provided by the X-ray emission data from Intra Cluster Medium (ICM) which contains hot dilute gas. Recent archieved observation of the X-Ray spectrum of the cool core galaxy cluster PKS 0745-191 provided by XMM-Newton is subjected to data analysis using ESAS package. Followed by spectra analysis utilizing Xspec spectral fitting software, we present the radial profiles of temperature and abundance from the core to 0.5R_500 of brightest distant cluster (z ˜ 0.102) PKS 0745-191. Using the deprojected spectra, the radial distribution of pressure and entropy in the aforementioned region are also presented.

  19. Specter: linear deconvolution for targeted analysis of data-independent acquisition mass spectrometry proteomics.

    PubMed

    Peckner, Ryan; Myers, Samuel A; Jacome, Alvaro Sebastian Vaca; Egertson, Jarrett D; Abelin, Jennifer G; MacCoss, Michael J; Carr, Steven A; Jaffe, Jacob D

    2018-05-01

    Mass spectrometry with data-independent acquisition (DIA) is a promising method to improve the comprehensiveness and reproducibility of targeted and discovery proteomics, in theory by systematically measuring all peptide precursors in a biological sample. However, the analytical challenges involved in discriminating between peptides with similar sequences in convoluted spectra have limited its applicability in important cases, such as the detection of single-nucleotide polymorphisms (SNPs) and alternative site localizations in phosphoproteomics data. We report Specter (https://github.com/rpeckner-broad/Specter), an open-source software tool that uses linear algebra to deconvolute DIA mixture spectra directly through comparison to a spectral library, thus circumventing the problems associated with typical fragment-correlation-based approaches. We validate the sensitivity of Specter and its performance relative to that of other methods, and show that Specter is able to successfully analyze cases involving highly similar peptides that are typically challenging for DIA analysis methods.

  20. Effect of microwave argon plasma on the glycosidic and hydrogen bonding system of cotton cellulose.

    PubMed

    Prabhu, S; Vaideki, K; Anitha, S

    2017-01-20

    Cotton fabric was processed with microwave (Ar) plasma to alter its hydrophilicity. The process parameters namely microwave power, process gas pressure and processing time were optimized using Box-Behnken method available in the Design Expert software. It was observed that certain combinations of process parameters improved existing hydrophilicity while the other combinations decreased it. ATR-FTIR spectral analysis was used to identify the strain induced in inter chain, intra chain, and inter sheet hydrogen bond and glycosidic covalent bond due to plasma treatment. X-ray diffraction (XRD) studies was used to analyze the effect of plasma on unit cell parameters and degree of crystallinity. Fabric surface etching was identified using FESEM analysis. Thus, it can be concluded that the increase/decrease in the hydrophilicity of the plasma treated fabric was due to these structural and physical changes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Airborne multicamera system for geo-spatial applications

    NASA Astrophysics Data System (ADS)

    Bachnak, Rafic; Kulkarni, Rahul R.; Lyle, Stacey; Steidley, Carl W.

    2003-08-01

    Airborne remote sensing has many applications that include vegetation detection, oceanography, marine biology, geographical information systems, and environmental coastal science analysis. Remotely sensed images, for example, can be used to study the aftermath of episodic events such as the hurricanes and floods that occur year round in the coastal bend area of Corpus Christi. This paper describes an Airborne Multi-Spectral Imaging System that uses digital cameras to provide high resolution at very high rates. The software is based on Delphi 5.0 and IC Imaging Control's ActiveX controls. Both time and the GPS coordinates are recorded. Three successful test flights have been conducted so far. The paper present flight test results and discusses the issues being addressed to fully develop the system.

  2. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  3. Design and experiment of spectrometer based on scanning micro-grating integrating with angle sensor

    NASA Astrophysics Data System (ADS)

    Biao, Luo; Wen, Zhi-yu

    2014-01-01

    A compact, low cost, high speed, non-destructive testing NIR (near infrared) spectrometer optical system based on MOEMS grating device is developed. The MOEMS grating works as the prismatic element and wavelength scanning element in our optical system. The MOEMS grating enables the design of compact grating spectrometers capable of acquiring full spectra using a single detector element. This MOEMS grating is driven by electromagnetic force and integrated with angle sensor which used to monitored deflection angle while the grating working. Comparing with the traditional spectral system, there is a new structure with a single detector and worked at high frequency. With the characteristics of MOEMS grating, the structure of the spectrometer system is proposed. After calculating the parameters of the optical path, ZEMAX optical software is used to simulate the system. According the ZEMAX output file of the 3D model, the prototype is designed by SolidWorks rapidly, fabricated. Designed for a wavelength range between 800 nm and 1500 nm, the spectrometer optical system features a spectral resolution of 16 nm with the volume of 97 mm × 81.7 mm × 81 mm. For the purpose of reduce modulated effect of sinusoidal rotation, spectral intensity of the different wavelength should be compensated by software method in the further. The system satisfies the demand of NIR micro-spectrometer with a single detector.

  4. Dual ant colony operational modal analysis parameter estimation method

    NASA Astrophysics Data System (ADS)

    Sitarz, Piotr; Powałka, Bartosz

    2018-01-01

    Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.

  5. Radio-science performance analysis software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  6. Radio-Science Performance Analysis Software

    NASA Astrophysics Data System (ADS)

    Morabito, D. D.; Asmar, S. W.

    1994-10-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussion on operating the program set on Galileo and Ulysses data will be presented.

  7. Radio-science performance analysis software

    NASA Technical Reports Server (NTRS)

    Morabito, D. D.; Asmar, S. W.

    1995-01-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  8. First Retrieval of Surface Lambert Albedos From Mars Reconnaissance Orbiter CRISM Data

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Arvidson, R. E.; Murchie, S. L.; Wolff, M. J.; Smith, M. D.; Martin, T. Z.; Milliken, R. E.; Mustard, J. F.; Pelkey, S. M.; Lichtenberg, K. A.; Cavender, P. J.; Humm, D. C.; Titus, T. N.; Malaret, E. R.

    2006-12-01

    We have developed a pipeline-processing software system to convert radiance-on-sensor for each of 72 out of 544 CRISM spectral bands used in global mapping to the corresponding surface Lambert albedo, accounting for atmospheric, thermal, and photoclinometric effects. We will present and interpret first results from this software system for the retrieval of Lambert albedos from CRISM data. For the multispectral mapping modes, these pipeline-processed 72 spectral bands constitute all of the available bands, for wavelengths from 0.362-3.920 μm, at 100-200 m/pixel spatial resolution, and ~ 0.006\\spaceμm spectral resolution. For the hyperspectral targeted modes, these pipeline-processed 72 spectral bands are only a selection of all of the 544 spectral bands, but at a resolution of 15-38 m/pixel. The pipeline processing for both types of observing modes (multispectral and hyperspectral) will use climatology, based on data from MGS/TES, in order to estimate ice- and dust-aerosol optical depths, prior to the atmospheric correction with lookup tables based upon radiative-transport calculations via DISORT. There is one DISORT atmospheric-correction lookup table for converting radiance-on-sensor to Lambert albedo for each of the 72 spectral bands. The measurements of the Emission Phase Function (EPF) during targeting will not be employed in this pipeline processing system. We are developing a separate system for extracting more accurate aerosol optical depths and surface scattering properties. This separate system will use direct calls (instead of lookup tables) to the DISORT code for all 544 bands, and it will use the EPF data directly, bootstrapping from the climatology data for the aerosol optical depths. The pipeline processing will thermally correct the albedos for the spectral bands above ~ 2.6 μm, by a choice between 4 different techniques for determining surface temperature: 1) climatology, 2) empirical estimation of the albedo at 3.9 μm from the measured albedo at 2.5 μm, 3) a physical thermal model (PTM) based upon maps of thermal inertia from TES and coarse-resolution surface slopes (SS) from MOLA, and 4) a photoclinometric extension to the PTM that uses CRISM albedos at 0.41 μm to compute the SS at CRISM spatial resolution. For the thermal correction, we expect that each of these 4 different techniques will be valuable for some fraction of the observations.

  9. Development of Translational Methods in Spectral Analysis of Human Infant Crying and Rat Pup Ultrasonic Vocalizations for Early Neurobehavioral Assessment

    PubMed Central

    Zeskind, Philip Sanford; McMurray, Matthew S.; Garber, Kristin A.; Neuspiel, Juliana M.; Cox, Elizabeth T.; Grewen, Karen M.; Mayes, Linda C.; Johns, Josephine M.

    2011-01-01

    The purpose of this article is to describe the development of translational methods by which spectrum analysis of human infant crying and rat pup ultrasonic vocalizations (USVs) can be used to assess potentially adverse effects of various prenatal conditions on early neurobehavioral development. The study of human infant crying has resulted in a rich set of measures that has long been used to assess early neurobehavioral insult due to non-optimal prenatal environments, even among seemingly healthy newborn and young infants. In another domain of study, the analysis of rat put USVs has been conducted via paradigms that allow for better experimental control over correlated prenatal conditions that may confound findings and conclusions regarding the effects of specific prenatal experiences. The development of translational methods by which cry vocalizations of both species can be analyzed may provide the opportunity for findings from the two approaches of inquiry to inform one another through their respective strengths. To this end, we present an enhanced taxonomy of a novel set of common measures of cry vocalizations of both human infants and rat pups based on a conceptual framework that emphasizes infant crying as a graded and dynamic acoustic signal. This set includes latency to vocalization onset, duration and repetition rate of expiratory components, duration of inter-vocalization-intervals and spectral features of the sound, including the frequency and amplitude of the fundamental and dominant frequencies. We also present a new set of classifications of rat pup USV waveforms that include qualitative shifts in fundamental frequency, similar to the presence of qualitative shifts in fundamental frequency that have previously been related to insults to neurobehavioral integrity in human infants. Challenges to the development of translational analyses, including the use of different terminologies, methods of recording, and spectral analyses are discussed, as well as descriptions of automated processes, software solutions, and pitfalls. PMID:22028695

  10. Platform for Postprocessing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don

    2008-01-01

    Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).

  11. Active multispectral reflection fingerprinting of persistent chemical agents

    NASA Astrophysics Data System (ADS)

    Tholl, H. D.; Münzhuber, F.; Kunz, J.; Raab, M.; Rattunde, M.; Hugger, S.; Gutty, F.; Grisard, A.; Larat, C.; Papillon, D.; Schwarz, M.; Lallier, E.; Kastek, M.; Piatkowski, T.; Brygo, F.; Awanzino, C.; Wilsenack, F.; Lorenzen, A.

    2017-10-01

    Remote detection of toxic chemicals of very low vapour pressure deposited on surfaces in form of liquid films, droplets or powder is a capability that is needed to protect operators and equipment in chemical warfare scenarios and in industrial environments. Infrared spectroscopy is a suitable means to support this requirement. Available instruments based on passive emission spectroscopy have difficulties in discriminating the infrared emission spectrum of the surface background from that of the contamination. Separation of background and contamination is eased by illuminating the surface with a spectrally tune-able light source and by analyzing the reflectivity spectrum. The project AMURFOCAL (Active Multispectral Reflection Fingerprinting of Persistent Chemical Agents) has the research topic of stand-off detection and identification of chemical warfare agents (CWAs) with amplified quantum cascade laser technology in the long-wave infrared spectral range. The project was conducted under the Joint Investment Programme (JIP) on CBRN protection funded through the European Defence Agency (EDA). The AMURFOCAL instrument comprises a spectrally narrow tune-able light source with a broadband infrared detector and chemometric data analysis software. The light source combines an external cavity quantum cascade laser (EC-QCL) with an optical parametric amplifier (OPA) to boost the peak output power of a short laser pulse tune-able over the infrared fingerprint region. The laser beam is focused onto a target at a distance between 10 and 20 m. A 3D data cube is registered by tuning the wavelength of the laser emission while recording the received signal scattered off the target using a multi-element infrared detector. A particular chemical is identified through the extraction of its characteristic spectral fingerprint out of the measured data. The paper describes the AMURFOCAL instrument, its functional units, and its principles of operation.

  12. DEEP WIDEBAND SINGLE POINTINGS AND MOSAICS IN RADIO INTERFEROMETRY: HOW ACCURATELY DO WE RECONSTRUCT INTENSITIES AND SPECTRAL INDICES OF FAINT SOURCES?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rau, U.; Bhatnagar, S.; Owen, F. N., E-mail: rurvashi@nrao.edu

    Many deep wideband wide-field radio interferometric surveys are being designed to accurately measure intensities, spectral indices, and polarization properties of faint source populations. In this paper, we compare various wideband imaging methods to evaluate the accuracy to which intensities and spectral indices of sources close to the confusion limit can be reconstructed. We simulated a wideband single-pointing (C-array, L-Band (1–2 GHz)) and 46-pointing mosaic (D-array, C-Band (4–8 GHz)) JVLA observation using a realistic brightness distribution ranging from 1 μ Jy to 100 mJy and time-, frequency-, polarization-, and direction-dependent instrumental effects. The main results from these comparisons are (a) errors in themore » reconstructed intensities and spectral indices are larger for weaker sources even in the absence of simulated noise, (b) errors are systematically lower for joint reconstruction methods (such as Multi-Term Multi-Frequency-Synthesis (MT-MFS)) along with A-Projection for accurate primary beam correction, and (c) use of MT-MFS for image reconstruction eliminates Clean-bias (which is present otherwise). Auxiliary tests include solutions for deficiencies of data partitioning methods (e.g., the use of masks to remove clean bias and hybrid methods to remove sidelobes from sources left un-deconvolved), the effect of sources not at pixel centers, and the consequences of various other numerical approximations within software implementations. This paper also demonstrates the level of detail at which such simulations must be done in order to reflect reality, enable one to systematically identify specific reasons for every trend that is observed, and to estimate scientifically defensible imaging performance metrics and the associated computational complexity of the algorithms/analysis procedures.« less

  13. Terahertz Spectroscopy and Solid-State Density Functional Theory Calculations of Cyanobenzaldehyde Isomers.

    PubMed

    Dash, Jyotirmayee; Ray, Shaumik; Nallappan, Kathirvel; Kaware, Vaibhav; Basutkar, Nitin; Gonnade, Rajesh G; Ambade, Ashootosh V; Joshi, Kavita; Pesala, Bala

    2015-07-23

    Spectral signatures in the terahertz (THz) frequency region are mainly due to bulk vibrations of the molecules. These resonances are highly sensitive to the relative position of atoms in a molecule as well as the crystal packing arrangement. To understand the variation of THz resonances, THz spectra (2-10 THz) of three structural isomers: 2-, 3-, and 4-cyanobenzaldehyde have been studied. THz spectra obtained from Fourier transform infrared (FTIR) spectrometry of these isomers show that the resonances are distinctly different especially below 5 THz. For understanding the intermolecular interactions due to hydrogen bonds, four molecule cluster simulations of each of the isomers have been carried out using the B3LYP density functional with the 6-31G(d,p) basis set in Gaussian09 software and the compliance constants are obtained. However, to understand the exact reason behind the observed resonances, simulation of each isomer considering the full crystal structure is essential. The crystal structure of each isomer has been determined using X-ray diffraction (XRD) analysis for carrying out crystal structure simulations. Density functional theory (DFT) simulations using CRYSTAL14 software, utilizing the hybrid density functional B3LYP, have been carried out to understand the vibrational modes. The bond lengths and bond angles from the optimized structures are compared with the XRD results in terms of root-mean-square-deviation (RMSD) values. Very low RMSD values confirm the overall accuracy of the results. The simulations are able to predict most of the spectral features exhibited by the isomers. The results show that low frequency modes (<3 THz) are mediated through hydrogen bonds and are dominated by intermolecular vibrations.

  14. Retinal oxygen saturation evaluation by multi-spectral fundus imaging

    NASA Astrophysics Data System (ADS)

    Khoobehi, Bahram; Ning, Jinfeng; Puissegur, Elise; Bordeaux, Kimberly; Balasubramanian, Madhusudhanan; Beach, James

    2007-03-01

    Purpose: To develop a multi-spectral method to measure oxygen saturation of the retina in the human eye. Methods: Five Cynomolgus monkeys with normal eyes were anesthetized with intramuscular ketamine/xylazine and intravenous pentobarbital. Multi-spectral fundus imaging was performed in five monkeys with a commercial fundus camera equipped with a liquid crystal tuned filter in the illumination light path and a 16-bit digital camera. Recording parameters were controlled with software written specifically for the application. Seven images at successively longer oxygen-sensing wavelengths were recorded within 4 seconds. Individual images for each wavelength were captured in less than 100 msec of flash illumination. Slightly misaligned images of separate wavelengths due to slight eye motion were registered and corrected by translational and rotational image registration prior to analysis. Numerical values of relative oxygen saturation of retinal arteries and veins and the underlying tissue in between the artery/vein pairs were evaluated by an algorithm previously described, but which is now corrected for blood volume from averaged pixels (n > 1000). Color saturation maps were constructed by applying the algorithm at each image pixel using a Matlab script. Results: Both the numerical values of relative oxygen saturation and the saturation maps correspond to the physiological condition, that is, in a normal retina, the artery is more saturated than the tissue and the tissue is more saturated than the vein. With the multi-spectral fundus camera and proper registration of the multi-wavelength images, we were able to determine oxygen saturation in the primate retinal structures on a tolerable time scale which is applicable to human subjects. Conclusions: Seven wavelength multi-spectral imagery can be used to measure oxygen saturation in retinal artery, vein, and tissue (microcirculation). This technique is safe and can be used to monitor oxygen uptake in humans. This work is original and is not under consideration for publication elsewhere.

  15. Solar Spectral Irradiance Variability of Some Chromospheric Emission Lines Through the Solar Activity Cycles 21-23

    NASA Astrophysics Data System (ADS)

    Göker, Ü. D.; Gigolashvili, M. Sh.; Kapanadze, N.

    2017-06-01

    A study of variations of solar spectral irradiance (SSI) in the wavelength ranges 121.5 nm-300.5 nm for the period 1981-2009 is presented. We used various data for ultraviolet (UV) spectral lines and international sunspot number (ISSN) from interactive data centers such as SME (NSSDC), UARS (GDAAC), SORCE (LISIRD) and SIDC, respectively. We reduced these data by using the MATLAB software package. In this respect, we revealed negative correlations of intensities of UV (289.5 nm-300.5 nm) spectral lines originating in the solar chromosphere with the ISSN index during the unusually prolonged minimum between the solar activity cycles (SACs) 23 and 24. We also compared our results with the variations of solar activity indices obtained by the ground-based telescopes. Therefore, we found that plage regions decrease while facular areas are increasing in SAC 23. However, the decrease in plage regions is seen in small sunspot groups (SGs), contrary to this, these regions in large SGs are comparable to previous SACs or even larger as is also seen in facular areas. Nevertheless, negative correlations between ISSN and SSI data indicate that these variations are in close connection with the classes of sunspots/SGs, faculae and plage regions. Finally, we applied the time series analysis of spectral lines corresponding to the wavelengths 121.5 nm-300.5 nm and made comparisons with the ISSN data. We found an unexpected increase in the 298.5 nm line for the Fe II ion. The variability of Fe II ion 298.5 nm line is in close connection with the facular areas and plage regions, and the sizes of these solar surface indices play an important role for the SSI variability, as well. So, we compared the connection between the sizes of faculae and plage regions, sunspots/SGs, chemical elements and SSI variability. Our future work will be the theoretical study of this connection and developing of a corresponding model.

  16. Paleo-productivity changes revealed by spectral analysis performed on coccoliths assemblages

    NASA Astrophysics Data System (ADS)

    Palumbo, Eliana; Ornella Amore, Filomena; Perugia, Carmen

    2010-05-01

    Several climate changes occurred over geological time at different time-scales. Spectral analyses performed on paleo-climate data suggested that these cyclicities verify irregularly into time-space domain. Paleo-climate oscillations occur with high or low frequencies dues to the oscillation of the major orbital parameters (characterized by low frequencies and high period) and some minor high-frequencies events. During last years, analyses on frequencies domain have been performed also on coccoliths assemblages. Coccolithophores are a special phytoplankton group living today at all latitude regions within the photic zone (0-200 m of depth) (Winter & Siesser, 1994). They are sensitive indicators of environmental conditions because they directly depend on temperature, salinity and nutrients as well as the availability of sunlight (McIntyre and Bé, 1967; Giradeau et al., 1993; Winter & Siesser, 1994; Baumann & Freitag, 2004). Therefore coccolithophores quickly respond to fluctuations in climate as well as changes in surface-water conditions (Baumann & Freitag, 2004). Thus coccoliths can be clearly used as paleo-climate data because of their power of recordering and amplifying climatic change signals. In addition, primary productivity depends on the amount of insolation received by Earth surface. In this study Sun insolation has been calculated in terms of intensity and energy, in order to compare them with maximum productivity activity. Precession controls sun intensity insolation, while the energy is controlled by obliquity. Thus, the intensity depends on the duration of the insolation,while the energy is connected to the amount of insolation (Berger, 1978; Loutre et al., 2004; Huybers, 2006). In this study, spectral analyses have been performed on coccoliths data with the result of individuating high and low frequencies content in productivity signals. Auto-spectral and cross-spectral analyses have been performed through Matlab software using several available functions plus a new function created in order to evaluate cross-wavelet power spectra. Auto-spectral analysis aims to describe the distribution of variance contained in each single signal over frequency or wavelength, while cross-spectral analysis correlates two time series in the frequency domain (Trauth, 2009). We have performed spectral analyses using the complex Fourier transform and the Short time Fourier transform. Both the transforms lose any kind of time information in transforming the signal from time to frequency domain (Jenkins and Watt, 1968). These transforms don't allow us to individuate when an event occurred in the past. In order to overcome this limit we have also applied Wavelet analysis which represents frequency content of a signal over the time thus it allows us to visualize when an event occurred into time domain (Torrence and Compo, 1998; Prokoph and El Bilali, 2008; Grinsted et al., 2004). Moreover we have performed a simple cross and a cross-spectral analysis between different proxy groups to discover their possible correlations into time and frequency domains. References. Berger, A., 1978. J. Atmos. Sc., 35 (12): 2362-2367. Baumann, K.-H., and Freitag, T., 2004. Marine Micropaleontology 52: 195-215. Giraudeau, J., Monteiro, P.M.S., Nikodemus, K., 1993. Mar. Micropalaeontol. 22: 93- 110. Grinsted, A., Moore, J. C., and Jevrejeva, S., 2004. Nonlinear Processes in Geophysics 11: 561-566. Huybers, P., 2006. Science 313: 508-511. Jenkins, G. M., and Watt, D. G., 1968. Holden Day, pp. 410, Oakland. Loutre, M. F., Paillard, D., Vimeux, F., and Cortijo, E., 2004. Earth Planet. Sci. Lett., 221, 1-14. McIntyre, A., and Bè, A.H.W., 1967. Deep-Sea Res. 14, pp. 561-597. Prokoph, A., and El Bilali, H., 2008. Math Geosciences 40: 575-586. Torrence, C., and Compo, G. P., 1998. Bulletin of American Meteorological Society 79:61-78. Trauth, M.H., 2009. Springer 288 p. Winter, A., and Siesser, W., 1994. Cambridge University Press 242 p.

  17. Clustering analysis of line indices for LAMOST spectra with AstroStat

    NASA Astrophysics Data System (ADS)

    Chen, Shu-Xin; Sun, Wei-Min; Yan, Qi

    2018-06-01

    The application of data mining in astronomical surveys, such as the Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) survey, provides an effective approach to automatically analyze a large amount of complex survey data. Unsupervised clustering could help astronomers find the associations and outliers in a big data set. In this paper, we employ the k-means method to perform clustering for the line index of LAMOST spectra with the powerful software AstroStat. Implementing the line index approach for analyzing astronomical spectra is an effective way to extract spectral features for low resolution spectra, which can represent the main spectral characteristics of stars. A total of 144 340 line indices for A type stars is analyzed through calculating their intra and inter distances between pairs of stars. For intra distance, we use the definition of Mahalanobis distance to explore the degree of clustering for each class, while for outlier detection, we define a local outlier factor for each spectrum. AstroStat furnishes a set of visualization tools for illustrating the analysis results. Checking the spectra detected as outliers, we find that most of them are problematic data and only a few correspond to rare astronomical objects. We show two examples of these outliers, a spectrum with abnormal continuumand a spectrum with emission lines. Our work demonstrates that line index clustering is a good method for examining data quality and identifying rare objects.

  18. Radiative transfer modeling through terrestrial atmosphere and ocean accounting for inelastic processes: Software package SCIATRAN

    NASA Astrophysics Data System (ADS)

    Rozanov, V. V.; Dinter, T.; Rozanov, A. V.; Wolanin, A.; Bracher, A.; Burrows, J. P.

    2017-06-01

    SCIATRAN is a comprehensive software package which is designed to model radiative transfer processes in the terrestrial atmosphere and ocean in the spectral range from the ultraviolet to the thermal infrared (0.18-40 μm). It accounts for multiple scattering processes, polarization, thermal emission and ocean-atmosphere coupling. The main goal of this paper is to present a recently developed version of SCIATRAN which takes into account accurately inelastic radiative processes in both the atmosphere and the ocean. In the scalar version of the coupled ocean-atmosphere radiative transfer solver presented by Rozanov et al. [61] we have implemented the simulation of the rotational Raman scattering, vibrational Raman scattering, chlorophyll and colored dissolved organic matter fluorescence. In this paper we discuss and explain the numerical methods used in SCIATRAN to solve the scalar radiative transfer equation including trans-spectral processes, and demonstrate how some selected radiative transfer problems are solved using the SCIATRAN package. In addition we present selected comparisons of SCIATRAN simulations with those published benchmark results, independent radiative transfer models, and various measurements from satellite, ground-based, and ship-borne instruments. The extended SCIATRAN software package along with a detailed User's Guide is made available for scientists and students, who are undertaking their own research typically at universities, via the web page of the Institute of Environmental Physics (IUP), University of Bremen: http://www.iup.physik.uni-bremen.de.

  19. SNSEDextend: SuperNova Spectral Energy Distributions extrapolation toolkit

    NASA Astrophysics Data System (ADS)

    Pierel, Justin D. R.; Rodney, Steven A.; Avelino, Arturo; Bianco, Federica; Foley, Ryan J.; Friedman, Andrew; Hicken, Malcolm; Hounsell, Rebekah; Jha, Saurabh W.; Kessler, Richard; Kirshner, Robert; Mandel, Kaisey; Narayan, Gautham; Filippenko, Alexei V.; Scolnic, Daniel; Strolger, Louis-Gregory

    2018-05-01

    SNSEDextend extrapolates core-collapse and Type Ia Spectral Energy Distributions (SEDs) into the UV and IR for use in simulations and photometric classifications. The user provides a library of existing SED templates (such as those in the authors' SN SED Repository) along with new photometric constraints in the UV and/or NIR wavelength ranges. The software then extends the existing template SEDs so their colors match the input data at all phases. SNSEDextend can also extend the SALT2 spectral time-series model for Type Ia SN for a "first-order" extrapolation of the SALT2 model components, suitable for use in survey simulations and photometric classification tools; as the code does not do a rigorous re-training of the SALT2 model, the results should not be relied on for precision applications such as light curve fitting for cosmology.

  20. In Silico Identification Software (ISIS): A Machine Learning Approach to Tandem Mass Spectral Identification of Lipids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kangas, Lars J.; Metz, Thomas O.; Isaac, Georgis

    2012-05-15

    Liquid chromatography-mass spectrometry-based metabolomics has gained importance in the life sciences, yet it is not supported by software tools for high throughput identification of metabolites based on their fragmentation spectra. An algorithm (ISIS: in silico identification software) and its implementation are presented and show great promise in generating in silico spectra of lipids for the purpose of structural identification. Instead of using chemical reaction rate equations or rules-based fragmentation libraries, the algorithm uses machine learning to find accurate bond cleavage rates in a mass spectrometer employing collision-induced dissocia-tion tandem mass spectrometry. A preliminary test of the algorithm with 45 lipidsmore » from a subset of lipid classes shows both high sensitivity and specificity.« less

  1. An HF and lower VHF spectrum assessment system exploiting instantaneously wideband capture

    NASA Astrophysics Data System (ADS)

    Barnes, Rod I.; Singh, Malkiat; Earl, Fred

    2017-09-01

    We report on a spectral environment evaluation and recording (SEER) system, for instantaneously wideband spectral capture and characterization in the HF and lower VHF band, utilizing a direct digital receiver coupled to a data recorder. The system is designed to contend with a wide variety of electromagnetic environments and to provide accurately calibrated spectral characterization and display from very short (ms) to synoptic scales. The system incorporates a novel RF front end involving automated gain and equalization filter selection which provides an analogue frequency-dependent gain characteristic that mitigates the high dynamic range found across the HF and lower VHF spectrum. The system accurately calibrates its own internal noise and automatically subtracts this from low variance, external spectral estimates, further extending the dynamic range over which robust characterization is possible. Laboratory and field experiments demonstrate that the implementation of these concepts has been effective. Sensitivity to varying antenna load impedance of the internal noise reduction process has been examined. Examples of software algorithms to provide extraction and visualization of spectral behavior over narrowband, wideband, short, and synoptic scales are provided. Application in HF noise spectral density monitoring, spectral signal strength assessment, and electromagnetic interference detection is possible with examples provided. The instantaneously full bandwidth collection provides some innovative applications, and this is demonstrated by the collection of discrete lightning emissions, which form fast ionograms called "flashagrams" in power-delay-frequency plots.

  2. (LMRG): Microscope Resolution, Objective Quality, Spectral Accuracy and Spectral Un-mixing

    PubMed Central

    Bayles, Carol J.; Cole, Richard W.; Eason, Brady; Girard, Anne-Marie; Jinadasa, Tushare; Martin, Karen; McNamara, George; Opansky, Cynthia; Schulz, Katherine; Thibault, Marc; Brown, Claire M.

    2012-01-01

    The second study by the LMRG focuses on measuring confocal laser scanning microscope (CLSM) resolution, objective lens quality, spectral imaging accuracy and spectral un-mixing. Affordable test samples for each aspect of the study were designed, prepared and sent to 116 labs from 23 countries across the globe. Detailed protocols were designed for the three tests and customized for most of the major confocal instruments being used by the study participants. One protocol developed for measuring resolution and objective quality was recently published in Nature Protocols (Cole, R. W., T. Jinadasa, et al. (2011). Nature Protocols 6(12): 1929–1941). The first study involved 3D imaging of sub-resolution fluorescent microspheres to determine the microscope point spread function. Results of the resolution studies as well as point spread function quality (i.e. objective lens quality) from 140 different objective lenses will be presented. The second study of spectral accuracy looked at the reflection of the laser excitation lines into the spectral detection in order to determine the accuracy of these systems to report back the accurate laser emission wavelengths. Results will be presented from 42 different spectral confocal systems. Finally, samples with double orange beads (orange core and orange coating) were imaged spectrally and the imaging software was used to un-mix fluorescence signals from the two orange dyes. Results from 26 different confocal systems will be summarized. Time will be left to discuss possibilities for the next LMRG study.

  3. Computational Modeling to Limit the Impact Displays and Indicator Lights Have on Habitable Volume Operational Lighting Constraints

    NASA Technical Reports Server (NTRS)

    Clark, T. A.; Brainard, G.; Salazar, G.; Johnston, S.; Schwing, B.; Litaker, H.; Kolomenski, A.; Venus, D.; Tran, K.; Hanifin, J.; hide

    2017-01-01

    NASA has demonstrated an interest in improving astronaut health and performance through the installment of a new lighting countermeasure on the International Space Station. The Solid State Lighting Assembly (SSLA) system is designed to positively influence astronaut health by providing a daily change to light spectrum to improve circadian entrainment. Unfortunately, existing NASA standards and requirements define ambient light level requirements for crew sleep and other tasks, yet the number of light-emitting diode (LED) indicators and displays within a habitable volume is currently uncontrolled. Because each of these light sources has its own unique spectral properties, the additive lighting environment ends up becoming something different from what was planned or researched. Restricting the use of displays and indicators is not a solution because these systems provide beneficial feedback to the crew. The research team for this grant used computer-based computational modeling and real-world lighting mockups to document the impact that light sources other than the ambient lighting system contribute to the ambient spectral lighting environment. In particular, the team was focused on understanding the impacts of long-term tasks located in front of avionics or computer displays. The team also wanted to understand options for mitigating the changes to the ambient light spectrum in the interest of maintaining the performance of a lighting countermeasure. The project utilized a variety of physical and computer-based simulations to determine direct relationships between system implementation and light spectrum. Using real-world data, computer models were built in the commercially available optics analysis software Zemax Optics Studio(c). The team also built a mockup test facility that had the same volume and configuration as one of the Zemax models. The team collected over 1200 spectral irradiance measurements, each representing a different configuration of the mockup. Analysis of the data showed a measurable impact on ambient light spectrum. This data showed that obvious design techniques exist that can be used to bind the ambient light spectrum closer to the planned spectral operating environment for the observer's eye point. The following observations should be considered when designing an operational environment that is dominated by computer displays. When more light is directed into the field of view of the observer, the greater the impact it will make on various human factors issues that depend on spectral shape and intensity. Because viewing angle has a large part to play in the amount of light flux on the crewmember's retina, beam shape, combined with light source location is an important factor for determining percent probable incident flux on the observer from any combination of light sources. Computer graphics design and display lumen output are major factors influencing the amount of spectrally intense light projected into the environment and in the viewer's direction. Use of adjustable white point display software was useful only if the predominant background color was white and if it matched the ambient light system's color. Display graphics that used a predominantly black background had the least influence on unplanned spectral energy projected into the environment. Percent reflectance makes a difference in total energy reflected back into an environment, and within certain architectural geometries, reflectance can be used to control the amount of a light spectrum that is allowed to perpetuate in the environment. Data showed that room volume and distance from significant light sources influence the total spectrum in a room. Smaller environments had a homogenizing effect on total light spectrum, whereas light from multiple sources in larger environments was less mixed. The findings indicated above should be considered when making recommendations for practice or standards for architectural systems. The ambient lighting system, surface reflectance, and display and indicator implementation all factor into the users' spectral environment. A variety of low-cost solutions exist to mitigate the impact of light from non-architectural lighting systems, and much potential for system automation and integration of display systems with the ambient environment. This team believes that proper planning can be used to avoid integration problems and also believes that human-in-the-loop evaluations, real-world test and measurement, and computer modeling can be used to determine how changes to a process, display graphics, and architecture will help maintain the planned spectral operating lighting environment.

  4. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  5. Biomarker Discovery Using New Metabolomics Software for Automated Processing of High Resolution LC-MS Data

    PubMed Central

    Hnatyshyn, S.; Reily, M.; Shipkova, P.; McClure, T.; Sanders, M.; Peake, D.

    2011-01-01

    Robust biomarkers of target engagement and efficacy are required in different stages of drug discovery. Liquid chromatography coupled to high resolution mass spectrometry provides sensitivity, accuracy and wide dynamic range required for identification of endogenous metabolites in biological matrices. LCMS is widely-used tool for biomarker identification and validation. Typical high resolution LCMS profiles from biological samples may contain greater than a million mass spectral peaks corresponding to several thousand endogenous metabolites. Reduction of the total number of peaks, component identification and statistical comparison across sample groups remains to be a difficult and time consuming challenge. Blood samples from four groups of rats (male vs. female, fully satiated and food deprived) were analyzed using high resolution accurate mass (HRAM) LCMS. All samples were separated using a 15 minute reversed-phase C18 LC gradient and analyzed in both positive and negative ion modes. Data was acquired using 15K resolution and 5ppm mass measurement accuracy. The entire data set was analyzed using software developed in collaboration between Bristol Meyers Squibb and Thermo Fisher Scientific to determine the metabolic effects of food deprivation on rats. Metabolomic LC-MS data files are extraordinarily complex and appropriate reduction of the number of spectral peaks via identification of related peaks and background removal is essential. A single component such as hippuric acid generates more than 20 related peaks including isotopic clusters, adducts and dimers. Plasma and urine may contain 500-1500 unique quantifiable metabolites. Noise filtering approaches including blank subtraction were used to reduce the number of irrelevant peaks. By grouping related signals such as isotopic peaks and alkali adducts, data processing was greatly simplified by reducing the total number of components by 10-fold. The software processes 48 samples in under 60minutes. Principle Component Analysis showed substantial differences in endogenous metabolites levels between the animal groups. Annotation of components was accomplished via searching the ChemSpider database. Tentative assignments made using accurate mass need further verification by comparison with the retention time of authentic standards.

  6. Virtual Astronomy: The Legacy of the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Berriman, G. B.; Lazio, J.; Szalay, A. S.; Fabbiano, G.; Plante, R. L.; McGlynn, T. A.; Evans, J.; Emery Bunn, S.; Claro, M.; VAO Project Team

    2014-01-01

    Over the past ten years, the Virtual Astronomical Observatory (VAO, http://usvao.org) and its predecessor, the National Virtual Observatory (NVO), have developed and operated a software infrastructure consisting of standards and protocols for data and science software applications. The Virtual Observatory (VO) makes it possible to develop robust software for the discovery, access, and analysis of astronomical data. Every major publicly funded research organization in the US and worldwide has deployed at least some components of the VO infrastructure; tens of thousands of VO-enabled queries for data are invoked daily against catalog, image, and spectral data collections; and groups within the community have developed tools and applications building upon the VO infrastructure. Further, NVO and VAO have helped ensure access to data internationally by co-founding the International Virtual Observatory Alliance (IVOA, http://ivoa.net). The products of the VAO are being archived in a publicly accessible repository. Several science tools developed by the VAO will continue to be supported by the organizations that developed them: the Iris spectral energy distribution package (SAO), the Data Discovery Tool (STScI/MAST, HEASARC), and the scalable cross-comparison service (IPAC). The final year of VAO is focused on development of the data access protocol for data cubes, creation of Python language bindings to VO services, and deployment of a cloud-like data storage service that links to VO data discovery tools (SciDrive). We encourage the community to make use of these tools and services, to extend and improve them, and to carry on with the vision for virtual astronomy: astronomical research enabled by easy access to distributed data and computational resources. Funding for VAO development and operations has been provided jointly by NSF and NASA since May 2010. NSF funding will end in September 2014, though with the possibility of competitive solicitations for VO-based tool development. NASA intends to maintain core VO services such as the resource registry (the index of VO-accessible data collections), monitoring services, and a website as part of the remit of HEASARC, IPAC (IRSA, NED), and MAST.

  7. Apparatus and system for multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2003-06-24

    An apparatus and system for determining the properties of a sample from measured spectral data collected from the sample by performing a method of multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used by a spectrum analyzer to process X-ray spectral data generated by a spectral analysis system that can include a Scanning Electron Microscope (SEM) with an Energy Dispersive Detector and Pulse Height Analyzer.

  8. A custom multi-modal sensor suite and data analysis pipeline for aerial field phenotyping

    NASA Astrophysics Data System (ADS)

    Bartlett, Paul W.; Coblenz, Lauren; Sherwin, Gary; Stambler, Adam; van der Meer, Andries

    2017-05-01

    Our group has developed a custom, multi-modal sensor suite and data analysis pipeline to phenotype crops in the field using unpiloted aircraft systems (UAS). This approach to high-throughput field phenotyping is part of a research initiative intending to markedly accelerate the breeding process for refined energy sorghum varieties. To date, single rotor and multirotor helicopters, roughly 14 kg in total weight, are being employed to provide sensor coverage over multiple hectaresized fields in tens of minutes. The quick, autonomous operations allow for complete field coverage at consistent plant and lighting conditions, with low operating costs. The sensor suite collects data simultaneously from six sensors and registers it for fusion and analysis. High resolution color imagery targets color and geometric phenotypes, along with lidar measurements. Long-wave infrared imagery targets temperature phenomena and plant stress. Hyperspectral visible and near-infrared imagery targets phenotypes such as biomass and chlorophyll content, as well as novel, predictive spectral signatures. Onboard spectrometers and careful laboratory and in-field calibration techniques aim to increase the physical validity of the sensor data throughout and across growing seasons. Off-line processing of data creates basic products such as image maps and digital elevation models. Derived data products include phenotype charts, statistics, and trends. The outcome of this work is a set of commercially available phenotyping technologies, including sensor suites, a fully integrated phenotyping UAS, and data analysis software. Effort is also underway to transition these technologies to farm management users by way of streamlined, lower cost sensor packages and intuitive software interfaces.

  9. Preliminary study on the differentiation between parapelvic cyst and hydronephrosis with non-calculous using only pre-contrast dual-energy spectral CT scans

    PubMed Central

    Han, Dong; Ma, Guangming; Wei, Lequn; Ren, Chenglong; Zhou, Jieli; Shen, Chen

    2017-01-01

    Objective: To investigate the value of using the quantitative parameters from only the pre-contrast dual-energy spectral CT imaging for distinguishing between parapelvic cyst and hydronephrosis with non-calculous (HNC). Methods: This retrospective study was approved by the institutional review board. 28 patients with parapelvic cyst and 24 patients with HNC who underwent standard pre-contrast and multiphase contrast-enhanced dual-energy spectral CT imaging were retrospectively identified. The parapelvic cyst and HNC were identified using the contrast-enhanced scans, and their CT number in the 70-keV monochromatic images, effective atomic number (Zeff), iodine concentration (IC) and water concentration in the pre-contrast images were measured. The slope of the spectral curve (λ) was calculated. The difference in the measurements between parapelvic cyst and HNC was statistically analyzed using SPSS® v. 19.0 (IBM Corp., New York, NY; formerly SPSS Inc., Chicago, IL) statistical software. Receiver-operating characteristic analysis was performed to assess the diagnostic performance. Results: The CT numbers in the 70-keV images, Zeff and IC values were statistically different between parapelvic cyst and HNC (all p < 0.05). The sensitivity, specificity and accuracy of these parameters for distinguishing between parapelvic cyst and HNC were 89.2%, 73.3% and 82.1%; 86.5%, 43.3% and 67.2%; 91.9%, 40.0% and 68.7%; and 64.9%, 73.3% and 83.6%, respectively, and the combined specificity was 92.9%. There was no statistical difference in λ between the two groups (p > 0.05). Conclusion: The quantitative parameters obtained in the pre-contrast dual-energy spectral CT imaging may be used to differentiate between parapelvic cyst and HNC. Advances in knowledge: The pre-contrast dual-energy spectral CT scans may be used to screen parapelvic cysts for patients who are asymptomatic, thereby avoiding contrast-enhanced CT or CT urography examination for these patients to reduce ionizing radiation dose and contrast dose. PMID:28281789

  10. Spectral monitoring of toluene and ethanol in gasoline blends using Fourier-Transform Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Ortega Clavero, Valentin; Weber, Andreas; Schröder, Werner; Curticapean, Dan; Meyrueis, Patrick; Javahiraly, Nicolas

    2013-04-01

    The combination of fossil-derived fuels with ethanol and methanol has acquired relevance and attention in several countries in recent years. This trend is strongly affected by market prices, constant geopolitical events, new sustainability policies, new laws and regulations, etc. Besides bio-fuels these materials also include different additives as anti-shock agents and as octane enhancer. Some of the chemical compounds in these additives may have harmful properties for both environment and public health (besides the inherent properties, like volatility). We present detailed Raman spectral information from toluene (C7H8) and ethanol (C2H6O) contained in samples of ElO gasoline-ethanol blends. The spectral information has been extracted by using a robust, high resolution Fourier-Transform Raman spectrometer (FT-Raman) prototype. This spectral information has been also compared with Raman spectra from pure additives and with standard Raman lines in order to validate its accuracy in frequency. The spectral information is presented in the range of 0 cm-1 to 3500 cm-1 with a resolution of 1.66cm-1. This allows resolving tight adjacent Raman lines like the ones observed around 1003cm-1 and 1030cm-1 (characteristic lines of toluene). The Raman spectra obtained show a reduced frequency deviation when compared to standard Raman spectra from different calibration materials. The FT-Raman spectrometer prototype used for the analysis consist basically of a Michelson interferometer and a self-designed photon counter cooled down on a Peltier element arrangement. The light coupling is achieved with conventional62.5/125μm multi-mode fibers. This FT-Raman setup is able to extract high resolution and frequency precise Raman spectra from the additives in the fuels analyzed. The proposed prototype has no additional complex hardware components or costly software modules. The mechanical and thermal disturbances affecting the FT-Raman system are mathematically compensated by accurately extracting the optical path information of the Michelson interferometer. This is accomplished by generating an additional interference pattern with a λ = 632.8 nm Helium-Neon laser (HeNe laser). It enables the FT-Raman system to perform reliable and clean spectral measurements from the materials under observation.

  11. [Identification of Dendrobium varieties by Fourier transform infrared spectroscopy combined with spectral retrieval].

    PubMed

    Liu, Fei; Wang, Yuan-zhong; Deng, Xing-yan; Jin, Hang; Yang, Chun-yan

    2014-06-01

    The infrared spectral of stems of 165 trees of 23 Dendrobium varieties were obtained by means of Fourier transform infrared spectroscopy technique. The spectra show that the spectra of all the samples were similar, and the main components of stem of Dendrobium is cellulose. By the spectral professional software Omnic8.0, three spectral databases were constructed. Lib01 includes of the average spectral of the first four trees of every variety, while Lib02 and Lib03 are constructed from the first-derivative spectra and the second-derivative spectra of average spectra, separately. The correlation search, the square difference retrieval and the square differential difference retrieval of the spectra are performed with the spectral database Lib01 in the specified range of 1 800-500 cm(-1), and the yield correct rate of 92.7%, 74.5% and 92.7%, respectively. The square differential difference retrieval of the first-derivative spectra and the second-derivative spectra is carried out with Lib02 and Lib03 in the same specified range 1 800-500 cm(-1), and shows correct rate of 93.9% for the former and 90.3% for the later. The results show that the first-derivative spectral retrieval of square differential difference algorithm is more suitabe for discerning Dendrobium varieties, and FTIR combining with the spectral retrieval method can identify different varieties of Dendrobium, and the correlation retrieval, the square differential retrieval, the first-derivative spectra and second-derivative spectra retrieval in the specified spectral range are effective and simple way of distinguishing different varieties of Dendrobium.

  12. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  13. Analysis of spectrally resolved autofluorescence images by support vector machines

    NASA Astrophysics Data System (ADS)

    Mateasik, A.; Chorvat, D.; Chorvatova, A.

    2013-02-01

    Spectral analysis of the autofluorescence images of isolated cardiac cells was performed to evaluate and to classify the metabolic state of the cells in respect to the responses to metabolic modulators. The classification was done using machine learning approach based on support vector machine with the set of the automatically calculated features from recorded spectral profile of spectral autofluorescence images. This classification method was compared with the classical approach where the individual spectral components contributing to cell autofluorescence were estimated by spectral analysis, namely by blind source separation using non-negative matrix factorization. Comparison of both methods showed that machine learning can effectively classify the spectrally resolved autofluorescence images without the need of detailed knowledge about the sources of autofluorescence and their spectral properties.

  14. High-Resolution Spectroscopic Database for the NASA Earth Observing System Program

    NASA Technical Reports Server (NTRS)

    Rothman, Laurence

    2003-01-01

    The purpose of this project is to develop and enhance the HITRAN molecular spectroscopic database and associated software to support the observational programs of the Earth Observing System (EOS). In particular, the focus is on the EOS projects: the Atmospheric Infrared Sounder (AIRS), the High-Resolution Dynamics Limb Sounder (HIRDLS), Measurements of Pollution in the Troposphere (MOPITT), the Tropospheric Emission Spectrometer (TES), and the Stratospheric Aerosol and Gas Experiment (SAGE III). The HITRAN program is also involved in the Ozone Monitoring Experiment (OMI). The data requirements of these programs in terms of spectroscopy are varied with respect to constituents being observed, required remote-sensing parameters, and spectral coverage. A general requisite is for additional spectral parameters and improvements to existing molecular bands sufficient for the simulation of the observations leading to retrieval of the atmospheric state. In addition, cross-section data for heavier molecular species must be expanded and made amenable to modeling in remote sensing. The effort in the project also includes developing software and distribution to make access, manipulation, and use of HITRAN functional to the EOS program.

  15. High Resolution Spectroscopic Database for the NASA Earth Observing System Program

    NASA Technical Reports Server (NTRS)

    Rothman, Laurence

    2004-01-01

    The purpose of this project has been to develop and enhance the HITRAN molecular spectroscopic database and associated software to support the observational programs of the Earth Observing System (EOS). Emphasis has been on the EOS projects: the Atmospheric Infrared Sounder (AIRS), the High-Resolution Dynamics Limb Sounder (HIRDLS), Measurements of Pollution in the Troposphere (MOPITT), the Tropospheric Emission Spectrometer (TES), and the Stratospheric Aerosol and Gas Experiment (SAGE III). The HITRAN program is also involved in the Ozone Monitoring Experiment (OMI). The data requirements of these programs in terms of spectroscopy are varied with respect to constituents being observed, required remote-sensing parameters, and spectral coverage. A general requisite is for additional spectral parameters and improvements to existing molecular bands sufficient for the simulation of the observations leading to retrieval of the atmospheric state. In addition, cross-section data for heavier molecular species must be expanded and made amenable to modeling in remote sensing. The effort in the project also includes developing software and distribution to make access, manipulation, and use of HITRAN functional to the EOS program.

  16. High-Resolution Spectroscopic Database for the NASA Earth Observing System Program

    NASA Technical Reports Server (NTRS)

    Rothman, Laurence S.

    2004-01-01

    The purpose of this project is to develop and enhance the HITRAN molecular spectroscopic database and associated - software to support the observational programs of the Earth observing System (EOS). In particular, the focus is on the EOS projects: the Atmospheric Infrared Sounder (AIRS), the High-Resolution Dynamics Limb Sounder (HIRDLS), Measurements of Pollution in the Troposphere (MOPITT), the Tropospheric Emission Spectrometer (TES), and the Stratospheric Aerosol and Gas Experiment (SAGE III). The HITRAN program is also involved in the Ozone Monitoring Experiment (OMI). The data requirements of these programs in terms of spectroscopy are varied with respect to constituents being observed, required remote-sensing parameters, and spectral coverage. A general requisite is for additional spectral parameters and improvements to existing molecular bands sufficient for the simulation of the observations leading to retrieval of the atmospheric state. In addition cross-section data for heavier molecular species must be expanded and made amenable to modeling in remote sensing. The effort in the project also includes developing software and distribution to make access, manipulation, and use HITRAN functional to the EOS program.

  17. ICER-3D Hyperspectral Image Compression Software

    NASA Technical Reports Server (NTRS)

    Xie, Hua; Kiely, Aaron; Klimesh, matthew; Aranki, Nazeeh

    2010-01-01

    Software has been developed to implement the ICER-3D algorithm. ICER-3D effects progressive, three-dimensional (3D), wavelet-based compression of hyperspectral images. If a compressed data stream is truncated, the progressive nature of the algorithm enables reconstruction of hyperspectral data at fidelity commensurate with the given data volume. The ICER-3D software is capable of providing either lossless or lossy compression, and incorporates an error-containment scheme to limit the effects of data loss during transmission. The compression algorithm, which was derived from the ICER image compression algorithm, includes wavelet-transform, context-modeling, and entropy coding subalgorithms. The 3D wavelet decomposition structure used by ICER-3D exploits correlations in all three dimensions of sets of hyperspectral image data, while facilitating elimination of spectral ringing artifacts, using a technique summarized in "Improving 3D Wavelet-Based Compression of Spectral Images" (NPO-41381), NASA Tech Briefs, Vol. 33, No. 3 (March 2009), page 7a. Correlation is further exploited by a context-modeling subalgorithm, which exploits spectral dependencies in the wavelet-transformed hyperspectral data, using an algorithm that is summarized in "Context Modeler for Wavelet Compression of Hyperspectral Images" (NPO-43239), which follows this article. An important feature of ICER-3D is a scheme for limiting the adverse effects of loss of data during transmission. In this scheme, as in the similar scheme used by ICER, the spatial-frequency domain is partitioned into rectangular error-containment regions. In ICER-3D, the partitions extend through all the wavelength bands. The data in each partition are compressed independently of those in the other partitions, so that loss or corruption of data from any partition does not affect the other partitions. Furthermore, because compression is progressive within each partition, when data are lost, any data from that partition received prior to the loss can be used to reconstruct that partition at lower fidelity. By virtue of the compression improvement it achieves relative to previous means of onboard data compression, this software enables (1) increased return of hyperspectral scientific data in the presence of limits on the rates of transmission of data from spacecraft to Earth via radio communication links and/or (2) reduction in spacecraft radio-communication power and/or cost through reduction in the amounts of data required to be downlinked and stored onboard prior to downlink. The software is also suitable for compressing hyperspectral images for ground storage or archival purposes.

  18. Methyl green and nitrotetrazolium blue chloride co-expression in colon tissue: A hyperspectral microscopic imaging analysis

    NASA Astrophysics Data System (ADS)

    Li, Qingli; Liu, Hongying; Wang, Yiting; Sun, Zhen; Guo, Fangmin; Zhu, Jianzhong

    2014-12-01

    Histological observation of dual-stained colon sections is usually performed by visual observation under a light microscope, or by viewing on a computer screen with the assistance of image processing software in both research and clinical settings. These traditional methods are usually not sufficient to reliably differentiate spatially overlapping chromogens generated by different dyes. Hyperspectral microscopic imaging technology offers a solution for these constraints as the hyperspectral microscopic images contain information that allows differentiation between spatially co-located chromogens with similar but different spectra. In this paper, a hyperspectral microscopic imaging (HMI) system is used to identify methyl green and nitrotetrazolium blue chloride in dual-stained colon sections. Hyperspectral microscopic images are captured and the normalized score algorithm is proposed to identify the stains and generate the co-expression results. Experimental results show that the proposed normalized score algorithm can generate more accurate co-localization results than the spectral angle mapper algorithm. The hyperspectral microscopic imaging technology can enhance the visualization of dual-stained colon sections, improve the contrast and legibility of each stain using their spectral signatures, which is helpful for pathologist performing histological analyses.

  19. Hyperspectral proximal sensing of Salix Alba trees in the Sacco river valley (Latium, Italy).

    PubMed

    Moroni, Monica; Lupo, Emanuela; Cenedese, Antonio

    2013-10-29

    Recent developments in hardware and software have increased the possibilities and reduced the costs of hyperspectral proximal sensing. Through the analysis of high resolution spectroscopic measurements at the laboratory or field scales, this monitoring technique is suitable for quantitative estimates of biochemical and biophysical variables related to the physiological state of vegetation. Two systems for hyperspectral imaging have been designed and developed at DICEA-Sapienza University of Rome, one based on the use of spectrometers, the other on tunable interference filters. Both systems provide a high spectral and spatial resolution with low weight, power consumption and cost. This paper describes the set-up of the tunable filter platform and its application to the investigation of the environmental status of the region crossed by the Sacco river (Latium, Italy). This was achieved by analyzing the spectral response given by tree samples, with roots partly or wholly submerged in the river, located upstream and downstream of an industrial area affected by contamination. Data acquired is represented as reflectance indices as well as reflectance values. Broadband and narrowband indices based on pigment content and carotenoids vs. chlorophyll content suggest tree samples located upstream of the contaminated area are 'healthier' than those downstream.

  20. Functional Connectivity Changes in Resting-State EEG as Potential Biomarker for Amyotrophic Lateral Sclerosis.

    PubMed

    Iyer, Parameswaran Mahadeva; Egan, Catriona; Pinto-Grau, Marta; Burke, Tom; Elamin, Marwa; Nasseroleslami, Bahman; Pender, Niall; Lalor, Edmund C; Hardiman, Orla

    2015-01-01

    Amyotrophic Lateral Sclerosis (ALS) is heterogeneous and overlaps with frontotemporal dementia. Spectral EEG can predict damage in structural and functional networks in frontotemporal dementia but has never been applied to ALS. 18 incident ALS patients with normal cognition and 17 age matched controls underwent 128 channel EEG and neuropsychology assessment. The EEG data was analyzed using FieldTrip software in MATLAB to calculate simple connectivity measures and scalp network measures. sLORETA was used in nodal analysis for source localization and same methods were applied as above to calculate nodal network measures. Graph theory measures were used to assess network integrity. Cross spectral density in alpha band was higher in patients. In ALS patients, increased degree values of the network nodes was noted in the central and frontal regions in the theta band across seven of the different connectivity maps (p<0.0005). Among patients, clustering coefficient in alpha and gamma bands was increased in all regions of the scalp and connectivity were significantly increased (p=0.02). Nodal network showed increased assortativity in alpha band in the patients group. The Clustering Coefficient in Partial Directed Connectivity (PDC) showed significantly higher values for patients in alpha, beta, gamma, theta and delta frequencies (p=0.05). There is increased connectivity in the fronto-central regions of the scalp and areas corresponding to Salience and Default Mode network in ALS, suggesting a pathologic disruption of neuronal networking in early disease states. Spectral EEG has potential utility as a biomarker in ALS.

  1. ESIprot: a universal tool for charge state determination and molecular weight calculation of proteins from electrospray ionization mass spectrometry data.

    PubMed

    Winkler, Robert

    2010-02-01

    Electrospray ionization (ESI) ion trap mass spectrometers with relatively low resolution are frequently used for the analysis of natural products and peptides. Although ESI spectra of multiply charged protein molecules also can be measured on this type of devices, only average spectra are produced for the majority of naturally occurring proteins. Evaluating such ESI protein spectra would provide valuable information about the native state of investigated proteins. However, no suitable and freely available software could be found which allows the charge state determination and molecular weight calculation of single proteins from average ESI-MS data. Therefore, an algorithm based on standard deviation optimization (scatter minimization) was implemented for the analysis of protein ESI-MS data. The resulting software ESIprot was tested with ESI-MS data of six intact reference proteins between 12.4 and 66.7 kDa. In all cases, the correct charge states could be determined. The obtained absolute mass errors were in a range between -0.2 and 1.2 Da, the relative errors below 30 ppm. The possible mass accuracy allows for valid conclusions about the actual condition of proteins. Moreover, the ESIprot algorithm demonstrates an extraordinary robustness and allows spectral interpretation from as little as two peaks, given sufficient quality of the provided m/z data, without the necessity for peak intensity data. ESIprot is independent from the raw data format and the computer platform, making it a versatile tool for mass spectrometrists. The program code was released under the open-source GPLv3 license to support future developments of mass spectrometry software. Copyright 2010 John Wiley & Sons, Ltd.

  2. Training the Next Generation in Space Situational Awareness Research

    NASA Astrophysics Data System (ADS)

    Colpo, D.; Reddy, V.; Arora, S.; Tucker, S.; Jeffries, L.; May, D.; Bronson, R.; Hunten, E.

    Traditional academic SSA research has relied on commercial off the shelf (COTS) systems for collecting metric and lightcurve data. COTS systems have several advantages over a custom built system including cost, easy integration, technical support and short deployment timescales. We at the University of Arizona took an alternative approach to develop a sensor system for space object characterization. Five engineering students designed and built two 0.6-meter F/4 electro-optical (EO) systems for collecting lightcurve and spectral data. All the design and fabrication work was carried out over the course of two semesters as part f their senior design project that is mandatory for the completion of their bachelors in engineering degree. The students designed over 200 individual parts using three-dimensional modeling software (SolidWorks), and conducted detailed optical design analysis using raytracing software (ZEMAX), with oversight and advice from faculty sponsor and Starizona, a local small business in Tucson. The components of the design were verified by test, analysis, inspection, or demonstration, per the process that the University of Arizona requires for each of its design projects. Methods to complete this project include mechanical FEA, optical testing methods (Foucault Knife Edge Test and Couder Mask Test), tests to verify the function of the thermometers, and a final pointing model test. A surprise outcome of our exercise is that the entire cost of the design and fabrication of these two EO systems was significantly lower than a COTS alternative. With careful planning and coordination we were also able to reduce to the deployment times to those for a commercial system. Our experience shows that development of hardware and software for SSA research could be accomplished in an academic environment that would enable the training of the next generation with active support from local small businesses.

  3. Bi-Force: large-scale bicluster editing and its application to gene expression data biclustering

    PubMed Central

    Sun, Peng; Speicher, Nora K.; Röttger, Richard; Guo, Jiong; Baumbach, Jan

    2014-01-01

    Abstract The explosion of the biological data has dramatically reformed today's biological research. The need to integrate and analyze high-dimensional biological data on a large scale is driving the development of novel bioinformatics approaches. Biclustering, also known as ‘simultaneous clustering’ or ‘co-clustering’, has been successfully utilized to discover local patterns in gene expression data and similar biomedical data types. Here, we contribute a new heuristic: ‘Bi-Force’. It is based on the weighted bicluster editing model, to perform biclustering on arbitrary sets of biological entities, given any kind of pairwise similarities. We first evaluated the power of Bi-Force to solve dedicated bicluster editing problems by comparing Bi-Force with two existing algorithms in the BiCluE software package. We then followed a biclustering evaluation protocol in a recent review paper from Eren et al. (2013) (A comparative analysis of biclustering algorithms for gene expressiondata. Brief. Bioinform., 14:279–292.) and compared Bi-Force against eight existing tools: FABIA, QUBIC, Cheng and Church, Plaid, BiMax, Spectral, xMOTIFs and ISA. To this end, a suite of synthetic datasets as well as nine large gene expression datasets from Gene Expression Omnibus were analyzed. All resulting biclusters were subsequently investigated by Gene Ontology enrichment analysis to evaluate their biological relevance. The distinct theoretical foundation of Bi-Force (bicluster editing) is more powerful than strict biclustering. We thus outperformed existing tools with Bi-Force at least when following the evaluation protocols from Eren et al. Bi-Force is implemented in Java and integrated into the open source software package of BiCluE. The software as well as all used datasets are publicly available at http://biclue.mpi-inf.mpg.de. PMID:24682815

  4. Single Point vs. Mapping Approach for Spectral Cytopathology (SCP)

    PubMed Central

    Schubert, Jennifer M.; Mazur, Antonella I.; Bird, Benjamin; Miljković, Miloš; Diem, Max

    2011-01-01

    In this paper we describe the advantages of collecting infrared microspectral data in imaging mode opposed to point mode. Imaging data are processed using the PapMap algorithm, which co-adds pixel spectra that have been scrutinized for R-Mie scattering effects as well as other constraints. The signal-to-noise quality of PapMap spectra will be compared to point spectra for oral mucosa cells deposited onto low-e slides. Also the effects of software atmospheric correction will be discussed. Combined with the PapMap algorithm, data collection in imaging mode proves to be a superior method for spectral cytopathology. PMID:20449833

  5. Multiplane and Spectrally-Resolved Single Molecule Localization Microscopy with Industrial Grade CMOS cameras.

    PubMed

    Babcock, Hazen P

    2018-01-29

    This work explores the use of industrial grade CMOS cameras for single molecule localization microscopy (SMLM). We show that industrial grade CMOS cameras approach the performance of scientific grade CMOS cameras at a fraction of the cost. This makes it more economically feasible to construct high-performance imaging systems with multiple cameras that are capable of a diversity of applications. In particular we demonstrate the use of industrial CMOS cameras for biplane, multiplane and spectrally resolved SMLM. We also provide open-source software for simultaneous control of multiple CMOS cameras and for the reduction of the movies that are acquired to super-resolution images.

  6. A synthetic method of solar spectrum based on LED

    NASA Astrophysics Data System (ADS)

    Wang, Ji-qiang; Su, Shi; Zhang, Guo-yu; Zhang, Jian

    2017-10-01

    A synthetic method of solar spectrum which based on the spectral characteristics of the solar spectrum and LED, and the principle of arbitrary spectral synthesis was studied by using 14 kinds of LED with different central wavelengths.The LED and solar spectrum data were selected by Origin Software firstly, then calculated the total number of LED for each center band by the transformation relation between brightness and illumination and Least Squares Curve Fit in Matlab.Finally, the spectrum curve of AM1.5 standard solar spectrum was obtained. The results met the technical indexes of the solar spectrum matching with ±20% and the solar constant with >0.5.

  7. Features in the spectra of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Stanek, Krzysztof Z.; Paczynski, Bohdan; Goodman, Jeremy

    1993-01-01

    Gravitational lensing of cosmological gamma-ray bursts by objects in the mass range about 10 exp 17 to 10 exp 20 g (femtolensing) may introduce complicated interference patterns that might be interpreted as absorption or emission lines in the bursts' spectra. This phenomenon, if detected, may be used as a unique probe of dark matter in the universe. The BATSE spectral data should allow one to detect such spectral features or to put significant upper limits on the cosmic density of a dark matter component that may be in the femtolensing range. Software to generate theoretical spectra has been developed, and it is accessible over the computer network with anonymous ftp.

  8. Une nouvelle méthode de cartographie de la région d'Oran (Algérie) à l'aide de la télédétection multispectrale

    NASA Astrophysics Data System (ADS)

    Laoufi, Fatiha; Belbachir, Ahmed-Hafid; Benabadji, Noureddine; Zanoun, Abdelouahab

    2011-10-01

    We have mapped the region of Oran, Algeria, using multispectral remote sensing with different resolutions. For the identification of objects on the ground using their spectral signatures, two methods were applied to images from SPOT, LANDSAT, IRS-1 C and ASTER. The first one is called Base Rule method (BR method) and is based on a set of rules that must be met at each pixel in the different bands reflectance calibrated and henceforth it is assigned to a given class. The construction of these rules is based on the spectral profiles of popular classes in the scene studied. The second one is called Spectral Angle Mapper method (SAM method) and is based on the direct calculation of the spectral angle between the target vector representing the spectral profile of the desired class and the pixel vector whose components are numbered accounts in the different bands of the calibrated image reflectance. This new method was performed using PCSATWIN software developed by our own laboratory LAAR. After collecting a library of spectral signatures with multiple libraries, a detailed study of the principles and physical processes that can influence the spectral signature has been conducted. The final goal is to establish the range of variation of a spectral profile of a well-defined class and therefore to get precise bases for spectral rules. From the results we have obtained, we find that the supervised classification of these pixels by BR method derived from spectral signatures reduces the uncertainty associated with identifying objects by enhancing significantly the percentage of correct classification with very distinct classes.

  9. Quantitative subpixel spectral detection of targets in multispectral images. [terrestrial and planetary surfaces

    NASA Technical Reports Server (NTRS)

    Sabol, Donald E., Jr.; Adams, John B.; Smith, Milton O.

    1992-01-01

    The conditions that affect the spectral detection of target materials at the subpixel scale are examined. Two levels of spectral mixture analysis for determining threshold detection limits of target materials in a spectral mixture are presented, the cases where the target is detected as: (1) a component of a spectral mixture (continuum threshold analysis) and (2) residuals (residual threshold analysis). The results of these two analyses are compared under various measurement conditions. The examples illustrate the general approach that can be used for evaluating the spectral detectability of terrestrial and planetary targets at the subpixel scale.

  10. Harness That S.O.B.: Distributing Remote Sensing Analysis in a Small Office/Business

    NASA Astrophysics Data System (ADS)

    Kramer, J.; Combe, J.; McCord, T. B.

    2009-12-01

    Researchers in a small office/business (SOB) operate with limited funding, equipment, and software availability. To mitigate these issues, we developed a distributed computing framework that: 1) leverages open source software to implement functionality otherwise reliant on proprietary software and 2) harnesses the unused power of (semi-)idle office computers with mixed operating systems (OSes). This abstract outlines some reasons for the effort, its conceptual basis and implementation, and provides brief speedup results. The Multiple-Endmember Linear Spectral Unmixing Model (MELSUM)1 processes remote-sensing (hyper-)spectral images. The algorithm is computationally expensive, sometimes taking a full week or more for a 1 million pixel/100 wavelength image. Analysis of pixels is independent, so a large benefit can be gained from parallel processing techniques. Job concurrency is limited by the number of active processing units. MELSUM was originally written in the Interactive Data Language (IDL). Despite its multi-threading capabilities, an IDL instance executes on a single machine, and so concurrency is limited by the machine's number of central processing units (CPUs). Network distribution can access more CPUs to provide a greater speedup, while also taking advantage of (often) underutilized extant equipment. appropriately integrating open source software magnifies the impact by avoiding the purchase of additional licenses. Our method of distribution breaks into four conceptual parts: 1) the top- or task-level user interface; 2) a mid-level program that manages hosts and jobs, called the distribution server; 3) a low-level executable for individual pixel calculations; and 4) a control program to synchronize sequential sub-tasks. Each part is a separate OS process, passing information via shell commands and/or temporary files. While the control and low-level executables are short-lived, the top-level program and distribution server run (at least) for the entirety of a task. While any language that supports "spawning" of OS processes can serve as the top-level interface, our solution, d-MELSUM, has been integrated with the IDL code. Doing so extracts the core calculating from IDL, but otherwise preserves IDL features and functionality. The distribution server is an extension of ADE2 mobile robot software, written in Java. Network connections rely on a secure shell (SSH) implementation, whether natively available (e.g., Linux or OS X) or user installed (e.g., OpenSSH available via Cygwin on Windows). Both the low-level and control programs are relatively small C++ programs (~54K, or 1500 lines, total) that were developed in-house, and use GNU's g++ compiler. The low-level code also relies on Linear Algebra PACKage (LAPACK) libraries for pixel calculations. Despite performance being contingent on data size, CPU speed, and network communication rate and latency to some degree, results have generally demonstrated a time reduction of a factor proportional to the number of open connections (one per CPU). For example, the task mentioned above requiring a week to process took 18 hours with d-MELSUM, using 10 CPUs on 2 computers. 1 J.-Ph Combe, et al., PSS 56, 2008. 2 J. Kramer and M. Scheutz, IROS2006, 2006.

  11. ROBOCAL: Gamma-ray isotopic hardware/software interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurd, J.R.; Bonner, C.A.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, presently being developed at the Los Alamos National Laboratory, is a full-scale prototypical robotic system for remotely performing calorimetric and gamma-ray isotopics measurements of nuclear materials. It features a fully automated vertical stacker-retriever for storing and retrieving packaged nuclear materials from a multi-drawer system, and a fully automated, uniquely integrated gantry robot for programmable selection and transfer of nuclear materials to calorimetric and gamma-ray isotopic measurement stations. Since ROBOCAL is to require almost no operator intervention, a mechanical control system is required in addition to a totally automated assay system. The assay system must be a completely integrated datamore » acquisition and isotopic analysis package fully capable of performing state-of-the-art homogeneous and heterogeneous analyses on many varied matrices. The TRIFID assay system being discussed at this conference by J. G. Fleissner of the Rocky Flats Plant has been adopted because of its many automated features. These include: MCA/ADC setup and acquisition; spectral storage and analysis utilizing an expert system formalism; report generation with internal measurement control printout; user friendly screens and menus. The mechanical control portion consists primarily of two detector platforms and a sample platform, each with independent movement. Some minor modifications and additions are needed with TRIFID to interface the assay and mechanical portions with the CimRoc 4000 software controlling the robot. 6 refs., 5 figs., 3 tabs.« less

  12. Experimental Raman and IR spectral and theoretical studies of vibrational spectrum and molecular structure of Pantothenic acid (vitamin B5)

    NASA Astrophysics Data System (ADS)

    Srivastava, Mayuri; Singh, N. P.; Yadav, R. A.

    2014-08-01

    Vibrational spectrum of Pantothenic acid has been investigated using experimental IR and Raman spectroscopies and density functional theory methods available with the Gaussian 09 software. Vibrational assignments of the observed IR and Raman bands have been proposed in light of the results obtained from computations. In order to assign the observed IR and Raman frequencies the potential energy distributions (PEDs) have also been computed using GAR2PED software. Optimized geometrical parameters suggest that the overall symmetry of the molecule is C1. The molecule is found to possess eight conformations. Conformational analysis was carried out to obtain the most stable configuration of the molecule. In the present paper the vibrational features of the lowest energy conformer C-I have been studied. The two methyl groups have slightly distorted symmetries from C3V. The acidic Osbnd H bond is found to be the smallest one. To investigate molecular stability and bond strength we have used natural bond orbital analysis (NBO). Charge transfer occurs in the molecule have been shown by the calculated highest occupied molecular orbital-lowest unoccupied molecular orbital (HOMO-LUMO) energies. The mapping of electron density iso-surface with electrostatic potential (ESP), has been carried out to get the information about the size, shape, charge density distribution and site of chemical reactivity of the molecule.

  13. Development and analysis of spectroscopic learning tools and the light and spectroscopy concept inventory for introductory college astronomy

    NASA Astrophysics Data System (ADS)

    Bardar, Erin M.

    Electromagnetic radiation is the fundamental carrier of astronomical information. Spectral features serve as the fingerprints of the universe, revealing many important properties of objects in the cosmos such as temperature, elemental compositions, and relative motion. Because of its importance to astronomical research, the nature of light and the electromagnetic spectrum is by far the most universally covered topic in astronomy education. Yet, to the surprise and disappointment of instructors, many students struggle to understand underlying fundamental concepts related to light and spectroscopic phenomena. This dissertation describes research into introductory college astronomy students' understanding of light and spectroscopy concepts, through the development and analysis of both instructional materials and an assessment instrument. The purpose of this research was two-fold: (1) to develop a novel suite of spectroscopic learning tools that enhance student understanding of light and spectroscopy and (2) to design and validate a Light and Spectroscopy Concept Inventory (LSCI) with the sensitivity to distinguish the relative effectiveness of various teaching interventions within the context of introductory college astronomy. Through a systematic investigation that included multiple rounds of clinical interviews, open-ended written surveys, and multiple-choice testing, introductory college astronomy students' commonly held misconceptions and reasoning difficulties were explored for concepts relating to: (1) The nature of the electromagnetic spectrum, including the interrelationships of wavelength, frequency, energy, and speed; (2) interpretation of Doppler shift; (3) properties of blackbody radiation; and (4) the connection between spectral features and underlying physical processes. These difficulties guided the development of instructional materials including six unique "homelab" exercises, a binocular spectrometer, a spectral analysis software tool, and the 26-question Light and Spectroscopy Concept Inventory (LSCI). In the fall of 2005, a multi-institution field-test of the LSCI was conducted with student examinees from 14 course sections at 11 colleges and universities employing various instructional techniques. Through statistical analysis, the inventory was proven to be a reliable (Cronbach's alpha = 0.77) and valid assessment instrument that was able to illustrate statistically significant learning gains (p < 0.05) for most course sections, with students utilizing our suite of instructional materials exhibiting among the highest performance gains (Effect Size = 1.31).

  14. Predictive spectroscopy and chemical imaging based on novel optical systems

    NASA Astrophysics Data System (ADS)

    Nelson, Matthew Paul

    1998-10-01

    This thesis describes two futuristic optical systems designed to surpass contemporary spectroscopic methods for predictive spectroscopy and chemical imaging. These systems are advantageous to current techniques in a number of ways including lower cost, enhanced portability, shorter analysis time, and improved S/N. First, a novel optical approach to predicting chemical and physical properties based on principal component analysis (PCA) is proposed and evaluated. A regression vector produced by PCA is designed into the structure of a set of paired optical filters. Light passing through the paired filters produces an analog detector signal directly proportional to the chemical/physical property for which the regression vector was designed. Second, a novel optical system is described which takes a single-shot approach to chemical imaging with high spectroscopic resolution using a dimension-reduction fiber-optic array. Images are focused onto a two- dimensional matrix of optical fibers which are drawn into a linear distal array with specific ordering. The distal end is imaged with a spectrograph equipped with an ICCD camera for spectral analysis. Software is used to extract the spatial/spectral information contained in the ICCD images and deconvolute them into wave length-specific reconstructed images or position-specific spectra which span a multi-wavelength space. This thesis includes a description of the fabrication of two dimension-reduction arrays as well as an evaluation of the system for spatial and spectral resolution, throughput, image brightness, resolving power, depth of focus, and channel cross-talk. PCA is performed on the images by treating rows of the ICCD images as spectra and plotting the scores of each PC as a function of reconstruction position. In addition, iterative target transformation factor analysis (ITTFA) is performed on the spectroscopic images to generate ``true'' chemical maps of samples. Univariate zero-order images, univariate first-order spectroscopic images, bivariate first-order spectroscopic images, and multivariate first-order spectroscopic images of the temporal development of laser-induced plumes are presented and interpreted. Reconstructed chemical images generated using bivariate and trivariate wavelength techniques, bimodal and trimodal PCA methods, and bimodal and trimodal ITTFA approaches are also included.

  15. Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.

    PubMed

    Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R

    2017-02-01

    Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. [Features of adaptive responses in right-handers and left-handers, and their relationship to the functional activity of the brain].

    PubMed

    Barkar, A A; Markina, L D

    2014-01-01

    In the article there is considered the relationship between adaptation state of the organism and features of bioelectric activity of the brain in right-handers and left-handers. Practically healthy persons of both genders, 23-45 years of age, with the chronic stress disorder were examined. Adaptation status was evaluated with a computer software "Anti-stress", features of bioelectric brain activity were detected by means of spectral and coherent EEG analysis, also the character of motor and sensory asymmetries was determined. The obtained data showed that the response of the organism to excitators of varying strength is a system one and manifested at different levels; adaptation status and bioelectrical activity in right-handers and left-handers have features.

  17. STRAP PTM: Software Tool for Rapid Annotation and Differential Comparison of Protein Post-Translational Modifications.

    PubMed

    Spencer, Jean L; Bhatia, Vivek N; Whelan, Stephen A; Costello, Catherine E; McComb, Mark E

    2013-12-01

    The identification of protein post-translational modifications (PTMs) is an increasingly important component of proteomics and biomarker discovery, but very few tools exist for performing fast and easy characterization of global PTM changes and differential comparison of PTMs across groups of data obtained from liquid chromatography-tandem mass spectrometry experiments. STRAP PTM (Software Tool for Rapid Annotation of Proteins: Post-Translational Modification edition) is a program that was developed to facilitate the characterization of PTMs using spectral counting and a novel scoring algorithm to accelerate the identification of differential PTMs from complex data sets. The software facilitates multi-sample comparison by collating, scoring, and ranking PTMs and by summarizing data visually. The freely available software (beta release) installs on a PC and processes data in protXML format obtained from files parsed through the Trans-Proteomic Pipeline. The easy-to-use interface allows examination of results at protein, peptide, and PTM levels, and the overall design offers tremendous flexibility that provides proteomics insight beyond simple assignment and counting.

  18. Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix

    2005-01-01

    To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.

  19. Spectral Characterization of Analog Samples in Anticipation of OSIRIS-REx's Arrival at Bennu

    NASA Technical Reports Server (NTRS)

    Donaldson Hanna, K. L.; Schrader, D. L.; Bowles, N. E.; Clark, B. E.; Cloutis, E. A.; Connolly, H. C., Jr.; Hamilton, V. E.; Keller, L. P.; Lauretta, D. S.; Lim, L. F.; hide

    2017-01-01

    NASA's Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) mission successfully launched on September 8th, 2016. During its rendezvous with near-Earth asteroid (101955) Bennu beginning in 2018, OSIRIS-REx will characterize the asteroid's physical, mineralogical, and chemical properties in an effort to globally map the properties of Bennu, a primitive carbonaceous asteroid, and choose a sampling location [e.g. 1]. In preparation for these observations, we spectrally characterized a suite of analog samples across visible, near- and thermal-infrared wavelengths and used these in initial tests of phase detection and abundance determination software algorithms. Here we present the thermal infrared laboratory measurements of the analog sample suite measured under asteroidlike conditions, which are relevant to the interpretation of spectroscopic observations by the OSIRIS-REx Thermal Emission Spectrometer (OTES) [2, 3]. This suite of laboratory measurements of asteroid analogs under asteroid-like conditions is the first of their kind.

  20. Remote sensing investigations of fugitive soil arsenic and its effects on vegetation reflectance

    NASA Astrophysics Data System (ADS)

    Slonecker, E. Terrence

    2007-12-01

    Three different remote sensing technologies were evaluated in support of the remediation of fugitive arsenic and other hazardous waste-related risks to human and ecological health at the Spring Valley Formerly Used Defense Site in northwest Washington D.C., an area of widespread soil arsenic contamination as a result of World War I research and development of chemical weapons. The first evaluation involved the value of information derived from the interpretation of historical aerial photographs. Historical aerial photographs dating back as far as 1918 provided a wealth of information about chemical weapons testing, storage, handling and disposal of these hazardous materials. When analyzed by a trained photo-analyst, the 1918 aerial photographs resulted in 42 features of potential interest. When compared with current remedial activities and known areas of contamination, 33 of 42 or 78.5 % of the features were spatially correlated with current areas of contamination or remedial activity. The second investigation involved the phytoremediation of arsenic through the use of Pteris ferns and the evaluation of the spectral properties of these ferns. Three hundred ferns were grown in controlled laboratory conditions in soils amended with five levels (0, 20, 50, 100 and 200 parts per million) of sodium arsenate. After 20 weeks, the Pteris ferns were shown to have an average uptake concentration of over 4,000 parts per million each. Additionally, statistical analysis of the spectral signature from each fern showed that the frond arsenic concentration could be reasonably predicted with a linear model when the concentration was equal or greater than 500 parts per million. Third, hyperspectral imagery of Spring Valley was obtained and analyzed with a suite of spectral analysis software tools. Results showed the grasses growing in areas of known high soil arsenic could be identified and mapped at an approximate 85% level of accuracy when the hyperspectral image was processed with a linear spectral unmixing algorithm and mapped with a maximum likelihood classifier. The information provided by these various remote sensing technologies presents a non-contact and potentially important alternative to the information needs of the hazardous waste remediation process, and is an important area for future environmental research.

  1. A 3D Joint Simulation Platform for Multiband_A Case Study in the Huailai Soybean and Maize Field

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Qinhuo, L.; Du, Y.; Huang, H.

    2016-12-01

    Canopy radiation and scattering signal contains abundant vegetation information. One can quantitatively retrieve the biophysical parameters by building canopy radiation and scattering models and inverting them. Joint simulation of the 3D models for different spectral (frequency) domains may produce complementary advantages and improves the precision. However, most of the currently models were based on one or two spectral bands (e.g. visible and thermal inferred bands, or visible and microwave bands). This manuscript established a 3D radiation and scattering simulation system which can simulate the BRDF, DBT, and backscattering coefficient based on the same structural description. The system coupled radiosity graphic model, Thermal RGM model and coherent microwave model by Yang Du for VIS/NIR, TIR, and MW, respectively. The models simulating the leaf spectral characteristics, component temperatures and dielectric properties were also coupled into the joint simulation system to convert the various parameters into fewer but more unified parameters. As a demonstration of our system, we applied the established system to simulate a mixed field with soybeans and maize based on the Huailai experiment data in August, 2014. With the help of Xfrog software, we remodeled soybean and maize in ".obj" and ".mtl" format. We extracted the structure information of the soybean and maize by statistics of the ".obj" files. We did simulations on red, NIR, TIR, C and L band. The simulation results were validated by the multi-angular observation data of Huailai experiment. Also, the spacial distribution (horizontal and vertical), leaf area index (LAI), leaf angle distribution (LAD), vegetation water content (VWC) and the incident observation geometry were analyzed in details. Validated by the experiment data, we indicate that the simulations of multiband were quite well. Because the crops were planted in regular rows and the maize and soybeans were with different height, different LAI, different LAD and different VWC, we did the sensitive analysis by changing on one of them and fixed the other parameters. The analysis showed that the parameters influenced the radiation and scattering signal of different spectral (frequency) with varying degrees.

  2. Survey for δ Sct components in eclipsing binaries and new correlations between pulsation frequency and fundamental stellar characteristics

    NASA Astrophysics Data System (ADS)

    Liakos, A.; Niarchos, P.; Soydugan, E.; Zasche, P.

    2012-05-01

    CCD observations of 68 eclipsing binary systems, candidates for containing δ Scuti components, were obtained. Their light curves are analysed using the PERIOD04 software for possible pulsational behaviour. For the systems QY Aql, CZ Aqr, TY Cap, WY Cet, UW Cyg, HL Dra, HZ Dra, AU Lac, CL Lyn and IO UMa, complete light curves were observed due to the detection of a pulsating component. All of them, except QY Aql and IO UMa, are analysed with modern astronomical softwares in order to determine their geometrical and pulsational characteristics. Spectroscopic observations of WY Cet and UW Cyg were used to estimate the spectral class of their primary components, while for HZ Dra radial velocities of its primary were measured. O - C diagram analysis was performed for the cases showing peculiar orbital period variations, namely CZ Aqr, TY Cap, WY Cet and UW Cyg, with the aim of obtaining a comprehensive picture of these systems. An updated catalogue of 74 close binaries including a δ Scuti companion is presented. Moreover, a connection between orbital and pulsation periods, as well as a correlation between evolutionary status and dominant pulsation frequency for these systems, is discussed.

  3. Meteor44 Video Meteor Photometry

    NASA Technical Reports Server (NTRS)

    Swift, Wesley R.; Suggs, Robert M.; Cooke, William J.

    2004-01-01

    Meteor44 is a software system developed at MSFC for the calibration and analysis of video meteor data. The dynamic range of the (8bit) video data is extended by approximately 4 magnitudes for both meteors and stellar images using saturation compensation. Camera and lens specific saturation compensation coefficients are derived from artificial variable star laboratory measurements. Saturation compensation significantly increases the number of meteors with measured intensity and improves the estimation of meteoroid mass distribution. Astrometry is automated to determine each image s plate coefficient using appropriate star catalogs. The images are simultaneously intensity calibrated from the contained stars to determine the photon sensitivity and the saturation level referenced above the atmosphere. The camera s spectral response is used to compensate for stellar color index and typical meteor spectra in order to report meteor light curves in traditional visual magnitude units. Recent efforts include improved camera calibration procedures, long focal length "streak" meteor photome&y and two-station track determination. Meteor44 has been used to analyze data from the 2001.2002 and 2003 MSFC Leonid observational campaigns as well as several lesser showers. The software is interactive and can be demonstrated using data from recent Leonid campaigns.

  4. Quantitative Multispectral Analysis Of Discrete Subcellular Particles By Digital Imaging Fluorescence Microscopy (DIFM)

    NASA Astrophysics Data System (ADS)

    Dorey, C. K.; Ebenstein, David B.

    1988-10-01

    Subcellular localization of multiple biochemical markers is readily achieved through their characteristic autofluorescence or through use of appropriately labelled antibodies. Recent development of specific probes has permitted elegant studies in calcium and pH in living cells. However, each of these methods measured fluorescence at one wavelength; precise quantitation of multiple fluorophores at individual sites within a cell has not been possible. Using DIFM, we have achieved spectral analysis of discrete subcellular particles 1-2 gm in diameter. The fluorescence emission is broken into narrow bands by an interference monochromator and visualized through the combined use of a silicon intensified target (SIT) camera, a microcomputer based framegrabber with 8 bit resolution, and a color video monitor. Image acquisition, processing, analysis and display are under software control. The digitized image can be corrected for the spectral distortions induced by the wavelength dependent sensitivity of the camera, and the displayed image can be enhanced or presented in pseudocolor to facilitate discrimination of variation in pixel intensity of individual particles. For rapid comparison of the fluorophore composition of granules, a ratio image is produced by dividing the image captured at one wavelength by that captured at another. In the resultant ratio image, a granule which has a fluorophore composition different from the majority is selectively colored. This powerful system has been utilized to obtain spectra of endogenous autofluorescent compounds in discrete cellular organelles of human retinal pigment epithelium, and to measure immunohistochemically labelled components of the extracellular matrix associated with the human optic nerve.

  5. PolarBRDF: A general purpose Python package for visualization and quantitative analysis of multi-angular remote sensing measurements

    NASA Astrophysics Data System (ADS)

    Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh

    2016-11-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  6. PolarBRDF: A general purpose Python package for visualization and quantitative analysis of multi-angular remote sensing measurements

    NASA Astrophysics Data System (ADS)

    Poudyal, R.; Singh, M.; Gautam, R.; Gatebe, C. K.

    2016-12-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR)- http://car.gsfc.nasa.gov/. Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  7. Polarbrdf: A General Purpose Python Package for Visualization Quantitative Analysis of Multi-Angular Remote Sensing Measurements

    NASA Technical Reports Server (NTRS)

    Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh

    2016-01-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wild fire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  8. Digital photogrammetric analysis of the IMP camera images: Mapping the Mars Pathfinder landing site in three dimensions

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Hare, T.; Dorrer, E.; Cook, D.; Becker, K.; Thompson, K.; Redding, B.; Blue, J.; Galuszka, D.; Lee, E.M.; Gaddis, L.R.; Johnson, J. R.; Soderblom, L.A.; Ward, A.W.; Smith, P.H.; Britt, D.T.

    1999-01-01

    This paper describes our photogrammetric analysis of the Imager for Mars Pathfinder data, part of a broader program of mapping the Mars Pathfinder landing site in support of geoscience investigations. This analysis, carried out primarily with a commercial digital photogrammetric system, supported by our in-house Integrated Software for Imagers and Spectrometers (ISIS), consists of three steps: (1) geometric control: simultaneous solution for refined estimates of camera positions and pointing plus three-dimensional (3-D) coordinates of ???103 features sitewide, based on the measured image coordinates of those features; (2) topographic modeling: identification of ???3 ?? 105 closely spaced points in the images and calculation (based on camera parameters from step 1) of their 3-D coordinates, yielding digital terrain models (DTMs); and (3) geometric manipulation of the data: combination of the DTMs from different stereo pairs into a sitewide model, and reprojection of image data to remove parallax between the different spectral filters in the two cameras and to provide an undistorted planimetric view of the site. These processes are described in detail and example products are shown. Plans for combining the photogrammetrically derived topographic data with spectrophotometry are also described. These include photometric modeling using surface orientations from the DTM to study surface microtextures and improve the accuracy of spectral measurements, and photoclinometry to refine the DTM to single-pixel resolution where photometric properties are sufficiently uniform. Finally, the inclusion of rover images in a joint photogrammetric analysis with IMP images is described. This challenging task will provide coverage of areas hidden to the IMP, but accurate ranging of distant features can be achieved only if the lander is also visible in the rover image used. Copyright 1999 by the American Geophysical Union.

  9. Quantitative Analysis of Immunohistochemistry in Melanoma Tumors

    PubMed Central

    Lilyquist, Jenna; White, Kirsten Anne Meyer; Lee, Rebecca J.; Philips, Genevieve K.; Hughes, Christopher R.; Torres, Salina M.

    2017-01-01

    Abstract Identification of positive staining is often qualitative and subjective. This is particularly troublesome in pigmented melanoma lesions, because melanin is difficult to distinguish from the brown stain resulting from immunohistochemistry (IHC) using horse radish peroxidase developed with 3,3′-Diaminobenzidine (HRP-DAB). We sought to identify and quantify positive staining, particularly in melanoma lesions. We visualized G-protein coupled estrogen receptor (GPER) expression developed with HRP-DAB and counterstained with Azure B (stains melanin) in melanoma tissue sections (n = 3). Matched sections (n = 3), along with 22 unmatched sections, were stained only with Azure B as a control. Breast tissue (n = 1) was used as a positive HRP-DAB control. Images of the stained tissues were generated using a Nuance Spectral Imaging Camera. Analysis of the images was performed using the Nuance Spectral Imaging software and SlideBook. Data was analyzed using a Kruskal–Wallis one way analysis of variance (ANOVA). We showed that a pigmented melanoma tissue doubly stained with anti-GPER HRP-DAB and Azure B can be unmixed using spectra derived from a matched, Azure B-only section, and an anti-GPER HRP-DAB control. We unmixed each of the melanoma lesions using each of the Azure B spectra, evaluated the mean intensity of positive staining, and examined the distribution of the mean intensities (P = .73; Kruskal–Wallis). These results suggest that this method does not require a matched Azure B-only stained control tissue for every melanoma lesion, allowing precious tissues to be conserved for other studies. Importantly, this quantification method reduces the subjectivity of protein expression analysis, and provides a valuable tool for accurate evaluation, particularly for pigmented tissues. PMID:28403073

  10. Tamarisk Mapping and Monitoring Using High Resolution Satellite Imagery

    Treesearch

    Jason W. San Souci; John T. Doyle

    2006-01-01

    QuickBird high resolution multispectral satellite imagery (60 cm GSD, 4 spectral bands) and calibrated products from DigitalGlobe’s AgroWatch program were used as inputs to Visual Learning System’s Feature Analyst automated feature extraction software to map localized occurrences of pervasive and aggressive Tamarisk (Tamarix ramosissima), an invasive...

  11. USE OF ROUGH SETS AND SPECTRAL DATA FOR BUILDING PREDICTIVE MODELS OF REACTION RATE CONSTANTS

    EPA Science Inventory

    A model for predicting the log of the rate constants for alkaline hydrolysis of organic esters has been developed with the use of gas-phase min-infrared library spectra and a rule-building software system based on the mathematical theory of rough sets. A diverse set of 41 esters ...

  12. NicoLase—An open-source diode laser combiner, fiber launch, and sequencing controller for fluorescence microscopy

    PubMed Central

    Walsh, James; Böcking, Till; Gaus, Katharina

    2017-01-01

    Modern fluorescence microscopy requires software-controlled illumination sources with high power across a wide range of wavelengths. Diode lasers meet the power requirements and combining multiple units into a single fiber launch expands their capability across the required spectral range. We present the NicoLase, an open-source diode laser combiner, fiber launch, and software sequence controller for fluorescence microscopy and super-resolution microscopy applications. Two configurations are described, giving four or six output wavelengths and one or two single-mode fiber outputs, with all CAD files, machinist drawings, and controller source code openly available. PMID:28301563

  13. Spectroscopy Made Easy: A New Tool for Fitting Observations with Synthetic Spectra

    NASA Technical Reports Server (NTRS)

    Valenti, J. A.; Piskunov, N.

    1996-01-01

    We describe a new software package that may be used to determine stellar and atomic parameters by matching observed spectra with synthetic spectra generated from parameterized atmospheres. A nonlinear least squares algorithm is used to solve for any subset of allowed parameters, which include atomic data (log gf and van der Waals damping constants), model atmosphere specifications (T(sub eff, log g), elemental abundances, and radial, turbulent, and rotational velocities. LTE synthesis software handles discontiguous spectral intervals and complex atomic blends. As a demonstration, we fit 26 Fe I lines in the NSO Solar Atlas (Kurucz et al.), determining various solar and atomic parameters.

  14. Spectral reflectance properties of major objects in desert oasis: a case study of the Weigan-Kuqa river delta oasis in Xinjiang, China.

    PubMed

    Zhang, Fei; Tiyip, Tashpolat; Ding, Jianli; Sawut, Mamat; Tashpolat, Nigara; Kung, Hsiangte; Han, Guihong; Gui, Dongwei

    2012-08-01

    Aiming at the remote sensing application has been increasingly relying on ground object spectral characteristics. In order to further research the spectral reflectance characteristics in arid area, this study was performed in the typical delta oasis of Weigan and Kuqa rivers located north of Tarim Basin. Data were collected from geo-targets at multiple sites in various field conditions. The spectra data were collected for different soil types including saline-alkaline soil, silt sandy soil, cotton field, and others; vegetations of Alhagi sparsifolia, Phragmites australis, Tamarix, Halostachys caspica, etc., and water bodies. Next, the data were processed to remove high-frequency noise, and the spectral curves were smoothed with the moving average method. The derivative spectrum was generated after eliminating environmental background noise so that to distinguish the original overlap spectra. After continuum removal of the undesirable absorbance, the spectrum curves were able to highlight features for both optical absorbance and reflectance. The spectrum information of each ground object is essential for fully utilizing the multispectrum data generated by remote sensing, which will need a representative spectral library. In this study using ENVI 4.5 software, a preliminary spectral library of surface features was constructed using the data surveyed in the study area. This library can support remote sensing activities such as feature investigation, vegetation classification, and environmental monitoring in the delta oasis region. Future plan will focus on sharing and standardizing the criteria of professional spectral library and to expand and promote the utilization of the spectral databases.

  15. Stellar spectral classification of previously unclassified stars GSC 4461-698 and GSC 4466-870

    NASA Astrophysics Data System (ADS)

    Grau, Darren Moser

    Stellar spectral classification is one of the first efforts undertaken to begin defining the physical characteristics of stars. However, many stars lack even this basic information, which is the foundation for later research to constrain stellar effective temperatures, masses, radial velocities, the number of stars in the system, and age. This research obtained visible-λ stellar spectra via the testing and commissioning of a Santa Barbara Instruments Group (SBIG) Self-Guiding Spectrograph (SGS) at the UND Observatory. Utilizing a 16-inch-aperture telescope on Internet Observatory #3, the SGS obtained spectra of GSC 4461-698 and GSC 4466-870 in the low-resolution mode using an 18-µm wide slit with dispersion of 4.3 Å/pixel, resolution of 8 Å, and a spectral range from 3800-7500 Å. Observational protocols include automatic bias/dark frame subtraction for each stellar spectrum obtained. This was followed by spectral averaging to obtain a combined spectrum for each star observed. Image calibration and spectral averaging was performed using the software programs, Maxim DL, Image J, Microsoft Excel, and Winmk. A wavelength calibration process was used to obtain spectra of an Hg/Ne source that allowed the conversion of spectrograph channels into wavelengths. Stellar emission and absorption lines, such as those for hydrogen (H) and helium (He), were identified, extracted, and rectified. Each average spectrum was compared to the MK stellar spectral standards to determine an initial spectral classification for each star. The hope is that successful completion of this project will allow long-term stellar spectral observations to begin at the UND Observatory.

  16. A differential spectral responsivity measurement system constructed for determining of the spectral responsivity of a single- and triple-junction photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Sametoglu, Ferhat; Celikel, Oguz; Witt, Florian

    2017-10-01

    A differential spectral responsivity (DSR) measurement system has been designed and constructed at National Metrology Institute of Turkey (TUBITAK UME) to determine the spectral responsivity (SR) of a single- or a multi-junction photovoltaic device (solar cell). The DSR setup contains a broad band light bias source composed of a constructed Solar Simulator based on a 1000 W Xe-arc lamp owning a AM-1.5 filter and 250 W quartz-tungsten-halogen lamp, a designed and constructed LED-based Bias Light Sources, a DC voltage bias circuit, and a probe beam optical power tracking and correction circuit controlled with an ADuC847 microcontroller card together with an embedded C based software, designed and constructed in TUBITAK UME under this project. By using the constructed DSR measurement system, the SR calibration of solar cells, the monolitic triple-junction solar cell GaInP/GaInAs/Ge and its corresponding component cells have been performed within the EURAMET Joint Research Project SolCell.

  17. A technique for measuring the quality of an elliptically bent pentaerythritol [PET(002)] crystal

    DOE PAGES

    Haugh, M. J.; Jacoby, K. D.; Barrios, M. A.; ...

    2016-08-23

    Here, we present a technique for determining the X-ray spectral quality from each region of an elliptically curved PET(002) crystal. The investigative technique utilizes the shape of the crystal rocking curve which changes significantly as the radius of curvature changes. This unique quality information enables the spectroscopist to verify where in the spectral range that the spectrometer performance is satisfactory and where there are regions that would show spectral distortion. A collection of rocking curve measurements for elliptically curved PET(002) has been built up in our X-ray laboratory. The multi-lamellar model from the XOP software has been used as amore » guide and corrections were applied to the model based upon measurements. But, the measurement of RI at small radius of curvature shows an anomalous behavior; the multi-lamellar model fails to show this behavior. The effect of this anomalous RI behavior on an X-ray spectrometer calibration is calculated. It is compared to the multi-lamellar model calculation which is completely inadequate for predicting RI for this range of curvature and spectral energies.« less

  18. A technique for measuring the quality of an elliptically bent pentaerythritol [PET(002)] crystal

    NASA Astrophysics Data System (ADS)

    Haugh, M. J.; Jacoby, K. D.; Barrios, M. A.; Thorn, D.; Emig, J. A.; Schneider, M. B.

    2016-11-01

    We present a technique for determining the X-ray spectral quality from each region of an elliptically curved PET(002) crystal. The investigative technique utilizes the shape of the crystal rocking curve which changes significantly as the radius of curvature changes. This unique quality information enables the spectroscopist to verify where in the spectral range that the spectrometer performance is satisfactory and where there are regions that would show spectral distortion. A collection of rocking curve measurements for elliptically curved PET(002) has been built up in our X-ray laboratory. The multi-lamellar model from the XOP software has been used as a guide and corrections were applied to the model based upon measurements. But, the measurement of RI at small radius of curvature shows an anomalous behavior; the multi-lamellar model fails to show this behavior. The effect of this anomalous RI behavior on an X-ray spectrometer calibration is calculated. It is compared to the multi-lamellar model calculation which is completely inadequate for predicting RI for this range of curvature and spectral energies.

  19. Possibility of successive SRXFA use along with chemical-spectral methods for palladium analysis in geological samples

    NASA Astrophysics Data System (ADS)

    Kislov, E. V.; Kulikov, A. A.; Kulikova, A. B.

    1989-10-01

    Samples of basit-ultrabasit rocks and NiCu ores of the Ioko-Dovyren and Chaya massifs were analysed by SRXFA and a chemical-spectral method. SRXFA perfectly satisfies the quantitative noble-metals analysis of ore-free rocks. Combination of SRXFA and chemical-spectral analysis has good prospects. After analysis of a great number of samples by SRXFA it is necessary to select samples which would show minimal and maximal results for the chemical-spectral method.

  20. Spectral Properties and Dynamics of Gold Nanorods Revealed by EMCCD Based Spectral-Phasor Method

    PubMed Central

    Chen, Hongtao; Digman, Michelle A.

    2015-01-01

    Gold nanorods (NRs) with tunable plasmon-resonant absorption in the near-infrared region have considerable advantages over organic fluorophores as imaging agents. However, the luminescence spectral properties of NRs have not been fully explored at the single particle level in bulk due to lack of proper analytic tools. Here we present a global spectral phasor analysis method which allows investigations of NRs' spectra at single particle level with their statistic behavior and spatial information during imaging. The wide phasor distribution obtained by the spectral phasor analysis indicates spectra of NRs are different from particle to particle. NRs with different spectra can be identified graphically in corresponding spatial images with high spectral resolution. Furthermore, spectral behaviors of NRs under different imaging conditions, e.g. different excitation powers and wavelengths, were carefully examined by our laser-scanning multiphoton microscope with spectral imaging capability. Our results prove that the spectral phasor method is an easy and efficient tool in hyper-spectral imaging analysis to unravel subtle changes of the emission spectrum. Moreover, we applied this method to study the spectral dynamics of NRs during direct optical trapping and by optothermal trapping. Interestingly, spectral shifts were observed in both trapping phenomena. PMID:25684346

  1. Using multi-date satellite imagery to monitor invasive grass species distribution in post-wildfire landscapes: An iterative, adaptable approach that employs open-source data and software

    USGS Publications Warehouse

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Kumar, Sunil; Swallow, Aaron; Luizza, Matthew; Chignell, Steve

    2017-01-01

    Among the most pressing concerns of land managers in post-wildfire landscapes are the establishment and spread of invasive species. Land managers need accurate maps of invasive species cover for targeted management post-disturbance that are easily transferable across space and time. In this study, we sought to develop an iterative, replicable methodology based on limited invasive species occurrence data, freely available remotely sensed data, and open source software to predict the distribution of Bromus tectorum (cheatgrass) in a post-wildfire landscape. We developed four species distribution models using eight spectral indices derived from five months of Landsat 8 Operational Land Imager (OLI) data in 2014. These months corresponded to both cheatgrass growing period and time of field data collection in the study area. The four models were improved using an iterative approach in which a threshold for cover was established, and all models had high sensitivity values when tested on an independent dataset. We also quantified the area at highest risk for invasion in future seasons given 2014 distribution, topographic covariates, and seed dispersal limitations. These models demonstrate the effectiveness of using derived multi-date spectral indices as proxies for species occurrence on the landscape, the importance of selecting thresholds for invasive species cover to evaluate ecological risk in species distribution models, and the applicability of Landsat 8 OLI and the Software for Assisted Habitat Modeling for targeted invasive species management.

  2. Using multi-date satellite imagery to monitor invasive grass species distribution in post-wildfire landscapes: An iterative, adaptable approach that employs open-source data and software

    NASA Astrophysics Data System (ADS)

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Kumar, Sunil; Swallow, Aaron; Luizza, Matthew W.; Chignell, Stephen M.

    2017-07-01

    Among the most pressing concerns of land managers in post-wildfire landscapes are the establishment and spread of invasive species. Land managers need accurate maps of invasive species cover for targeted management post-disturbance that are easily transferable across space and time. In this study, we sought to develop an iterative, replicable methodology based on limited invasive species occurrence data, freely available remotely sensed data, and open source software to predict the distribution of Bromus tectorum (cheatgrass) in a post-wildfire landscape. We developed four species distribution models using eight spectral indices derived from five months of Landsat 8 Operational Land Imager (OLI) data in 2014. These months corresponded to both cheatgrass growing period and time of field data collection in the study area. The four models were improved using an iterative approach in which a threshold for cover was established, and all models had high sensitivity values when tested on an independent dataset. We also quantified the area at highest risk for invasion in future seasons given 2014 distribution, topographic covariates, and seed dispersal limitations. These models demonstrate the effectiveness of using derived multi-date spectral indices as proxies for species occurrence on the landscape, the importance of selecting thresholds for invasive species cover to evaluate ecological risk in species distribution models, and the applicability of Landsat 8 OLI and the Software for Assisted Habitat Modeling for targeted invasive species management.

  3. Hyperspectral analysis of seagrass in Redfish Bay, Texas

    NASA Astrophysics Data System (ADS)

    Wood, John S.

    Remote sensing using multi- and hyperspectral imaging and analysis has been used in resource management for quite some time, and for a variety of purposes. In the studies to follow, hyperspectral imagery of Redfish Bay is used to discriminate between species of seagrasses found below the water surface. Water attenuates and reflects light and energy from the electromagnetic spectrum, and as a result, subsurface analysis can be more complex than that performed in the terrestrial world. In the following studies, an iterative process is developed, using ENVI image processing software and ArcGIS software. Band selection was based on recommendations developed empirically in conjunction with ongoing research into depth corrections, which were applied to the imagery bands (a default depth of 65 cm was used). Polygons generated, classified and aggregated within ENVI are reclassified in ArcGIS using field site data that was randomly selected for that purpose. After the first iteration, polygons that remain classified as 'Mixed' are subjected to another iteration of classification in ENVI, then brought into ArcGIS and reclassified. Finally, when that classification scheme is exhausted, a supervised classification is performed, using a 'Maximum Likelihood' classification technique, which assigned the remaining polygons to the classification that was most like the training polygons, by digital number value. Producer's Accuracy by classification ranged from 23.33 % for the 'MixedMono' class to 66.67% for the 'Bare' class; User's Accuracy by classification ranged from 22.58% for the 'MixedMono' class to 69.57% for the 'Bare' classification. An overall accuracy of 37.93% was achieved. Producers and Users Accuracies for Halodule were 29% and 39%, respectively; for Thalassia, they were 46% and 40%. Cohen's Kappa Coefficient was calculated at .2988. We then returned to the field and collected spectral signatures of monotypic stands of seagrass at varying depths and at three sensor levels: above the water surface, just below the air/water interface, and at the canopy position, when it differed from the subsurface position. Analysis of plots of these spectral curves, after applying depth corrections and Multiplicative Scatter Correction, indicates that there are detectable spectral differences between Halodule and Thalassia species at all three positions. Further analysis indicated that only above-surface spectral signals could reliably be used to discriminate between species, because there was an overlap of the standard deviations in the other two positions. A recommendation for wavelengths that would produce increased accuracy in hyperspectral image analysis was made, based on areas where there is a significant amount of difference between the mean spectral signatures, and no overlap of the standard deviations in our samples. The original hyperspectral imagery was reprocessed, using the bands recommended from the research above (approximately 535, 600, 620, 638, and 656 nm). A depth raster was developed from various available sources, which was resampled and reclassified to reflect values for water absorption and water scattering, which were then applied to each band using the depth correction algorithm. Processing followed the iterative classification methods described above. Accuracy for this round of processing improved; overall accuracy increased from 38% to 57%. Improvements were noted in Producer's Accuracy, with the 'Bare' vi classification increasing from 67% to 73%, Halodule increasing from 29% to 63%, Thalassia increasing slightly, from 46% to 50%, and 'MixedMono' improving from 23% to 42%. User's Accuracy also improved, with the 'Bare' class increasing from 69% to 70%, Halodule increasing from 39% to 67%, Thalassia increasing from 40% to 7%, and 'MixedMono' increasing from 22.5% to 35%. A very recent report shows the mean percent cover of seagrasses in Redfish Bay and Corpus Christi Bay combined for all species at 68.6%, and individually by species: Halodule 39.8%, Thalassia 23.7%, Syringodium 4%, Ruppia 1% and Halophila 0.1%. Our study classifies 15% as 'Bare', 23% Halodule, 18% Thalassia, and 2% Ruppia. In addition, we classify 5% as 'Mixed', 22% as 'MixedMono', 12% as 'Bare/Halodule Mix', and 3% 'Bare/Thalassia Mix'. Aggregating the 'Bare' and 'Bare/species' classes would equate to approximately 30%, very close to what this new study produces. Other classes are quite similar, when considering that their study includes no 'Mixed' classifications. This series of research studies illustrates the application and utility of hyperspectral imagery and associated processing to mapping shallow benthic habitats. It also demonstrates that the technology is rapidly changing and adapting, which will lead to even further increases in accuracy. Future studies with hyperspectral imaging should include extensive spectral field collection, and the application of a depth correction.

  4. Evaluation of the BreastSimulator software platform for breast tomography

    NASA Astrophysics Data System (ADS)

    Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.

    2017-08-01

    The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f)  =  α/f   β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.

  5. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    NASA Technical Reports Server (NTRS)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  6. Testing of a simplified LED based vis/NIR system for rapid ripeness evaluation of white grape (Vitis vinifera L.) for Franciacorta wine.

    PubMed

    Giovenzana, Valentina; Civelli, Raffaele; Beghi, Roberto; Oberti, Roberto; Guidetti, Riccardo

    2015-11-01

    The aim of this work was to test a simplified optical prototype for a rapid estimation of the ripening parameters of white grape for Franciacorta wine directly in field. Spectral acquisition based on reflectance at four wavelengths (630, 690, 750 and 850 nm) was proposed. The integration of a simple processing algorithm in the microcontroller software would allow to visualize real time values of spectral reflectance. Non-destructive analyses were carried out on 95 grape bunches for a total of 475 berries. Samplings were performed weekly during the last ripening stages. Optical measurements were carried out both using the simplified system and a portable commercial vis/NIR spectrophotometer, as reference instrument for performance comparison. Chemometric analyses were performed in order to extract the maximum useful information from optical data. Principal component analysis (PCA) was performed for a preliminary evaluation of the data. Correlations between the optical data matrix and ripening parameters (total soluble solids content, SSC; titratable acidity, TA) were carried out using partial least square (PLS) regression for spectra and using multiple linear regression (MLR) for data from the simplified device. Classification analysis were also performed with the aim of discriminate ripe and unripe samples. PCA, MLR and classification analyses show the effectiveness of the simplified system in separating samples among different sampling dates and in discriminating ripe from unripe samples. Finally, simple equations for SSC and TA prediction were calculated. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Finding fossils in new ways: an artificial neural network approach to predicting the location of productive fossil localities.

    PubMed

    Anemone, Robert; Emerson, Charles; Conroy, Glenn

    2011-01-01

    Chance and serendipity have long played a role in the location of productive fossil localities by vertebrate paleontologists and paleoanthropologists. We offer an alternative approach, informed by methods borrowed from the geographic information sciences and using recent advances in computer science, to more efficiently predict where fossil localities might be found. Our model uses an artificial neural network (ANN) that is trained to recognize the spectral characteristics of known productive localities and other land cover classes, such as forest, wetlands, and scrubland, within a study area based on the analysis of remotely sensed (RS) imagery. Using these spectral signatures, the model then classifies other pixels throughout the study area. The results of the neural network classification can be examined and further manipulated within a geographic information systems (GIS) software package. While we have developed and tested this model on fossil mammal localities in deposits of Paleocene and Eocene age in the Great Divide Basin of southwestern Wyoming, a similar analytical approach can be easily applied to fossil-bearing sedimentary deposits of any age in any part of the world. We suggest that new analytical tools and methods of the geographic sciences, including remote sensing and geographic information systems, are poised to greatly enrich paleoanthropological investigations, and that these new methods should be embraced by field workers in the search for, and geospatial analysis of, fossil primates and hominins. Copyright © 2011 Wiley-Liss, Inc.

  8. Benefit of the Use of GCxGC/MS Profiles for 1D GC/MS Data Treatment Illustrated by the Analysis of Pyrolysis Products from East Asian Handmade Papers

    NASA Astrophysics Data System (ADS)

    Han, Bin; Lob, Silvia; Sablier, Michel

    2018-06-01

    In this study, we report the use of pyrolysis-GCxGC/MS profiles for an optimized treatment of data issued from pyrolysis-GC/MS combined with the automatic deconvolution software Automated Mass Spectral Deconvolution and Identification System (AMDIS). The method was illustrated by the characterization of marker compounds of East Asian handmade papers through the examination of pyrolysis-GCxGC/MS data to get information which was used for manually identifying low concentrated and co-eluting compounds in 1D GC/MS data. The results showed that the merits of a higher separation power for co-eluting compounds and a better sensitivity for low concentration compounds offered by a GCxGC system can be used effectively for AMDIS 1D GC/MS data treatment: (i) the compound distribution in pyrolysis-GCxGC/MS profiles can be used as "peak finder" for manual check of low concentration and co-eluting compound identification in 1D GC/MS data, and (ii) pyrolysis-GCxGC/MS profiles can provide better quality mass spectra with observed higher match factors in the AMDIS automatic match process. The combination of 2D profile with AMDIS was shown to contribute efficiently to a better characterization of compound profiles in the chromatograms obtained by 1D analysis in focusing on the mass spectral identification. [Figure not available: see fulltext.

  9. ExoData: A Python package to handle large exoplanet catalogue data

    NASA Astrophysics Data System (ADS)

    Varley, Ryan

    2016-10-01

    Exoplanet science often involves using the system parameters of real exoplanets for tasks such as simulations, fitting routines, and target selection for proposals. Several exoplanet catalogues are already well established but often lack a version history and code friendly interfaces. Software that bridges the barrier between the catalogues and code enables users to improve the specific repeatability of results by facilitating the retrieval of exact system parameters used in articles results along with unifying the equations and software used. As exoplanet science moves towards large data, gone are the days where researchers can recall the current population from memory. An interface able to query the population now becomes invaluable for target selection and population analysis. ExoData is a Python interface and exploratory analysis tool for the Open Exoplanet Catalogue. It allows the loading of exoplanet systems into Python as objects (Planet, Star, Binary, etc.) from which common orbital and system equations can be calculated and measured parameters retrieved. This allows researchers to use tested code of the common equations they require (with units) and provides a large science input catalogue of planets for easy plotting and use in research. Advanced querying of targets is possible using the database and Python programming language. ExoData is also able to parse spectral types and fill in missing parameters according to programmable specifications and equations. Examples of use cases are integration of equations into data reduction pipelines, selecting planets for observing proposals and as an input catalogue to large scale simulation and analysis of planets. ExoData is a Python package available freely on GitHub.

  10. Non destructive defect detection by spectral density analysis.

    PubMed

    Krejcar, Ondrej; Frischer, Robert

    2011-01-01

    The potential nondestructive diagnostics of solid objects is discussed in this article. The whole process is accomplished by consecutive steps involving software analysis of the vibration power spectrum (eventually acoustic emissions) created during the normal operation of the diagnosed device or under unexpected situations. Another option is to create an artificial pulse, which can help us to determine the actual state of the diagnosed device. The main idea of this method is based on the analysis of the current power spectrum density of the received signal and its postprocessing in the Matlab environment with a following sample comparison in the Statistica software environment. The last step, which is comparison of samples, is the most important, because it is possible to determine the status of the examined object at a given time. Nowadays samples are compared only visually, but this method can't produce good results. Further the presented filter can choose relevant data from a huge group of data, which originate from applying FFT (Fast Fourier Transform). On the other hand, using this approach they can be subjected to analysis with the assistance of a neural network. If correct and high-quality starting data are provided to the initial network, we are able to analyze other samples and state in which condition a certain object is. The success rate of this approximation, based on our testing of the solution, is now 85.7%. With further improvement of the filter, it could be even greater. Finally it is possible to detect defective conditions or upcoming limiting states of examined objects/materials by using only one device which contains HW and SW parts. This kind of detection can provide significant financial savings in certain cases (such as continuous casting of iron where it could save hundreds of thousands of USD).

  11. Portable Multispectral Colorimeter for Metallic Ion Detection and Classification

    PubMed Central

    Jaimes, Ruth F. V. V.; Borysow, Walter; Gomes, Osmar F.; Salcedo, Walter J.

    2017-01-01

    This work deals with a portable device system applied to detect and classify different metallic ions as proposed and developed, aiming its application for hydrological monitoring systems such as rivers, lakes and groundwater. Considering the system features, a portable colorimetric system was developed by using a multispectral optoelectronic sensor. All the technology of quantification and classification of metallic ions using optoelectronic multispectral sensors was fully integrated in the embedded hardware FPGA ( Field Programmable Gate Array) technology and software based on virtual instrumentation (NI LabView®). The system draws on an indicative colorimeter by using the chromogen reagent of 1-(2-pyridylazo)-2-naphthol (PAN). The results obtained with the signal processing and pattern analysis using the method of the linear discriminant analysis, allows excellent results during detection and classification of Pb(II), Cd(II), Zn(II), Cu(II), Fe(III) and Ni(II) ions, with almost the same level of performance as for those obtained from the Ultravioled and visible (UV-VIS) spectrophotometers of high spectral resolution. PMID:28788082

  12. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  13. Portable Multispectral Colorimeter for Metallic Ion Detection and Classification.

    PubMed

    Braga, Mauro S; Jaimes, Ruth F V V; Borysow, Walter; Gomes, Osmar F; Salcedo, Walter J

    2017-07-28

    This work deals with a portable device system applied to detect and classify different metallic ions as proposed and developed, aiming its application for hydrological monitoring systems such as rivers, lakes and groundwater. Considering the system features, a portable colorimetric system was developed by using a multispectral optoelectronic sensor. All the technology of quantification and classification of metallic ions using optoelectronic multispectral sensors was fully integrated in the embedded hardware FPGA ( Field Programmable Gate Array) technology and software based on virtual instrumentation (NI LabView ® ). The system draws on an indicative colorimeter by using the chromogen reagent of 1-(2-pyridylazo)-2-naphthol (PAN). The results obtained with the signal processing and pattern analysis using the method of the linear discriminant analysis, allows excellent results during detection and classification of Pb(II), Cd(II), Zn(II), Cu(II), Fe(III) and Ni(II) ions, with almost the same level of performance as for those obtained from the Ultravioled and visible (UV-VIS) spectrophotometers of high spectral resolution.

  14. XAssist: A System for the Automation of X-ray Astrophysics Analysis

    NASA Astrophysics Data System (ADS)

    Ptak, A.

    2004-08-01

    XAssist is a NASA AISR-funded project for the automation of X-ray astrophysics. It is capable of data reprocessing, source detection, and preliminary spatial, temporal and spectral analysis for each source with sufficient counts. The bulk of the system is written in Python, which in turn drives underlying software (CIAO for Chandra data, etc.). Future work will include a GUI (mainly for beginners and status monitoring) and the exposure of at least some functionality as web services. The latter will help XAssist to eventually become part of the VO, making advanced queries possible, such as determining the X-ray fluxes of counterparts to HST or SDSS sources (including the use of unpublished X-ray data), and add the ability of ``on-the-fly'' X-ray processing. Pipelines are running on Chandra and XMM-Newton observations of galaxies to demonstrate XAssist's capabilities, and the results are available online (in real time) at http://www.xassist.org. XAssist itself as well as various associated projects are available for download.

  15. A CAMAC based real-time noise analysis system for nuclear reactors

    NASA Astrophysics Data System (ADS)

    Ciftcioglu, Özer

    1987-05-01

    A CAMAC based real-time noise analysis system was designed for the TRIGA MARK II nuclear reactor at the Institute for Nuclear Energy, Istanbul. The input analog signals obtained from the radiation detectors are introduced to the system through CAMAC interface. The signals converted into digital form are processed by a PDP-11 computer. The fast data processing based on auto/cross power spectral density computations is carried out by means of assembly written FFT algorithms in real-time and the spectra obtained are displayed on a CAMAC driven display system as an additional monitoring device. The system has the advantage of being software programmable and controlled by a CAMAC system so that it is operated under program control for reactor surveillance, anomaly detection and diagnosis. The system can also be used for the identification of nonstationary operational characteristics of the reactor in long term by comparing the noise power spectra with the corresponding reference noise patterns prepared in advance.

  16. Evaluation of Thymus vulgaris plant extract as an eco-friendly corrosion inhibitor for stainless steel 304 in acidic solution by means of electrochemical impedance spectroscopy, electrochemical noise analysis and density functional theory.

    PubMed

    Ehsani, A; Mahjani, M G; Hosseini, M; Safari, R; Moshrefi, R; Mohammad Shiri, H

    2017-03-15

    Inhibition performance of Thymus vulgaris plant leaves extract (thyme) as environmentally friendly (green) inhibitor for the corrosion protection of stainless steel (SS) type 304 in 1.0molL -1 HCl solution was studied by potentiodynamic polarization, electrochemical impedance (EIS) and electrochemical noise measurements (EN) techniques. The EN data were analyzed with FFT technique to make the spectral power density plots. The calculations were performed by MATLAB 2014a software. Geometry optimization and calculation of the structural and electronic properties of the molecular system of inhibitor have been carried out using UB3LYP/6-311++G ∗∗ level. Moreover, the results obtained from electrochemical noise analysis were compared with potentiodynamic polarization and electrochemical impedance spectroscopy. All of the used techniques showed positive effect of green inhibitor with increasing inhibitor concentration. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Hyperspectral imaging simulation of object under sea-sky background

    NASA Astrophysics Data System (ADS)

    Wang, Biao; Lin, Jia-xuan; Gao, Wei; Yue, Hui

    2016-10-01

    Remote sensing image simulation plays an important role in spaceborne/airborne load demonstration and algorithm development. Hyperspectral imaging is valuable in marine monitoring, search and rescue. On the demand of spectral imaging of objects under the complex sea scene, physics based simulation method of spectral image of object under sea scene is proposed. On the development of an imaging simulation model considering object, background, atmosphere conditions, sensor, it is able to examine the influence of wind speed, atmosphere conditions and other environment factors change on spectral image quality under complex sea scene. Firstly, the sea scattering model is established based on the Philips sea spectral model, the rough surface scattering theory and the water volume scattering characteristics. The measured bi directional reflectance distribution function (BRDF) data of objects is fit to the statistical model. MODTRAN software is used to obtain solar illumination on the sea, sky brightness, the atmosphere transmittance from sea to sensor and atmosphere backscattered radiance, and Monte Carlo ray tracing method is used to calculate the sea surface object composite scattering and spectral image. Finally, the object spectrum is acquired by the space transformation, radiation degradation and adding the noise. The model connects the spectrum image with the environmental parameters, the object parameters, and the sensor parameters, which provide a tool for the load demonstration and algorithm development.

  18. Land surface temperature measurements from EOS MODIS data

    NASA Technical Reports Server (NTRS)

    Wan, Zhengming

    1995-01-01

    A significant progress has been made in TIR instrumentation which is required to establish the spectral BRDF/emissivity knowledge base of land-surface materials and to validate the land-surface temperature (LST) algorithms. The SIBRE (spectral Infrared Bidirectional Reflectance and Emissivity) system and a TIR system for measuring spectral directional-hemispherical emissivity have been completed and tested successfully. Optical properties and performance features of key components (including spectrometer, and TIR source) of these systems have been characterized by integrated use of local standards (blackbody and reference plates). The stabilization of the spectrometer performance was improved by a custom designed and built liquid cooling system. Methods and procedures for measuring spectral TIR BRDF and directional-hemispheric emissivity with these two systems have been verified in sample measurements. These TIR instruments have been used in the laboratory and the field, giving very promising results. The measured spectral emissivities of water surface are very close to the calculated values based on well established water refractive index values in published papers. Preliminary results show that the TIR instruments can be used for validation of the MODIS LST algorithm in homogeneous test sites. The beta-3 version of the MODIS LST software is being prepared for its delivery scheduled in the early second half of this year.

  19. Spectral compression algorithms for the analysis of very large multivariate images

    DOEpatents

    Keenan, Michael R.

    2007-10-16

    A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.

  20. A Steady-State Kalman Predictor-Based Filtering Strategy for Non-Overlapping Sub-Band Spectral Estimation

    PubMed Central

    Li, Zenghui; Xu, Bin; Yang, Jian; Song, Jianshe

    2015-01-01

    This paper focuses on suppressing spectral overlap for sub-band spectral estimation, with which we can greatly decrease the computational complexity of existing spectral estimation algorithms, such as nonlinear least squares spectral analysis and non-quadratic regularized sparse representation. Firstly, our study shows that the nominal ability of the high-order analysis filter to suppress spectral overlap is greatly weakened when filtering a finite-length sequence, because many meaningless zeros are used as samples in convolution operations. Next, an extrapolation-based filtering strategy is proposed to produce a series of estimates as the substitutions of the zeros and to recover the suppression ability. Meanwhile, a steady-state Kalman predictor is applied to perform a linearly-optimal extrapolation. Finally, several typical methods for spectral analysis are applied to demonstrate the effectiveness of the proposed strategy. PMID:25609038

Top