Distributed Unmixing of Hyperspectral Datawith Sparsity Constraint
NASA Astrophysics Data System (ADS)
Khoshsokhan, S.; Rajabi, R.; Zayyani, H.
2017-09-01
Spectral unmixing (SU) is a data processing problem in hyperspectral remote sensing. The significant challenge in the SU problem is how to identify endmembers and their weights, accurately. For estimation of signature and fractional abundance matrices in a blind problem, nonnegative matrix factorization (NMF) and its developments are used widely in the SU problem. One of the constraints which was added to NMF is sparsity constraint that was regularized by L1/2 norm. In this paper, a new algorithm based on distributed optimization has been used for spectral unmixing. In the proposed algorithm, a network including single-node clusters has been employed. Each pixel in hyperspectral images considered as a node in this network. The distributed unmixing with sparsity constraint has been optimized with diffusion LMS strategy, and then the update equations for fractional abundance and signature matrices are obtained. Simulation results based on defined performance metrics, illustrate advantage of the proposed algorithm in spectral unmixing of hyperspectral data compared with other methods. The results show that the AAD and SAD of the proposed approach are improved respectively about 6 and 27 percent toward distributed unmixing in SNR=25dB.
A method of minimum volume simplex analysis constrained unmixing for hyperspectral image
NASA Astrophysics Data System (ADS)
Zou, Jinlin; Lan, Jinhui; Zeng, Yiliang; Wu, Hongtao
2017-07-01
The signal recorded by a low resolution hyperspectral remote sensor from a given pixel, letting alone the effects of the complex terrain, is a mixture of substances. To improve the accuracy of classification and sub-pixel object detection, hyperspectral unmixing(HU) is a frontier-line in remote sensing area. Unmixing algorithm based on geometric has become popular since the hyperspectral image possesses abundant spectral information and the mixed model is easy to understand. However, most of the algorithms are based on pure pixel assumption, and since the non-linear mixed model is complex, it is hard to obtain the optimal endmembers especially under a highly mixed spectral data. To provide a simple but accurate method, we propose a minimum volume simplex analysis constrained (MVSAC) unmixing algorithm. The proposed approach combines the algebraic constraints that are inherent to the convex minimum volume with abundance soft constraint. While considering abundance fraction, we can obtain the pure endmember set and abundance fraction correspondingly, and the final unmixing result is closer to reality and has better accuracy. We illustrate the performance of the proposed algorithm in unmixing simulated data and real hyperspectral data, and the result indicates that the proposed method can obtain the distinct signatures correctly without redundant endmember and yields much better performance than the pure pixel based algorithm.
2011-04-01
Sensitive Dual Color In Vivo Bioluminescence Imaging Using a New Red Codon Optimized Firefly Luciferase and a Green Click Beetle Luciferase Laura...20 nm). Spectral unmixing algorithms were applied to the images where good separation of signals was observed. Furthermore, HEK293 cells that...spectral emissions using a suitable spectral unmixing algorithm . This new D-luciferin-dependent reporter gene couplet opens up the possibility in the future
Spectral Unmixing With Multiple Dictionaries
NASA Astrophysics Data System (ADS)
Cohen, Jeremy E.; Gillis, Nicolas
2018-02-01
Spectral unmixing aims at recovering the spectral signatures of materials, called endmembers, mixed in a hyperspectral or multispectral image, along with their abundances. A typical assumption is that the image contains one pure pixel per endmember, in which case spectral unmixing reduces to identifying these pixels. Many fully automated methods have been proposed in recent years, but little work has been done to allow users to select areas where pure pixels are present manually or using a segmentation algorithm. Additionally, in a non-blind approach, several spectral libraries may be available rather than a single one, with a fixed number (or an upper or lower bound) of endmembers to chose from each. In this paper, we propose a multiple-dictionary constrained low-rank matrix approximation model that address these two problems. We propose an algorithm to compute this model, dubbed M2PALS, and its performance is discussed on both synthetic and real hyperspectral images.
Spectral Unmixing Based Construction of Lunar Mineral Abundance Maps
NASA Astrophysics Data System (ADS)
Bernhardt, V.; Grumpe, A.; Wöhler, C.
2017-07-01
In this study we apply a nonlinear spectral unmixing algorithm to a nearly global lunar spectral reflectance mosaic derived from hyper-spectral image data acquired by the Moon Mineralogy Mapper (M3) instrument. Corrections for topographic effects and for thermal emission were performed. A set of 19 laboratory-based reflectance spectra of lunar samples published by the Lunar Soil Characterization Consortium (LSCC) were used as a catalog of potential endmember spectra. For a given spectrum, the multi-population population-based incremental learning (MPBIL) algorithm was used to determine the subset of endmembers actually contained in it. However, as the MPBIL algorithm is computationally expensive, it cannot be applied to all pixels of the reflectance mosaic. Hence, the reflectance mosaic was clustered into a set of 64 prototype spectra, and the MPBIL algorithm was applied to each prototype spectrum. Each pixel of the mosaic was assigned to the most similar prototype, and the set of endmembers previously determined for that prototype was used for pixel-wise nonlinear spectral unmixing using the Hapke model, implemented as linear unmixing of the single-scattering albedo spectrum. This procedure yields maps of the fractional abundances of the 19 endmembers. Based on the known modal abundances of a variety of mineral species in the LSCC samples, a conversion from endmember abundances to mineral abundances was performed. We present maps of the fractional abundances of plagioclase, pyroxene and olivine and compare our results with previously published lunar mineral abundance maps.
Supervised nonlinear spectral unmixing using a postnonlinear mixing model for hyperspectral imagery.
Altmann, Yoann; Halimi, Abderrahim; Dobigeon, Nicolas; Tourneret, Jean-Yves
2012-06-01
This paper presents a nonlinear mixing model for hyperspectral image unmixing. The proposed model assumes that the pixel reflectances are nonlinear functions of pure spectral components contaminated by an additive white Gaussian noise. These nonlinear functions are approximated using polynomial functions leading to a polynomial postnonlinear mixing model. A Bayesian algorithm and optimization methods are proposed to estimate the parameters involved in the model. The performance of the unmixing strategies is evaluated by simulations conducted on synthetic and real data.
Spectral unmixing of multi-color tissue specific in vivo fluorescence in mice
NASA Astrophysics Data System (ADS)
Zacharakis, Giannis; Favicchio, Rosy; Garofalakis, Anikitos; Psycharakis, Stylianos; Mamalaki, Clio; Ripoll, Jorge
2007-07-01
Fluorescence Molecular Tomography (FMT) has emerged as a powerful tool for monitoring biological functions in vivo in small animals. It provides the means to determine volumetric images of fluorescent protein concentration by applying the principles of diffuse optical tomography. Using different probes tagged to different proteins or cells, different biological functions and pathways can be simultaneously imaged in the same subject. In this work we present a spectral unmixing algorithm capable of separating signal from different probes when combined with the tomographic imaging modality. We show results of two-color imaging when the algorithm is applied to separate fluorescence activity originating from phantoms containing two different fluorophores, namely CFSE and SNARF, with well separated emission spectra, as well as Dsred- and GFP-fused cells in F5-b10 transgenic mice in vivo. The same algorithm can furthermore be applied to tissue-specific spectroscopy data. Spectral analysis of a variety of organs from control, DsRed and GFP F5/B10 transgenic mice showed that fluorophore detection by optical systems is highly tissue-dependent. Spectral data collected from different organs can provide useful insight into experimental parameter optimisation (choice of filters, fluorophores, excitation wavelengths) and spectral unmixing can be applied to measure the tissue-dependency, thereby taking into account localized fluorophore efficiency. Summed up, tissue spectral unmixing can be used as criteria in choosing the most appropriate tissue targets as well as fluorescent markers for specific applications.
NASA Astrophysics Data System (ADS)
Martin, Gabriel; Gonzalez-Ruiz, Vicente; Plaza, Antonio; Ortiz, Juan P.; Garcia, Inmaculada
2010-07-01
Lossy hyperspectral image compression has received considerable interest in recent years due to the extremely high dimensionality of the data. However, the impact of lossy compression on spectral unmixing techniques has not been widely studied. These techniques characterize mixed pixels (resulting from insufficient spatial resolution) in terms of a suitable combination of spectrally pure substances (called endmembers) weighted by their estimated fractional abundances. This paper focuses on the impact of JPEG2000-based lossy compression of hyperspectral images on the quality of the endmembers extracted by different algorithms. The three considered algorithms are the orthogonal subspace projection (OSP), which uses only spatial information, and the automatic morphological endmember extraction (AMEE) and spatial spectral endmember extraction (SSEE), which integrate both spatial and spectral information in the search for endmembers. The impact of compression on the resulting abundance estimation based on the endmembers derived by different methods is also substantiated. Experimental results are conducted using a hyperspectral data set collected by NASA Jet Propulsion Laboratory over the Cuprite mining district in Nevada. The experimental results are quantitatively analyzed using reference information available from U.S. Geological Survey, resulting in recommendations to specialists interested in applying endmember extraction and unmixing algorithms to compressed hyperspectral data.
Multi-objective based spectral unmixing for hyperspectral images
NASA Astrophysics Data System (ADS)
Xu, Xia; Shi, Zhenwei
2017-02-01
Sparse hyperspectral unmixing assumes that each observed pixel can be expressed by a linear combination of several pure spectra in a priori library. Sparse unmixing is challenging, since it is usually transformed to a NP-hard l0 norm based optimization problem. Existing methods usually utilize a relaxation to the original l0 norm. However, the relaxation may bring in sensitive weighted parameters and additional calculation error. In this paper, we propose a novel multi-objective based algorithm to solve the sparse unmixing problem without any relaxation. We transform sparse unmixing to a multi-objective optimization problem, which contains two correlative objectives: minimizing the reconstruction error and controlling the endmember sparsity. To improve the efficiency of multi-objective optimization, a population-based randomly flipping strategy is designed. Moreover, we theoretically prove that the proposed method is able to recover a guaranteed approximate solution from the spectral library within limited iterations. The proposed method can directly deal with l0 norm via binary coding for the spectral signatures in the library. Experiments on both synthetic and real hyperspectral datasets demonstrate the effectiveness of the proposed method.
Robust Spectral Unmixing of Sparse Multispectral Lidar Waveforms using Gamma Markov Random Fields
Altmann, Yoann; Maccarone, Aurora; McCarthy, Aongus; ...
2017-05-10
Here, this paper presents a new Bayesian spectral un-mixing algorithm to analyse remote scenes sensed via sparse multispectral Lidar measurements. To a first approximation, in the presence of a target, each Lidar waveform consists of a main peak, whose position depends on the target distance and whose amplitude depends on the wavelength of the laser source considered (i.e, on the target reflectivity). Besides, these temporal responses are usually assumed to be corrupted by Poisson noise in the low photon count regime. When considering multiple wavelengths, it becomes possible to use spectral information in order to identify and quantify the mainmore » materials in the scene, in addition to estimation of the Lidar-based range profiles. Due to its anomaly detection capability, the proposed hierarchical Bayesian model, coupled with an efficient Markov chain Monte Carlo algorithm, allows robust estimation of depth images together with abundance and outlier maps associated with the observed 3D scene. The proposed methodology is illustrated via experiments conducted with real multispectral Lidar data acquired in a controlled environment. The results demonstrate the possibility to unmix spectral responses constructed from extremely sparse photon counts (less than 10 photons per pixel and band).« less
Spectral Unmixing Analysis of Time Series Landsat 8 Images
NASA Astrophysics Data System (ADS)
Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.
2018-05-01
Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.
NASA Astrophysics Data System (ADS)
Xu, Xia; Shi, Zhenwei; Pan, Bin
2018-07-01
Sparse unmixing aims at recovering pure materials from hyperpspectral images and estimating their abundance fractions. Sparse unmixing is actually ℓ0 problem which is NP-h ard, and a relaxation is often used. In this paper, we attempt to deal with ℓ0 problem directly via a multi-objective based method, which is a non-convex manner. The characteristics of hyperspectral images are integrated into the proposed method, which leads to a new spectra and multi-objective based sparse unmixing method (SMoSU). In order to solve the ℓ0 norm optimization problem, the spectral library is encoded in a binary vector, and a bit-wise flipping strategy is used to generate new individuals in the evolution process. However, a multi-objective method usually produces a number of non-dominated solutions, while sparse unmixing requires a single solution. How to make the final decision for sparse unmixing is challenging. To handle this problem, we integrate the spectral characteristic of hyperspectral images into SMoSU. By considering the spectral correlation in hyperspectral data, we improve the Tchebycheff decomposition function in SMoSU via a new regularization item. This regularization item is able to enforce the individual divergence in the evolution process of SMoSU. In this way, the diversity and convergence of population is further balanced, which is beneficial to the concentration of individuals. In the experiments part, three synthetic datasets and one real-world data are used to analyse the effectiveness of SMoSU, and several state-of-art sparse unmixing algorithms are compared.
Collewet, Guylaine; Moussaoui, Saïd; Deligny, Cécile; Lucas, Tiphaine; Idier, Jérôme
2018-06-01
Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry. Copyright © 2018 Elsevier Inc. All rights reserved.
Unsupervised Unmixing of Hyperspectral Images Accounting for Endmember Variability.
Halimi, Abderrahim; Dobigeon, Nicolas; Tourneret, Jean-Yves
2015-12-01
This paper presents an unsupervised Bayesian algorithm for hyperspectral image unmixing, accounting for endmember variability. The pixels are modeled by a linear combination of endmembers weighted by their corresponding abundances. However, the endmembers are assumed random to consider their variability in the image. An additive noise is also considered in the proposed model, generalizing the normal compositional model. The proposed algorithm exploits the whole image to benefit from both spectral and spatial information. It estimates both the mean and the covariance matrix of each endmember in the image. This allows the behavior of each material to be analyzed and its variability to be quantified in the scene. A spatial segmentation is also obtained based on the estimated abundances. In order to estimate the parameters associated with the proposed Bayesian model, we propose to use a Hamiltonian Monte Carlo algorithm. The performance of the resulting unmixing strategy is evaluated through simulations conducted on both synthetic and real data.
NASA Astrophysics Data System (ADS)
Pu, Huangsheng; Zhang, Guanglei; He, Wei; Liu, Fei; Guang, Huizhi; Zhang, Yue; Bai, Jing; Luo, Jianwen
2014-09-01
It is a challenging problem to resolve and identify drug (or non-specific fluorophore) distribution throughout the whole body of small animals in vivo. In this article, an algorithm of unmixing multispectral fluorescence tomography (MFT) images based on independent component analysis (ICA) is proposed to solve this problem. ICA is used to unmix the data matrix assembled by the reconstruction results from MFT. Then the independent components (ICs) that represent spatial structures and the corresponding spectrum courses (SCs) which are associated with spectral variations can be obtained. By combining the ICs with SCs, the recovered MFT images can be generated and fluorophore concentration can be calculated. Simulation studies, phantom experiments and animal experiments with different concentration contrasts and spectrum combinations are performed to test the performance of the proposed algorithm. Results demonstrate that the proposed algorithm can not only provide the spatial information of fluorophores, but also recover the actual reconstruction of MFT images.
Method for hyperspectral imagery exploitation and pixel spectral unmixing
NASA Technical Reports Server (NTRS)
Lin, Ching-Fang (Inventor)
2003-01-01
An efficiently hybrid approach to exploit hyperspectral imagery and unmix spectral pixels. This hybrid approach uses a genetic algorithm to solve the abundance vector for the first pixel of a hyperspectral image cube. This abundance vector is used as initial state in a robust filter to derive the abundance estimate for the next pixel. By using Kalman filter, the abundance estimate for a pixel can be obtained in one iteration procedure which is much fast than genetic algorithm. The output of the robust filter is fed to genetic algorithm again to derive accurate abundance estimate for the current pixel. The using of robust filter solution as starting point of the genetic algorithm speeds up the evolution of the genetic algorithm. After obtaining the accurate abundance estimate, the procedure goes to next pixel, and uses the output of genetic algorithm as the previous state estimate to derive abundance estimate for this pixel using robust filter. And again use the genetic algorithm to derive accurate abundance estimate efficiently based on the robust filter solution. This iteration continues until pixels in a hyperspectral image cube end.
NASA Technical Reports Server (NTRS)
Swayze, Gregg A.; Clark, Roger N.
1995-01-01
The rapid development of sophisticated imaging spectrometers and resulting flood of imaging spectrometry data has prompted a rapid parallel development of spectral-information extraction technology. Even though these extraction techniques have evolved along different lines (band-shape fitting, endmember unmixing, near-infrared analysis, neural-network fitting, and expert systems to name a few), all are limited by the spectrometer's signal to noise (S/N) and spectral resolution in producing useful information. This study grew from a need to quantitatively determine what effects these parameters have on our ability to differentiate between mineral absorption features using a band-shape fitting algorithm. We chose to evaluate the AVIRIS, HYDICE, MIVIS, GERIS, VIMS, NIMS, and ASTER instruments because they collect data over wide S/N and spectral-resolution ranges. The study evaluates the performance of the Tricorder algorithm, in differentiating between mineral spectra in the 0.4-2.5 micrometer spectral region. The strength of the Tricorder algorithm is in its ability to produce an easily understood comparison of band shape that can concentrate on small relevant portions of the spectra, giving it an advantage over most unmixing schemes, and in that it need not spend large amounts of time reoptimizing each time a new mineral component is added to its reference library, as is the case with neural-network schemes. We believe the flexibility of the Tricorder algorithm is unparalleled among spectral-extraction techniques and that the results from this study, although dealing with minerals, will have direct applications to spectral identification in other disciplines.
Generation, Validation, and Application of Abundance Map Reference Data for Spectral Unmixing
NASA Astrophysics Data System (ADS)
Williams, McKay D.
Reference data ("ground truth") maps traditionally have been used to assess the accuracy of imaging spectrometer classification algorithms. However, these reference data can be prohibitively expensive to produce, often do not include sub-pixel abundance estimates necessary to assess spectral unmixing algorithms, and lack published validation reports. Our research proposes methodologies to efficiently generate, validate, and apply abundance map reference data (AMRD) to airborne remote sensing scenes. We generated scene-wide AMRD for three different remote sensing scenes using our remotely sensed reference data (RSRD) technique, which spatially aggregates unmixing results from fine scale imagery (e.g., 1-m Ground Sample Distance (GSD)) to co-located coarse scale imagery (e.g., 10-m GSD or larger). We validated the accuracy of this methodology by estimating AMRD in 51 randomly-selected 10 m x 10 m plots, using seven independent methods and observers, including field surveys by two observers, imagery analysis by two observers, and RSRD using three algorithms. Results indicated statistically-significant differences between all versions of AMRD, suggesting that all forms of reference data need to be validated. Given these significant differences between the independent versions of AMRD, we proposed that the mean of all (MOA) versions of reference data for each plot and class were most likely to represent true abundances. We then compared each version of AMRD to MOA. Best case accuracy was achieved by a version of imagery analysis, which had a mean coverage area error of 2.0%, with a standard deviation of 5.6%. One of the RSRD algorithms was nearly as accurate, achieving a mean error of 3.0%, with a standard deviation of 6.3%, showing the potential of RSRD-based AMRD generation. Application of validated AMRD to specific coarse scale imagery involved three main parts: 1) spatial alignment of coarse and fine scale imagery, 2) aggregation of fine scale abundances to produce coarse scale imagery-specific AMRD, and 3) demonstration of comparisons between coarse scale unmixing abundances and AMRD. Spatial alignment was performed using our scene-wide spectral comparison (SWSC) algorithm, which aligned imagery with accuracy approaching the distance of a single fine scale pixel. We compared simple rectangular aggregation to coarse sensor point spread function (PSF) aggregation, and found that the PSF approach returned lower error, but that rectangular aggregation more accurately estimated true abundances at ground level. We demonstrated various metrics for comparing unmixing results to AMRD, including mean absolute error (MAE) and linear regression (LR). We additionally introduced reference data mean adjusted MAE (MA-MAE), and reference data confidence interval adjusted MAE (CIA-MAE), which account for known error in the reference data itself. MA-MAE analysis indicated that fully constrained linear unmixing of coarse scale imagery across all three scenes returned an error of 10.83% per class and pixel, with regression analysis yielding a slope = 0.85, intercept = 0.04, and R2 = 0.81. Our reference data research has demonstrated a viable methodology to efficiently generate, validate, and apply AMRD to specific examples of airborne remote sensing imagery, thereby enabling direct quantitative assessment of spectral unmixing performance.
M-estimation for robust sparse unmixing of hyperspectral images
NASA Astrophysics Data System (ADS)
Toomik, Maria; Lu, Shijian; Nelson, James D. B.
2016-10-01
Hyperspectral unmixing methods often use a conventional least squares based lasso which assumes that the data follows the Gaussian distribution. The normality assumption is an approximation which is generally invalid for real imagery data. We consider a robust (non-Gaussian) approach to sparse spectral unmixing of remotely sensed imagery which reduces the sensitivity of the estimator to outliers and relaxes the linearity assumption. The method consists of several appropriate penalties. We propose to use an lp norm with 0 < p < 1 in the sparse regression problem, which induces more sparsity in the results, but makes the problem non-convex. On the other hand, the problem, though non-convex, can be solved quite straightforwardly with an extensible algorithm based on iteratively reweighted least squares. To deal with the huge size of modern spectral libraries we introduce a library reduction step, similar to the multiple signal classification (MUSIC) array processing algorithm, which not only speeds up unmixing but also yields superior results. In the hyperspectral setting we extend the traditional least squares method to the robust heavy-tailed case and propose a generalised M-lasso solution. M-estimation replaces the Gaussian likelihood with a fixed function ρ(e) that restrains outliers. The M-estimate function reduces the effect of errors with large amplitudes or even assigns the outliers zero weights. Our experimental results on real hyperspectral data show that noise with large amplitudes (outliers) often exists in the data. This ability to mitigate the influence of such outliers can therefore offer greater robustness. Qualitative hyperspectral unmixing results on real hyperspectral image data corroborate the efficacy of the proposed method.
Quadratic Blind Linear Unmixing: A Graphical User Interface for Tissue Characterization
Gutierrez-Navarro, O.; Campos-Delgado, D.U.; Arce-Santana, E. R.; Jo, Javier A.
2016-01-01
Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. PMID:26589467
Quadratic blind linear unmixing: A graphical user interface for tissue characterization.
Gutierrez-Navarro, O; Campos-Delgado, D U; Arce-Santana, E R; Jo, Javier A
2016-02-01
Spectral unmixing is the process of breaking down data from a sample into its basic components and their abundances. Previous work has been focused on blind unmixing of multi-spectral fluorescence lifetime imaging microscopy (m-FLIM) datasets under a linear mixture model and quadratic approximations. This method provides a fast linear decomposition and can work without a limitation in the maximum number of components or end-members. Hence this work presents an interactive software which implements our blind end-member and abundance extraction (BEAE) and quadratic blind linear unmixing (QBLU) algorithms in Matlab. The options and capabilities of our proposed software are described in detail. When the number of components is known, our software can estimate the constitutive end-members and their abundances. When no prior knowledge is available, the software can provide a completely blind solution to estimate the number of components, the end-members and their abundances. The characterization of three case studies validates the performance of the new software: ex-vivo human coronary arteries, human breast cancer cell samples, and in-vivo hamster oral mucosa. The software is freely available in a hosted webpage by one of the developing institutions, and allows the user a quick, easy-to-use and efficient tool for multi/hyper-spectral data decomposition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Endmember extraction from hyperspectral image based on discrete firefly algorithm (EE-DFA)
NASA Astrophysics Data System (ADS)
Zhang, Chengye; Qin, Qiming; Zhang, Tianyuan; Sun, Yuanheng; Chen, Chao
2017-04-01
This study proposed a novel method to extract endmembers from hyperspectral image based on discrete firefly algorithm (EE-DFA). Endmembers are the input of many spectral unmixing algorithms. Hence, in this paper, endmember extraction from hyperspectral image is regarded as a combinational optimization problem to get best spectral unmixing results, which can be solved by the discrete firefly algorithm. Two series of experiments were conducted on the synthetic hyperspectral datasets with different SNR and the AVIRIS Cuprite dataset, respectively. The experimental results were compared with the endmembers extracted by four popular methods: the sequential maximum angle convex cone (SMACC), N-FINDR, Vertex Component Analysis (VCA), and Minimum Volume Constrained Nonnegative Matrix Factorization (MVC-NMF). What's more, the effect of the parameters in the proposed method was tested on both synthetic hyperspectral datasets and AVIRIS Cuprite dataset, and the recommended parameters setting was proposed. The results in this study demonstrated that the proposed EE-DFA method showed better performance than the existing popular methods. Moreover, EE-DFA is robust under different SNR conditions.
CHAMP: a locally adaptive unmixing-based hyperspectral anomaly detection algorithm
NASA Astrophysics Data System (ADS)
Crist, Eric P.; Thelen, Brian J.; Carrara, David A.
1998-10-01
Anomaly detection offers a means by which to identify potentially important objects in a scene without prior knowledge of their spectral signatures. As such, this approach is less sensitive to variations in target class composition, atmospheric and illumination conditions, and sensor gain settings than would be a spectral matched filter or similar algorithm. The best existing anomaly detectors generally fall into one of two categories: those based on local Gaussian statistics, and those based on linear mixing moles. Unmixing-based approaches better represent the real distribution of data in a scene, but are typically derived and applied on a global or scene-wide basis. Locally adaptive approaches allow detection of more subtle anomalies by accommodating the spatial non-homogeneity of background classes in a typical scene, but provide a poorer representation of the true underlying background distribution. The CHAMP algorithm combines the best attributes of both approaches, applying a linear-mixing model approach in a spatially adaptive manner. The algorithm itself, and teste results on simulated and actual hyperspectral image data, are presented in this paper.
Nonlinear hyperspectral unmixing based on sparse non-negative matrix factorization
NASA Astrophysics Data System (ADS)
Li, Jing; Li, Xiaorun; Zhao, Liaoying
2016-01-01
Hyperspectral unmixing aims at extracting pure material spectra, accompanied by their corresponding proportions, from a mixed pixel. Owing to modeling more accurate distribution of real material, nonlinear mixing models (non-LMM) are usually considered to hold better performance than LMMs in complicated scenarios. In the past years, numerous nonlinear models have been successfully applied to hyperspectral unmixing. However, most non-LMMs only think of sum-to-one constraint or positivity constraint while the widespread sparsity among real materials mixing is the very factor that cannot be ignored. That is, for non-LMMs, a pixel is usually composed of a few spectral signatures of different materials from all the pure pixel set. Thus, in this paper, a smooth sparsity constraint is incorporated into the state-of-the-art Fan nonlinear model to exploit the sparsity feature in nonlinear model and use it to enhance the unmixing performance. This sparsity-constrained Fan model is solved with the non-negative matrix factorization. The algorithm was implemented on synthetic and real hyperspectral data and presented its advantage over those competing algorithms in the experiments.
Spectral unmixing of urban land cover using a generic library approach
NASA Astrophysics Data System (ADS)
Degerickx, Jeroen; Lordache, Marian-Daniel; Okujeni, Akpona; Hermy, Martin; van der Linden, Sebastian; Somers, Ben
2016-10-01
Remote sensing based land cover classification in urban areas generally requires the use of subpixel classification algorithms to take into account the high spatial heterogeneity. These spectral unmixing techniques often rely on spectral libraries, i.e. collections of pure material spectra (endmembers, EM), which ideally cover the large EM variability typically present in urban scenes. Despite the advent of several (semi-) automated EM detection algorithms, the collection of such image-specific libraries remains a tedious and time-consuming task. As an alternative, we suggest the use of a generic urban EM library, containing material spectra under varying conditions, acquired from different locations and sensors. This approach requires an efficient EM selection technique, capable of only selecting those spectra relevant for a specific image. In this paper, we evaluate and compare the potential of different existing library pruning algorithms (Iterative Endmember Selection and MUSIC) using simulated hyperspectral (APEX) data of the Brussels metropolitan area. In addition, we develop a new hybrid EM selection method which is shown to be highly efficient in dealing with both imagespecific and generic libraries, subsequently yielding more robust land cover classification results compared to existing methods. Future research will include further optimization of the proposed algorithm and additional tests on both simulated and real hyperspectral data.
Accuracy assessment of linear spectral mixture model due to terrain undulation
NASA Astrophysics Data System (ADS)
Wang, Tianxing; Chen, Songlin; Ma, Ya
2008-12-01
Mixture spectra are common in remote sensing due to the limitations of spatial resolution and the heterogeneity of land surface. During the past 30 years, a lot of subpixel model have developed to investigate the information within mixture pixels. Linear spectral mixture model (LSMM) is a simper and more general subpixel model. LSMM also known as spectral mixture analysis is a widely used procedure to determine the proportion of endmembers (constituent materials) within a pixel based on the endmembers' spectral characteristics. The unmixing accuracy of LSMM is restricted by variety of factors, but now the research about LSMM is mostly focused on appraisement of nonlinear effect relating to itself and techniques used to select endmembers, unfortunately, the environment conditions of study area which could sway the unmixing-accuracy, such as atmospheric scatting and terrain undulation, are not studied. This paper probes emphatically into the accuracy uncertainty of LSMM resulting from the terrain undulation. ASTER dataset was chosen and the C terrain correction algorithm was applied to it. Based on this, fractional abundances for different cover types were extracted from both pre- and post-C terrain illumination corrected ASTER using LSMM. Simultaneously, the regression analyses and the IKONOS image were introduced to assess the unmixing accuracy. Results showed that terrain undulation could dramatically constrain the application of LSMM in mountain area. Specifically, for vegetation abundances, a improved unmixing accuracy of 17.6% (regression against to NDVI) and 18.6% (regression against to MVI) for R2 was achieved respectively by removing terrain undulation. Anyway, this study indicated in a quantitative way that effective removal or minimization of terrain illumination effects was essential for applying LSMM. This paper could also provide a new instance for LSMM applications in mountainous areas. In addition, the methods employed in this study could be effectively used to evaluate different algorithms of terrain undulation correction for further study.
NASA Astrophysics Data System (ADS)
Mikheeva, Anna; Moiseev, Pavel
2017-04-01
In mountain territories climate change affects forest productivity and growth, which results in the tree line advancing and increasing of the forest density. These changes pose new challenges for forest managers whose responsibilities include forest resources inventory, monitoring and protection of ecosystems, and assessment of forest vulnerability. These activities require a range of sources of information, including exact squares of forested areas, forest densities and species abundances. Picea obovata, dominant tree species in South-Ural State Natural Reserve, Russia has regenerated, propagated and increased its relative cover during the recent 70 years. A remarkable shift of the upper limit of Picea obovata up to 60-80 m upslope was registered by repeating photography, especially on gentle slopes. The stands of Picea obovata are monitored by Reserve inspectors on the test plots to ensure that forests maintain or improve their productivity, these studies also include projective cover measurements. However, it is impossible to cover the entire territory of the Reserve by detailed field observations. Remote sensing data from Terra ASTER imagery provides valuable information for large territories (scene covers an area of 60 x 60 km) and can be used for quantitative mapping of forest and non-forest vegetation at regional scale (spatial resolution is 15-30 m for visible and infrared bands). A case study of estimating Picea obovata abundance was conducted for forest and forest-tundra sites of Zigalga Range, using 9-band ASTER multispectral imagery of 23.08.2007, field data and spectral unmixing algorithm. This type of algorithms intends to derive object and its abundance from a mixed pixel of multispectral imagery which can be further converted to object's projective cover. Atmospheric correction was applied to the imagery prior to spectral unmixing, and then pure spectra of Picea obovata were extracted from the image in 10 points and averaged. These points located in Zigalga Range and were visited in summer 2016. We used Mixture-tuned Match Filtering (MTMF) algorithm, a non-linear subpixel classification technique which allows to separate the spectral mixture containing unknown objects, and to derive only known ones. The results of spectral unmixing classification were abundance maps of Picea obovata. The values were statistically determined (there was only selected abundances with high probabilities of presence and low probabilities of absence) and then constrained to the interval [0; 1]. Verification of maps was made at the sites of Iremel Mountains on the same ASTER image, where projective cover of Picea obovata was measured in the field in 147 points. The correlation coefficient between the spectral unmixing abundances and field-measured abundances was 0.7; not a very high value is due to the low sensitivity of the algorithm to detect abundances less than 0.25. The proposed method provides a tool for defining the Picea obovata boundaries more accurately than per-pixel automatic classification and locating new spruce islands in the mixing tree line environment. The abundances can be obtained for large areas with minimum field work which makes this approach cost-effective in providing timely information to nature reserve managers for adapting forest management actions to climate change.
Band selection using forward feature selection algorithm for citrus Huanglongbing disease detection
USDA-ARS?s Scientific Manuscript database
This study attempted to classify spectrally similar data – obtained from aerial images of healthy citrus plants and the citrus greening disease (Huanglongbing) infected plants - using small differences without un-mixing the endmember components and therefore without the need for endmember library. H...
Automating spectral unmixing of AVIRIS data using convex geometry concepts
NASA Technical Reports Server (NTRS)
Boardman, Joseph W.
1993-01-01
Spectral mixture analysis, or unmixing, has proven to be a useful tool in the semi-quantitative interpretation of AVIRIS data. Using a linear mixing model and a set of hypothesized endmember spectra, unmixing seeks to estimate the fractional abundance patterns of the various materials occurring within the imaged area. However, the validity and accuracy of the unmixing rest heavily on the 'user-supplied' set of endmember spectra. Current methods for emdmember determination are the weak link in the unmixing chain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altmann, Yoann; Maccarone, Aurora; McCarthy, Aongus
Here, this paper presents a new Bayesian spectral un-mixing algorithm to analyse remote scenes sensed via sparse multispectral Lidar measurements. To a first approximation, in the presence of a target, each Lidar waveform consists of a main peak, whose position depends on the target distance and whose amplitude depends on the wavelength of the laser source considered (i.e, on the target reflectivity). Besides, these temporal responses are usually assumed to be corrupted by Poisson noise in the low photon count regime. When considering multiple wavelengths, it becomes possible to use spectral information in order to identify and quantify the mainmore » materials in the scene, in addition to estimation of the Lidar-based range profiles. Due to its anomaly detection capability, the proposed hierarchical Bayesian model, coupled with an efficient Markov chain Monte Carlo algorithm, allows robust estimation of depth images together with abundance and outlier maps associated with the observed 3D scene. The proposed methodology is illustrated via experiments conducted with real multispectral Lidar data acquired in a controlled environment. The results demonstrate the possibility to unmix spectral responses constructed from extremely sparse photon counts (less than 10 photons per pixel and band).« less
Context Dependent Spectral Unmixing
2014-08-01
the tar- get sizes). The targets were made of 100% cotton fabric and were emplaced so that there would be representatives of each color type completely...method for simplex-based endmember extraction algorithm,” IEEE Transactions on Geoscience and Re- mote Sensing, vol. 44, no. 10, pp. 2804–2819, 2006. [68
Estimating forest species abundance through linear unmixing of CHRIS/PROBA imagery
NASA Astrophysics Data System (ADS)
Stagakis, Stavros; Vanikiotis, Theofilos; Sykioti, Olga
2016-09-01
The advancing technology of hyperspectral remote sensing offers the opportunity of accurate land cover characterization of complex natural environments. In this study, a linear spectral unmixing algorithm that incorporates a novel hierarchical Bayesian approach (BI-ICE) was applied on two spatially and temporally adjacent CHRIS/PROBA images over a forest in North Pindos National Park (Epirus, Greece). The scope is to investigate the potential of this algorithm to discriminate two different forest species (i.e. beech - Fagus sylvatica, pine - Pinus nigra) and produce accurate species-specific abundance maps. The unmixing results were evaluated in uniformly distributed plots across the test site using measured fractions of each species derived by very high resolution aerial orthophotos. Landsat-8 images were also used to produce a conventional discrete-type classification map of the test site. This map was used to define the exact borders of the test site and compare the thematic information of the two mapping approaches (discrete vs abundance mapping). The required ground truth information, regarding training and validation of the applied mapping methodologies, was collected during a field campaign across the study site. Abundance estimates reached very good overall accuracy (R2 = 0.98, RMSE = 0.06). The most significant source of error in our results was due to the shadowing effects that were very intense in some areas of the test site due to the low solar elevation during CHRIS acquisitions. It is also demonstrated that the two mapping approaches are in accordance across pure and dense forest areas, but the conventional classification map fails to describe the natural spatial gradients of each species and the actual species mixture across the test site. Overall, the BI-ICE algorithm presented increased potential to unmix challenging objects with high spectral similarity, such as different vegetation species, under real and not optimum acquisition conditions. Its full potential remains to be investigated in further and more complex study sites in view of the upcoming satellite hyperspectral missions.
NASA Astrophysics Data System (ADS)
Benhalouche, Fatima Zohra; Karoui, Moussa Sofiane; Deville, Yannick; Ouamri, Abdelaziz
2017-04-01
This paper proposes three multisharpening approaches to enhance the spatial resolution of urban hyperspectral remote sensing images. These approaches, related to linear-quadratic spectral unmixing techniques, use a linear-quadratic nonnegative matrix factorization (NMF) multiplicative algorithm. These methods begin by unmixing the observable high-spectral/low-spatial resolution hyperspectral and high-spatial/low-spectral resolution multispectral images. The obtained high-spectral/high-spatial resolution features are then recombined, according to the linear-quadratic mixing model, to obtain an unobservable multisharpened high-spectral/high-spatial resolution hyperspectral image. In the first designed approach, hyperspectral and multispectral variables are independently optimized, once they have been coherently initialized. These variables are alternately updated in the second designed approach. In the third approach, the considered hyperspectral and multispectral variables are jointly updated. Experiments, using synthetic and real data, are conducted to assess the efficiency, in spatial and spectral domains, of the designed approaches and of linear NMF-based approaches from the literature. Experimental results show that the designed methods globally yield very satisfactory spectral and spatial fidelities for the multisharpened hyperspectral data. They also prove that these methods significantly outperform the used literature approaches.
NASA Astrophysics Data System (ADS)
Behrooz, Ali; Vasquez, Kristine O.; Waterman, Peter; Meganck, Jeff; Peterson, Jeffrey D.; Miller, Peter; Kempner, Joshua
2017-02-01
Intraoperative resection of tumors currently relies upon the surgeon's ability to visually locate and palpate tumor nodules. Undetected residual malignant tissue often results in the need for additional treatment or surgical intervention. The Solaris platform is a multispectral open-air fluorescence imaging system designed for translational fluorescence-guided surgery. Solaris supports video-rate imaging in four fixed fluorescence channels ranging from visible to near infrared, and a multispectral channel equipped with a liquid crystal tunable filter (LCTF) for multispectral image acquisition (520-620 nm). Identification of tumor margins using reagents emitting in the visible spectrum (400-650 nm), such as fluorescein isothiocyanate (FITC), present challenges considering the presence of auto-fluorescence from tissue and food in the gastrointestinal (GI) tract. To overcome this, Solaris acquires LCTF-based multispectral images, and by applying an automated spectral unmixing algorithm to the data, separates reagent fluorescence from tissue and food auto-fluorescence. The unmixing algorithm uses vertex component analysis to automatically extract the primary pure spectra, and resolves the reagent fluorescent signal using non-negative least squares. For validation, intraoperative in vivo studies were carried out in tumor-bearing rodents injected with FITC-dextran reagent that is primarily residing in malignant tissue 24 hours post injection. In the absence of unmixing, fluorescence from tumors is not distinguishable from that of surrounding tissue. Upon spectral unmixing, the FITC-labeled malignant regions become well defined and detectable. The results of these studies substantiate the multispectral power of Solaris in resolving FITC-based agent signal in deep tumor masses, under ambient and surgical light, and enhancing the ability to surgically resect them.
Quantitative detection of settled dust over green canopy
NASA Astrophysics Data System (ADS)
Brook, Anna
2016-04-01
The main task of environmental and geoscience applications are efficient and accurate quantitative classification of earth surfaces and spatial phenomena. In the past decade, there has been a significant interest in employing hyperspectral unmixing in order to retrieve accurate quantitative information latent in hyperspectral imagery data. Recently, the ground-truth and laboratory measured spectral signatures promoted by advanced algorithms are proposed as a new path toward solving the unmixing problem of hyperspectral imagery in semi-supervised fashion. This paper suggests that the sensitivity of sparse unmixing techniques provides an ideal approach to extract and identify dust settled over/upon green vegetation canopy using hyperspectral airborne data. Atmospheric dust transports a variety of chemicals, some of which pose a risk to the ecosystem and human health (Kaskaoutis, et al., 2008). Many studies deal with the impact of dust on particulate matter (PM) and atmospheric pollution. Considering the potential impact of industrial pollutants, one of the most important considerations is the fact that suspended PM can have both a physical and a chemical impact on plants, soils, and water bodies. Not only can the particles covering surfaces cause physical distortion, but particles of diverse origin and different chemistries can also serve as chemical stressors and cause irreversible damage. Sediment dust load in an indoor environment can be spectrally assessed using reflectance spectroscopy (Chudnovsky and Ben-Dor, 2009). Small amounts of particulate pollution that may carry a signature of a forthcoming environmental hazard are of key interest when considering the effects of pollution. According to the most basic distribution dynamics, dust consists of suspended particulate matter in a fine state of subdivision that are raised and carried by wind. In this context, it is increasingly important to first, understand the distribution dynamics of pollutants, and subsequently develop dedicated tools and measures to control and monitor pollutants in the free environment. The earliest effect of settled polluted dust particles is not always reflected through poor conditions of vegetation or soils, or any visible damages. In most of the cases, it has a quite long accumulation process that graduates from a polluted condition to long-term environmental hazard. Although conducted experiments with pollutant analog powders under controlled conditions have tended to confirm the findings from field studies (Brook, 2014), a major criticism of all these experiments is their short duration. The resulting conclusion is that it is difficult, if not impossible, to determine the implications of long-term exposure to realistic concentrations of pollutants from such short-term studies. Hyperspectral remote sensing (HRS) has become a common tool for environmental and geoscience applications. HRS has promoted new opportunities for exploring a wide range of materials and evaluating a variety of natural processes due to its detailed, specific, and extensive information on spectral and spatial disseminations. Hyperspectral unmixing (HU) is the technique of presuming the category type, which constitutes the mix-pixel, and its mixing ratio (Keshava and Mustard, 2002). In general, the task of unmixing is to decompose the reflectance spectrum of each pixel into a set of endmembers or principal combined spectra and their corresponding abundances (Bioucas-Dias et al., 2012). This study suggests that the sensitivity of sparse unmixing techniques provides an ideal approach to extract and identify dust settled over/upon green vegetation canopy using hyperspectral airborne data. Among the available techniques, this study present results of seven linear and non-linear unmixing algorithms: 1) Non-negative Matrix Factorization (NMF), 2) L1 sparsity-constrained NMF (L1-NMF), 3) L1/2 sparsity-constrained NMF (L1/2-NMF), 4) Graph regularized NMF (G-NMF), 5) Structured Sparse NMF (SS-NMF), 6) Alternating Least-Square (ALS), and 2) Lin's Projected Gradient (LPG). The performance is evaluated on real hyperspectral imagery data via detailed experimental assessment. The study showed that in certain compression tasks content-adapted sparse representation is provided by state-of-the-art solutions. The NMF algorithm estimates endmembers that are used to remove spurious information. If computationally feasible, it should include interaction terms to make the model more flexible. The optimal NMF algorithms, such as ALS and LPG, are assumed to be the simplest methods that achieve the minimum error on the test set. In summary, this work shows that sediment dust can be assessed using airborne HSI data, making it a potentially powerful tool for environmental studies. References Keshava, N., Mustard, J. (2002). Spectral unmixing. IEEE Signal Process. Mag., 19(1), 44-57. Chudnovsky, A., & Ben-Dor, E. (2009). Reflectance spectroscopy as a tool for settled dust monitoring in office environment. International Journal of Environment and Waste Management, 4(1), 32-49. Brook, A. (2014). Quantitative Detection of Settled dust over Green Canopy using Sparse Unmixing of Airborne Hyperspectral Data. IEEE-Whispers 6th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, 2014, Switzerland, 4-8. Keshava, N., Mustard, J. (2002). Spectral unmixing. IEEE Signal Process. Mag., 19(1), 44-57. Bioucas-Dias et al. (2012). Hyperspectral unmixing overview: Geometrical, statistical, and sparse regression-based approaches, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 5(2), 354 -379.
[Orthogonal Vector Projection Algorithm for Spectral Unmixing].
Song, Mei-ping; Xu, Xing-wei; Chang, Chein-I; An, Ju-bai; Yao, Li
2015-12-01
Spectrum unmixing is an important part of hyperspectral technologies, which is essential for material quantity analysis in hyperspectral imagery. Most linear unmixing algorithms require computations of matrix multiplication and matrix inversion or matrix determination. These are difficult for programming, especially hard for realization on hardware. At the same time, the computation costs of the algorithms increase significantly as the number of endmembers grows. Here, based on the traditional algorithm Orthogonal Subspace Projection, a new method called. Orthogonal Vector Projection is prompted using orthogonal principle. It simplifies this process by avoiding matrix multiplication and inversion. It firstly computes the final orthogonal vector via Gram-Schmidt process for each endmember spectrum. And then, these orthogonal vectors are used as projection vector for the pixel signature. The unconstrained abundance can be obtained directly by projecting the signature to the projection vectors, and computing the ratio of projected vector length and orthogonal vector length. Compared to the Orthogonal Subspace Projection and Least Squares Error algorithms, this method does not need matrix inversion, which is much computation costing and hard to implement on hardware. It just completes the orthogonalization process by repeated vector operations, easy for application on both parallel computation and hardware. The reasonability of the algorithm is proved by its relationship with Orthogonal Sub-space Projection and Least Squares Error algorithms. And its computational complexity is also compared with the other two algorithms', which is the lowest one. At last, the experimental results on synthetic image and real image are also provided, giving another evidence for effectiveness of the method.
Pigments identification of paintings using subspace distance unmixing algorithm
NASA Astrophysics Data System (ADS)
Li, Bin; Lyu, Shuqiang; Zhang, Dafeng; Dong, Qinghao
2018-04-01
In the digital protection of the cultural relics, the identification of the pigment mixtures on the surface of the painting has been the research spot for many years. In this paper, as a hyperspectral unmixing algorithm, sub-space distance unmixing is introduced to solve the problem of recognition of pigments mixture in paintings. Firstly, some mixtures of different pigments are designed to measure their reflectance spectra using spectrometer. Moreover, the factors affecting the unmixing accuracy of pigments' mixtures are discussed. The unmixing results of two cases with and without rice paper and its underlay as endmembers are compared. The experiment results show that the algorithm is able to unmixing the pigments effectively and the unmixing accuracy can be improved after considering the influence of spectra of the rich paper and the underlaying material.
Evaluation of algorithm methods for fluorescence spectra of cancerous and normal human tissues
NASA Astrophysics Data System (ADS)
Pu, Yang; Wang, Wubao; Alfano, Robert R.
2016-03-01
The paper focus on the various algorithms on to unravel the fluorescence spectra by unmixing methods to identify cancerous and normal human tissues from the measured fluorescence spectroscopy. The biochemical or morphologic changes that cause fluorescence spectra variations would appear earlier than the histological approach; therefore, fluorescence spectroscopy holds a great promise as clinical tool for diagnosing early stage of carcinomas and other deceases for in vivo use. The method can further identify tissue biomarkers by decomposing the spectral contributions of different fluorescent molecules of interest. In this work, we investigate the performance of blind source un-mixing methods (backward model) and spectral fitting approaches (forward model) in decomposing the contributions of key fluorescent molecules from the tissue mixture background when certain selected excitation wavelength is applied. Pairs of adenocarcinoma as well as normal tissues confirmed by pathologist were excited by selective wavelength of 340 nm. The emission spectra of resected fresh tissue were used to evaluate the relative changes of collagen, reduced nicotinamide adenine dinucleotide (NADH), and Flavin by various spectral un-mixing methods. Two categories of algorithms: forward methods and Blind Source Separation [such as Principal Component Analysis (PCA) and Independent Component Analysis (ICA), and Nonnegative Matrix Factorization (NMF)] will be introduced and evaluated. The purpose of the spectral analysis is to discard the redundant information which conceals the difference between these two types of tissues, but keep their diagnostically significance. The facts predicted by different methods were compared to the gold standard of histopathology. The results indicate that these key fluorophores within tissue, e.g. tryptophan, collagen, and NADH, and flavin, show differences of relative contents of fluorophores among different types of human cancer and normal tissues. The sensitivity, specificity, and receiver operating characteristic (ROC) are finally employed as the criteria to evaluate the efficacy of these methods in cancer detection. The underlying physical and biological basis for these optical approaches will be discussed with examples. This ex vivo preliminary trial demonstrates that these different criteria from different methods can distinguish carcinoma from normal tissues with good sensitivity and specificity while among them, we found that ICA appears to be the superior method in predication accuracy.
Sparsely-sampled hyperspectral stimulated Raman scattering microscopy: a theoretical investigation
NASA Astrophysics Data System (ADS)
Lin, Haonan; Liao, Chien-Sheng; Wang, Pu; Huang, Kai-Chih; Bouman, Charles A.; Kong, Nan; Cheng, Ji-Xin
2017-02-01
A hyperspectral image corresponds to a data cube with two spatial dimensions and one spectral dimension. Through linear un-mixing, hyperspectral images can be decomposed into spectral signatures of pure components as well as their concentration maps. Due to this distinct advantage on component identification, hyperspectral imaging becomes a rapidly emerging platform for engineering better medicine and expediting scientific discovery. Among various hyperspectral imaging techniques, hyperspectral stimulated Raman scattering (HSRS) microscopy acquires data in a pixel-by-pixel scanning manner. Nevertheless, current image acquisition speed for HSRS is insufficient to capture the dynamics of freely moving subjects. Instead of reducing the pixel dwell time to achieve speed-up, which would inevitably decrease signal-to-noise ratio (SNR), we propose to reduce the total number of sampled pixels. Location of sampled pixels are carefully engineered with triangular wave Lissajous trajectory. Followed by a model-based image in-painting algorithm, the complete data is recovered for linear unmixing. Simulation results show that by careful selection of trajectory, a fill rate as low as 10% is sufficient to generate accurate linear unmixing results. The proposed framework applies to any hyperspectral beam-scanning imaging platform which demands high acquisition speed.
Comparing performance of standard and iterative linear unmixing methods for hyperspectral signatures
NASA Astrophysics Data System (ADS)
Gault, Travis R.; Jansen, Melissa E.; DeCoster, Mallory E.; Jansing, E. David; Rodriguez, Benjamin M.
2016-05-01
Linear unmixing is a method of decomposing a mixed signature to determine the component materials that are present in sensor's field of view, along with the abundances at which they occur. Linear unmixing assumes that energy from the materials in the field of view is mixed in a linear fashion across the spectrum of interest. Traditional unmixing methods can take advantage of adjacent pixels in the decomposition algorithm, but is not the case for point sensors. This paper explores several iterative and non-iterative methods for linear unmixing, and examines their effectiveness at identifying the individual signatures that make up simulated single pixel mixed signatures, along with their corresponding abundances. The major hurdle addressed in the proposed method is that no neighboring pixel information is available for the spectral signature of interest. Testing is performed using two collections of spectral signatures from the Johns Hopkins University Applied Physics Laboratory's Signatures Database software (SigDB): a hand-selected small dataset of 25 distinct signatures from a larger dataset of approximately 1600 pure visible/near-infrared/short-wave-infrared (VIS/NIR/SWIR) spectra. Simulated spectra are created with three and four material mixtures randomly drawn from a dataset originating from SigDB, where the abundance of one material is swept in 10% increments from 10% to 90%with the abundances of the other materials equally divided amongst the remainder. For the smaller dataset of 25 signatures, all combinations of three or four materials are used to create simulated spectra, from which the accuracy of materials returned, as well as the correctness of the abundances, is compared to the inputs. The experiment is expanded to include the signatures from the larger dataset of almost 1600 signatures evaluated using a Monte Carlo scheme with 5000 draws of three or four materials to create the simulated mixed signatures. The spectral similarity of the inputs to the output component signatures is calculated using the spectral angle mapper. Results show that iterative methods significantly outperform the traditional methods under the given test conditions.
Mapping target signatures via partial unmixing of AVIRIS data
NASA Technical Reports Server (NTRS)
Boardman, Joseph W.; Kruse, Fred A.; Green, Robert O.
1995-01-01
A complete spectral unmixing of a complicated AVIRIS scene may not always be possible or even desired. High quality data of spectrally complex areas are very high dimensional and are consequently difficult to fully unravel. Partial unmixing provides a method of solving only that fraction of the data inversion problem that directly relates to the specific goals of the investigation. Many applications of imaging spectrometry can be cast in the form of the following question: 'Are my target signatures present in the scene, and if so, how much of each target material is present in each pixel?' This is a partial unmixing problem. The number of unmixing endmembers is one greater than the number of spectrally defined target materials. The one additional endmember can be thought of as the composite of all the other scene materials, or 'everything else'. Several workers have proposed partial unmixing schemes for imaging spectrometry data, but each has significant limitations for operational application. The low probability detection methods described by Farrand and Harsanyi and the foreground-background method of Smith et al are both examples of such partial unmixing strategies. The new method presented here builds on these innovative analysis concepts, combining their different positive attributes while attempting to circumvent their limitations. This new method partially unmixes AVIRIS data, mapping apparent target abundances, in the presence of an arbitrary and unknown spectrally mixed background. It permits the target materials to be present in abundances that drive significant portions of the scene covariance. Furthermore it does not require a priori knowledge of the background material spectral signatures. The challenge is to find the proper projection of the data that hides the background variance while simultaneously maximizing the variance amongst the targets.
A novel edge-preserving nonnegative matrix factorization method for spectral unmixing
NASA Astrophysics Data System (ADS)
Bao, Wenxing; Ma, Ruishi
2015-12-01
Spectral unmixing technique is one of the key techniques to identify and classify the material in the hyperspectral image processing. A novel robust spectral unmixing method based on nonnegative matrix factorization(NMF) is presented in this paper. This paper used an edge-preserving function as hypersurface cost function to minimize the nonnegative matrix factorization. To minimize the hypersurface cost function, we constructed the updating functions for signature matrix of end-members and abundance fraction respectively. The two functions are updated alternatively. For evaluation purpose, synthetic data and real data have been used in this paper. Synthetic data is used based on end-members from USGS digital spectral library. AVIRIS Cuprite dataset have been used as real data. The spectral angle distance (SAD) and abundance angle distance(AAD) have been used in this research for assessment the performance of proposed method. The experimental results show that this method can obtain more ideal results and good accuracy for spectral unmixing than present methods.
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2017-12-01
Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. We describe the development of an Informed Non-Negative Matrix Factorization (INMF) spectral unmixing method to exploit this spectral information and separate atmospheric and surface signals based on their physical sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO), with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric and surface conditions. These include atmospheres with varying aerosol optical thicknesses and cloud cover. HICO images also provide a range of surface conditions including deep ocean regions, with only minor contributions from the ocean surfaces; and more complex shallow coastal regions with contributions from the seafloor or suspended sediments. We provide extensive comparison of INMF decomposition results against independent measurements of physical properties. These include comparison against traditional model-based retrievals of water-leaving, aerosol, and molecular scattering radiances and other satellite products, such as aerosol optical thickness from the Moderate Resolution Imaging Spectroradiometer (MODIS).
On the Use of FOSS4G in Land Cover Fraction Estimation with Unmixing Algorithms
NASA Astrophysics Data System (ADS)
Kumar, U.; Milesi, C.; Raja, K.; Ganguly, S.; Wang, W.; Zhang, G.; Nemani, R. R.
2014-12-01
The popularity and usage of FOSS4G (FOSS for Geoinformatics) has increased drastically in the last two decades with increasing benefits that facilitate spatial data analysis, image processing, graphics and map production, spatial modeling and visualization. The objective of this paper is to use FOSS4G to implement and perform a quantitative analysis of three different unmixing algorithms: Constraint Least-Square (CLS), Unconstraint Least-Square, and Orthogonal Subspace Projection to estimate land cover (LC) fraction estimates from RS data. The LC fractions obtained by unmixing of mixed pixels represent mixture of more than one class per pixel rendering more accurate LC abundance estimates. The algorithms were implemented in C++ programming language with OpenCV package (http://opencv.org/) and boost C++ libraries (www.boost.org) in the NASA Earth Exchange at the NASA Advanced Supercomputing Facility. GRASS GIS was used for visualization of results and statistical analysis was carried in R in a Linux system environment. A set of global endmembers for substrate, vegetation and dark objects were used to unmix the data using the three algorithms and were compared with Singular Value decomposition unmixed outputs available in ENVI image processing software. First, computer simulated data of different signal to noise ratio were used to evaluate the algorithms. The second set of experiments was carried out in an agricultural set-up with a spectrally diverse collection of 11 Landsat-5 scenes (acquired in 2008) for an agricultural setup in Frenso, California and the ground data were collected on those specific dates when the satellite passed through the site. Finally, in the third set of experiments, a pair of coincident clear sky Landsat and World View 2 data for an urbanized area of San Francisco were used to assess the algorithm. Validation of the results using descriptive statistics, correlation coefficient (cc), RMSE, boxplot and bivariate distribution function indicated that with the computer simulated data, CLS was better than other techniques. With the real world data of an agricultural landscape, CLS was superior to other techniques with a mean absolute error for all four methods close to 7.3%. For the urban setup, CLS demonstrated highest average cc of 0.64 and lowest average RMSE of 0.19 for all the endmembers.
Kannan, R; Ievlev, A V; Laanait, N; Ziatdinov, M A; Vasudevan, R K; Jesse, S; Kalinin, S V
2018-01-01
Many spectral responses in materials science, physics, and chemistry experiments can be characterized as resulting from the superposition of a number of more basic individual spectra. In this context, unmixing is defined as the problem of determining the individual spectra, given measurements of multiple spectra that are spatially resolved across samples, as well as the determination of the corresponding abundance maps indicating the local weighting of each individual spectrum. Matrix factorization is a popular linear unmixing technique that considers that the mixture model between the individual spectra and the spatial maps is linear. Here, we present a tutorial paper targeted at domain scientists to introduce linear unmixing techniques, to facilitate greater understanding of spectroscopic imaging data. We detail a matrix factorization framework that can incorporate different domain information through various parameters of the matrix factorization method. We demonstrate many domain-specific examples to explain the expressivity of the matrix factorization framework and show how the appropriate use of domain-specific constraints such as non-negativity and sum-to-one abundance result in physically meaningful spectral decompositions that are more readily interpretable. Our aim is not only to explain the off-the-shelf available tools, but to add additional constraints when ready-made algorithms are unavailable for the task. All examples use the scalable open source implementation from https://github.com/ramkikannan/nmflibrary that can run from small laptops to supercomputers, creating a user-wide platform for rapid dissemination and adoption across scientific disciplines.
A hyperspectral image projector for hyperspectral imagers
NASA Astrophysics Data System (ADS)
Rice, Joseph P.; Brown, Steven W.; Neira, Jorge E.; Bousquet, Robert R.
2007-04-01
We have developed and demonstrated a Hyperspectral Image Projector (HIP) intended for system-level validation testing of hyperspectral imagers, including the instrument and any associated spectral unmixing algorithms. HIP, based on the same digital micromirror arrays used in commercial digital light processing (DLP*) displays, is capable of projecting any combination of many different arbitrarily programmable basis spectra into each image pixel at up to video frame rates. We use a scheme whereby one micromirror array is used to produce light having the spectra of endmembers (i.e. vegetation, water, minerals, etc.), and a second micromirror array, optically in series with the first, projects any combination of these arbitrarily-programmable spectra into the pixels of a 1024 x 768 element spatial image, thereby producing temporally-integrated images having spectrally mixed pixels. HIP goes beyond conventional DLP projectors in that each spatial pixel can have an arbitrary spectrum, not just arbitrary color. As such, the resulting spectral and spatial content of the projected image can simulate realistic scenes that a hyperspectral imager will measure during its use. Also, the spectral radiance of the projected scenes can be measured with a calibrated spectroradiometer, such that the spectral radiance projected into each pixel of the hyperspectral imager can be accurately known. Use of such projected scenes in a controlled laboratory setting would alleviate expensive field testing of instruments, allow better separation of environmental effects from instrument effects, and enable system-level performance testing and validation of hyperspectral imagers as used with analysis algorithms. For example, known mixtures of relevant endmember spectra could be projected into arbitrary spatial pixels in a hyperspectral imager, enabling tests of how well a full system, consisting of the instrument + calibration + analysis algorithm, performs in unmixing (i.e. de-convolving) the spectra in all pixels. We discuss here the performance of a visible prototype HIP. The technology is readily extendable to the ultraviolet and infrared spectral ranges, and the scenes can be static or dynamic.
Quantitative detection of settled coal dust over green canopy
NASA Astrophysics Data System (ADS)
Brook, Anna; Sahar, Nir
2017-04-01
The main task of environmental and geoscience applications are efficient and accurate quantitative classification of earth surfaces and spatial phenomena. In the past decade, there has been a significant interest in employing spectral unmixing in order to retrieve accurate quantitative information latent in in situ data. Recently, the ground-truth and laboratory measured spectral signatures promoted by advanced algorithms are proposed as a new path toward solving the unmixing problem in semi-supervised fashion. This study presents a practical implementation of field spectroscopy as a quantitative tool to detect settled coal dust over green canopy in free/open environment. Coal dust is a fine powdered form of coal, which is created by the crushing, grinding, and pulverizing of coal. Since the inelastic nature of coal, coal dust can be created during transportation, or by mechanically handling coal. Coal dust, categorized at silt-clay particle size, of particular concern due to heavy metals (lead, mercury, nickel, tin, cadmium, mercury, antimony, arsenic, isotopes of thorium and strontium) which are toxic also at low concentrations. This hazard exposes risk on both environment and public health. It has been identified by medical scientist around the world as causing a range of diseases and health problems, mainly heart and respiratory diseases like asthma and lung cancer. It is due to the fact that the fine invisible coal dust particles (less than 2.5 microns) long lodge in the lungs and are not naturally expelled, so long-term exposure increases the risk of health problems. Numerus studies reported that data to conduct study of geographic distribution of the very fine coal dust (smaller than PM 2.5) and related health impacts from coal exports, is not being collected. Sediment dust load in an indoor environment can be spectrally assessed using reflectance spectroscopy (Chudnovsky and Ben-Dor, 2009). Small amounts of particulate pollution that may carry a signature of a forthcoming environmental hazard are of key interest when considering the effects of pollution. According to the most basic distribution dynamics, dust consists of suspended particulate matter in a fine state of subdivision that are raised and carried by wind. In this context, it is increasingly important to first, understand the distribution dynamics of pollutants, and subsequently develop dedicated tools and measures to control and monitor pollutants in the free environment. The earliest effect of settled polluted dust particles is not always reflected through poor conditions of vegetation or soils, or any visible damages. In most of the cases, it has a quite long accumulation process that graduates from a polluted condition to long-term environmental and health related hazard. Although conducted experiments with pollutant analog powders under controlled conditions have tended to con- firm the findings from field studies (Brook, 2014; Brook and Ben-Dor 2016; Brook, 2016), a major criticism of all these experiments is their short duration. The resulting conclusion is that it is difficult, if not impossible, to determine the implications of long-term exposure to realistic concentrations of pollutants from such short-term studies. In general, the task of unmixing is to decompose the reflectance spectrum into a set of endmembers or principal combined spectra and their corresponding abundances (Bioucas-Dias et al., 2012). This study suggests that the sensitivity of sparse unmixing techniques provides an ideal approach to extract and identify coal dust settled over/upon green vegetation canopy using in situ spectral data collected by portable spectrometer. The optimal NMF algorithms, such as ALS and LPG, are assumed to be the simplest methods that achieve the minimum error. The suggested practical approach includes the following stages: 1. In situ spectral measurements, 2. Near-real-time spectral data analysis, 3. Estimated concentration of coal dust reported as mg/sq m. The stage 2 is completed by calculating: 1. Unmixing between the green canopy and the settle dust extraction only coal dust fraction, 2. Converting spectral feature of coal dust to concentration via PLSR spectral model. The spectral model was trained and validated PLSR model developed at laboratory using spectra across MIR (FTIR reflectance spectra) and NIR regions and XRD analysis. The obtained RMSE was satisfying for both spectral regions. Thus, it was concluded that field spectroscopy can be used for this purpose, and it can provide fully quantitative measures of settle coal dust. Nowadays this approach (both spectrometer and algorithm) has been accepted as a practical operational tool for environmental monitoring near power station Orot Rabin in Hadera and will be used by the Sharon-Carmel Districts Municipal Association for Environmental Protection, Israel as a regulatory tool. In summary, this work shows that coal dust can be assessed using in situ spectroscopy, making it a potentially powerful tool for environmental studies. References Chudnovsky, A., & Ben-Dor, E. (2009). Reflectance spectroscopy as a tool for settled dust monitoring in office environment. International Journal of Environment and Waste Management, 4(1), 32-49. Brook, A. (2014). Quantitative Detection of Settled dust over Green Canopy using Sparse Unmixing of Airborne Hyperspectral Data. IEEE-Whispers 6th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, 2014, Switzerland, 4-8. Brook, A. and Ben-Dor, E. (2016). Quantitative detection of settled dust over Green Canopy using sparse unmixing of airborne hyperspectral data. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 9(2), pp.884-897. Brook, A. (2016). Quantitative Detection and Long-Term Monitoring of Settle Dust Using Semisupervised Learning for Spectral Data. Water, Air, & Soil Pollution, 227(3), pp.1-9. Bioucas-Dias, J.M., Plaza, A., Dobigeon, N., Parente, M., Du, Q., Gader, P. and Chanussot, J. (2012). Hyperspectral unmixing overview: Geometrical, statistical, and sparse regression-based approaches. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 5(2), pp.354-379. Keshava, N., Mustard, J. (2002). Spectral unmixing. IEEE Signal Process. Mag., 19(1), 44-57. Bioucas-Dias et al. (2012). Hyperspectral unmixing overview: Geometrical, statistical, and sparse regression-based approaches, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 5(2), 354 -379.
Fast sparse Raman spectral unmixing for chemical fingerprinting and quantification
NASA Astrophysics Data System (ADS)
Yaghoobi, Mehrdad; Wu, Di; Clewes, Rhea J.; Davies, Mike E.
2016-10-01
Raman spectroscopy is a well-established spectroscopic method for the detection of condensed phase chemicals. It is based on scattered light from exposure of a target material to a narrowband laser beam. The information generated enables presumptive identification from measuring correlation with library spectra. Whilst this approach is successful in identification of chemical information of samples with one component, it is more difficult to apply to spectral mixtures. The capability of handling spectral mixtures is crucial for defence and security applications as hazardous materials may be present as mixtures due to the presence of degradation, interferents or precursors. A novel method for spectral unmixing is proposed here. Most modern decomposition techniques are based on the sparse decomposition of mixture and the application of extra constraints to preserve the sum of concentrations. These methods have often been proposed for passive spectroscopy, where spectral baseline correction is not required. Most successful methods are computationally expensive, e.g. convex optimisation and Bayesian approaches. We present a novel low complexity sparsity based method to decompose the spectra using a reference library of spectra. It can be implemented on a hand-held spectrometer in near to real-time. The algorithm is based on iteratively subtracting the contribution of selected spectra and updating the contribution of each spectrum. The core algorithm is called fast non-negative orthogonal matching pursuit, which has been proposed by the authors in the context of nonnegative sparse representations. The iteration terminates when the maximum number of expected chemicals has been found or the residual spectrum has a negligible energy, i.e. in the order of the noise level. A backtracking step removes the least contributing spectrum from the list of detected chemicals and reports it as an alternative component. This feature is particularly useful in detection of chemicals with small contributions, which are normally not detected. The proposed algorithm is easily reconfigurable to include new library entries and optional preferential threat searches in the presence of predetermined threat indicators. Under Ministry of Defence funding, we have demonstrated the algorithm for fingerprinting and rough quantification of the concentration of chemical mixtures using a set of reference spectral mixtures. In our experiments, the algorithm successfully managed to detect the chemicals with concentrations below 10 percent. The running time of the algorithm is in the order of one second, using a single core of a desktop computer.
Pixel decomposition for tracking in low resolution videos
NASA Astrophysics Data System (ADS)
Govinda, Vivekanand; Ralph, Jason F.; Spencer, Joseph W.; Goulermas, John Y.; Yang, Lihua; Abbas, Alaa M.
2008-04-01
This paper describes a novel set of algorithms that allows indoor activity to be monitored using data from very low resolution imagers and other non-intrusive sensors. The objects are not resolved but activity may still be determined. This allows the use of such technology in sensitive environments where privacy must be maintained. Spectral un-mixing algorithms from remote sensing were adapted for this environment. These algorithms allow the fractional contributions from different colours within each pixel to be estimated and this is used to assist in the detection and monitoring of small objects or sub-pixel motion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Mary K.
The Koobi Fora Formation in northwestern Kenya has yielded more hominin fossils dated between 2.1 and 1.2 Ma than any other location on Earth. This research was undertaken to discover the spectral signatures of a portion of the Koobi Fora Formation using imagery from the DOE's Multispectral Thermal Imager (MTI) satellite. Creation of a digital geologic map from MTI imagery was a secondary goal of this research. MTI is unique amongst multispectral satellites in that it co-collects data from 15 spectral bands ranging from the visible to the thermal infrared with a ground sample distance of 5 meters per pixelmore » in the visible and 20 meters in the infrared. The map was created in two stages. The first was to correct the base MTI image using spatial accuracy assessment points collected in the field. The second was to mosaic various MTI images together to create the final Koobi Fora map. Absolute spatial accuracy of the final map product is 73 meters. The geologic classification of the Koobi Fora MTI map also took place in two stages. The field work stage involved location of outcrops of different lithologies within the Koobi Fora Formation. Field descriptions of these outcrops were made and their locations recorded. During the second stage, a linear spectral unmixing algorithm was applied to the MTI mosaic. In order to train the linear spectra unmixing algorithm, regions of interest representing four different classes of geologic material (tuff, alluvium, carbonate, and basalt), as well as a vegetation class were defined within the MTI mosaic. The regions of interest were based upon the aforementioned field data as well as overlays of geologic maps from the 1976 Iowa State mapping project. Pure spectra were generated for each class from the regions of interest, and then the unmixing algorithm classified each pixel according to relative percentage of classes found within the pixel based upon the pure spectra values. A total of four unique combinations of geologic classes were analyzed using the algorithm. The tuffs within the Koobi Fora Formation were defined with 100% accuracy using a combination of pure spectra from the basalt, vegetation, and tuff.« less
Spectral mapping tools from the earth sciences applied to spectral microscopy data.
Harris, A Thomas
2006-08-01
Spectral imaging, originating from the field of earth remote sensing, is a powerful tool that is being increasingly used in a wide variety of applications for material identification. Several workers have used techniques like linear spectral unmixing (LSU) to discriminate materials in images derived from spectral microscopy. However, many spectral analysis algorithms rely on assumptions that are often violated in microscopy applications. This study explores algorithms originally developed as improvements on early earth imaging techniques that can be easily translated for use with spectral microscopy. To best demonstrate the application of earth remote sensing spectral analysis tools to spectral microscopy data, earth imaging software was used to analyze data acquired with a Leica confocal microscope with mechanical spectral scanning. For this study, spectral training signatures (often referred to as endmembers) were selected with the ENVI (ITT Visual Information Solutions, Boulder, CO) "spectral hourglass" processing flow, a series of tools that use the spectrally over-determined nature of hyperspectral data to find the most spectrally pure (or spectrally unique) pixels within the data set. This set of endmember signatures was then used in the full range of mapping algorithms available in ENVI to determine locations, and in some cases subpixel abundances of endmembers. Mapping and abundance images showed a broad agreement between the spectral analysis algorithms, supported through visual assessment of output classification images and through statistical analysis of the distribution of pixels within each endmember class. The powerful spectral analysis algorithms available in COTS software, the result of decades of research in earth imaging, are easily translated to new sources of spectral data. Although the scale between earth imagery and spectral microscopy is radically different, the problem is the same: mapping material locations and abundances based on unique spectral signatures. (c) 2006 International Society for Analytical Cytology.
USDA-ARS?s Scientific Manuscript database
This study evaluated linear spectral unmixing (LSU), mixture tuned matched filtering (MTMF) and support vector machine (SVM) techniques for detecting and mapping giant reed (Arundo donax L.), an invasive weed that presents a severe threat to agroecosystems and riparian areas throughout the southern ...
Improving Automated Endmember Identification for Linear Unmixing of HyspIRI Spectral Data.
NASA Astrophysics Data System (ADS)
Gader, P.
2016-12-01
The size of data sets produced by imaging spectrometers is increasing rapidly. There is already a processing bottleneck. Part of the reason for this bottleneck is the need for expert input using interactive software tools. This process can be very time consuming and laborious but is currently crucial to ensuring the quality of the analysis. Automated algorithms can mitigate this problem. Although it is unlikely that processing systems can become completely automated, there is an urgent need to increase the level of automation. Spectral unmixing is a key component to processing HyspIRI data. Algorithms such as MESMA have been demonstrated to achieve results but require carefully, expert construction of endmember libraries. Unfortunately, many endmembers found by automated algorithms for finding endmembers are deemed unsuitable by experts because they are not physically reasonable. Unfortunately, endmembers that are not physically reasonable can achieve very low errors between the linear mixing model with those endmembers and the original data. Therefore, this error is not a reasonable way to resolve the problem on "non-physical" endmembers. There are many potential approaches for resolving these issues, including using Bayesian priors, but very little attention has been given to this problem. The study reported on here considers a modification of the Sparsity Promoting Iterated Constrained Endmember (SPICE) algorithm. SPICE finds endmembers and abundances and estimates the number of endmembers. The SPICE algorithm seeks to minimize a quadratic objective function with respect to endmembers E and fractions P. The modified SPICE algorithm, which we refer to as SPICED, is obtained by adding the term D to the objective function. The term D pressures the algorithm to minimize sum of the squared differences between each endmember and a weighted sum of the data. By appropriately modifying the, the endmembers are pushed towards a subset of the data with the potential for becoming exactly equal to the data points. The algorithm has been applied to spectral data and the differences between the endmembers resulting from ecorded. The results so far are that the endmembers found SPICED are approximately 25% closer to the data with indistinguishable reconstruction error compared to those found using SPICE.
Generating High-Temporal and Spatial Resolution TIR Image Data
NASA Astrophysics Data System (ADS)
Herrero-Huerta, M.; Lagüela, S.; Alfieri, S. M.; Menenti, M.
2017-09-01
Remote sensing imagery to monitor global biophysical dynamics requires the availability of thermal infrared data at high temporal and spatial resolution because of the rapid development of crops during the growing season and the fragmentation of most agricultural landscapes. Conversely, no single sensor meets these combined requirements. Data fusion approaches offer an alternative to exploit observations from multiple sensors, providing data sets with better properties. A novel spatio-temporal data fusion model based on constrained algorithms denoted as multisensor multiresolution technique (MMT) was developed and applied to generate TIR synthetic image data at both temporal and spatial high resolution. Firstly, an adaptive radiance model is applied based on spectral unmixing analysis of . TIR radiance data at TOA (top of atmosphere) collected by MODIS daily 1-km and Landsat - TIRS 16-day sampled at 30-m resolution are used to generate synthetic daily radiance images at TOA at 30-m spatial resolution. The next step consists of unmixing the 30 m (now lower resolution) images using the information about their pixel land-cover composition from co-registered images at higher spatial resolution. In our case study, TIR synthesized data were unmixed to the Sentinel 2 MSI with 10 m resolution. The constrained unmixing preserves all the available radiometric information of the 30 m images and involves the optimization of the number of land-cover classes and the size of the moving window for spatial unmixing. Results are still being evaluated, with particular attention for the quality of the data streams required to apply our approach.
Mapping tropical rainforest canopies using multi-temporal spaceborne imaging spectroscopy
NASA Astrophysics Data System (ADS)
Somers, Ben; Asner, Gregory P.
2013-10-01
The use of imaging spectroscopy for florisic mapping of forests is complicated by the spectral similarity among coexisting species. Here we evaluated an alternative spectral unmixing strategy combining a time series of EO-1 Hyperion images and an automated feature selection strategy in MESMA. Instead of using the same spectral subset to unmix each image pixel, our modified approach allowed the spectral subsets to vary on a per pixel basis such that each pixel is evaluated using a spectral subset tuned towards maximal separability of its specific endmember class combination or species mixture. The potential of the new approach for floristic mapping of tree species in Hawaiian rainforests was quantitatively demonstrated using both simulated and actual hyperspectral image time-series. With a Cohen's Kappa coefficient of 0.65, our approach provided a more accurate tree species map compared to MESMA (Kappa = 0.54). In addition, by the selection of spectral subsets our approach was about 90% faster than MESMA. The flexible or adaptive use of band sets in spectral unmixing as such provides an interesting avenue to address spectral similarities in complex vegetation canopies.
Unmixing Space Object’s Moderate Resolution Spectra
2013-09-01
collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE SEP 2013 2. REPORT TYPE 3. DATES COVERED 00...result of spectral unmixing. In the visible, the non- resolved spectral signature is modeled as a linear mixture of spectral reflectance signatures...1 (3) In (3), the first term expresses the Euclidian distance (l2) between the observed data and the forward model . The second term (l1
Spectral unmixing of hyperspectral data to map bauxite deposits
NASA Astrophysics Data System (ADS)
Shanmugam, Sanjeevi; Abhishekh, P. V.
2006-12-01
This paper presents a study about the potential of remote sensing in bauxite exploration in the Kolli hills of Tamilnadu state, southern India. ASTER image (acquired in the VNIR and SWIR regions) has been used in conjunction with SRTM - DEM in this study. A new approach of spectral unmixing of ASTER image data delineated areas rich in alumina. Various geological and geomorphological parameters that control bauxite formation were also derived from the ASTER image. All these information, when integrated, showed that there are 16 cappings (including the existing mines) that satisfy most of the conditions favouring bauxitization in the Kolli Hills. The study concludes that spectral unmixing of hyperspectral satellite data in the VNIR and SWIR regions may be combined with the terrain parameters to get accurate information about bauxite deposits, including their quality.
NASA Astrophysics Data System (ADS)
Y Yang, M.; Wang, J.; Zhang, Q.
2017-07-01
Vegetation coverage is one of the most important indicators for ecological environment change, and is also an effective index for the assessment of land degradation and desertification. The dry-hot valley regions have sparse surface vegetation, and the spectral information about the vegetation in such regions usually has a weak representation in remote sensing, so there are considerable limitations for applying the commonly-used vegetation index method to calculate the vegetation coverage in the dry-hot valley regions. Therefore, in this paper, Alternating Angle Minimum (AAM) algorithm of deterministic model is adopted for selective endmember for pixel unmixing of MODIS image in order to extract the vegetation coverage, and accuracy test is carried out by the use of the Landsat TM image over the same period. As shown by the results, in the dry-hot valley regions with sparse vegetation, AAM model has a high unmixing accuracy, and the extracted vegetation coverage is close to the actual situation, so it is promising to apply the AAM model to the extraction of vegetation coverage in the dry-hot valley regions.
NASA Astrophysics Data System (ADS)
Goodman, James Ansell
My research focuses on the development and application of hyperspectral remote sensing as a valuable component in the assessment and management of coral ecosystems. Remote sensing provides an important quantitative ability to investigate the spatial dynamics of coral health and evaluate the impacts of local, regional and global change on this important natural resource. Furthermore, advances in detector capabilities and analysis methods, particularly with respect to hyperspectral remote sensing, are also increasing the accuracy and level of effectiveness of the resulting data products. Using imagery of Kaneohe Bay and French Frigate Shoals in the Hawaiian Islands, acquired in 2000 by NASA's Airborne Visible InfraRed Imaging Spectrometer (AVIRIS), I developed, applied and evaluated algorithms for analyzing coral reefs using hyperspectral remote sensing data. Research included developing methods for acquiring in situ underwater reflectance, collecting spectral measurements of the dominant bottom components in Kaneohe Bay, applying atmospheric correction and sunglint removal algorithms, employing a semianalytical optimization model to derive bathymetry and aquatic optical properties, and developing a linear unmixing approach for deriving bottom composition. Additionally, algorithm development focused on using fundamental scientific principles to facilitate the portability of methods to diverse geographic locations and across variable environmental conditions. Assessments of this methodology compared favorably with available field measurements and habitat information, and the overall analysis demonstrated the capacity to derive information on water properties, bathymetry and habitat composition. Thus, results illustrated a successful approach for extracting environmental information and habitat composition from a coral reef environment using hyperspectral remote sensing.
NASA Technical Reports Server (NTRS)
Abercromby, Kira J.; Rapp, Jason; Bedard, Donald; Seitzer, Patrick; Cardona, Tommaso; Cowardin, Heather; Barker, Ed; Lederer, Susan
2013-01-01
Constrained Linear Least Squares model is generally more accurate than the "human-in-the-loop". However, "human-in-the-loop" can remove materials that make no sense. The speed of the model in determining a "first cut" at the material ID makes it a viable option for spectral unmixing of debris objects.
NASA Astrophysics Data System (ADS)
Senthil Kumar, A.; Keerthi, V.; Manjunath, A. S.; Werff, Harald van der; Meer, Freek van der
2010-08-01
Classification of hyperspectral images has been receiving considerable attention with many new applications reported from commercial and military sectors. Hyperspectral images are composed of a large number of spectral channels, and have the potential to deliver a great deal of information about a remotely sensed scene. However, in addition to high dimensionality, hyperspectral image classification is compounded with a coarse ground pixel size of the sensor for want of adequate sensor signal to noise ratio within a fine spectral passband. This makes multiple ground features jointly occupying a single pixel. Spectral mixture analysis typically begins with pixel classification with spectral matching techniques, followed by the use of spectral unmixing algorithms for estimating endmembers abundance values in the pixel. The spectral matching techniques are analogous to supervised pattern recognition approaches, and try to estimate some similarity between spectral signatures of the pixel and reference target. In this paper, we propose a spectral matching approach by combining two schemes—variable interval spectral average (VISA) method and spectral curve matching (SCM) method. The VISA method helps to detect transient spectral features at different scales of spectral windows, while the SCM method finds a match between these features of the pixel and one of library spectra by least square fitting. Here we also compare the performance of the combined algorithm with other spectral matching techniques using a simulated and the AVIRIS hyperspectral data sets. Our results indicate that the proposed combination technique exhibits a stronger performance over the other methods in the classification of both the pure and mixed class pixels simultaneously.
SMV⊥: Simplex of maximal volume based upon the Gram-Schmidt process
NASA Astrophysics Data System (ADS)
Salazar-Vazquez, Jairo; Mendez-Vazquez, Andres
2015-10-01
In recent years, different algorithms for Hyperspectral Image (HI) analysis have been introduced. The high spectral resolution of these images allows to develop different algorithms for target detection, material mapping, and material identification for applications in Agriculture, Security and Defense, Industry, etc. Therefore, from the computer science's point of view, there is fertile field of research for improving and developing algorithms in HI analysis. In some applications, the spectral pixels of a HI can be classified using laboratory spectral signatures. Nevertheless, for many others, there is no enough available prior information or spectral signatures, making any analysis a difficult task. One of the most popular algorithms for the HI analysis is the N-FINDR because it is easy to understand and provides a way to unmix the original HI in the respective material compositions. The N-FINDR is computationally expensive and its performance depends on a random initialization process. This paper proposes a novel idea to reduce the complexity of the N-FINDR by implementing a bottom-up approach based in an observation from linear algebra and the use of the Gram-Schmidt process. Therefore, the Simplex of Maximal Volume Perpendicular (SMV⊥) algorithm is proposed for fast endmember extraction in hyperspectral imagery. This novel algorithm has complexity O(n) with respect to the number of pixels. In addition, the evidence shows that SMV⊥ calculates a bigger volume, and has lower computational time complexity than other poular algorithms on synthetic and real scenarios.
Mezzanotte, Laura; Que, Ivo; Kaijzel, Eric; Branchini, Bruce; Roda, Aldo; Löwik, Clemens
2011-04-22
Despite a plethora of bioluminescent reporter genes being cloned and used for cell assays and molecular imaging purposes, the simultaneous monitoring of multiple events in small animals is still challenging. This is partly attributable to the lack of optimization of cell reporter gene expression as well as too much spectral overlap of the color-coupled reporter genes. A new red emitting codon-optimized luciferase reporter gene mutant of Photinus pyralis, Ppy RE8, has been developed and used in combination with the green click beetle luciferase, CBG99. Human embryonic kidney cells (HEK293) were transfected with vectors that expressed red Ppy RE8 and green CBG99 luciferases. Populations of red and green emitting cells were mixed in different ratios. After addition of the shared single substrate, D-luciferin, bioluminescent (BL) signals were imaged with an ultrasensitive cooled CCD camera using a series of band pass filters (20 nm). Spectral unmixing algorithms were applied to the images where good separation of signals was observed. Furthermore, HEK293 cells that expressed the two luciferases were injected at different depth in the animals. Spectrally-separate images and quantification of the dual BL signals in a mixed population of cells was achieved when cells were either injected subcutaneously or directly into the prostate. We report here the re-engineering of different luciferase genes for in vitro and in vivo dual color imaging applications to address the technical issues of using dual luciferases for imaging. In respect to previously used dual assays, our study demonstrated enhanced sensitivity combined with spatially separate BL spectral emissions using a suitable spectral unmixing algorithm. This new D-luciferin-dependent reporter gene couplet opens up the possibility in the future for more accurate quantitative gene expression studies in vivo by simultaneously monitoring two events in real time.
Mezzanotte, Laura; Que, Ivo; Kaijzel, Eric; Branchini, Bruce; Roda, Aldo; Löwik, Clemens
2011-01-01
Background Despite a plethora of bioluminescent reporter genes being cloned and used for cell assays and molecular imaging purposes, the simultaneous monitoring of multiple events in small animals is still challenging. This is partly attributable to the lack of optimization of cell reporter gene expression as well as too much spectral overlap of the color-coupled reporter genes. A new red emitting codon-optimized luciferase reporter gene mutant of Photinus pyralis, Ppy RE8, has been developed and used in combination with the green click beetle luciferase, CBG99. Principal Findings Human embryonic kidney cells (HEK293) were transfected with vectors that expressed red Ppy RE8 and green CBG99 luciferases. Populations of red and green emitting cells were mixed in different ratios. After addition of the shared single substrate, D-luciferin, bioluminescent (BL) signals were imaged with an ultrasensitive cooled CCD camera using a series of band pass filters (20 nm). Spectral unmixing algorithms were applied to the images where good separation of signals was observed. Furthermore, HEK293 cells that expressed the two luciferases were injected at different depth in the animals. Spectrally-separate images and quantification of the dual BL signals in a mixed population of cells was achieved when cells were either injected subcutaneously or directly into the prostate. Significance We report here the re-engineering of different luciferase genes for in vitro and in vivo dual color imaging applications to address the technical issues of using dual luciferases for imaging. In respect to previously used dual assays, our study demonstrated enhanced sensitivity combined with spatially separate BL spectral emissions using a suitable spectral unmixing algorithm. This new D-luciferin-dependent reporter gene couplet opens up the possibility in the future for more accurate quantitative gene expression studies in vivo by simultaneously monitoring two events in real time. PMID:21544210
Retrieving the hydrous minerals on Mars by sparse unmixing and the Hapke model using MRO/CRISM data
NASA Astrophysics Data System (ADS)
Lin, Honglei; Zhang, Xia
2017-05-01
The hydrous minerals on Mars preserve records of potential past aqueous activity. Quantitative information regarding mineralogical composition would enable a better understanding of the formation processes of these hydrous minerals, and provide unique insights into ancient habitable environments and the geological evolution of Mars. The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) has the advantage of both a high spatial and spectral resolution, which makes it suitable for the quantitative analysis of minerals on Mars. However, few studies have attempted to quantitatively retrieve the mineralogical composition of hydrous minerals on Mars using visible-infrared (VISIR) hyperspectral data due to their distribution characteristics (relatively low concentrations, located primarily in Noachian terrain, and unclear or unknown background minerals) and limitations of the spectral unmixing algorithms. In this study, we developed a modified sparse unmixing (MSU) method, combining the Hapke model with sparse unmixing. The MSU method considers the nonlinear mixed effects of minerals and avoids the difficulty of determining the spectra and number of endmembers from the image. The proposed method was tested successfully using laboratory mixture spectra and an Airborne Visible Infrared Imaging Spectrometer (AVIRIS) image of the Cuprite site (Nevada, USA). Then it was applied to CRISM hyperspectral images over Gale crater. Areas of hydrous mineral distribution were first identified by spectral features of water and hydroxyl absorption. The MSU method was performed on these areas, and the abundances were retrieved. The results indicated that the hydrous minerals consisted mostly of hydrous silicates, with abundances of up to 35%, as well as hydrous sulfates, with abundances ≤10%. Several main subclasses of hydrous minerals (e.g., Fe/Mg phyllosilicate, prehnite, and kieserite) were retrieved. Among these, Fe/Mg- phyllosilicate was the most abundant, with abundances ranging up to almost 30%, followed by prehnite and kieserite, with abundances lower than 15%. Our results are consistent with related research and in situ analyses of data from the rover Curiosity; thus, our method has the potential to be widely used for quantitative mineralogical mapping at the global scale of the surface of Mars.
NASA Astrophysics Data System (ADS)
Su, Yuanchao; Sun, Xu; Gao, Lianru; Li, Jun; Zhang, Bing
2016-10-01
Endmember extraction is a key step in hyperspectral unmixing. A new endmember extraction framework is proposed for hyperspectral endmember extraction. The proposed approach is based on the swarm intelligence (SI) algorithm, where discretization is used to solve the SI algorithm because pixels in a hyperspectral image are naturally defined within a discrete space. Moreover, a "distance" factor is introduced into the objective function to limit the endmember numbers which is generally limited in real scenarios, while traditional SI algorithms likely produce superabundant spectral signatures, which generally belong to the same classes. Three endmember extraction methods are proposed based on the artificial bee colony, ant colony optimization, and particle swarm optimization algorithms. Experiments with both simulated and real hyperspectral images indicate that the proposed framework can improve the accuracy of endmember extraction.
NASA Astrophysics Data System (ADS)
Oommen, T.; Chatterjee, S.
2017-12-01
NASA and the Indian Space Research Organization (ISRO) are generating Earth surface features data using Airborne Visible/Infrared Imaging Spectrometer-Next Generation (AVIRIS-NG) within 380 to 2500 nm spectral range. This research focuses on the utilization of such data to better understand the mineral potential in India and to demonstrate the application of spectral data in rock type discrimination and mapping for mineral exploration by using automated mapping techniques. The primary focus area of this research is the Hutti-Maski greenstone belt, located in Karnataka, India. The AVIRIS-NG data was integrated with field analyzed data (laboratory scaled compositional analysis, mineralogy, and spectral library) to characterize minerals and rock types. An expert system was developed to produce mineral maps from AVIRIS-NG data automatically. The ground truth data from the study areas was obtained from the existing literature and collaborators from India. The Bayesian spectral unmixing algorithm was used in AVIRIS-NG data for endmember selection. The classification maps of the minerals and rock types were developed using support vector machine algorithm. The ground truth data was used to verify the mineral maps.
A fast fully constrained geometric unmixing of hyperspectral images
NASA Astrophysics Data System (ADS)
Zhou, Xin; Li, Xiao-run; Cui, Jian-tao; Zhao, Liao-ying; Zheng, Jun-peng
2014-11-01
A great challenge in hyperspectral image analysis is decomposing a mixed pixel into a collection of endmembers and their corresponding abundance fractions. This paper presents an improved implementation of Barycentric Coordinate approach to unmix hyperspectral images, integrating with the Most-Negative Remove Projection method to meet the abundance sum-to-one constraint (ASC) and abundance non-negativity constraint (ANC). The original barycentric coordinate approach interprets the endmember unmixing problem as a simplex volume ratio problem, which is solved by calculate the determinants of two augmented matrix. One consists of all the members and the other consist of the to-be-unmixed pixel and all the endmembers except for the one corresponding to the specific abundance that is to be estimated. In this paper, we first modified the algorithm of Barycentric Coordinate approach by bringing in the Matrix Determinant Lemma to simplify the unmixing process, which makes the calculation only contains linear matrix and vector operations. So, the matrix determinant calculation of every pixel, as the original algorithm did, is avoided. By the end of this step, the estimated abundance meet the ASC constraint. Then, the Most-Negative Remove Projection method is used to make the abundance fractions meet the full constraints. This algorithm is demonstrated both on synthetic and real images. The resulting algorithm yields the abundance maps that are similar to those obtained by FCLS, while the runtime is outperformed as its computational simplicity.
Spectral imaging perspective on cytomics.
Levenson, Richard M
2006-07-01
Cytomics involves the analysis of cellular morphology and molecular phenotypes, with reference to tissue architecture and to additional metadata. To this end, a variety of imaging and nonimaging technologies need to be integrated. Spectral imaging is proposed as a tool that can simplify and enrich the extraction of morphological and molecular information. Simple-to-use instrumentation is available that mounts on standard microscopes and can generate spectral image datasets with excellent spatial and spectral resolution; these can be exploited by sophisticated analysis tools. This report focuses on brightfield microscopy-based approaches. Cytological and histological samples were stained using nonspecific standard stains (Giemsa; hematoxylin and eosin (H&E)) or immunohistochemical (IHC) techniques employing three chromogens plus a hematoxylin counterstain. The samples were imaged using the Nuance system, a commercially available, liquid-crystal tunable-filter-based multispectral imaging platform. The resulting data sets were analyzed using spectral unmixing algorithms and/or learn-by-example classification tools. Spectral unmixing of Giemsa-stained guinea-pig blood films readily classified the major blood elements. Machine-learning classifiers were also successful at the same task, as well in distinguishing normal from malignant regions in a colon-cancer example, and in delineating regions of inflammation in an H&E-stained kidney sample. In an example of a multiplexed ICH sample, brown, red, and blue chromogens were isolated into separate images without crosstalk or interference from the (also blue) hematoxylin counterstain. Cytomics requires both accurate architectural segmentation as well as multiplexed molecular imaging to associate molecular phenotypes with relevant cellular and tissue compartments. Multispectral imaging can assist in both these tasks, and conveys new utility to brightfield-based microscopy approaches. Copyright 2006 International Society for Analytical Cytology.
Nanohole-array-based device for 2D snapshot multispectral imaging
Najiminaini, Mohamadreza; Vasefi, Fartash; Kaminska, Bozena; Carson, Jeffrey J. L.
2013-01-01
We present a two-dimensional (2D) snapshot multispectral imager that utilizes the optical transmission characteristics of nanohole arrays (NHAs) in a gold film to resolve a mixture of input colors into multiple spectral bands. The multispectral device consists of blocks of NHAs, wherein each NHA has a unique periodicity that results in transmission resonances and minima in the visible and near-infrared regions. The multispectral device was illuminated over a wide spectral range, and the transmission was spectrally unmixed using a least-squares estimation algorithm. A NHA-based multispectral imaging system was built and tested in both reflection and transmission modes. The NHA-based multispectral imager was capable of extracting 2D multispectral images representative of four independent bands within the spectral range of 662 nm to 832 nm for a variety of targets. The multispectral device can potentially be integrated into a variety of imaging sensor systems. PMID:24005065
NASA Astrophysics Data System (ADS)
Kemper, Thomas; Sommer, Stefan
2004-10-01
Field and airborne hyperspectral data was used to map residual contamination after a mining accident, by applying spectral mixture modelling. Test case was the Aznalcollar Mine (Southern Spain) accident, where heavy metal bearing sludge from a tailings pond was distributed over large areas of the Guadiamar flood plain. Although the sludge and the contaminated topsoils have been removed mechanically in the whole affected area, still high abundance of pyritic material remained on the ground. During dedicated field campaigns in two subsequent years soil samples were collected for geochemical and spectral laboratory analysis and spectral field measurements were carried out in parallel to data acquisition with the HyMap sensor. A Variable Multiple Endmember Spectral Mixture Analysis (VMESMA) tool was used providing possibilities of multiple endmember unmixing, aiming to estimate the quantities and distribution of the remaining tailings material. A spectrally based zonal partition of the area was introduced to allow the application of different submodels to the selected areas. Based on an iterative feedback process, the unmixing performance could be improved in each stage until an optimum level was reached. The sludge abundances obtained by unmixing the hyperspectral spectral data were confirmed by the field observations and chemical measurements of samples taken in the area. The semi-quantitative sludge abundances of residual pyritic material could be transformed into quantitative information for an assessment of acidification risk and distribution of residual heavy metal contamination based on an artificial mixture experiment. The unmixing of the second year images allowed identification of secondary minerals of pyrite as indicators of pyrite oxidation and associated acidification.
Assessing FRET using Spectral Techniques
Leavesley, Silas J.; Britain, Andrea L.; Cichon, Lauren K.; Nikolaev, Viacheslav O.; Rich, Thomas C.
2015-01-01
Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein–protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP–Epac–YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. PMID:23929684
Assessing FRET using spectral techniques.
Leavesley, Silas J; Britain, Andrea L; Cichon, Lauren K; Nikolaev, Viacheslav O; Rich, Thomas C
2013-10-01
Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein-protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP-Epac-YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. © 2013 International Society for Advancement of Cytometry. Copyright © 2013 International Society for Advancement of Cytometry.
(LMRG): Microscope Resolution, Objective Quality, Spectral Accuracy and Spectral Un-mixing
Bayles, Carol J.; Cole, Richard W.; Eason, Brady; Girard, Anne-Marie; Jinadasa, Tushare; Martin, Karen; McNamara, George; Opansky, Cynthia; Schulz, Katherine; Thibault, Marc; Brown, Claire M.
2012-01-01
The second study by the LMRG focuses on measuring confocal laser scanning microscope (CLSM) resolution, objective lens quality, spectral imaging accuracy and spectral un-mixing. Affordable test samples for each aspect of the study were designed, prepared and sent to 116 labs from 23 countries across the globe. Detailed protocols were designed for the three tests and customized for most of the major confocal instruments being used by the study participants. One protocol developed for measuring resolution and objective quality was recently published in Nature Protocols (Cole, R. W., T. Jinadasa, et al. (2011). Nature Protocols 6(12): 1929–1941). The first study involved 3D imaging of sub-resolution fluorescent microspheres to determine the microscope point spread function. Results of the resolution studies as well as point spread function quality (i.e. objective lens quality) from 140 different objective lenses will be presented. The second study of spectral accuracy looked at the reflection of the laser excitation lines into the spectral detection in order to determine the accuracy of these systems to report back the accurate laser emission wavelengths. Results will be presented from 42 different spectral confocal systems. Finally, samples with double orange beads (orange core and orange coating) were imaged spectrally and the imaging software was used to un-mix fluorescence signals from the two orange dyes. Results from 26 different confocal systems will be summarized. Time will be left to discuss possibilities for the next LMRG study.
Effects of band selection on endmember extraction for forestry applications
NASA Astrophysics Data System (ADS)
Karathanassi, Vassilia; Andreou, Charoula; Andronis, Vassilis; Kolokoussis, Polychronis
2014-10-01
In spectral unmixing theory, data reduction techniques play an important role as hyperspectral imagery contains an immense amount of data, posing many challenging problems such as data storage, computational efficiency, and the so called "curse of dimensionality". Feature extraction and feature selection are the two main approaches for dimensionality reduction. Feature extraction techniques are used for reducing the dimensionality of the hyperspectral data by applying transforms on hyperspectral data. Feature selection techniques retain the physical meaning of the data by selecting a set of bands from the input hyperspectral dataset, which mainly contain the information needed for spectral unmixing. Although feature selection techniques are well-known for their dimensionality reduction potentials they are rarely used in the unmixing process. The majority of the existing state-of-the-art dimensionality reduction methods set criteria to the spectral information, which is derived by the whole wavelength, in order to define the optimum spectral subspace. These criteria are not associated with any particular application but with the data statistics, such as correlation and entropy values. However, each application is associated with specific land c over materials, whose spectral characteristics present variations in specific wavelengths. In forestry for example, many applications focus on tree leaves, in which specific pigments such as chlorophyll, xanthophyll, etc. determine the wavelengths where tree species, diseases, etc., can be detected. For such applications, when the unmixing process is applied, the tree species, diseases, etc., are considered as the endmembers of interest. This paper focuses on investigating the effects of band selection on the endmember extraction by exploiting the information of the vegetation absorbance spectral zones. More precisely, it is explored whether endmember extraction can be optimized when specific sets of initial bands related to leaf spectral characteristics are selected. Experiments comprise application of well-known signal subspace estimation and endmember extraction methods on a hyperspectral imagery that presents a forest area. Evaluation of the extracted endmembers showed that more forest species can be extracted as endmembers using selected bands.
Estimation of tissue optical parameters with hyperspectral imaging and spectral unmixing
NASA Astrophysics Data System (ADS)
Lu, Guolan; Qin, Xulei; Wang, Dongsheng; Chen, Zhuo G.; Fei, Baowei
2015-03-01
Early detection of oral cancer and its curable precursors can improve patient survival and quality of life. Hyperspectral imaging (HSI) holds the potential for noninvasive early detection of oral cancer. The quantification of tissue chromophores by spectral unmixing of hyperspectral images could provide insights for evaluating cancer progression. In this study, non-negative matrix factorization has been applied for decomposing hyperspectral images into physiologically meaningful chromophore concentration maps. The approach has been validated by computer-simulated hyperspectral images and in vivo tumor hyperspectral images from a head and neck cancer animal model.
GPU implementation of the simplex identification via split augmented Lagrangian
NASA Astrophysics Data System (ADS)
Sevilla, Jorge; Nascimento, José M. P.
2015-10-01
Hyperspectral imaging can be used for object detection and for discriminating between different objects based on their spectral characteristics. One of the main problems of hyperspectral data analysis is the presence of mixed pixels, due to the low spatial resolution of such images. This means that several spectrally pure signatures (endmembers) are combined into the same mixed pixel. Linear spectral unmixing follows an unsupervised approach which aims at inferring pure spectral signatures and their material fractions at each pixel of the scene. The huge data volumes acquired by such sensors put stringent requirements on processing and unmixing methods. This paper proposes an efficient implementation of a unsupervised linear unmixing method on GPUs using CUDA. The method finds the smallest simplex by solving a sequence of nonsmooth convex subproblems using variable splitting to obtain a constraint formulation, and then applying an augmented Lagrangian technique. The parallel implementation of SISAL presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory. The results herein presented indicate that the GPU implementation can significantly accelerate the method's execution over big datasets while maintaining the methods accuracy.
A novel highly parallel algorithm for linearly unmixing hyperspectral images
NASA Astrophysics Data System (ADS)
Guerra, Raúl; López, Sebastián.; Callico, Gustavo M.; López, Jose F.; Sarmiento, Roberto
2014-10-01
Endmember extraction and abundances calculation represent critical steps within the process of linearly unmixing a given hyperspectral image because of two main reasons. The first one is due to the need of computing a set of accurate endmembers in order to further obtain confident abundance maps. The second one refers to the huge amount of operations involved in these time-consuming processes. This work proposes an algorithm to estimate the endmembers of a hyperspectral image under analysis and its abundances at the same time. The main advantage of this algorithm is its high parallelization degree and the mathematical simplicity of the operations implemented. This algorithm estimates the endmembers as virtual pixels. In particular, the proposed algorithm performs the descent gradient method to iteratively refine the endmembers and the abundances, reducing the mean square error, according with the linear unmixing model. Some mathematical restrictions must be added so the method converges in a unique and realistic solution. According with the algorithm nature, these restrictions can be easily implemented. The results obtained with synthetic images demonstrate the well behavior of the algorithm proposed. Moreover, the results obtained with the well-known Cuprite dataset also corroborate the benefits of our proposal.
NASA Astrophysics Data System (ADS)
Liu, Xi; Zhou, Mei; Qiu, Song; Sun, Li; Liu, Hongying; Li, Qingli; Wang, Yiting
2017-12-01
Red blood cell counting, as a routine examination, plays an important role in medical diagnoses. Although automated hematology analyzers are widely used, manual microscopic examination by a hematologist or pathologist is still unavoidable, which is time-consuming and error-prone. This paper proposes a full-automatic red blood cell counting method which is based on microscopic hyperspectral imaging of blood smears and combines spatial and spectral information to achieve high precision. The acquired hyperspectral image data of the blood smear in the visible and near-infrared spectral range are firstly preprocessed, and then a quadratic blind linear unmixing algorithm is used to get endmember abundance images. Based on mathematical morphological operation and an adaptive Otsu’s method, a binaryzation process is performed on the abundance images. Finally, the connected component labeling algorithm with magnification-based parameter setting is applied to automatically select the binary images of red blood cell cytoplasm. Experimental results show that the proposed method can perform well and has potential for clinical applications.
NASA Astrophysics Data System (ADS)
Yang, Jian; He, Yuhong
2017-02-01
Quantifying impervious surfaces in urban and suburban areas is a key step toward a sustainable urban planning and management strategy. With the availability of fine-scale remote sensing imagery, automated mapping of impervious surfaces has attracted growing attention. However, the vast majority of existing studies have selected pixel-based and object-based methods for impervious surface mapping, with few adopting sub-pixel analysis of high spatial resolution imagery. This research makes use of a vegetation-bright impervious-dark impervious linear spectral mixture model to characterize urban and suburban surface components. A WorldView-3 image acquired on May 9th, 2015 is analyzed for its potential in automated unmixing of meaningful surface materials for two urban subsets and one suburban subset in Toronto, ON, Canada. Given the wide distribution of shadows in urban areas, the linear spectral unmixing is implemented in non-shadowed and shadowed areas separately for the two urban subsets. The results indicate that the accuracy of impervious surface mapping in suburban areas reaches up to 86.99%, much higher than the accuracies in urban areas (80.03% and 79.67%). Despite its merits in mapping accuracy and automation, the application of our proposed vegetation-bright impervious-dark impervious model to map impervious surfaces is limited due to the absence of soil component. To further extend the operational transferability of our proposed method, especially for the areas where plenty of bare soils exist during urbanization or reclamation, it is still of great necessity to mask out bare soils by automated classification prior to the implementation of linear spectral unmixing.
NASA Astrophysics Data System (ADS)
Plaza, Antonio; Chang, Chein-I.; Plaza, Javier; Valencia, David
2006-05-01
The incorporation of hyperspectral sensors aboard airborne/satellite platforms is currently producing a nearly continual stream of multidimensional image data, and this high data volume has soon introduced new processing challenges. The price paid for the wealth spatial and spectral information available from hyperspectral sensors is the enormous amounts of data that they generate. Several applications exist, however, where having the desired information calculated quickly enough for practical use is highly desirable. High computing performance of algorithm analysis is particularly important in homeland defense and security applications, in which swift decisions often involve detection of (sub-pixel) military targets (including hostile weaponry, camouflage, concealment, and decoys) or chemical/biological agents. In order to speed-up computational performance of hyperspectral imaging algorithms, this paper develops several fast parallel data processing techniques. Techniques include four classes of algorithms: (1) unsupervised classification, (2) spectral unmixing, and (3) automatic target recognition, and (4) onboard data compression. A massively parallel Beowulf cluster (Thunderhead) at NASA's Goddard Space Flight Center in Maryland is used to measure parallel performance of the proposed algorithms. In order to explore the viability of developing onboard, real-time hyperspectral data compression algorithms, a Xilinx Virtex-II field programmable gate array (FPGA) is also used in experiments. Our quantitative and comparative assessment of parallel techniques and strategies may help image analysts in selection of parallel hyperspectral algorithms for specific applications.
Scaling dimensions in spectroscopy of soil and vegetation
NASA Astrophysics Data System (ADS)
Malenovský, Zbyněk; Bartholomeus, Harm M.; Acerbi-Junior, Fausto W.; Schopfer, Jürg T.; Painter, Thomas H.; Epema, Gerrit F.; Bregt, Arnold K.
2007-05-01
The paper revises and clarifies definitions of the term scale and scaling conversions for imaging spectroscopy of soil and vegetation. We demonstrate a new four-dimensional scale concept that includes not only spatial but also the spectral, directional and temporal components. Three scaling remote sensing techniques are reviewed: (1) radiative transfer, (2) spectral (un)mixing, and (3) data fusion. Relevant case studies are given in the context of their up- and/or down-scaling abilities over the soil/vegetation surfaces and a multi-source approach is proposed for their integration. Radiative transfer (RT) models are described to show their capacity for spatial, spectral up-scaling, and directional down-scaling within a heterogeneous environment. Spectral information and spectral derivatives, like vegetation indices (e.g. TCARI/OSAVI), can be scaled and even tested by their means. Radiative transfer of an experimental Norway spruce ( Picea abies (L.) Karst.) research plot in the Czech Republic was simulated by the Discrete Anisotropic Radiative Transfer (DART) model to prove relevance of the correct object optical properties scaled up to image data at two different spatial resolutions. Interconnection of the successive modelling levels in vegetation is shown. A future development in measurement and simulation of the leaf directional spectral properties is discussed. We describe linear and/or non-linear spectral mixing techniques and unmixing methods that demonstrate spatial down-scaling. Relevance of proper selection or acquisition of the spectral endmembers using spectral libraries, field measurements, and pure pixels of the hyperspectral image is highlighted. An extensive list of advanced unmixing techniques, a particular example of unmixing a reflective optics system imaging spectrometer (ROSIS) image from Spain, and examples of other mixture applications give insight into the present status of scaling capabilities. Simultaneous spatial and temporal down-scaling by means of a data fusion technique is described. A demonstrative example is given for the moderate resolution imaging spectroradiometer (MODIS) and LANDSAT Thematic Mapper (TM) data from Brazil. Corresponding spectral bands of both sensors were fused via a pyramidal wavelet transform in Fourier space. New spectral and temporal information of the resultant image can be used for thematic classification or qualitative mapping. All three described scaling techniques can be integrated as the relevant methodological steps within a complex multi-source approach. We present this concept of combining numerous optical remote sensing data and methods to generate inputs for ecosystem process models.
Spectral unmixing of agents on surfaces for the Joint Contaminated Surface Detector (JCSD)
NASA Astrophysics Data System (ADS)
Slamani, Mohamed-Adel; Chyba, Thomas H.; LaValley, Howard; Emge, Darren
2007-09-01
ITT Corporation, Advanced Engineering and Sciences Division, is currently developing the Joint Contaminated Surface Detector (JCSD) technology under an Advanced Concept Technology Demonstration (ACTD) managed jointly by the U.S. Army Research, Development, and Engineering Command (RDECOM) and the Joint Project Manager for Nuclear, Biological, and Chemical Contamination Avoidance for incorporation on the Army's future reconnaissance vehicles. This paper describes the design of the chemical agent identification (ID) algorithm associated with JCSD. The algorithm detects target chemicals mixed with surface and interferent signatures. Simulated data sets were generated from real instrument measurements to support a matrix of parameters based on a Design Of Experiments approach (DOE). Decisions based on receiver operating characteristics (ROC) curves and area-under-the-curve (AUC) measures were used to down-select between several ID algorithms. Results from top performing algorithms were then combined via a fusion approach to converge towards optimum rates of detections and false alarms. This paper describes the process associated with the algorithm design and provides an illustrating example.
NASA Astrophysics Data System (ADS)
Lin, H.; Zhang, X.; Wu, X.; Tarnas, J. D.; Mustard, J. F.
2018-04-01
Quantitative analysis of hydrated minerals from hyperspectral remote sensing data is fundamental for understanding Martian geologic process. Because of the difficulties for selecting endmembers from hyperspectral images, a sparse unmixing algorithm has been proposed to be applied to CRISM data on Mars. However, it's challenge when the endmember library increases dramatically. Here, we proposed a new methodology termed Target Transformation Constrained Sparse Unmixing (TTCSU) to accurately detect hydrous minerals on Mars. A new version of target transformation technique proposed in our recent work was used to obtain the potential detections from CRISM data. Sparse unmixing constrained with these detections as prior information was applied to CRISM single-scattering albedo images, which were calculated using a Hapke radiative transfer model. This methodology increases success rate of the automatic endmember selection of sparse unmixing and could get more accurate abundances. CRISM images with well analyzed in Southwest Melas Chasma was used to validate our methodology in this study. The sulfates jarosite was detected from Southwest Melas Chasma, the distribution is consistent with previous work and the abundance is comparable. More validations will be done in our future work.
NASA Astrophysics Data System (ADS)
Wu, Yuanfeng; Gao, Lianru; Zhang, Bing; Zhao, Haina; Li, Jun
2014-01-01
We present a parallel implementation of the optimized maximum noise fraction (G-OMNF) transform algorithm for feature extraction of hyperspectral images on commodity graphics processing units (GPUs). The proposed approach explored the algorithm data-level concurrency and optimized the computing flow. We first defined a three-dimensional grid, in which each thread calculates a sub-block data to easily facilitate the spatial and spectral neighborhood data searches in noise estimation, which is one of the most important steps involved in OMNF. Then, we optimized the processing flow and computed the noise covariance matrix before computing the image covariance matrix to reduce the original hyperspectral image data transmission. These optimization strategies can greatly improve the computing efficiency and can be applied to other feature extraction algorithms. The proposed parallel feature extraction algorithm was implemented on an Nvidia Tesla GPU using the compute unified device architecture and basic linear algebra subroutines library. Through the experiments on several real hyperspectral images, our GPU parallel implementation provides a significant speedup of the algorithm compared with the CPU implementation, especially for highly data parallelizable and arithmetically intensive algorithm parts, such as noise estimation. In order to further evaluate the effectiveness of G-OMNF, we used two different applications: spectral unmixing and classification for evaluation. Considering the sensor scanning rate and the data acquisition time, the proposed parallel implementation met the on-board real-time feature extraction.
Grégori, Gérald; Rajwa, Bartek; Patsekin, Valery; Jones, James; Furuki, Motohiro; Yamamoto, Masanobu; Paul Robinson, J
2014-01-01
Hyperspectral cytometry is an emerging technology for single-cell analysis that combines ultrafast optical spectroscopy and flow cytometry. Spectral cytometry systems utilize diffraction gratings or prism-based monochromators to disperse fluorescence signals from multiple labels (organic dyes, nanoparticles, or fluorescent proteins) present in each analyzed bioparticle onto linear detector arrays such as multianode photomultipliers or charge-coupled device sensors. The resultant data, consisting of a series of characterizing every analyzed cell, are not compensated by employing the traditional cytometry approach, but rather are spectrally unmixed utilizing algorithms such as constrained Poisson regression or non-negative matrix factorization. Although implementations of spectral cytometry were envisioned as early as the 1980s, only recently has the development of highly sensitive photomultiplier tube arrays led to design and construction of functional prototypes and subsequently to introduction of commercially available systems. This chapter summarizes the historical efforts and work in the field of spectral cytometry performed at Purdue University Cytometry Laboratories and describes the technology developed by Sony Corporation that resulted in release of the first commercial spectral cytometry system-the Sony SP6800. A brief introduction to spectral data analysis is also provided, with emphasis on the differences between traditional polychromatic and spectral cytometry approaches.
Multiphoton spectral analysis of benzo[a]pyrene uptake and metabolism in a rat liver cell line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhoumi, Rola, E-mail: rmouneimne@cvm.tamu.edu; Mouneimne, Youssef; Ramos, Ernesto
2011-05-15
Dynamic analysis of the uptake and metabolism of polycyclic aromatic hydrocarbons (PAHs) and their metabolites within live cells in real time has the potential to provide novel insights into genotoxic and non-genotoxic mechanisms of cellular injury caused by PAHs. The present work, combining the use of metabolite spectra generated from metabolite standards using multiphoton spectral analysis and an 'advanced unmixing process', identifies and quantifies the uptake, partitioning, and metabolite formation of one of the most important PAHs (benzo[a]pyrene, BaP) in viable cultured rat liver cells over a period of 24 h. The application of the advanced unmixing process resulted inmore » the simultaneous identification of 8 metabolites in live cells at any single time. The accuracy of this unmixing process was verified using specific microsomal epoxide hydrolase inhibitors, glucuronidation and sulfation inhibitors as well as several mixtures of metabolite standards. Our findings prove that the two-photon microscopy imaging surpasses the conventional fluorescence imaging techniques and the unmixing process is a mathematical technique that seems applicable to the analysis of BaP metabolites in living cells especially for analysis of changes of the ultimate carcinogen benzo[a]pyrene-r-7,t-8-dihydrodiol-t-9,10-epoxide. Therefore, the combination of the two-photon acquisition with the unmixing process should provide important insights into the cellular and molecular mechanisms by which BaP and other PAHs alter cellular homeostasis.« less
Terahertz spectral unmixing based method for identifying gastric cancer
NASA Astrophysics Data System (ADS)
Cao, Yuqi; Huang, Pingjie; Li, Xian; Ge, Weiting; Hou, Dibo; Zhang, Guangxin
2018-02-01
At present, many researchers are exploring biological tissue inspection using terahertz time-domain spectroscopy (THz-TDS) techniques. In this study, based on a modified hard modeling factor analysis method, terahertz spectral unmixing was applied to investigate the relationships between the absorption spectra in THz-TDS and certain biomarkers of gastric cancer in order to systematically identify gastric cancer. A probability distribution and box plot were used to extract the distinctive peaks that indicate carcinogenesis, and the corresponding weight distributions were used to discriminate the tissue types. The results of this work indicate that terahertz techniques have the potential to detect different levels of cancer, including benign tumors and polyps.
NASA Astrophysics Data System (ADS)
Ma, Yehao; Li, Xian; Huang, Pingjie; Hou, Dibo; Wang, Qiang; Zhang, Guangxin
2017-04-01
In many situations the THz spectroscopic data observed from complex samples represent the integrated result of several interrelated variables or feature components acting together. The actual information contained in the original data might be overlapping and there is a necessity to investigate various approaches for model reduction and data unmixing. The development and use of low-rank approximate nonnegative matrix factorization (NMF) and smooth constraint NMF (CNMF) algorithms for feature components extraction and identification in the fields of terahertz time domain spectroscopy (THz-TDS) data analysis are presented. The evolution and convergence properties of NMF and CNMF methods based on sparseness, independence and smoothness constraints for the resulting nonnegative matrix factors are discussed. For general NMF, its cost function is nonconvex and the result is usually susceptible to initialization and noise corruption, and may fall into local minima and lead to unstable decomposition. To reduce these drawbacks, smoothness constraint is introduced to enhance the performance of NMF. The proposed algorithms are evaluated by several THz-TDS data decomposition experiments including a binary system and a ternary system simulating some applications such as medicine tablet inspection. Results show that CNMF is more capable of finding optimal solutions and more robust for random initialization in contrast to NMF. The investigated method is promising for THz data resolution contributing to unknown mixture identification.
Ma, Yehao; Li, Xian; Huang, Pingjie; Hou, Dibo; Wang, Qiang; Zhang, Guangxin
2017-04-15
In many situations the THz spectroscopic data observed from complex samples represent the integrated result of several interrelated variables or feature components acting together. The actual information contained in the original data might be overlapping and there is a necessity to investigate various approaches for model reduction and data unmixing. The development and use of low-rank approximate nonnegative matrix factorization (NMF) and smooth constraint NMF (CNMF) algorithms for feature components extraction and identification in the fields of terahertz time domain spectroscopy (THz-TDS) data analysis are presented. The evolution and convergence properties of NMF and CNMF methods based on sparseness, independence and smoothness constraints for the resulting nonnegative matrix factors are discussed. For general NMF, its cost function is nonconvex and the result is usually susceptible to initialization and noise corruption, and may fall into local minima and lead to unstable decomposition. To reduce these drawbacks, smoothness constraint is introduced to enhance the performance of NMF. The proposed algorithms are evaluated by several THz-TDS data decomposition experiments including a binary system and a ternary system simulating some applications such as medicine tablet inspection. Results show that CNMF is more capable of finding optimal solutions and more robust for random initialization in contrast to NMF. The investigated method is promising for THz data resolution contributing to unknown mixture identification. Copyright © 2017 Elsevier B.V. All rights reserved.
Rotational Spectral Unmixing of Exoplanets: Degeneracies between Surface Colors and Geography
NASA Astrophysics Data System (ADS)
Fujii, Yuka; Lustig-Yaeger, Jacob; Cowan, Nicolas B.
2017-11-01
Unmixing the disk-integrated spectra of exoplanets provides hints about heterogeneous surfaces that we cannot directly resolve in the foreseeable future. It is particularly important for terrestrial planets with diverse surface compositions like Earth. Although previous work on unmixing the spectra of Earth from disk-integrated multi-band light curves appeared successful, we point out a mathematical degeneracy between the surface colors and their spatial distributions. Nevertheless, useful constraints on the spectral shape of individual surface types may be obtained from the premise that albedo is everywhere between 0 and 1. We demonstrate the degeneracy and the possible constraints using both mock data based on a toy model of Earth, as well as real observations of Earth. Despite the severe degeneracy, we are still able to recover an approximate albedo spectrum for an ocean. In general, we find that surfaces are easier to identify when they cover a large fraction of the planet and when their spectra approach zero or unity in certain bands.
Rotational Spectral Unmixing of Exoplanets: Degeneracies between Surface Colors and Geography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujii, Yuka; Lustig-Yaeger, Jacob; Cowan, Nicolas B., E-mail: yuka.fujii.ebihara@gmail.com
Unmixing the disk-integrated spectra of exoplanets provides hints about heterogeneous surfaces that we cannot directly resolve in the foreseeable future. It is particularly important for terrestrial planets with diverse surface compositions like Earth. Although previous work on unmixing the spectra of Earth from disk-integrated multi-band light curves appeared successful, we point out a mathematical degeneracy between the surface colors and their spatial distributions. Nevertheless, useful constraints on the spectral shape of individual surface types may be obtained from the premise that albedo is everywhere between 0 and 1. We demonstrate the degeneracy and the possible constraints using both mock datamore » based on a toy model of Earth, as well as real observations of Earth. Despite the severe degeneracy, we are still able to recover an approximate albedo spectrum for an ocean. In general, we find that surfaces are easier to identify when they cover a large fraction of the planet and when their spectra approach zero or unity in certain bands.« less
Rotational Spectral Unmixing of Exoplanets: Degeneracies Between Surface Colors and Geography
NASA Technical Reports Server (NTRS)
Fujii, Yuka; Lustig-Yaeger, Jacob; Cowan, Nicolas B.
2017-01-01
Unmixing the disk-integrated spectra of exoplanets provides hints about heterogeneous surfaces that we cannot directly resolve in the foreseeable future. It is particularly important for terrestrial planets with diverse surface compositions like Earth. Although previous work on unmixing the spectra of Earth from disk-integrated multi-band light curves appeared successful, we point out a mathematical degeneracy between the surface colors and their spatial distributions. Nevertheless, useful constraints on the spectral shape of individual surface types may be obtained from the premise that albedo is everywhere between 0 and 1. We demonstrate the degeneracy and the possible constraints using both mock data based on a toy model of Earth, as well as real observations of Earth. Despite the severe degeneracy, we are still able to recover an approximate albedo spectrum for an ocean. In general, we find that surfaces are easier to identify when they cover a large fraction of the planet and when their spectra approach zero or unity in certain bands.
NASA Astrophysics Data System (ADS)
Salvatore, M. R.; Goudge, T. A.; Bramble, M. S.; Edwards, C. S.; Bandfield, J. L.; Amador, E. S.; Mustard, J. F.; Christensen, P. R.
2018-02-01
We investigated the area to the northwest of the Isidis impact basin (hereby referred to as "NW Isidis") using thermal infrared emission datasets to characterize and quantify bulk surface mineralogy throughout this region. This area is home to Jezero crater and the watershed associated with its two deltaic deposits in addition to NE Syrtis and the strong and diverse visible/near-infrared spectral signatures observed in well-exposed stratigraphic sections. The spectral signatures throughout this region show a diversity of primary and secondary surface mineralogies, including olivine, pyroxene, smectite clays, sulfates, and carbonates. While previous thermal infrared investigations have sought to characterize individual mineral groups within this region, none have systematically assessed bulk surface mineralogy and related these observations to visible/near-infrared studies. We utilize an iterative spectral unmixing method to statistically evaluate our linear thermal infrared spectral unmixing models to derive surface mineralogy. All relevant primary and secondary phases identified in visible/near-infrared studies are included in the unmixing models and their modeled spectral contributions are discussed in detail. While the stratigraphy and compositional diversity observed in visible/near-infrared spectra are much better exposed and more diverse than most other regions of Mars, our thermal infrared analyses suggest the dominance of basaltic compositions with less observed variability in the amount and diversity of alteration phases. These results help to constrain the mineralogical context of these previously reported visible/near-infrared spectral identifications. The results are also discussed in the context of future in situ investigations, as the NW Isidis region has long been promoted as a region of paleoenvironmental interest on Mars.
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
NASA Astrophysics Data System (ADS)
OMEGA Science Team; Combe, J.-Ph.; Le Mouélic, S.; Sotin, C.; Gendrin, A.; Mustard, J. F.; Le Deit, L.; Launeau, P.; Bibring, J.-P.; Gondet, B.; Langevin, Y.; Pinet, P.; OMEGA Science Team
2008-05-01
The mineralogical composition of the Martian surface is investigated by a Multiple-Endmember Linear Spectral Unmixing Model (MELSUM) of the Observatoire pour la Minéralogie, l'Eau, les Glaces et l'Activité (OMEGA) imaging spectrometer onboard Mars Express. OMEGA has fully covered the surface of the red planet at medium to low resolution (2-4 km per pixel). Several areas have been imaged at a resolution up to 300 m per pixel. One difficulty in the data processing is to extract the mineralogical composition, since rocks are mixtures of several components. MELSUM is an algorithm that selects the best linear combination of spectra among the families of minerals available in a reference library. The best fit of the observed spectrum on each pixel is calculated by the same unmixing equation used in the classical Spectral Mixture Analysis (SMA). This study shows the importance of the choice of the input library, which contains in our case 24 laboratory spectra (endmembers) of minerals that cover the diversity of the mineral families that may be found on the Martian surface. The analysis is restricted to the 1.0-2.5 μm wavelength range. Grain size variations and atmospheric scattering by aerosols induce changes in overall albedo level and continuum slopes. Synthetic flat and pure slope spectra have therefore been included in the input mineral spectral endmembers library in order to take these effects into account. The selection process for the endmembers is a systematic exploration of whole set of combinations of four components plus the straight line spectra. When negative coefficients occur, the results are discarded. This strategy is successfully tested on the terrestrial Cuprite site (Nevada, USA), for which extensive ground observations exist. It is then applied to different areas on Mars including Syrtis Major, Aram Chaos and Olympia Undae near the North Polar Cap. MELSUM on Syrtis Major reveals a region dominated by mafic minerals, with the oldest crustal regions composed of a mixing between low-calcium pyroxenes (LCPs) (orthopyroxenes (OPx)) and high-calcium pyroxenes (HCPs) (clinopyroxenes (CPx)). The Syrtis volcanic edifice appears depleted in LCP (OPx) and enriched in HCP (CPx), which is consistent with materials produced with a lower partial fusion degree at an age younger to the surrounding crust. Strong olivine signatures are found between the two calderas Nili Patera and Meroe Patera and in Nili Fossae. A strong signature of iron oxides is found within Aram Chaos, with a spatial distribution also consistent with thermal emission spectrometer (TES). Gypsum is unambiguously detected in the northern polar region, in agreement with the study of Langevin et al. [2005. Sulfates in the north polar region of Mars detected by OMEGA/Mars Express. Science 307(5715), 1584-1586]. Our results show that the linear spectral unmixing provides good first order results in a variety of mineralogical contexts, and can therefore confidently be used on a wider scale to analyze the complete archive of OMEGA data.
Multispectral Live-Cell Imaging.
Cohen, Sarah; Valm, Alex M; Lippincott-Schwartz, Jennifer
2018-06-01
Fluorescent proteins and vital dyes are invaluable tools for studying dynamic processes within living cells. However, the ability to distinguish more than a few different fluorescent reporters in a single sample is limited by the spectral overlap of available fluorophores. Here, we present a protocol for imaging live cells labeled with six fluorophores simultaneously. A confocal microscope with a spectral detector is used to acquire images, and linear unmixing algorithms are applied to identify the fluorophores present in each pixel of the image. We describe the application of this method to visualize the dynamics of six different organelles, and to quantify the contacts between organelles. However, this method can be used to image any molecule amenable to tagging with a fluorescent probe. Thus, multispectral live-cell imaging is a powerful tool for systems-level analysis of cellular organization and dynamics. © 2018 by John Wiley & Sons, Inc. Copyright © 2018 John Wiley & Sons, Inc.
Snapshot Hyperspectral Volumetric Microscopy
NASA Astrophysics Data System (ADS)
Wu, Jiamin; Xiong, Bo; Lin, Xing; He, Jijun; Suo, Jinli; Dai, Qionghai
2016-04-01
The comprehensive analysis of biological specimens brings about the demand for capturing the spatial, temporal and spectral dimensions of visual information together. However, such high-dimensional video acquisition faces major challenges in developing large data throughput and effective multiplexing techniques. Here, we report the snapshot hyperspectral volumetric microscopy that computationally reconstructs hyperspectral profiles for high-resolution volumes of ~1000 μm × 1000 μm × 500 μm at video rate by a novel four-dimensional (4D) deconvolution algorithm. We validated the proposed approach with both numerical simulations for quantitative evaluation and various real experimental results on the prototype system. Different applications such as biological component analysis in bright field and spectral unmixing of multiple fluorescence are demonstrated. The experiments on moving fluorescent beads and GFP labelled drosophila larvae indicate the great potential of our method for observing multiple fluorescent markers in dynamic specimens.
Postfire soil burn severity mapping with hyperspectral image unmixing
Robichaud, P.R.; Lewis, S.A.; Laes, D.Y.M.; Hudak, A.T.; Kokaly, R.F.; Zamudio, J.A.
2007-01-01
Burn severity is mapped after wildfires to evaluate immediate and long-term fire effects on the landscape. Remotely sensed hyperspectral imagery has the potential to provide important information about fine-scale ground cover components that are indicative of burn severity after large wildland fires. Airborne hyperspectral imagery and ground data were collected after the 2002 Hayman Fire in Colorado to assess the application of high resolution imagery for burn severity mapping and to compare it to standard burn severity mapping methods. Mixture Tuned Matched Filtering (MTMF), a partial spectral unmixing algorithm, was used to identify the spectral abundance of ash, soil, and scorched and green vegetation in the burned area. The overall performance of the MTMF for predicting the ground cover components was satisfactory (r2 = 0.21 to 0.48) based on a comparison to fractional ash, soil, and vegetation cover measured on ground validation plots. The relationship between Landsat-derived differenced Normalized Burn Ratio (dNBR) values and the ground data was also evaluated (r2 = 0.20 to 0.58) and found to be comparable to the MTMF. However, the quantitative information provided by the fine-scale hyperspectral imagery makes it possible to more accurately assess the effects of the fire on the soil surface by identifying discrete ground cover characteristics. These surface effects, especially soil and ash cover and the lack of any remaining vegetative cover, directly relate to potential postfire watershed response processes. ?? 2006 Elsevier Inc. All rights reserved.
Spec Tool; an online education and research resource
NASA Astrophysics Data System (ADS)
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
2016-06-01
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
Emission spectra profiling of fluorescent proteins in living plant cells
2013-01-01
Background Fluorescence imaging at high spectral resolution allows the simultaneous recording of multiple fluorophores without switching optical filters, which is especially useful for time-lapse analysis of living cells. The collected emission spectra can be used to distinguish fluorophores by a computation analysis called linear unmixing. The availability of accurate reference spectra for different fluorophores is crucial for this type of analysis. The reference spectra used by plant cell biologists are in most cases derived from the analysis of fluorescent proteins in solution or produced in animal cells, although these spectra are influenced by both the cellular environment and the components of the optical system. For instance, plant cells contain various autofluorescent compounds, such as cell wall polymers and chlorophyll, that affect the spectral detection of some fluorophores. Therefore, it is important to acquire both reference and experimental spectra under the same biological conditions and through the same imaging systems. Results Entry clones (pENTR) of fluorescent proteins (FPs) were constructed in order to create C- or N-terminal protein fusions with the MultiSite Gateway recombination technology. The emission spectra for eight FPs, fused C-terminally to the A- or B-type cyclin dependent kinases (CDKA;1 and CDKB1;1) and transiently expressed in epidermal cells of tobacco (Nicotiana benthamiana), were determined by using the Olympus FluoView™ FV1000 Confocal Laser Scanning Microscope. These experimental spectra were then used in unmixing experiments in order to separate the emission of fluorophores with overlapping spectral properties in living plant cells. Conclusions Spectral imaging and linear unmixing have a great potential for efficient multicolor detection in living plant cells. The emission spectra for eight of the most commonly used FPs were obtained in epidermal cells of tobacco leaves and used in unmixing experiments. The generated set of FP Gateway entry vectors represents a valuable resource for plant cell biologists. PMID:23552272
NASA Astrophysics Data System (ADS)
Benhalouche, Fatima Zohra; Karoui, Moussa Sofiane; Deville, Yannick; Ouamri, Abdelaziz
2015-10-01
In this paper, a new Spectral-Unmixing-based approach, using Nonnegative Matrix Factorization (NMF), is proposed to locally multi-sharpen hyperspectral data by integrating a Digital Surface Model (DSM) obtained from LIDAR data. In this new approach, the nature of the local mixing model is detected by using the local variance of the object elevations. The hyper/multispectral images are explored using small zones. In each zone, the variance of the object elevations is calculated from the DSM data in this zone. This variance is compared to a threshold value and the adequate linear/linearquadratic spectral unmixing technique is used in the considered zone to independently unmix hyperspectral and multispectral data, using an adequate linear/linear-quadratic NMF-based approach. The obtained spectral and spatial information thus respectively extracted from the hyper/multispectral images are then recombined in the considered zone, according to the selected mixing model. Experiments based on synthetic hyper/multispectral data are carried out to evaluate the performance of the proposed multi-sharpening approach and literature linear/linear-quadratic approaches used on the whole hyper/multispectral data. In these experiments, real DSM data are used to generate synthetic data containing linear and linear-quadratic mixed pixel zones. The DSM data are also used for locally detecting the nature of the mixing model in the proposed approach. Globally, the proposed approach yields good spatial and spectral fidelities for the multi-sharpened data and significantly outperforms the used literature methods.
NASA Astrophysics Data System (ADS)
Varatharajan, I.; D'Amore, M.; Maturilli, A.; Helbert, J.; Hiesinger, H.
2018-04-01
Machine learning approach to spectral unmixing of emissivity spectra of Mercury is carried out using endmember spectral library measured at simulated daytime surface conditions of Mercury. Study supports MERTIS payload onboard ESA/JAXA BepiColombo.
Solving for the Surface: An Automated Approach to THEMIS Atmospheric Correction
NASA Astrophysics Data System (ADS)
Ryan, A. J.; Salvatore, M. R.; Smith, R.; Edwards, C. S.; Christensen, P. R.
2013-12-01
Here we present the initial results of an automated atmospheric correction algorithm for the Thermal Emission Imaging System (THEMIS) instrument, whereby high spectral resolution Thermal Emission Spectrometer (TES) data are queried to generate numerous atmospheric opacity values for each THEMIS infrared image. While the pioneering methods of Bandfield et al. [2004] also used TES spectra to atmospherically correct THEMIS data, the algorithm presented here is a significant improvement because of the reduced dependency on user-defined inputs for individual images. Additionally, this technique is particularly useful for correcting THEMIS images that have captured a range of atmospheric conditions and/or surface elevations, issues that have been difficult to correct for using previous techniques. Thermal infrared observations of the Martian surface can be used to determine the spatial distribution and relative abundance of many common rock-forming minerals. This information is essential to understanding the planet's geologic and climatic history. However, the Martian atmosphere also has absorptions in the thermal infrared which complicate the interpretation of infrared measurements obtained from orbit. TES has sufficient spectral resolution (143 bands at 10 cm-1 sampling) to linearly unmix and remove atmospheric spectral end-members from the acquired spectra. THEMIS has the benefit of higher spatial resolution (~100 m/pixel vs. 3x5 km/TES-pixel) but has lower spectral resolution (8 surface sensitive spectral bands). As such, it is not possible to isolate the surface component by unmixing the atmospheric contribution from the THEMIS spectra, as is done with TES. Bandfield et al. [2004] developed a technique using atmospherically corrected TES spectra as tie-points for constant radiance offset correction and surface emissivity retrieval. This technique is the primary method used to correct THEMIS but is highly susceptible to inconsistent results if great care in the selection of TES spectra is not exercised. Our algorithm implements a newly populated TES database that was created using PostgreSQL/PostGIS geospatial database. TES pixels that meet user-defined quality criteria and that intersect a THEMIS observation of interest may be quickly retrieved using this new database. The THEMIS correction process [Bandfield et al. 2004] is then run using all TES pixels that pass an additional set of TES-THEMIS relational quality checks. The result is a spatially correlated set of atmospheric opacity values, determined from the difference between each atmospherically corrected TES pixel and the overlapping portion of the THEMIS image. The dust and ice contributions to the atmospheric opacity are estimated using known dust and ice spectral dependencies [Smith et al. 2003]. These opacity values may be used to determine atmospheric variation across the scene, from which topography- and temperature-scaled atmospheric contribution may be calculated and removed. References: Bandfield, JL et al. [2004], JGR 109, E10008. Smith, MD et al. [2003], JGR 108, E11, 5115.
Pisharady, Pramod Kumar; Sotiropoulos, Stamatios N; Duarte-Carvajalino, Julio M; Sapiro, Guillermo; Lenglet, Christophe
2018-02-15
We present a sparse Bayesian unmixing algorithm BusineX: Bayesian Unmixing for Sparse Inference-based Estimation of Fiber Crossings (X), for estimation of white matter fiber parameters from compressed (under-sampled) diffusion MRI (dMRI) data. BusineX combines compressive sensing with linear unmixing and introduces sparsity to the previously proposed multiresolution data fusion algorithm RubiX, resulting in a method for improved reconstruction, especially from data with lower number of diffusion gradients. We formulate the estimation of fiber parameters as a sparse signal recovery problem and propose a linear unmixing framework with sparse Bayesian learning for the recovery of sparse signals, the fiber orientations and volume fractions. The data is modeled using a parametric spherical deconvolution approach and represented using a dictionary created with the exponential decay components along different possible diffusion directions. Volume fractions of fibers along these directions define the dictionary weights. The proposed sparse inference, which is based on the dictionary representation, considers the sparsity of fiber populations and exploits the spatial redundancy in data representation, thereby facilitating inference from under-sampled q-space. The algorithm improves parameter estimation from dMRI through data-dependent local learning of hyperparameters, at each voxel and for each possible fiber orientation, that moderate the strength of priors governing the parameter variances. Experimental results on synthetic and in-vivo data show improved accuracy with a lower uncertainty in fiber parameter estimates. BusineX resolves a higher number of second and third fiber crossings. For under-sampled data, the algorithm is also shown to produce more reliable estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Spectral mineral mapping for characterization of subtle geothermal prospects using ASTER data
NASA Astrophysics Data System (ADS)
Abubakar, A. J.; Hashim, M.; Pour, A. B.
2017-05-01
In this study, the performance of ASTER data is evaluated for mapping subtle geothermal prospects in an unexplored tropical region having a number of thermal springs. The study employed a simple Decorrelation stretch with specific absorptions to highlight possible alteration zones of interest related to Geothermal (GT) systems. Hydrothermal alteration minerals are subsequently mapped using Spectral Angle Mapper (SAM) and Linear Spectral Unmixing (LSU) algorithms to target representative minerals such as clays, carbonates and AL-OH minerals as indicators of GT activity. The results were validated through field GPS survey, rock sampling and laboratory analysis using latest smart lab X-Ray Diffractometer technology. The study indicates that ASTER broadband satellite data could be used to map subtle GT prospects with the aid of an in-situ verification. However, it also shows that ASTER could not discriminate within specie minerals especially for clays using SWIR bands. Subsequent studies are aimed at looking at both ASTER and Hyperion hyperspectral data in the same area as this could have significant implications for GT resource detection in unmapped aseismic and inaccessible tropical regions using available spaceborne data.
NASA Astrophysics Data System (ADS)
Traganos, D.; Cerra, D.; Reinartz, P.
2017-05-01
Seagrasses are one of the most productive and widespread yet threatened coastal ecosystems on Earth. Despite their importance, they are declining due to various threats, which are mainly anthropogenic. Lack of data on their distribution hinders any effort to rectify this decline through effective detection, mapping and monitoring. Remote sensing can mitigate this data gap by allowing retrospective quantitative assessment of seagrass beds over large and remote areas. In this paper, we evaluate the quantitative application of Planet high resolution imagery for the detection of seagrasses in the Thermaikos Gulf, NW Aegean Sea, Greece. The low Signal-to-noise Ratio (SNR), which characterizes spectral bands at shorter wavelengths, prompts the application of the Unmixing-based denoising (UBD) as a pre-processing step for seagrass detection. A total of 15 spectral-temporal patterns is extracted from a Planet image time series to restore the corrupted blue and green band in the processed Planet image. Subsequently, we implement Lyzenga's empirical water column correction and Support Vector Machines (SVM) to evaluate quantitative benefits of denoising. Denoising aids detection of Posidonia oceanica seagrass species by increasing its producer and user accuracy by 31.7 % and 10.4 %, correspondingly, with a respective increase in its Kappa value from 0.3 to 0.48. In the near future, our objective is to improve accuracies in seagrass detection by applying more sophisticated, analytical water column correction algorithms to Planet imagery, developing time- and cost-effective monitoring of seagrass distribution that will enable in turn the effective management and conservation of these highly valuable and productive ecosystems.
A FPGA implementation for linearly unmixing a hyperspectral image using OpenCL
NASA Astrophysics Data System (ADS)
Guerra, Raúl; López, Sebastián.; Sarmiento, Roberto
2017-10-01
Hyperspectral imaging systems provide images in which single pixels have information from across the electromagnetic spectrum of the scene under analysis. These systems divide the spectrum into many contiguos channels, which may be even out of the visible part of the spectra. The main advantage of the hyperspectral imaging technology is that certain objects leave unique fingerprints in the electromagnetic spectrum, known as spectral signatures, which allow to distinguish between different materials that may look like the same in a traditional RGB image. Accordingly, the most important hyperspectral imaging applications are related with distinguishing or identifying materials in a particular scene. In hyperspectral imaging applications under real-time constraints, the huge amount of information provided by the hyperspectral sensors has to be rapidly processed and analysed. For such purpose, parallel hardware devices, such as Field Programmable Gate Arrays (FPGAs) are typically used. However, developing hardware applications typically requires expertise in the specific targeted device, as well as in the tools and methodologies which can be used to perform the implementation of the desired algorithms in the specific device. In this scenario, the Open Computing Language (OpenCL) emerges as a very interesting solution in which a single high-level synthesis design language can be used to efficiently develop applications in multiple and different hardware devices. In this work, the Fast Algorithm for Linearly Unmixing Hyperspectral Images (FUN) has been implemented into a Bitware Stratix V Altera FPGA using OpenCL. The obtained results demonstrate the suitability of OpenCL as a viable design methodology for quickly creating efficient FPGAs designs for real-time hyperspectral imaging applications.
A cross-comparison of field, spectral, and lidar estimates of forest canopy cover
Alistair M. S. Smith; Michael J. Falkowski; Andrew T. Hudak; Jeffrey S. Evans; Andrew P. Robinson; Caiti M. Steele
2010-01-01
A common challenge when comparing forest canopy cover and similar metrics across different ecosystems is that there are many field- and landscape-level measurement methods. This research conducts a cross-comparison and evaluation of forest canopy cover metrics produced using unmixing of reflective spectral satellite data, light detection and ranging (lidar) data, and...
NASA Astrophysics Data System (ADS)
Liu, Zhaoxin; Zhao, Liaoying; Li, Xiaorun; Chen, Shuhan
2018-04-01
Owing to the limitation of spatial resolution of the imaging sensor and the variability of ground surfaces, mixed pixels are widesperead in hyperspectral imagery. The traditional subpixel mapping algorithms treat all mixed pixels as boundary-mixed pixels while ignoring the existence of linear subpixels. To solve this question, this paper proposed a new subpixel mapping method based on linear subpixel feature detection and object optimization. Firstly, the fraction value of each class is obtained by spectral unmixing. Secondly, the linear subpixel features are pre-determined based on the hyperspectral characteristics and the linear subpixel feature; the remaining mixed pixels are detected based on maximum linearization index analysis. The classes of linear subpixels are determined by using template matching method. Finally, the whole subpixel mapping results are iteratively optimized by binary particle swarm optimization algorithm. The performance of the proposed subpixel mapping method is evaluated via experiments based on simulated and real hyperspectral data sets. The experimental results demonstrate that the proposed method can improve the accuracy of subpixel mapping.
Spatial-spectral preprocessing for endmember extraction on GPU's
NASA Astrophysics Data System (ADS)
Jimenez, Luis I.; Plaza, Javier; Plaza, Antonio; Li, Jun
2016-10-01
Spectral unmixing is focused in the identification of spectrally pure signatures, called endmembers, and their corresponding abundances in each pixel of a hyperspectral image. Mainly focused on the spectral information contained in the hyperspectral images, endmember extraction techniques have recently included spatial information to achieve more accurate results. Several algorithms have been developed for automatic or semi-automatic identification of endmembers using spatial and spectral information, including the spectral-spatial endmember extraction (SSEE) where, within a preprocessing step in the technique, both sources of information are extracted from the hyperspectral image and equally used for this purpose. Previous works have implemented the SSEE technique in four main steps: 1) local eigenvectors calculation in each sub-region in which the original hyperspectral image is divided; 2) computation of the maxima and minima projection of all eigenvectors over the entire hyperspectral image in order to obtain a candidates pixels set; 3) expansion and averaging of the signatures of the candidate set; 4) ranking based on the spectral angle distance (SAD). The result of this method is a list of candidate signatures from which the endmembers can be extracted using various spectral-based techniques, such as orthogonal subspace projection (OSP), vertex component analysis (VCA) or N-FINDR. Considering the large volume of data and the complexity of the calculations, there is a need for efficient implementations. Latest- generation hardware accelerators such as commodity graphics processing units (GPUs) offer a good chance for improving the computational performance in this context. In this paper, we develop two different implementations of the SSEE algorithm using GPUs. Both are based on the eigenvectors computation within each sub-region of the first step, one using the singular value decomposition (SVD) and another one using principal component analysis (PCA). Based on our experiments with hyperspectral data sets, high computational performance is observed in both cases.
NASA Technical Reports Server (NTRS)
Lederer, Susan
2017-01-01
NASA's ODPO has recently collected data of unresolved objects at GEO with the 3.8m UKIRT infrared telescope on Mauna Kea and the 1.3m MCAT visible telescope on Ascension Island. Analyses of SWIR data of rocket bodies and HS-376 solar-panel covered buses demonstrate the uniqueness of spectral signatures. Data of 3 classes of rocket bodies show similarities amongst a given class, but distinct differences from one class to another, suggesting that infrared reflectance spectra could effectively be used toward characterizing and constraining potential parent bodies of uncorrelated targets (UCTs). The Optical Measurements Center (OMC) at NASA JSC is designed to collect photometric signatures in the laboratory that can be used for comparison with telescopic data. NASA also has a spectral database of spacecraft materials for use with spectral unmixing models. Spectral unmixing of the HS-376 bus data demonstrates how absorption features and slopes can be used to constrain material characteristics of debris. Broadband photometry likewise can be compared with MCAT data of non-resolved debris images. Similar studies have been applied to IDCSP satellites to demonstrate how color-color photometry can be compared with lab data to constrain bulk materials signatures of spacecraft and debris.
NASA Astrophysics Data System (ADS)
Guillemot, Mathilde; Midahuen, Rony; Archeny, Delpine; Fulchiron, Corine; Montvernay, Regis; Perrin, Guillaume; Leroux, Denis F.
2016-04-01
BioMérieux is automating the microbiology laboratory in order to reduce cost (less manpower and consumables), to improve performance (increased sensitivity, machine algorithms) and to gain traceability through optimization of the clinical laboratory workflow. In this study, we evaluate the potential of Hyperspectral imaging (HSI) as a substitute to human visual observation when performing the task of microbiological culture interpretation. Microbial colonies from 19 strains subcategorized in 6 chromogenic classes were analyzed after a 24h-growth on a chromogenic culture medium (chromID® CPS Elite, bioMérieux, France). The HSI analysis was performed in the VNIR region (400-900 nm) using a linescan configuration. Using algorithms relying on Linear Spectral Unmixing, and using exclusively Diffuse Reflectance Spectra (DRS) as input data, we report interclass classification accuracies of 100% using a fully automatable approach and no use of morphological information. In order to eventually simplify the instrument, the performance of degraded DRS was also evaluated using only the most discriminant 14 spectral channels (a model for a multispectral approach) or 3 channels (model of a RGB image). The overall classification performance remains unchanged for our multispectral model but is degraded for the predicted RGB model, hints that a multispectral solution might bring the answer for an improved colony recognition.
NASA Astrophysics Data System (ADS)
Wendt, L.; Gross, C.; McGuire, P. C.; Combe, J.-P.; Neukum, G.
2009-04-01
Juventae Chasma, just north of Valles Marineris on Mars, contains several light-toned deposits (LTD), one of which is labelled mound B. Based on IR data from the imaging spectrometer OMEGA on Mars Express,[1] suggested kieserite for the lower part and gypsum for the upper part of the mound. In this study, we analyzed NIR data from the Compact Reconnaissance Imaging Spectrometer CRISM on MRO with the Multiple-Endmember Linear Spectral Unmixing Model MELSUM developed by Combe et al.[2]. We used CRISM data product FRT00009C0A from 1 to 2.6 µm. A novel, time-dependent volcano-scan technique [3] was applied to remove absorption bands related to CO2 much more effectively than the volcano-scan technique [4] that has been applied to CRISM and OMEGA data so far. In the classic SMA, a solution for the measured spectrum is calculated by a linear combination of all input spectra (which may come from a spectral library or from the image itself) at once. This can lead to negative coefficients, which have no physical meaning. MELSUM avoids this by calculating a solution for each possible combination of a subset of the reference spectra, with the maximum number of library spectra in the subset defined by the user. The solution with the lowest residual to the input spectrum is returned. We used MELSUM in a first step as similarity measure within the image by using averaged spectra from the image itself as input to MELSUM. This showed that three spectral units are enough to describe the variability in the data to first order: A lower, light-toned unit, an upper light-toned unit and a dark-toned unit. We then chose 34 laboratory spectra of sulfates, mafic minerals and iron oxides plus a spectrum for H2O ice as reference spectra for the unmixing of averaged spectra for each of these spectral regions. The best fit for the dark material was a combination of olivine, pyroxene and ice (present as cloud in the atmosphere and not on the surface). In agreement with [5], The lower unit was best modeled by a mix of the monohydrated sulfates szomolnokite and kieserite plus olivine and ice. The upper unit fits best with a combination of romerite, rozenite, (two polyhydrated iron sulfates) olivine and ice. Gypsum is not present. The excellent fit between modeled and measured spectra demonstrates the effectiveness of MELSUM as a tool to analyze hyperspectral data from CRISM. This research has been supported by the Helmholtz Association through the research alliance "Planetary Evolution and Life" and the German Space Agency under the Mars Express programme. References: [1] Gendrin, A. et al., (2005), Science, 307, 5751, 1587-1591 [2] Combe. J.-P. et al., (2008), PSS, 56, 951-975. [3] McGuire et al., (2009), in preparation), "A new volcano-scan algorithm for atmospheric correction of CRISM and OMEGA spectral data". [4] Langevin et al., (2005), Science, 307 (5715), 1584-1586. [5] Bishop, J. L. et al., (2008) LPSC XXXIX, #1391.
NASA Astrophysics Data System (ADS)
Arnold, Thomas; De Biasio, Martin; Leitner, Raimund
2015-06-01
Two problems are addressed in this paper (i) the fluorescent marker-based and the (ii) marker-free discrimination between healthy and cancerous human tissues. For both applications the performance of hyper-spectral methods are quantified. Fluorescent marker-based tissue classification uses a number of fluorescent markers to dye specific parts of a human cell. The challenge is that the emission spectra of the fluorescent dyes overlap considerably. They are, furthermore disturbed by the inherent auto-fluorescence of human tissue. This results in ambiguities and decreased image contrast causing difficulties for the treatment decision. The higher spectral resolution introduced by tunable-filter-based spectral imaging in combination with spectral unmixing techniques results in an improvement of the image contrast and therefore more reliable information for the physician to choose the treatment decision. Marker-free tissue classification is based solely on the subtle spectral features of human tissue without the use of artificial markers. The challenge in this case is that the spectral differences between healthy and cancerous tissues are subtle and embedded in intra- and inter-patient variations of these features. The contributions of this paper are (i) the evaluation of hyper-spectral imaging in combination with spectral unmixing techniques for fluorescence marker-based tissue classification, (ii) the evaluation of spectral imaging for marker-free intra surgery tissue classification. Within this paper, we consider real hyper-spectral fluorescence and endoscopy data sets to emphasize the practical capability of the proposed methods. It is shown that the combination of spectral imaging with multivariate statistical methods can improve the sensitivity and specificity of the detection and the staging of cancerous tissues compared to standard procedures.
Watson, K.; Rowan, L.C.; Bowers, T.L.; Anton-Pacheco, C.; Gumiel, P.; Miller, S.H.
1996-01-01
Airborne thermal-infrared multispectral scanner (TIMS) data of the Iron Hill carbonatite-alkalic igneous rock complex in south-central Colorado are analyzed using a new spectral emissivity ratio algorithm and confirmed by field examination using existing 1:24 000-scale geologic maps and petrographic studies. Color composite images show that the alkalic rocks could be clearly identified and that differences existed among alkalic rocks in several parts of the complex. An unsupervised classification algorithm defines four alkalic rock classes within the complex: biotitic pyroxenite, uncompahgrite, augitic pyroxenite, and fenite + nepheline syenite. Felsic rock classes defined in the surrounding country rock are an extensive class consisting of tuff, granite, and felsite, a less extensive class of granite and felsite, and quartzite. The general composition of the classes can be determined from comparisons of the TIMS spectra with laboratory spectra. Carbonatite rocks are not classified, and we attribute that to the fact that dolomite, the predominant carbonate mineral in the complex, has a spectral feature that falls between TIMS channels 5 and 6. Mineralogical variability in the fenitized granite contributed to the nonuniform pattern of the fenite-nepheline syenite class. The biotitic pyroxenite, which resulted from alteration of the pyroxenite, is spatially associated and appears to be related to narrow carbonatite dikes and sills. Results from a linear unmixing algorithm suggest that the detected spatial extent of the two mixed felsic rock classes was sensitive to the amount of vegetation cover. These results illustrate that spectral thermal infrared data can be processed to yield compositional information that can be a cost-effective tool to target mineral exploration, particularly in igneous terranes.
Quantifying the Components of Impervious Surfaces
Tilley, Janet S.; Slonecker, E. Terrence
2006-01-01
This study's objectives were to (1) determine the relative contribution of impervious surface individual components by collecting digital information from high-resolution imagery, 1-meter or better; and to (2) determine which of the more advanced techniques, such as spectral unmixing or the application of coefficients to land use or land cover data, was the most suitable method that could be used by State and local governments as well as Federal agencies to efficiently measure the imperviousness in any given watershed or area of interest. The components of impervious surfaces, combined from all the watersheds and time periods from objective one were the following: buildings 29.2-percent, roads 28.3-percent, parking lots 24.6-percent; with the remaining three totaling 14-percent - driveways, sidewalks, and other, where other were any other features that were not contained within the first five. Results from objective two were spectral unmixing techniques will ultimately be the most efficient method of determining imperviousness, but are not yet accurate enough as it is critical to achieve accuracy better than 10-percent of the truth, of which the method is not consistently accomplishing as observed in this study. Of the three techniques in coefficient application tested, land use coefficient application was not practical, while if the last two methods, coefficients applied to land cover data, were merged, their end results could be to within 5-percent or better, of the truth. Until the spectral unmixing technique has been further refined, land cover coefficients should be used, which offer quick results, but not current as they were developed for the 1992 National Land Characteristics Data.
NASA Astrophysics Data System (ADS)
Roy, Ankita
2007-12-01
This research using Hyperspectral imaging involves recognizing targets through spatial and spectral matching and spectral un-mixing of data ranging from remote sensing to medical imaging kernels for clinical studies based on Hyperspectral data-sets generated using the VFTHSI [Visible Fourier Transform Hyperspectral Imager], whose high resolution Si detector makes the analysis achievable. The research may be broadly classified into (I) A Physically Motivated Correlation Formalism (PMCF), which places both spatial and spectral data on an equivalent mathematical footing in the context of a specific Kernel and (II) An application in RF plasma specie detection during carbon nanotube growing process. (III) Hyperspectral analysis for assessing density and distribution of retinopathies like age related macular degeneration (ARMD) and error estimation enabling the early recognition of ARMD, which is treated as an ill-conditioned inverse imaging problem. The broad statistical scopes of this research are two fold-target recognition problems and spectral unmixing problems. All processes involve experimental and computational analysis of Hyperspectral data sets is presented, which is based on the principle of a Sagnac Interferometer, calibrated to obtain high SNR levels. PMCF computes spectral/spatial/cross moments and answers the question of how optimally the entire hypercube should be sampled and finds how many spatial-spectral pixels are required precisely for a particular target recognition. Spectral analysis of RF plasma radicals, typically Methane plasma and Argon plasma using VFTHSI has enabled better process monitoring during growth of vertically aligned multi-walled carbon nanotubes by instant registration of the chemical composition or density changes temporally, which is key since a significant correlation can be found between plasma state and structural properties. A vital focus of this dissertation is towards medical Hyperspectral imaging applied to retinopathies like age related macular degeneration targets taken with a Fundus imager, which is akin to the VFTHSI. Detection of the constituent components in the diseased hyper-pigmentation area is also computed. The target or reflectance matrix is treated as a highly ill-conditioned spectral un-mixing problem, to which methodologies like inverse techniques, principal component analysis (PCA) and receiver operating curves (ROC) for precise spectral recognition of infected area. The region containing ARMD was easily distinguishable from the spectral mesh plots over the entire band-pass area. Once the location was detected the PMCF coefficients were calculated by cross correlating a target of normal oxygenated retina with the deoxygenated one. The ROCs generated using PMCF shows 30% higher detection probability with improved accuracy than ROCs based on Spectral Angle Mapper (SAM). By spectral unmixing methods, the important endmembers/carotenoids of the MD pigment were found to be Xanthophyl and lutein, while beta-carotene which showed a negative correlation in the unconstrained inverse problem is a supplement given to ARMD patients to prevent the disease and does not occur in the eye. Literature also shows degeneration of meso-zeaxanthin. Ophthalmologists may assert the presence of ARMD and commence the diagnosis process if the Xanthophyl pigment have degenerated 89.9%, while the lutein has decayed almost 80%, as found deduced computationally. This piece of current research takes it to the next level of precise investigation in the continuing process of improved clinical findings by correlating the microanatomy of the diseased fovea and shows promise of an early detection of this disease.
Parallel-multiplexed excitation light-sheet microscopy (Conference Presentation)
NASA Astrophysics Data System (ADS)
Xu, Dongli; Zhou, Weibin; Peng, Leilei
2017-02-01
Laser scanning light-sheet imaging allows fast 3D image of live samples with minimal bleach and photo-toxicity. Existing light-sheet techniques have very limited capability in multi-label imaging. Hyper-spectral imaging is needed to unmix commonly used fluorescent proteins with large spectral overlaps. However, the challenge is how to perform hyper-spectral imaging without sacrificing the image speed, so that dynamic and complex events can be captured live. We report wavelength-encoded structured illumination light sheet imaging (λ-SIM light-sheet), a novel light-sheet technique that is capable of parallel multiplexing in multiple excitation-emission spectral channels. λ-SIM light-sheet captures images of all possible excitation-emission channels in true parallel. It does not require compromising the imaging speed and is capable of distinguish labels by both excitation and emission spectral properties, which facilitates unmixing fluorescent labels with overlapping spectral peaks and will allow more labels being used together. We build a hyper-spectral light-sheet microscope that combined λ-SIM with an extended field of view through Bessel beam illumination. The system has a 250-micron-wide field of view and confocal level resolution. The microscope, equipped with multiple laser lines and an unlimited number of spectral channels, can potentially image up to 6 commonly used fluorescent proteins from blue to red. Results from in vivo imaging of live zebrafish embryos expressing various genetic markers and sensors will be shown. Hyper-spectral images from λ-SIM light-sheet will allow multiplexed and dynamic functional imaging in live tissue and animals.
Unmixing the Materials and Mechanics Contributions in Non-resolved Object Signatures
2008-09-01
abundances from hyperspectral or multi-spectral time - resolved signatures. A Fourier analysis of temporal variation of material abundance provides...factorization technique to extract the temporal variation of material abundances from hyperspectral or multi-spectral time - resolved signatures. A Fourier...approximately one hundred wavelengths in the visible spectrum. The frame rate for the instrument was not large enough to collect time resolved data. However
NASA Astrophysics Data System (ADS)
Masalmah, Yahya M.; Vélez-Reyes, Miguel
2007-04-01
The authors proposed in previous papers the use of the constrained Positive Matrix Factorization (cPMF) to perform unsupervised unmixing of hyperspectral imagery. Two iterative algorithms were proposed to compute the cPMF based on the Gauss-Seidel and penalty approaches to solve optimization problems. Results presented in previous papers have shown the potential of the proposed method to perform unsupervised unmixing in HYPERION and AVIRIS imagery. The performance of iterative methods is highly dependent on the initialization scheme. Good initialization schemes can improve convergence speed, whether or not a global minimum is found, and whether or not spectra with physical relevance are retrieved as endmembers. In this paper, different initializations using random selection, longest norm pixels, and standard endmembers selection routines are studied and compared using simulated and real data.
Multispectral image restoration of historical documents based on LAAMs and mathematical morphology
NASA Astrophysics Data System (ADS)
Lechuga-S., Edwin; Valdiviezo-N., Juan C.; Urcid, Gonzalo
2014-09-01
This research introduces an automatic technique designed for the digital restoration of the damaged parts in historical documents. For this purpose an imaging spectrometer is used to acquire a set of images in the wavelength interval from 400 to 1000 nm. Assuming the presence of linearly mixed spectral pixels registered from the multispectral image, our technique uses two lattice autoassociative memories to extract the set of pure pigments conforming a given document. Through an spectral unmixing analysis, our method produces fractional abundance maps indicating the distributions of each pigment in the scene. These maps are then used to locate cracks and holes in the document under study. The restoration process is performed by the application of a region filling algorithm, based on morphological dilation, followed by a color interpolation to restore the original appearance of the filled areas. This procedure has been successfully applied to the analysis and restoration of three multispectral data sets: two corresponding to artificially superimposed scripts and a real data acquired from a Mexican pre-Hispanic codex, whose restoration results are presented.
Subpixel target detection and enhancement in hyperspectral images
NASA Astrophysics Data System (ADS)
Tiwari, K. C.; Arora, M.; Singh, D.
2011-06-01
Hyperspectral data due to its higher information content afforded by higher spectral resolution is increasingly being used for various remote sensing applications including information extraction at subpixel level. There is however usually a lack of matching fine spatial resolution data particularly for target detection applications. Thus, there always exists a tradeoff between the spectral and spatial resolutions due to considerations of type of application, its cost and other associated analytical and computational complexities. Typically whenever an object, either manmade, natural or any ground cover class (called target, endmembers, components or class) gets spectrally resolved but not spatially, mixed pixels in the image result. Thus, numerous manmade and/or natural disparate substances may occur inside such mixed pixels giving rise to mixed pixel classification or subpixel target detection problems. Various spectral unmixing models such as Linear Mixture Modeling (LMM) are in vogue to recover components of a mixed pixel. Spectral unmixing outputs both the endmember spectrum and their corresponding abundance fractions inside the pixel. It, however, does not provide spatial distribution of these abundance fractions within a pixel. This limits the applicability of hyperspectral data for subpixel target detection. In this paper, a new inverse Euclidean distance based super-resolution mapping method has been presented that achieves subpixel target detection in hyperspectral images by adjusting spatial distribution of abundance fraction within a pixel. Results obtained at different resolutions indicate that super-resolution mapping may effectively aid subpixel target detection.
Monitoring intracellular oxidative events using dynamic spectral unmixing microscopy
There is increasing interest in using live-cell imaging to monitor not just individual intracellular endpoints, but to investigate the interplay between multiple molecular events as they unfold in real time within the cell. A major impediment to simultaneous acquisition of multip...
NASA Astrophysics Data System (ADS)
Wamser, Kyle
Hyperspectral imagery and the corresponding ability to conduct analysis below the pixel level have tremendous potential to aid in landcover monitoring. During large ecosystem restoration projects, being able to monitor specific aspects of the recovery over large and often inaccessible areas under constrained finances are major challenges. The Civil Air Patrol's Airborne Real-time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) can provide hyperspectral data in most parts of the United States at relatively low cost. Although designed specifically for use in locating downed aircraft, the imagery holds the potential to identify specific aspects of landcover at far greater fidelity than traditional multispectral means. The goals of this research were to improve the use of ARCHER hyperspectral imagery to classify sub-canopy and open-area vegetation in coniferous forests located in the Southern Rockies and to determine how much fidelity might be lost from a baseline of 1 meter spatial resolution resampled to 2 and 5 meter pixel size to simulate higher altitude collection. Based on analysis comparing linear spectral unmixing with a traditional supervised classification, the linear spectral unmixing proved to be statistically superior. More importantly, however, linear spectral unmixing provided additional sub-pixel information that was unavailable using other techniques. The second goal of determining fidelity loss based on spatial resolution was more difficult to determine due to how the data are represented. Furthermore, the 2 and 5 meter imagery were obtained by resampling the 1 meter imagery and therefore may not be representative of the quality of actual 2 or 5 meter imagery. Ultimately, the information derived from this research may be useful in better utilizing hyperspectral imagery to conduct forest monitoring and assessment.
NASA Astrophysics Data System (ADS)
Qie, G.; Wang, G.; Wang, M.
2016-12-01
Mixed pixels and shadows due to buildings in urban areas impede accurate estimation and mapping of city vegetation carbon density. In most of previous studies, these factors are often ignored, which thus result in underestimation of city vegetation carbon density. In this study we presented an integrated methodology to improve the accuracy of mapping city vegetation carbon density. Firstly, we applied a linear shadow remove analysis (LSRA) on remotely sensed Landsat 8 images to reduce the shadow effects on carbon estimation. Secondly, we integrated a linear spectral unmixing analysis (LSUA) with a linear stepwise regression (LSR), a logistic model-based stepwise regression (LMSR) and k-Nearest Neighbors (kNN), and utilized and compared the integrated models on shadow-removed images to map vegetation carbon density. This methodology was examined in Shenzhen City of Southeast China. A data set from a total of 175 sample plots measured in 2013 and 2014 was used to train the models. The independent variables statistically significantly contributing to improving the fit of the models to the data and reducing the sum of squared errors were selected from a total of 608 variables derived from different image band combinations and transformations. The vegetation fraction from LSUA was then added into the models as an important independent variable. The estimates obtained were evaluated using a cross-validation method. Our results showed that higher accuracies were obtained from the integrated models compared with the ones using traditional methods which ignore the effects of mixed pixels and shadows. This study indicates that the integrated method has great potential on improving the accuracy of urban vegetation carbon density estimation. Key words: Urban vegetation carbon, shadow, spectral unmixing, spatial modeling, Landsat 8 images
Improving alpine-region spectral unmixing with optimal-fit snow endmembers
NASA Technical Reports Server (NTRS)
Painter, Thomas H.; Roberts, Dar A.; Green, Robert O.; Dozier, Jeff
1995-01-01
Surface albedo and snow-covered-area (SCA) are crucial inputs to the hydrologic and climatologic modeling of alpine and seasonally snow-covered areas. Because the spectral albedo and thermal regime of pure snow depend on grain size, areal distribution of snow grain size is required. Remote sensing has been shown to be an effective (and necessary) means of deriving maps of grain size distribution and snow-covered-area. Developed here is a technique whereby maps of grain size distribution improve estimates of SCA from spectral mixture analysis with AVIRIS data.
NASA Astrophysics Data System (ADS)
Echtler, Helmut; Segl, Karl; Dickerhof, Corinna; Chabrillat, Sabine; Kaufmann, Hermann J.
2003-03-01
The ESF-LSF 1997 flight campaign conducted by the German Aerospace Center (DLR) recorded several transects across the island of Naxos using the airborne hyperspectral scanner DAIS. The geological targets cover all major litho-tectonic units of a metamorphic dome with the transition of metamorphic zonations from the outer meta-sedimentary greenschist envelope to the gneissic amphibolite facies and migmatitic core. Mineral identification of alternating marble-dolomite sequences and interlayered schists bearing muscovite and biotite has been accomplished using the airborne hyperspectral DAIS 7915 sensor. Data have been noise filtered based on maximum noise fraction (MNF) and fast Fourier transform (FFT) and converted from radiance to reflectance. For mineral identification, constrained linear spectral unmixing and spectral angle mapper (SAM) algorithms were tested. Due to their unsatisfying results a new approach was developed which consists of a linear mixture modeling and spectral feature fitting. This approach provides more detailed and accurate information. Results are discussed in comparison with detailed geological mapping and additional information. Calcites are clearly separated from dolomites as well as the mica-schist sequences by a good resolution of the mineral muscovite. Thereon an outstanding result represents the very good resolution of the chlorite/mica (muscovite, biotite)-transition defining a metamorphic isograde.
Ball, David A; Lux, Matthew W; Graef, Russell R; Peterson, Matthew W; Valenti, Jane D; Dileo, John; Peccoud, Jean
2010-01-01
The concept of co-design is common in engineering, where it is necessary, for example, to determine the optimal partitioning between hardware and software of the implementation of a system features. Here we propose to adapt co-design methodologies for synthetic biology. As a test case, we have designed an environmental sensing device that detects the presence of three chemicals, and returns an output only if at least two of the three chemicals are present. We show that the logical operations can be implemented in three different design domains: (1) the transcriptional domain using synthetically designed hybrid promoters, (2) the protein domain using bi-molecular fluorescence complementation, and (3) the fluorescence domain using spectral unmixing and relying on electronic processing. We discuss how these heterogeneous design strategies could be formalized to develop co-design algorithms capable of identifying optimal designs meeting user specifications.
NASA Technical Reports Server (NTRS)
Ramsey, Michael S.; Christensen, Philip R.
1992-01-01
Accurate interpretation of thermal infrared data depends upon the understanding and removal of complicating effects. These effects may include physical mixing of various mineralogies and particle sizes, atmospheric absorption and emission, surficial coatings, geometry effects, and differential surface temperatures. The focus is the examination of the linear spectral mixing of individual mineral or endmember spectra. Linear addition of spectra, for particles larger than the wavelength, allows for a straight-forward method of deconvolving the observed spectra, predicting a volume percent of each endmember. The 'forward analysis' of linear mixing (comparing the spectra of physical mixtures to numerical mixtures) has received much attention. The reverse approach of un-mixing thermal emission spectra was examined with remotely sensed data, but no laboratory verification exists. Understanding of the effects of spectral mixing on high resolution laboratory spectra allows for the extrapolation to lower resolution, and often more complicated, remotely gathered data. Thermal Infrared Multispectral Scanner (TIMS) data for Meteor Crater, Arizona were acquired in Sep. 1987. The spectral un-mixing of these data gives a unique test of the laboratory results. Meteor Crater (1.2 km in diameter and 180 m deep) is located in north-central Arizona, west of Canyon Diablo. The arid environment, paucity of vegetation, and low relief make the region ideal for remote data acquisition. Within the horizontal sedimentary sequence that forms the upper Colorado Plateau, the oldest unit sampled by the impact crater was the Permian Coconino Sandstone. A thin bed of the Toroweap Formation, also of Permian age, conformably overlays the Coconino. Above the Toroweap lies the Permian Kiabab Limestone which, in turn, is covered by a thin veneer of the Moenkopi Formation. The Moenkopi is Triassic in age and has two distinct sub-units in the vicinity of the crater. The lower Wupatki member is a fine-grained sandstone, while the upper Moqui member is a fissile siltstone. Ejecta from these units are preserved as inverted stratigraphy up to 2 crater radii from the rim. The mineralogical contrast between the units, relative lack of post-emplacement erosion and ejecta mixing provide a unique site to apply the un-mixing model. Selection of the aforementioned units as endmembers reveals distinct patterns in the ejecta of the crater.
Unmixing of spectral components affecting AVIRIS imagery of Tampa Bay
NASA Astrophysics Data System (ADS)
Carder, Kendall L.; Lee, Z. P.; Chen, Robert F.; Davis, Curtiss O.
1993-09-01
According to Kirk's as well as Morel and Gentili's Monte Carlo simulations, the popular simple expression, R approximately equals 0.33 bb/a, relating subsurface irradiance reflectance (R) to the ratio of the backscattering coefficient (bb) to absorption coefficient (a), is not valid for bb/a > 0.25. This means that it may no longer be valid for values of remote-sensing reflectance (above-surface ratio of water-leaving radiance to downwelling irradiance) where Rrs4/ > 0.01. Since there has been no simple Rrs expression developed for very turbid waters, we developed one based in part on Monte Carlo simulations and empirical adjustments to an Rrs model and applied it to rather turbid coastal waters near Tampa Bay to evaluate its utility for unmixing the optical components affecting the water- leaving radiance. With the high spectral (10 nm) and spatial (20 m2) resolution of Airborne Visible-InfraRed Imaging Spectrometer (AVIRIS) data, the water depth and bottom type were deduced using the model for shallow waters. This research demonstrates the necessity of further research to improve interpretations of scenes with highly variable turbid waters, and it emphasizes the utility of high spectral-resolution data as from AVIRIS for better understanding complicated coastal environments such as the west Florida shelf.
Hyperspectral imaging applied to microbial categorization in an automated microbiology workflow
NASA Astrophysics Data System (ADS)
Leroux, Denis F.; Midahuen, Rony; Perrin, Guillaume; Pescatore, Jeremie; Imbaud, Pierre
2015-07-01
Hyperspectral imaging (HSI) is being evaluated as a pre-selection tool to categorize and localize populations of microbial colonies directly onto their culture medium, in order to facilitate the microbiology workflow downstream the incubation step. The categorization criteria were here limited to the diffuse radiance spectra acquired mostly in the visible region between 400 and 900 nm. Although the diffuse radiance signal is much broader than the one acquired using vibrational techniques such as Raman and IR and limited to chromophores absorbing in the visible region, it can be acquired very quickly allowing to perform hyperspectral imaging of large objects (i.e. Petri dishes) with throughputs that are compatible with the needs of a clinical laboratory workflow. Moreover, additional cost reduction could possibly be achieved using application-specific multispectral systems. Furthermore, recent research has shown that good power of discrimination, at the species level, could be achieved at least for a low level of species. In our work, we test different culture media, with and without a strong light absorption in the visible region, and report categorization results obtained when selecting end-member spectra according to a multi-parametric study (colonies, agar type). Results of categorization (e.g. at the species level) are presented using two types of supervised-categorization algorithms providing that they deliver subpixel fractional abundance information (Linear Spectral Unmixing type) or not such as Spectral Angle Mapping (SAM) and Euclidian Distance (ED) type. Interestingly the performance between the two classes of algorithms is dramatically different, a trend which is not always observed. An interpretation is proposed on the basis of the agar interference and the spectral purity of end-member spectra.
NASA Astrophysics Data System (ADS)
Gu, Lingjia; Ren, Ruizhi; Zhao, Kai; Li, Xiaofeng
2014-01-01
The precision of snow parameter retrieval is unsatisfactory for current practical demands. The primary reason is because of the problem of mixed pixels that are caused by low spatial resolution of satellite passive microwave data. A snow passive microwave unmixing method is proposed in this paper, based on land cover type data and the antenna gain function of passive microwaves. The land cover type of Northeast China is partitioned into grass, farmland, bare soil, forest, and water body types. The component brightness temperatures (CBT), namely unmixed data, with 1 km data resolution are obtained using the proposed unmixing method. The snow depth determined by the CBT and three snow depth retrieval algorithms are validated through field measurements taken in forest and farmland areas of Northeast China in January 2012 and 2013. The results show that the overall of the retrieval precision of the snow depth is improved by 17% in farmland areas and 10% in forest areas when using the CBT in comparison with the mixed pixels. The snow cover results based on the CBT are compared with existing MODIS snow cover products. The results demonstrate that more snow cover information can be obtained with up to 86% accuracy.
NASA Astrophysics Data System (ADS)
Jawin, E. R.; Head, J. W., III; Cannon, K.
2017-12-01
The Aristarchus pyroclastic deposit in central Oceanus Procellarum is understood to have formed in a gas-rich explosive volcanic eruption, and has been observed to contain abundant volcanic glass. However, the interpreted color (and therefore composition) of the glass has been debated. In addition, previous analyses of the pyroclastic deposit have been performed using lower resolution data than are currently available. In this work, a nonlinear spectral unmixing model was applied to Moon Mineralogy Mapper (M3) data of the Aristarchus plateau to investigate the detailed mineralogic and crystalline nature of the Aristarchus pyroclastic deposit by using spectra of laboratory endmembers including a suite of volcanic glasses returned from the Apollo 15 and 17 missions (green, orange, black beads), as well as synthetic lunar glasses (orange, green, red, yellow). Preliminary results of the M3 unmixing model suggest that spectra of the pyroclastic deposit can be modeled by a mixture composed predominantly of a featureless endmember approximating space weathering and a smaller component of glass. The modeled spectra were most accurate with a synthetic orange glass endmember, relative to the other glasses analyzed in this work. The results confirm that there is a detectable component of glass in the Aristarchus pyroclastic deposit which may be similar to the high-Ti orange glass seen in other regional pyroclastic deposits, with only minimal contributions of other crystalline minerals. The presence of volcanic glass in the pyroclastic deposit, with the low abundance of crystalline material, would support the model that the Aristarchus pyroclastic deposit formed in a long-duration, hawaiian-style fire fountain eruption. No significant detection of devitrified black beads in the spectral modeling results (as was observed at the Apollo 17 landing site in the Taurus-Littrow pyroclastic deposit), suggests the optical density of the eruptive plume remained low throughout the eruption.
Continental Spatio-Temporal Data Analysis with Linear Spectral Mixture Model Using FOSS
NASA Technical Reports Server (NTRS)
Kumar, Uttam; Nemani, Ramakrishna; Ganguly, Sangram; Milesi, Cristina; Raja, Kumar; Wang, Weile; Votava, Petr; Michaelis, Andrew
2015-01-01
This work demonstrates the development and implementation of a Fully Constrained Least Squares (FCLS) unmixing model developed in C++ programming language with OpenCV package and boost C++ libraries in the NASA Earth Exchange (NEX). Visualization of the results is supported by GRASS GIS and statistical analysis is carried in R in a Linux system environment. FCLS was first tested on computer simulated data with Gaussian noise of various signal-to-noise ratio, and Landsat data of an agricultural scenario and an urban environment using a set of global end members of substrate (soils, sediments, rocks, and non-photosynthetic vegetation), vegetation that includes green photosynthetic plants and dark objects which encompasses absorptive substrate materials, clear water, deep shadows, etc. For the agricultural scenario, a spectrally diverse collection of 11 scenes of Level 1 terrain corrected, cloud free Landsat-5 TM data of Fresno, California, USA were unmixed and the results were validated with the corresponding ground data. To study an urbanized landscape, a clear sky Landsat-5 TM data were unmixed and validated with coincident World View-2 abundance maps (of 2 m spatial resolution) for an area of San Francisco, California, USA. The results were evaluated using descriptive statistics, correlation coefficient, RMSE, probability of success, boxplot and bivariate distribution function. Finally, FCLS was used for sub-pixel land cover analysis of the monthly WELD (Wen-enabled Landsat data) repository from 2008 to 2011 of North America. The abundance maps in conjunction with DMSP-OLS nighttime lights data were used to extract the urban land cover features and analyze their spatial-temporal growth.
Continental Spatio-temporal Data Analysis with Linear Spectral Mixture Model using FOSS
NASA Astrophysics Data System (ADS)
Kumar, U.; Nemani, R. R.; Ganguly, S.; Milesi, C.; Raja, K. S.; Wang, W.; Votava, P.; Michaelis, A.
2015-12-01
This work demonstrates the development and implementation of a Fully Constrained Least Squares (FCLS) unmixing model developed in C++ programming language with OpenCV package and boost C++ libraries in the NASA Earth Exchange (NEX). Visualization of the results is supported by GRASS GIS and statistical analysis is carried in R in a Linux system environment. FCLS was first tested on computer simulated data with Gaussian noise of various signal-to-noise ratio, and Landsat data of an agricultural scenario and an urban environment using a set of global endmembers of substrate (soils, sediments, rocks, and non-photosynthetic vegetation), vegetation that includes green photosynthetic plants and dark objects which encompasses absorptive substrate materials, clear water, deep shadows, etc. For the agricultural scenario, a spectrally diverse collection of 11 scenes of Level 1 terrain corrected, cloud free Landsat-5 TM data of Fresno, California, USA were unmixed and the results were validated with the corresponding ground data. To study an urbanized landscape, a clear sky Landsat-5 TM data were unmixed and validated with coincident World View-2 abundance maps (of 2 m spatial resolution) for an area of San Francisco, California, USA. The results were evaluated using descriptive statistics, correlation coefficient, RMSE, probability of success, boxplot and bivariate distribution function. Finally, FCLS was used for sub-pixel land cover analysis of the monthly WELD (Wen-enabled Landsat data) repository from 2008 to 2011 of North America. The abundance maps in conjunction with DMSP-OLS nighttime lights data were used to extract the urban land cover features and analyze their spatial-temporal growth.
Biomass and health based forest cover delineation using spectral un-mixing
Mohan Tiruveedhula; Joseph Fan; Ravi R. Sadasivuni; Surya S. Durbha; David L. Evans
2009-01-01
Remote sensing is a well-suited source of information on various forest characteristics such as forest cover type, leaf area, biomass, and health. The use of appropriate layers helps to quantify the variables of interest. For example, normalized difference vegetation index (NDVI) and greenness help explain variability in biomass as well as health of forests....
Near-infrared fluorescent proteins for multicolor in vivo imaging
Shcherbakova, Daria M.; Verkhusha, Vladislav V.
2013-01-01
Near-infrared fluorescent proteins are in high demand for in vivo imaging. We developed four spectrally distinct fluorescent proteins, iRFP670, iRFP682, iRFP702, and iRFP720, from bacterial phytochromes. iRFPs exhibit high brightness in mammalian cells and tissues and are suitable for long-term studies. iRFP670 and iRFP720 enable two-color imaging in living cells and mice using standard approaches. Five iRFPs including previously engineered iRFP713 allow multicolor imaging in living mice with spectral unmixing. PMID:23770755
Unmixing-Based Denoising as a Pre-Processing Step for Coral Reef Analysis
NASA Astrophysics Data System (ADS)
Cerra, D.; Traganos, D.; Gege, P.; Reinartz, P.
2017-05-01
Coral reefs, among the world's most biodiverse and productive submerged habitats, have faced several mass bleaching events due to climate change during the past 35 years. In the course of this century, global warming and ocean acidification are expected to cause corals to become increasingly rare on reef systems. This will result in a sharp decrease in the biodiversity of reef communities and carbonate reef structures. Coral reefs may be mapped, characterized and monitored through remote sensing. Hyperspectral images in particular excel in being used in coral monitoring, being characterized by very rich spectral information, which results in a strong discrimination power to characterize a target of interest, and separate healthy corals from bleached ones. Being submerged habitats, coral reef systems are difficult to analyse in airborne or satellite images, as relevant information is conveyed in bands in the blue range which exhibit lower signal-to-noise ratio (SNR) with respect to other spectral ranges; furthermore, water is absorbing most of the incident solar radiation, further decreasing the SNR. Derivative features, which are important in coral analysis, result greatly affected by the resulting noise present in relevant spectral bands, justifying the need of new denoising techniques able to keep local spatial and spectral features. In this paper, Unmixing-based Denoising (UBD) is used to enable analysis of a hyperspectral image acquired over a coral reef system in the Red Sea based on derivative features. UBD reconstructs pixelwise a dataset with reduced noise effects, by forcing each spectrum to a linear combination of other reference spectra, exploiting the high dimensionality of hyperspectral datasets. Results show clear enhancements with respect to traditional denoising methods based on spatial and spectral smoothing, facilitating the coral detection task.
NASA Astrophysics Data System (ADS)
Pour, Amin Beiranvand; Hashim, Mazlan
2012-02-01
This study investigates the application of spectral image processing methods to ASTER data for mapping hydrothermal alteration zones associated with porphyry copper mineralization and related host rock. The study area is located in the southeastern segment of the Urumieh-Dokhtar Volcanic Belt of Iran. This area has been selected because it is a potential zone for exploration of new porphyry copper deposits. Spectral transform approaches, namely principal component analysis, band ratio and minimum noise fraction were used for mapping hydrothermally altered rocks and lithological units at regional scale. Spectral mapping methods, including spectral angle mapper, linear spectral unmixing, matched filtering and mixture tuned matched filtering were applied to differentiate hydrothermal alteration zones associated with porphyry copper mineralization such as phyllic, argillic and propylitic mineral assemblages.Spectral transform methods enhanced hydrothermally altered rocks associated with the known porphyry copper deposits and new identified prospects using shortwave infrared (SWIR) bands of ASTER. These methods showed the discrimination of quartz rich igneous rocks from the magmatic background and the boundary between igneous and sedimentary rocks using the thermal infrared (TIR) bands of ASTER at regional scale. Spectral mapping methods distinguished the sericitically- and argillically-altered rocks (the phyllic and argillic alteration zones) that surrounded by discontinuous to extensive zones of propylitized rocks (the propylitic alteration zone) using SWIR bands of ASTER at both regional and district scales. Linear spectral unmixing method can be best suited for distinguishing specific high economic-potential hydrothermal alteration zone (the phyllic zone) and mineral assemblages using SWIR bands of ASTER. Results have proven to be effective, and in accordance with the results of field surveying, spectral reflectance measurements and X-ray diffraction (XRD) analysis. In conclusion, the image processing methods used can provide cost-effective information to discover possible locations of porphyry copper and epithermal gold mineralization prior to detailed and costly ground investigations. The extraction of spectral information from ASTER data can produce comprehensive and accurate information for copper and gold resource investigations around the world, including those yet to be discovered.
High spatial resolution spectral unmixing for mapping ash species across a complex urban environment
Jennifer Pontius; Ryan P. Hanavan; Richard A. Hallett; Bruce D. Cook; Lawrence A. Corp
2017-01-01
Ash (Fraxinus L.) species are currently threatened by the emerald ash borer (EAB; Agrilus planipennis Fairmaire) across a growing area in the eastern US. Accurate mapping of ash species is required to monitor the host resource, predict EAB spread and better understand the short- and long-term effects of EAB on the ash resource...
Van de Voorde, Tim; Vlaeminck, Jeroen; Canters, Frank
2008-01-01
Urban growth and its related environmental problems call for sustainable urban management policies to safeguard the quality of urban environments. Vegetation plays an important part in this as it provides ecological, social, health and economic benefits to a city's inhabitants. Remotely sensed data are of great value to monitor urban green and despite the clear advantages of contemporary high resolution images, the benefits of medium resolution data should not be discarded. The objective of this research was to estimate fractional vegetation cover from a Landsat ETM+ image with sub-pixel classification, and to compare accuracies obtained with multiple stepwise regression analysis, linear spectral unmixing and multi-layer perceptrons (MLP) at the level of meaningful urban spatial entities. Despite the small, but nevertheless statistically significant differences at pixel level between the alternative approaches, the spatial pattern of vegetation cover and estimation errors is clearly distinctive at neighbourhood level. At this spatially aggregated level, a simple regression model appears to attain sufficient accuracy. For mapping at a spatially more detailed level, the MLP seems to be the most appropriate choice. Brightness normalisation only appeared to affect the linear models, especially the linear spectral unmixing. PMID:27879914
Single cell analysis using surface enhanced Raman scattering (SERS) tags
Nolan, John P.; Duggan, Erika; Liu, Er; Condello, Danilo; Dave, Isha; Stoner, Samuel A.
2013-01-01
Fluorescence is a mainstay of bioanalytical methods, offering sensitive and quantitative reporting, often in multiplexed or multiparameter assays. Perhaps the best example of the latter is flow cytometry, where instruments equipped with multiple lasers and detectors allow measurement of 15 or more different fluorophores simultaneously, but increases beyond this number are limited by the relatively broad emission spectra. Surface enhanced Raman scattering (SERS) from metal nanoparticles can produce signal intensities that rival fluorescence, but with narrower spectral features that allow a greater degree of multiplexing. We are developing nanoparticle SERS tags as well as Raman flow cytometers for multiparameter single cell analysis of suspension or adherent cells. SERS tags are based on plasmonically active nanoparticles (gold nanorods) whose plasmon resonance can be tuned to give optimal SERS signals at a desired excitation wavelength. Raman resonant compounds are adsorbed on the nanoparticles to confer a unique spectral fingerprint on each SERS tag, which are then encapsulated in a polymer coating for conjugation to antibodies or other targeting molecules. Raman flow cytometry employs a high resolution spectral flow cytometer capable of measuring the complete SERS spectra, as well as conventional flow cytometry measurements, from thousands of individual cells per minute. Automated spectral unmixing algorithms extract the contributions of each SERS tag from each cell to generate high content, multiparameter single cell population data. SERS-based cytometry is a powerful complement to conventional fluorescence-based cytometry. The narrow spectral features of the SERS signal enables more distinct probes to be measured in a smaller region of the optical spectrum with a single laser and detector, allowing for higher levels of multiplexing and multiparameter analysis. PMID:22498143
Spectroscopy as a diagnostic tool for urban soil
NASA Astrophysics Data System (ADS)
Brook, Anna; Kopel, Daniella; Wittenberg, Lea
2015-04-01
Anthropogenic urban soil are the foundation of the urban green infrastructure, the green net quality is as good as each of its patches. In early days of pedology urban soil has been recognized with respect to contamination and the risks for human health but in study performed since the 70s, the importance of urban soil for the urban ecology became increasingly significant (Gómez-Baggethun and Barton 2013). Urban soils are highly disturbed land that was created by the process of urbanization. The dominant agent in the creation of urban soils is human activity which modifies the natural soil through mixing, filling or by contamination of land surfaces so as to create a layer of urban soil which can be more than 50 cm thick (Pavao-Zuckerman 2008). The objective of this study is to determine the extent to which field spectroscopy methods can be used to extend the knowledge of urban soils features and components. The majority of the studies on urban soils concentrate on identifying and mapping of pollution mostly heavy metals. In this study a top-down analysis is developed - a simple and intuitive spectral feature for detecting the presence of minerals, organic matter and pollutants in mixed soil samples. The developed method uses spectral activity (SA) detection in a structured hierarchical approach to quickly and, more importantly, correctly identify dominant spectral features. The developed method is adopted by multiple in-production tools including continuum removal normalization, guided by polynomial generalization, and spectral-likelihood algorithms: orthogonal subspace projection (OSP) and iterative spectral mixture analysis (ISMA) were compared to feature likelihood methods (Li et al. 2014). Results of the proposed top-down unmixing method suggest that the analysis is made very fast due to the simplified hierarchy which avoids the high-learning curve associated with unmixing algorithms showed that the most abundant components were coarse organic matter 12% followed by concrete dust, plastic crumbs, other man made materials, clay and other minerals. The major part of the mineralogical composition was dominated by Montmorillonite and Kaolinite as is it expected to be in the Mount Carmel soils. Pyroxene and Olivine are also typical to the mineralogy of the Mount Carmel were there are several known magmatic eruption areas of Scoria and Basalt. There is a high frequency of Actinolite (Ca2(Mg,Fe)5(Si8O22)(OH)2), Amphibole family (2.5%) that is typical to metamorphic rocks that are not to be found in the Mount Carmel region. Some of the mineral found in the analysis is of marine origin like Syngenite (K2Ca(SO4)2(H2O)) and Blodite (Na2Mg(SO4)24(H2O)) as the area was created under the Mediterranean Sea and is still influence by it. None of the endmembers were detected only once, the lowest frequency was 4 times for Cyanide-Cadmium (Cd(CN)2) and Andalusite (Al2SiO5). The results of the soils pH, measured electrometrically and the particle size distribution, measured by Laser diffraction, indicate there is no big different between the samples particle size distribution and the pH values of the samples but they are not significantly different from the expected, except for the OM percentage which is significantly higher in most samples. The suggested method was very effective for tracing the man-made substances, we could find concrete and asphalt, plastic and synthetic polymers after they were assimilated, broken down and decomposed into soil particles. By the top-down unmixing method we did not limit the substances we characterize and so we could detect unexpected materials and contaminants. Gómez-Baggethun, Erik and David N. Barton. 2013. "Classifying and Valuing Ecosystem Services for Urban Planning." Ecological Economics 86: 235-245. Pavao-Zuckerman, M. A. 2008. "The Nature of Urban Soils and their Role in Ecological Restoration in Cities." Restoration Ecology 16 (4): 642-649. Li, Lijun, Peter E. Holm, Helle Marcussen, and Hans Christian Bruun Hansen. 2014. "Release of Cadmium, Copper and Lead from Urban Soils of Copenhagen." Environmental Pollution 187: 90-97.
NIR & MIR spectroscopy as an effective tool for detecting urban influences on soils
NASA Astrophysics Data System (ADS)
Brook, Anna; Kopel, Daniella; Wittenberg, Lea
2016-04-01
Soil supports ecosystem functions and services, sustains ecosystems and biodiversity, yet in the urban spreading world of today, soil as a resource is in constant danger. Human society takes for granted the services provided by open green patches allocated within and nearby cities, with no consideration of ramifications of urban development on those areas. The urban ecology science recognizes the need to learn, identify and monitor the soils of cities - urban soils. The definitions of those soils are mainly descriptive, since urban soils do not submitted to the pedological process as natural soils. The main objective of this paper is to characterize urban soils in open green undisturbed patches by mineralogical composition. This goal was achieved using field and laboratory spectroscopy across visible near, short wave infrared regions and laboratory thermal mid infrared region. The majority of the studies on urban soils concentrate on identifying and mapping of pollution mostly heavy metals. In this study a top-down analysis (a simple and intuitive spectral feature for detecting the presence of minerals, organic matter and pollutants in mixed soil samples) is applied. This method uses spectral activity (SA) detection in a structured hierarchical approach to quickly and, more importantly, correctly identify dominant spectral features. The applied method is adopted by multiple in-production tools including continuum removal normalization, guided by polynomial generalization, and spectral-likelihood algorithms: orthogonal subspace projection (OSP) and iterative spectral mixture analysis (ISMA) were compared to feature likelihood methods. A total of 70 soil samples were collected at different locations: in remnant area within the city (edge and core), on the borders of the neighborhoods (edge) and in the fringe zone and in 2 locations in the protected park. The park samples were taken in locations found more than 100m from roads or direct anthropogenic disturbances. The samples were collected outside the setback of the residential areas (edge), and the fringe samples were taken away from the edge, where construction debris or waste was no longer visible - approximately 18 m-50 m down the slopes. The samples were taken from the upper layer of the soils, after the course organic or trash residues were removed. A soil sample drill, 5 cm in diameter and 10 cm deep, was used collecting up to 100 ml sample caps. The samples were air-dried, sifted through a 2 mm sieve to remove large particles and rock fragments and ground to <200 nm samples for spectral analysis across 400-2500 nm and laboratory mid-IR analysis. A ratio between the spectral features of soils' aliphatic and aromatic groups and calcite or hydroxyls to estimate the total organic matter via method proposed by Dlapa et al., 2014; base on the ratio indices between aliphatic hydrocarbons (3000-2800cm-1) to calcite mineral (peak area at 875cm-1, central wave length) and between carboxyl aromatic groups (1800-1200cm-1) to calcite mineral, were calculated for soil total carbon estimation. Results of the proposed top-down unmixing method suggest that the analysis is made very fast due to the simplified hierarchy which avoids the high-learning curve associated with unmixing algorithms showed that the most abundant components found in the all the samples taken within city boundaries were organic matter. In the "organic matter" category, we summarized all forms of vegetation endmembers including coarse vegetation and organic carbon. The second component was concrete followed by plastic and bricks. We found traces of concrete in all the urban study samples, even samples taken as far as 150 m from the edge of patches. In the park soils, we found a low diversity of materials and only two identifications of anthropogenic substances. The results of the soils pH, measured electrometrically and the particle size distribution, measured by Laser diffraction, indicate there is no difference between the samples particle size distribution and the pH values of the samples but they are not significantly different from the expected, except for the OM percentage. The suggested method was very effective for tracing the man-made substances, we could find concrete and asphalt, plastic and synthetic polymers after they were assimilated, broken down and decomposed into soil particles. By the top-down unmixing method we did not limit the substances we characterize and so we could detect unexpected materials and contaminants.
NASA Astrophysics Data System (ADS)
Zhou, Qiang
Over the past two decades, non-native species within grassland communities have quickly developed due to human migration and commerce. Invasive species like Smooth Brome grass (Bromus inermis) and Kentucky Blue Grass (Poa pratensis), seriously threaten conservation of native grasslands. This study aims to discriminate between native grasslands and planted hayfields and conservation areas dominated by introduced grasses using hyperspectral imagery. Hyperspectral imageries from the Hyperion sensor on EO-1 were acquired in late spring and late summer on 2009 and 2010. Field spectra for widely distributed species as well as smooth brome grass and Kentucky blue grass were collected from the study sites throughout the growing season. Imagery was processed with an unmixing algorithm to estimate fractional cover of green and dry vegetation and bare soil. As the spectrum is significantly different through growing season, spectral libraries for the most common species are then built for both the early growing season and late growing season. After testing multiple methods, the Adaptive Coherence Estimator (ACE) was used for spectral matching analysis between the imagery and spectral libraries. Due in part to spectral similarity among key species, the results of spectral matching analysis were not definitive. Additional indexes, "Level of Dominance" and "Band variance", were calculated to measure the predominance of spectral signatures in any area. A Texture co-occurrence analysis was also performed on both "Level of Dominance" and "Band variance" indexes to extract spatial characteristics. The results suggest that compared with disturbed area, native prairie tend to have generally lower "Level of Dominance" and "Band variance" as well as lower spatial dissimilarity. A final decision tree model was created to predict presence of native or introduced grassland. The model was more effective for identification of Mixed Native Grassland than for grassland dominated by a single species. The discrimination of native and introduced grassland was limited by the similarity of spectral signatures between forb-dominated native grasslands and brome-grass stands. However, saline native grasslands were distinguishable from brome grass.
NASA Technical Reports Server (NTRS)
Abercromby, Kira J.; Rapp, Jason; Bedard, Donald; Seitzer, Patrick; Cardona, Tommaso; Cowardin, Heather; Barker, Ed; Lederer, Susan
2013-01-01
Spectral reflectance data through the visible regime was collected at Las Campanas Observatory in Chile using an imaging spectrograph on one of the twin 6.5-m Magellan telescopes. The data were obtained on 1-2 May 2012 on the 'Landon Clay' telescope with the LDSS3 (Low Dispersion Survey Spectrograph 3). Five pieces of Geosynchronous Orbit (GEO) or near-GEO debris were identified and observed with an exposure time of 30 seconds on average. In addition, laboratory spectral reflectance data was collected using an Analytical Spectral Device (ASD) field spectrometer at California Polytechnic State University (Cal Poly) in San Luis Obispo on several typical common spacecraft materials including solar cells, circuit boards, various Kapton materials used for multi-layer insulation, and various paints. The remotely collected data and the laboratory-acquired data were then incorporated in a newly developed model that uses a constrained least squares method to unmix the spectrum in specific material components. The results of this model are compared to the previous method of a human-in-the-loop (considered here the traditional method) that identifies possible material components by varying the materials and percentages until a spectral match is obtained. The traditional model was found to match the remotely collected spectral data after it had been divided by the continuum to remove the space weathering effects, or a reddening of the materials. The constrained least-squares model also used the de-reddened spectra as inputs and the results were consistent with those obtained through the traditional method. For comparison, a first-order examination of including reddening effects into the constrained least-squares model will be explored and comparisons to the remotely collected data will be examined. The identification of each object s suspected material component will be discussed herein.
NASA Technical Reports Server (NTRS)
Rapp, Jason; Abercromby, Kira J.; Bedard, Donald; Seitzer, Patrick; Cardona, Tommaso; Cowardin, Heather; Barker, Ed; Lederer, Susan
2012-01-01
Spectral reflectance data through the visible regime was collected at Las Campanas Observatory in Chile using an imaging spectrograph on one of the twin 6.5-m Magellan telescopes. The data were obtained on 1-2 May 2012 on the 'Landon Clay' telescope with the LDSS3 (Low Dispersion Survey Spectrograph 3). Five pieces of Geosynchronous Orbit (GEO) or near-GEO debris were identified and observed with an exposure time of 30 seconds on average. In addition, laboratory spectral reflectance data was collected using an Analytical Spectral Device (ASD) field spectrometer at California Polytechnic State University in San Luis Obispo on several typical common spacecraft materials including solar cells, circuit boards, various Kapton materials used for multi-layer insulation, and various paints. The remotely collected data and the laboratory-acquired data were then incorporated in a newly developed model that uses a constrained least squares method to unmix the spectrum in specific material components. The results of this model are compared to the previous method of a human-in-the-loop (considered here the traditional method) that identifies possible material components by varying the materials and percentages until a spectral match is obtained. The traditional model was found to match the remotely collected spectral data after it had been divided by the continuum to remove the space weathering effects, or a "reddening" of the materials. The constrained least-squares model also used the de-reddened spectra as inputs and the results were consistent with those obtained through the traditional method. For comparison, a first-order examination of including reddening effects into the constrained least-squares model will be explored and comparisons to the remotely collected data will be examined. The identification of each object's suspected material component will be discussed herein.
A. M. S. Smith; L. B. Lenilte; A. T. Hudak; P. Morgan
2007-01-01
The Differenced Normalized Burn Ratio (deltaNBR) is widely used to map post-fire effects in North America from multispectral satellite imagery, but has not been rigorously validated across the great diversity in vegetation types. The importance of these maps to fire rehabilitation crews highlights the need for continued assessment of alternative remote sensing...
NASA Astrophysics Data System (ADS)
Bharti, Rishikesh; Ramakrishnan, D.; Singh, K. D.
2014-02-01
This study investigated the potential of Moon Mineralogy Mapper (M3) data for studying compositional variation in the near-, far-side transition zone of the lunar surface. For this purpose, the radiance values of the M3 data were corrected for illumination and emission related effects and converted to apparent reflectance. Dimensionality of the calibrated reflectance image cube was reduced using Independent Component Analysis (ICA) and endmembers were extracted by using Pixel Purity Index (PPI) algorithm. The selected endmembers were linearly unmixed and resolved for mineralogy using United States Geological Survey (USGS) library spectra of minerals. These mineralogically resolved endmembers were used to map the compositional variability within, and outside craters using Spectral Angle Mapper (SAM) algorithm. Cross validation for certain litho types was attempted using band ratios like Optical Maturity (OMAT), Color Ratio Composite and Integrated Band Depth ratio (IBD). The identified lithologies for highland and basin areas match well with published works and strongly support depth related magmatic differentiation. Prevalence of pigeonite-basalt, pigeonite-norite and pyroxenite in crater peaks and floors are unique to the investigated area and are attributed to local, lateral compositional variability in magma composition due to pressure, temperature, and rate of cooling.
NASA Astrophysics Data System (ADS)
Tavakoli, Behnoosh; Chen, Ying; Guo, Xiaoyu; Kang, Hyun Jae; Pomper, Martin; Boctor, Emad M.
2015-03-01
Targeted contrast agents can improve the sensitivity of imaging systems for cancer detection and monitoring the treatment. In order to accurately detect contrast agent concentration from photoacoustic images, we developed a decomposition algorithm to separate photoacoustic absorption spectrum into components from individual absorbers. In this study, we evaluated novel prostate-specific membrane antigen (PSMA) targeted agents for imaging prostate cancer. Three agents were synthesized through conjugating PSMA-targeting urea with optical dyes ICG, IRDye800CW and ATTO740 respectively. In our preliminary PA study, dyes were injected in a thin wall plastic tube embedded in water tank. The tube was illuminated with pulsed laser light using a tunable Q-switch ND-YAG laser. PA signal along with the B-mode ultrasound images were detected with a diagnostic ultrasound probe in orthogonal mode. PA spectrums of each dye at 0.5 to 20 μM concentrations were estimated using the maximum PA signal extracted from images which are obtained at illumination wavelengths of 700nm-850nm. Subsequently, we developed nonnegative linear least square optimization method along with localized regularization to solve the spectral unmixing. The algorithm was tested by imaging mixture of those dyes. The concentration of each dye was estimated with about 20% error on average from almost all mixtures albeit the small separation between dyes spectrums.
NASA Technical Reports Server (NTRS)
Kumar, Uttam; Nemani, Ramakrishna R.; Ganguly, Sangram; Kalia, Subodh; Michaelis, Andrew
2017-01-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS-national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91 percent was achieved, which is a 6 percent improvement in unmixing based classification relative to per-pixel-based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kumar, U.; Nemani, R. R.; Kalia, S.; Michaelis, A.
2017-12-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS - national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91% was achieved, which is a 6% improvement in unmixing based classification relative to per-pixel based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
Validating the LASSO algorithm by unmixing spectral signatures in multicolor phantoms
NASA Astrophysics Data System (ADS)
Samarov, Daniel V.; Clarke, Matthew; Lee, Ji Yoon; Allen, David; Litorja, Maritoni; Hwang, Jeeseong
2012-03-01
As hyperspectral imaging (HSI) sees increased implementation into the biological and medical elds it becomes increasingly important that the algorithms being used to analyze the corresponding output be validated. While certainly important under any circumstance, as this technology begins to see a transition from benchtop to bedside ensuring that the measurements being given to medical professionals are accurate and reproducible is critical. In order to address these issues work has been done in generating a collection of datasets which could act as a test bed for algorithms validation. Using a microarray spot printer a collection of three food color dyes, acid red 1 (AR), brilliant blue R (BBR) and erioglaucine (EG) are mixed together at dierent concentrations in varying proportions at dierent locations on a microarray chip. With the concentration and mixture proportions known at each location, using HSI an algorithm should in principle, based on estimates of abundances, be able to determine the concentrations and proportions of each dye at each location on the chip. These types of data are particularly important in the context of medical measurements as the resulting estimated abundances will be used to make critical decisions which can have a serious impact on an individual's health. In this paper we present a novel algorithm for processing and analyzing HSI data based on the LASSO algorithm (similar to "basis pursuit"). The LASSO is a statistical method for simultaneously performing model estimation and variable selection. In the context of estimating abundances in an HSI scene these so called "sparse" representations provided by the LASSO are appropriate as not every pixel will be expected to contain every endmember. The algorithm we present takes the general framework of the LASSO algorithm a step further and incorporates the rich spatial information which is available in HSI to further improve the estimates of abundance. We show our algorithm's improvement over the standard LASSO using the dye mixture data as the test bed.
Moham P. Tiruveedhula; Joseph Fan; Ravi R. Sadasivuni; Surya S. Durbha; David L. Evans
2010-01-01
The accumulation of small diameter trees (SDTs) is becoming a nationwide concern. Forest management practices such as fire suppression and selective cutting of high grade timber have contributed to an overabundance of SDTs in many areas. Alternative value-added utilization of SDTs (for composite wood products and biofuels) has prompted the need to estimate their...
Spectral Unmixing Applied to Desert Soils for the Detection of Sub-Pixel Disturbances
2012-09-01
and Glazner, 1997). Rocks underlying Panum Crater consist of the granitic and metamorphic batholith associated with the Sierra Nevada. On top of this...of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE September...technology can be used to detect and characterize surface disturbance both literally (visually) and non-literally (analytically). Non-literal approaches
NASA Astrophysics Data System (ADS)
Dong, Biqin; Almassalha, Luay Matthew; Urban, Ben E.; Nguyen, The-Quyen; Khuon, Satya; Chew, Teng-Leong; Backman, Vadim; Sun, Cheng; Zhang, Hao F.
2017-02-01
Distinguishing minute differences in spectroscopic signatures is crucial for revealing the fluorescence heterogeneity among fluorophores to achieve a high molecular specificity. Here we report spectroscopic photon localization microscopy (SPLM), a newly developed far-field spectroscopic imaging technique, to achieve nanoscopic resolution based on the principle of single-molecule localization microscopy while simultaneously uncovering the inherent molecular spectroscopic information associated with each stochastic event (Dong et al., Nature Communications 2016, in press). In SPLM, by using a slit-less monochromator, both the zero-order and the first-order diffractions from a grating were recorded simultaneously by an electron multiplying charge-coupled device to reveal the spatial distribution and the associated emission spectra of individual stochastic radiation events, respectively. As a result, the origins of photon emissions from different molecules can be identified according to their spectral differences with sub-nm spectral resolution, even when the molecules are within close proximity. With the newly developed algorithms including background subtraction and spectral overlap unmixing, we established and tested a method which can significantly extend the fundamental spatial resolution limit of single molecule localization microscopy by molecular discrimination through spectral regression. Taking advantage of this unique capability, we demonstrated improvement in spatial resolution of PALM/STORM up to ten fold with selected fluorophores. This technique can be readily adopted by other research groups to greatly enhance the optical resolution of single molecule localization microscopy without the need to modify their existing staining methods and protocols. This new resolving capability can potentially provide new insights into biological phenomena and enable significant research progress to be made in the life sciences.
NASA Astrophysics Data System (ADS)
Varatharajan, I.; D'Amore, M.; Maturilli, A.; Helbert, J.; Hiesinger, H.
2017-12-01
The Mercury Radiometer and Thermal Imaging Spectrometer (MERTIS) payload of ESA/JAXA Bepicolombo mission to Mercury will map the thermal emissivity at wavelength range of 7-14 μm and spatial resolution of 500 m/pixel [1]. Mercury was also imaged at the same wavelength range using the Boston University's Mid-Infrared Spectrometer and Imager (MIRSI) mounted on the NASA Infrared Telescope Facility (IRTF) on Mauna Kea, Hawaii with the minimum spatial coverage of 400-600km/spectra which blends all rocks, minerals, and soil types [2]. Therefore, the study [2] used quantitative deconvolution algorithm developed by [3] for spectral unmixing of this composite thermal emissivity spectrum from telescope to their respective areal fractions of endmember spectra; however, the thermal emissivity of endmembers used in [2] is the inverted reflectance measurements (Kirchhoff's law) of various samples measured at room temperature and pressure. Over a decade, the Planetary Spectroscopy Laboratory (PSL) at the Institute of Planetary Research (PF) at the German Aerospace Center (DLR) facilitates the thermal emissivity measurements under controlled and simulated surface conditions of Mercury by taking emissivity measurements at varying temperatures from 100-500°C under vacuum conditions supporting MERTIS payload. The measured thermal emissivity endmember spectral library therefore includes major silicates such as bytownite, anorthoclase, synthetic glass, olivine, enstatite, nepheline basanite, rocks like komatiite, tektite, Johnson Space Center lunar simulant (1A), and synthetic powdered sulfides which includes MgS, FeS, CaS, CrS, TiS, NaS, and MnS. Using such specialized endmember spectral library created under Mercury's conditions significantly increases the accuracy of the deconvolution model results. In this study, we revisited the available telescope spectra and redeveloped the algorithm by [3] by only choosing the endmember spectral library created at PSL for unbiased model accuracy with the RMS value of 0.03-0.04. Currently, the telescope spectra are investigated for its calibrations and the results will be presented at AGU. References: [1] Hiesinger, H. and J. Helbert (2010) PSS, 58(1-2): 144-165. [2] Sprague, A.L. et al (2009) PSS, 57, 364-383. [3] Ramsey and Christiansen (1998) JGR, 103, 577-596
NASA Astrophysics Data System (ADS)
Fedrigo, Melissa; Newnham, Glenn J.; Coops, Nicholas C.; Culvenor, Darius S.; Bolton, Douglas K.; Nitschke, Craig R.
2018-02-01
Light detection and ranging (lidar) data have been increasingly used for forest classification due to its ability to penetrate the forest canopy and provide detail about the structure of the lower strata. In this study we demonstrate forest classification approaches using airborne lidar data as inputs to random forest and linear unmixing classification algorithms. Our results demonstrated that both random forest and linear unmixing models identified a distribution of rainforest and eucalypt stands that was comparable to existing ecological vegetation class (EVC) maps based primarily on manual interpretation of high resolution aerial imagery. Rainforest stands were also identified in the region that have not previously been identified in the EVC maps. The transition between stand types was better characterised by the random forest modelling approach. In contrast, the linear unmixing model placed greater emphasis on field plots selected as endmembers which may not have captured the variability in stand structure within a single stand type. The random forest model had the highest overall accuracy (84%) and Cohen's kappa coefficient (0.62). However, the classification accuracy was only marginally better than linear unmixing. The random forest model was applied to a region in the Central Highlands of south-eastern Australia to produce maps of stand type probability, including areas of transition (the 'ecotone') between rainforest and eucalypt forest. The resulting map provided a detailed delineation of forest classes, which specifically recognised the coalescing of stand types at the landscape scale. This represents a key step towards mapping the structural and spatial complexity of these ecosystems, which is important for both their management and conservation.
NASA Astrophysics Data System (ADS)
Li, Qingli; Peng, Hui; Wang, Jing; Wang, Yiting; Guo, Fangmin
2015-11-01
A direct spatial and spectral observation of CdSe and CdSe/CdS quantum dots (QDs) as probes in live cells is performed by using a custom molecular hyperspectral imaging (MHI) system. Water-soluble CdSe and CdSe/CdS QDs are synthesized in aqueous solution under the assistance of high-intensity ultrasonic irradiation and incubated with colon cancer cells for bioimaging. Unlike the traditional fluorescence microscopy methods, MHI technology can identify QD probes according to their spectral signatures and generate coexpression and stain titer maps by a clustering method. The experimental results show that the MHI method has potential to unmix biomarkers by their spectral information, which opens up a pathway of optical multiplexing with many different QD probes.
NASA Astrophysics Data System (ADS)
Howard, A. M.; Bernardes, S.; Nibbelink, N.; Biondi, L.; Presotto, A.; Fragaszy, D. M.; Madden, M.
2012-07-01
Movement patterns of bearded capuchin monkeys (Cebus (Sapajus) libidinosus) in northeastern Brazil are likely impacted by environmental features such as elevation, vegetation density, or vegetation type. Habitat preferences of these monkeys provide insights regarding the impact of environmental features on species ecology and the degree to which they incorporate these features in movement decisions. In order to evaluate environmental features influencing movement patterns and predict areas suitable for movement, we employed a maximum entropy modelling approach, using observation points along capuchin monkey daily routes as species presence points. We combined these presence points with spatial data on important environmental features from remotely sensed data on land cover and topography. A spectral mixing analysis procedure was used to generate fraction images that represent green vegetation, shade and soil of the study area. A Landsat Thematic Mapper scene of the area of study was geometrically and atmospherically corrected and used as input in a Minimum Noise Fraction (MNF) procedure and a linear spectral unmixing approach was used to generate the fraction images. These fraction images and elevation were the environmental layer inputs for our logistic MaxEnt model of capuchin movement. Our models' predictive power (test AUC) was 0.775. Areas of high elevation (>450 m) showed low probabilities of presence, and percent green vegetation was the greatest overall contributor to model AUC. This work has implications for predicting daily movement patterns of capuchins in our field site, as suitability values from our model may relate to habitat preference and facility of movement.
Practical considerations in experimental computational sensing
NASA Astrophysics Data System (ADS)
Poon, Phillip K.
Computational sensing has demonstrated the ability to ameliorate or eliminate many trade-offs in traditional sensors. Rather than attempting to form a perfect image, then sampling at the Nyquist rate, and reconstructing the signal of interest prior to post-processing, the computational sensor attempts to utilize a priori knowledge, active or passive coding of the signal-of-interest combined with a variety of algorithms to overcome the trade-offs or to improve various task-specific metrics. While it is a powerful approach to radically new sensor architectures, published research tends to focus on architecture concepts and positive results. Little attention is given towards the practical issues when faced with implementing computational sensing prototypes. I will discuss the various practical challenges that I encountered while developing three separate applications of computational sensors. The first is a compressive sensing based object tracking camera, the SCOUT, which exploits the sparsity of motion between consecutive frames while using no moving parts to create a psuedo-random shift variant point-spread function. The second is a spectral imaging camera, the AFSSI-C, which uses a modified version of Principal Component Analysis with a Bayesian strategy to adaptively design spectral filters for direct spectral classification using a digital micro-mirror device (DMD) based architecture. The third demonstrates two separate architectures to perform spectral unmixing by using an adaptive algorithm or a hybrid techniques of using Maximum Noise Fraction and random filter selection from a liquid crystal on silicon based computational spectral imager, the LCSI. All of these applications demonstrate a variety of challenges that have been addressed or continue to challenge the computational sensing community. One issue is calibration, since many computational sensors require an inversion step and in the case of compressive sensing, lack of redundancy in the measurement data. Another issue is over multiplexing, as more light is collected per sample, the finite amount of dynamic range and quantization resolution can begin to degrade the recovery of the relevant information. A priori knowledge of the sparsity and or other statistics of the signal or noise is often used by computational sensors to outperform their isomorphic counterparts. This is demonstrated in all three of the sensors I have developed. These challenges and others will be discussed using a case-study approach through these three applications.
MAX UnMix: A web application for unmixing magnetic coercivity distributions
NASA Astrophysics Data System (ADS)
Maxbauer, Daniel P.; Feinberg, Joshua M.; Fox, David L.
2016-10-01
It is common in the fields of rock and environmental magnetism to unmix magnetic mineral components using statistical methods that decompose various types of magnetization curves (e.g., acquisition, demagnetization, or backfield). A number of programs have been developed over the past decade that are frequently used by the rock magnetic community, however many of these programs are either outdated or have obstacles inhibiting their usability. MAX UnMix is a web application (available online at http://www.irm.umn.edu/maxunmix), built using the shiny package for R studio, that can be used for unmixing coercivity distributions derived from magnetization curves. Here, we describe in detail the statistical model underpinning the MAX UnMix web application and discuss the programs functionality. MAX UnMix is an improvement over previous unmixing programs in that it is designed to be user friendly, runs as an independent website, and is platform independent.
Spectrally resolved visualization of fluorescent dyes permeating into skin
NASA Astrophysics Data System (ADS)
Maeder, Ulf; Bergmann, Thorsten; Beer, Sebastian; Burg, Jan Michael; Schmidts, Thomas; Runkel, Frank; Fiebich, Martin
2012-03-01
We present a spectrally resolved confocal imaging approach to qualitatively asses the overall uptake and the penetration depth of fluorescent dyes into biological tissue. We use a confocal microscope with a spectral resolution of 5 nm to measure porcine skin tissue after performing a Franz-Diffusion experiment with a submicron emulsion enriched with the fluorescent dye Nile Red. The evaluation uses linear unmixing of the dye and the tissue autofluorescence spectra. The results are combined with a manual segmentation of the skin's epidermis and dermis layers to assess the penetration behavior additionally to the overall uptake. The diffusion experiments, performed for 3h and 24h, show a 3-fold increased dye uptake in the epidermis and dermis for the 24h samples. As the method is based on spectral information it does not face the problem of superimposed dye and tissue spectra and therefore is more precise compared to intensity based evaluation methods.
Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath
2009-01-01
Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697
Arctic lead detection using a waveform unmixing algorithm from CryoSat-2 data
NASA Astrophysics Data System (ADS)
Lee, S.; Im, J.
2016-12-01
Arctic areas consist of ice floes, leads, and polynyas. While leads and polynyas account for small parts in the Arctic Ocean, they play a key role in exchanging heat flux, moisture, and momentum between the atmosphere and ocean in wintertime because of their huge temperature difference In this study, a linear waveform unmixing approach was proposed to detect lead fraction. CryoSat-2 waveforms for pure leads, sea ice, and ocean were used as end-members based on visual interpretation of MODIS images coincident with CryoSat-2 data. The unmixing model produced lead, sea ice, and ocean abundances and a threshold (> 0.7) was applied to make a binary classification between lead and sea ice. The unmixing model produced better results than the existing models in the literature, which are based on simple thresholding approaches. The results were also comparable with our previous research using machine learning based models (i.e., decision trees and random forest). A monthly lead fraction was calculated, dividing the number of detected leads by the total number of measurements. The lead fraction around Beaufort Sea and Fram strait was high due to the anti-cyclonic rotation of Beaufort Gyre and the outflows of sea ice to the Atlantic. The lead fraction maps produced in this study were matched well with monthly lead fraction maps in the literature. The areas with thin sea ice identified from our previous research correspond to the high lead fraction areas in the present study. Furthermore, sea ice roughness from ASCAT scatterometer was compared to a lead fraction map to see the relationship between surface roughness and lead distribution.
GAO, L.; HAGEN, N.; TKACZYK, T.S.
2012-01-01
Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127
Hyperspectral light sheet microscopy
NASA Astrophysics Data System (ADS)
Jahr, Wiebke; Schmid, Benjamin; Schmied, Christopher; Fahrbach, Florian O.; Huisken, Jan
2015-09-01
To study the development and interactions of cells and tissues, multiple fluorescent markers need to be imaged efficiently in a single living organism. Instead of acquiring individual colours sequentially with filters, we created a platform based on line-scanning light sheet microscopy to record the entire spectrum for each pixel in a three-dimensional volume. We evaluated data sets with varying spectral sampling and determined the optimal channel width to be around 5 nm. With the help of these data sets, we show that our setup outperforms filter-based approaches with regard to image quality and discrimination of fluorophores. By spectral unmixing we resolved overlapping fluorophores with up to nanometre resolution and removed autofluorescence in zebrafish and fruit fly embryos.
Hyperspectral light sheet microscopy.
Jahr, Wiebke; Schmid, Benjamin; Schmied, Christopher; Fahrbach, Florian O; Huisken, Jan
2015-09-02
To study the development and interactions of cells and tissues, multiple fluorescent markers need to be imaged efficiently in a single living organism. Instead of acquiring individual colours sequentially with filters, we created a platform based on line-scanning light sheet microscopy to record the entire spectrum for each pixel in a three-dimensional volume. We evaluated data sets with varying spectral sampling and determined the optimal channel width to be around 5 nm. With the help of these data sets, we show that our setup outperforms filter-based approaches with regard to image quality and discrimination of fluorophores. By spectral unmixing we resolved overlapping fluorophores with up to nanometre resolution and removed autofluorescence in zebrafish and fruit fly embryos.
NASA Astrophysics Data System (ADS)
Schmalz, M.; Ritter, G.; Key, R.
Accurate and computationally efficient spectral signature classification is a crucial step in the nonimaging detection and recognition of spaceborne objects. In classical hyperspectral recognition applications using linear mixing models, signature classification accuracy depends on accurate spectral endmember discrimination [1]. If the endmembers cannot be classified correctly, then the signatures cannot be classified correctly, and object recognition from hyperspectral data will be inaccurate. In practice, the number of endmembers accurately classified often depends linearly on the number of inputs. This can lead to potentially severe classification errors in the presence of noise or densely interleaved signatures. In this paper, we present an comparison of emerging technologies for nonimaging spectral signature classfication based on a highly accurate, efficient search engine called Tabular Nearest Neighbor Encoding (TNE) [3,4] and a neural network technology called Morphological Neural Networks (MNNs) [5]. Based on prior results, TNE can optimize its classifier performance to track input nonergodicities, as well as yield measures of confidence or caution for evaluation of classification results. Unlike neural networks, TNE does not have a hidden intermediate data structure (e.g., the neural net weight matrix). Instead, TNE generates and exploits a user-accessible data structure called the agreement map (AM), which can be manipulated by Boolean logic operations to effect accurate classifier refinement algorithms. The open architecture and programmability of TNE's agreement map processing allows a TNE programmer or user to determine classification accuracy, as well as characterize in detail the signatures for which TNE did not obtain classification matches, and why such mis-matches occurred. In this study, we will compare TNE and MNN based endmember classification, using performance metrics such as probability of correct classification (Pd) and rate of false detections (Rfa). As proof of principle, we analyze classification of multiple closely spaced signatures from a NASA database of space material signatures. Additional analysis pertains to computational complexity and noise sensitivity, which are superior to Bayesian techniques based on classical neural networks. [1] Winter, M.E. "Fast autonomous spectral end-member determination in hyperspectral data," in Proceedings of the 13th International Conference On Applied Geologic Remote Sensing, Vancouver, B.C., Canada, pp. 337-44 (1999). [2] N. Keshava, "A survey of spectral unmixing algorithms," Lincoln Laboratory Journal 14:55-78 (2003). [3] Key, G., M.S. SCHMALZ, F.M. Caimi, and G.X. Ritter. "Performance analysis of tabular nearest neighbor encoding algorithm for joint compression and ATR", in Proceedings SPIE 3814:115-126 (1999). [4] Schmalz, M.S. and G. Key. "Algorithms for hyperspectral signature classification in unresolved object detection using tabular nearest neighbor encoding" in Proceedings of the 2007 AMOS Conference, Maui HI (2007). [5] Ritter, G.X., G. Urcid, and M.S. Schmalz. "Autonomous single-pass endmember approximation using lattice auto-associative memories", Neurocomputing (Elsevier), accepted (June 2008).
NASA Astrophysics Data System (ADS)
Higgins, M. A.; Asner, G. P.; Perez, E.; Elespuru, N.; Alonso, A.
2014-03-01
Tropical forests vary substantially in aboveground properties such as canopy height, canopy structure, and plant species composition, corresponding to underlying variations in soils and geology. Forest properties are often difficult to detect and map in the field, however, due to the remoteness and inaccessibility of these forests. Spectral mixture analysis of Landsat imagery allows mapping of photosynthetic and nonphotosynthetic vegetation quantities (PV and NPV), corresponding to biophysical properties such as canopy openness, forest productivity, and disturbance. Spectral unmixing has been used for applications ranging from deforestation monitoring to identifying burn scars from past fires, but little is known about variations in PV and NPV in intact rainforest. Here we use spectral unmixing of Landsat imagery to map PV and NPV in northern Amazonia, and to test their relationship to soils and plant species composition. To do this we sampled 117 sites crossing a geological boundary in northwestern Amazonia for soil cation concentrations and plant species composition. We then used the Carnegie Landsat Analysis System to map PV and NPV for these sites from multiple dates of Landsat imagery. We found that soil cation concentrations and plant species composition consistently explain a majority of the variation in remotely sensed PV and NPV values. After combining PV and NPV into a single variable (PV-NPV), we determined that the influence of soil properties on canopy properties was inseparable from the influence of plant species composition. In all cases, patterns in PV and NPV corresponded to underlying geological patterns. Our findings suggest that geology and soils regulate canopy PV and NPV values in intact tropical forest, possibly through changes in plant species composition.
NASA Astrophysics Data System (ADS)
Higgins, M. A.; Asner, G. P.; Perez, E.; Elespuru, N.; Alonso, A.
2014-07-01
Tropical forests vary substantially in aboveground properties such as canopy height, canopy structure, and plant species composition, corresponding to underlying variations in soils and geology. Forest properties are often difficult to detect and map in the field, however, due to the remoteness and inaccessibility of these forests. Spectral mixture analysis of Landsat imagery allows mapping of photosynthetic and nonphotosynthetic vegetation quantities (PV and NPV), corresponding to biophysical properties such as canopy openness, forest productivity, and disturbance. Spectral unmixing has been used for applications ranging from deforestation monitoring to identifying burn scars from past fires, but little is known about variations in PV and NPV in intact rainforests. Here we use spectral unmixing of Landsat imagery to map PV and NPV in northern Amazonia, and to test their relationship to soils and plant species composition. To do this we sampled 117 sites crossing a geological boundary in northwestern Amazonia for soil cation concentrations and plant species composition. We then used the Carnegie Landsat Analysis System to map PV and NPV for these sites from multiple dates of Landsat imagery. We found that soil cation concentrations and plant species composition consistently explain a majority of the variation in remotely sensed PV and NPV values. After combining PV and NPV into a single variable (PV-NPV), we determined that the influence of soil properties on canopy properties was inseparable from the influence of plant species composition. In all cases, patterns in PV and NPV corresponded to underlying geological patterns. Our findings suggest that geology and soils regulate canopy PV and NPV values in intact tropical forests, possibly through changes in plant species composition.
Unmixing techniques for better segmentation of urban zones, roads, and open pit mines
NASA Astrophysics Data System (ADS)
Nikolov, Hristo; Borisova, Denitsa; Petkov, Doyno
2010-10-01
In this paper the linear unmixing method has been applied in classification of manmade objects, namely urbanized zones, roads etc. The idea is to exploit to larger extent the possibilities offered by multispectral imagers having mid spatial resolution in this case TM/ETM+ instruments. In this research unmixing is used to find consistent regression dependencies between multispectral data and those gathered in-situ and airborne-based sensors. The correct identification of the mixed pixels is key element for the subsequent segmentation forming the shape of the artificial feature is determined much more reliable. This especially holds true for objects with relatively narrow structure for example two-lane roads for which the spatial resolution is larger that the object itself. We have combined ground spectrometry of asphalt, Landsat images of RoI, and in-situ measured asphalt in order to determine the narrow roads. The reflectance of paving stones made from granite is highest compared to another ones which is true for open and stone pits. The potential for mapping is not limited to the mid-spatial Landsat data, but also may be used if the data has higher spatial resolution (as fine as 0.5 m). In this research the spectral and directional reflection properties of asphalt and concrete surfaces compared to those of paving stone made from different rocks have been measured. The in-situ measurements, which plays key role have been obtained using the Thematically Oriented Multichannel Spectrometer (TOMS) - designed in STIL-BAS.
NASA Astrophysics Data System (ADS)
Feng, Jilu; Rogge, Derek; Rivard, Benoit
2018-02-01
This study investigates using the Airborne Hyperspectral Imaging Systems (AISA) visible and short-wave infrared (SWIR) and Spatially Enhanced Broadband Array Spectrograph System (SEBASS) longwave infrared (LWIR) (2 and 4 m spatial resolution, respectively) imagery independently and in combination to produce detailed lithologic maps in a subarctic region (Cape Smith Belt, Nunavik, Canada) where regionally metamorphosed lower greenschist mafic, ultramafic and sedimentary rocks are exposed in the presence of lichen coatings. We make use of continuous wavelet analysis (CWA) to improve the radiometric quality of the imagery through the minimization of random noise and the enhancement of spectral features, the minimization of residual errors in the ISAC radiometric correction and target temperature estimation in the case of the LWIR data, the minimization of line to line residual calibration effects that lead to inconsistencies in data mosaics, and the reduction in variability of the spectral continuum introduced by variable illumination and topography. The use of CWA also provides a platform to directly combine the wavelet scale spectral profiles of the SWIR and LWIR after applying a scalar correction factor to the LWIR such that the dynamic range of two data sets have equal weight. This is possible using CWA as the datasets are normalized to a zero mean allowing spectra from different spectral regions to be adjoined. Lithologic maps are generated using an iterative spectral unmixing approach with image spectral endmembers extracted from the SWIR and LWIR imagery based on locations defined from previous work of the study area and field mapping information. Unmixing results of the independent SWIR and LWIR data, and the combined data show clear benefits to using the CWA combined imagery. The analysis showed SWIR and LWIR imagery highlight similar regions and spatial distributions for the three ultramafic units (dunite, peridotite, pyroxenite). However, significant differences are observed for quartz-rich sediments, with the SWIR overestimating the distribution of these rocks whereas the LWIR provided more consistent results compared with existing maps. Both SWIR and LWIR imagery were impacted by the pervasive lichen coatings on the mafic rocks (basalts and gabbros), although the SWIR provided better results than the LWIR. Limitations observed for the independent data sets were removed using the combined spectral data resulting in all geologically meaningful units mapped correctly in comparison with existing geological maps.
NASA Astrophysics Data System (ADS)
Sun, Yuansheng; Booker, Cynthia F.; Day, Richard N.; Periasamy, Ammasi
2009-02-01
Förster resonance energy transfer (FRET) methodology has been used for over 30 years to localize protein-protein interactions in living specimens. The cloning and modification of various visible fluorescent proteins (FPs) has generated a variety of new probes that can be used as FRET pairs to investigate the protein associations in living cells. However, the spectral cross-talk between FRET donor and acceptor channels has been a major limitation to FRET microscopy. Many investigators have developed different ways to eliminate the bleedthrough signals in the FRET channel for one donor and one acceptor. We developed a novel FRET microscopy method for studying interactions among three chromophores: three-color FRET microscopy. We generated a genetic construct that directly links the three FPs - monomeric teal FP (mTFP), Venus and tandem dimer Tomato (tdTomato), and demonstrated the occurrence of mutually dependent energy transfers among the three FPs. When expressed in cells and excited with the 458 nm laser line, the mTFP-Venus-tdTomato fusion proteins yielded parallel (mTFP to Venus and mTFP to tdTomato) and sequential (mTFP to Venus and then to tdTomato) energy transfer signals. To quantify the FRET signals in the three-FP system in a single living cell, we developed an algorithm to remove all the spectral cross-talk components and also to separate different FRET signals at a same emission channel using the laser scanning spectral imaging and linear unmixing techniques on the Zeiss510 META system. Our results were confirmed with fluorescence lifetime measurements and using acceptor photobleaching FRET microscopy.
NASA Technical Reports Server (NTRS)
Warner, Amanda Susan
2002-01-01
The High Plains is an economically important and climatologically sensitive region of the United States and Canada. The High Plains contain 100,000 sq km of Holocene sand dunes and sand sheets that are currently stabilized by natural vegetation. Droughts and the larger threat of global warming are climate phenomena that could cause depletion of natural vegetation and make this region susceptible to sand dune reactivation. This thesis is part of a larger study that is assessing the effect of climate variability on the natural vegetation that covers the High Plains using Landsat 5 and Landsat 7 data. The question this thesis addresses is how can fractional vegetation cover be mapped with the Landsat instruments using linear spectral mixture analysis and to what accuracy. The method discussed in this thesis made use of a high spatial and spectral resolution sensor called AVIRIS (Airborne Visible and Infrared Imaging Spectrometer) and field measurements to test vegetation mapping in three Landsat 7 sub-scenes. Near-simultaneous AVIRIS images near Ft. Morgan, Colorado and near Logan, New Mexico were acquired on July 10, 1999 and September 30, 1999, respectively. The AVIRIS flights preceded Landsat 7 overpasses by approximately one hour. These data provided the opportunity to test spectral mixture algorithms with AVIRIS and to use these data to constrain the multispectral mixed pixels of Landsat 7. The comparisons of mixture analysis between the two instruments showed that AVIRIS endmembers can be used to unmix Landsat 7 data with good estimates of soil cover, and reasonable estimates of non-photosynthetic vegetation and green vegetation. Landsat 7 derived image endmembers correlate with AVIRIS fractions, but the error is relatively large and does not give a precise estimate of cover.
Transcutaneous Raman Spectroscopy of Bone
NASA Astrophysics Data System (ADS)
Maher, Jason R.
Clinical diagnoses of bone health and fracture risk typically rely upon measurements of bone density or structure, but the strength of a bone is also dependent upon its chemical composition. One technology that has been used extensively in ex vivo, exposed-bone studies to measure the chemical composition of bone is Raman spectroscopy. This spectroscopic technique provides chemical information about a sample by probing its molecular vibrations. In the case of bone tissue, Raman spectra provide chemical information about both the inorganic mineral and organic matrix components, which each contribute to bone strength. To explore the relationship between bone strength and chemical composition, our laboratory has contributed to ex vivo, exposed-bone animal studies of rheumatoid arthritis, glucocorticoid-induced osteoporosis, and prolonged lead exposure. All of these studies suggest that Raman-based predictions of biomechanical strength may be more accurate than those produced by the clinically-used parameter of bone mineral density. The utility of Raman spectroscopy in ex vivo, exposed-bone studies has inspired attempts to perform bone spectroscopy transcutaneously. Although the results are promising, further advancements are necessary to make non-invasive, in vivo measurements of bone that are of sufficient quality to generate accurate predictions of fracture risk. In order to separate the signals from bone and soft tissue that contribute to a transcutaneous measurement, we developed an overconstrained extraction algorithm that is based upon fitting with spectral libraries derived from separately-acquired measurements of the underlying tissue components. This approach allows for accurate spectral unmixing despite the fact that similar chemical components (e.g., type I collagen) are present in both soft tissue and bone and was applied to experimental data in order to transcutaneously detect, to our knowledge for the first time, age- and disease-related spectral differences in murine bone.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kumar, U.; Nemani, R. R.; Kalia, S.; Michaelis, A.
2016-12-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS - national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91% was achieved, which is a 6% improvement in unmixing based classification relative to per-pixel based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
Unmixing AVHRR Imagery to Assess Clearcuts and Forest Regrowth in Oregon
NASA Technical Reports Server (NTRS)
Hlavka, Christine A.; Spanner, Michael A.
1995-01-01
Advanced Very High Resolution Radiometer imagery provides frequent and low-cost coverage of the earth, but its coarse spatial resolution (approx. 1.1 km by 1.1 km) does not lend itself to standard techniques of automated categorization of land cover classes because the pixels are generally mixed; that is, the extent of the pixel includes several land use/cover classes. Unmixing procedures were developed to extract land use/cover class signatures from mixed pixels, using Landsat Thematic Mapper data as a source for the training set, and to estimate fractions of class coverage within pixels. Application of these unmixing procedures to mapping forest clearcuts and regrowth in Oregon indicated that unmixing is a promising approach for mapping major trends in land cover with AVHRR bands 1 and 2. Including thermal bands by unmixing AVHRR bands 1-4 did not lead to significant improvements in accuracy, but experiments with unmixing these four bands did indicate that use of weighted least squares techniques might lead to improvements in other applications of unmixing.
NASA Astrophysics Data System (ADS)
Cui, Qian; Shi, Jiancheng; Xu, Yuanliu
2011-12-01
Water is the basic needs for human society, and the determining factor of stability of ecosystem as well. There are lots of lakes on Tibet Plateau, which will lead to flood and mudslide when the water expands sharply. At present, water area is extracted from TM or SPOT data for their high spatial resolution; however, their temporal resolution is insufficient. MODIS data have high temporal resolution and broad coverage. So it is valuable resource for detecting the change of water area. Because of its low spatial resolution, mixed-pixels are common. In this paper, four spectral libraries are built using MOD09A1 product, based on that, water body is extracted in sub-pixels utilizing Multiple Endmembers Spectral Mixture Analysis (MESMA) using MODIS daily reflectance data MOD09GA. The unmixed result is comparing with contemporaneous TM data and it is proved that this method has high accuracy.
NASA Astrophysics Data System (ADS)
Leverington, D. W.
2008-12-01
The use of remote-sensing techniques in the discrimination of rock and soil classes in northern regions can help support a diverse range of activities including environmental characterization, mineral exploration, and the study of Quaternary paleoenvironments. Images of low spectral resolution can commonly be used in the mapping of lithological classes possessing distinct spectral characteristics, but hyperspectral databases offer greater potential for discrimination of materials distinguished by more subtle reflectance properties. Orbiting sensors offer an especially flexible and cost-effective means for acquisition of data to workers unable to conduct airborne surveys. In an effort to better constrain the utility of hyperspectral datasets in northern research, this study undertook to investigate the effectiveness of EO-1 Hyperion data in the discrimination and mapping of surface classes at a study area on Melville Island, Nunavut. Bedrock units in the immediate study area consist of late-Paleozoic clastic and carbonate sequences of the Sverdrup Basin. Weathered and frost-shattered felsenmeer, predominantly taking the form of boulder- to pebble-sized clasts that have accumulated in place and that mantle parent bedrock units, is the most common surface material in the study area. Hyperion data were converted from at-sensor radiance to reflectance, and were then linearly unmixed on the basis of end-member spectra measured from field samples. Hyperion unmixing results effectively portray the general fractional cover of six end members, although the fraction images of several materials contain background values that in some areas overestimate surface exposure. The best separated end members include the snow, green vegetation, and red-weathering sandstone classes, whereas the classes most negatively affected by elevated fraction values include the mudstone, limestone, and 'other' sandstone classes. Local overestimates of fractional cover are likely related to the shared lithological and weathering characteristics of several clastic and carbonate units, and may also be related to the lower radiometric precision characteristic of Hyperion data. Despite these issues, the databases generated in this study successfully provide useful complementary information to that provided by maps of local bedrock geology.
Noise estimation for hyperspectral imagery using spectral unmixing and synthesis
NASA Astrophysics Data System (ADS)
Demirkesen, C.; Leloglu, Ugur M.
2014-10-01
Most hyperspectral image (HSI) processing algorithms assume a signal to noise ratio model in their formulation which makes them dependent on accurate noise estimation. Many techniques have been proposed to estimate the noise. A very comprehensive comparative study on the subject is done by Gao et al. [1]. In a nut-shell, most techniques are based on the idea of calculating standard deviation from assumed-to-be homogenous regions in the image. Some of these algorithms work on a regular grid parameterized with a window size w, while others make use of image segmentation in order to obtain homogenous regions. This study focuses not only to the statistics of the noise but to the estimation of the noise itself. A noise estimation technique motivated from a recent HSI de-noising approach [2] is proposed in this study. The denoising algorithm is based on estimation of the end-members and their fractional abundances using non-negative least squares method. The end-members are extracted using the well-known simplex volume optimization technique called NFINDR after manual selection of number of end-members and the image is reconstructed using the estimated endmembers and abundances. Actually, image de-noising and noise estimation are two sides of the same coin: Once we denoise an image, we can estimate the noise by calculating the difference of the de-noised image and the original noisy image. In this study, the noise is estimated as described above. To assess the accuracy of this method, the methodology in [1] is followed, i.e., synthetic images are created by mixing end-member spectra and noise. Since best performing method for noise estimation was spectral and spatial de-correlation (SSDC) originally proposed in [3], the proposed method is compared to SSDC. The results of the experiments conducted with synthetic HSIs suggest that the proposed noise estimation strategy outperforms the existing techniques in terms of mean and standard deviation of absolute error of the estimated noise. Finally, it is shown that the proposed technique demonstrated a robust behavior to the change of its single parameter, namely the number of end-members.
The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data
NASA Technical Reports Server (NTRS)
Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.
1992-01-01
The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.
NASA Astrophysics Data System (ADS)
Amaral, Cibele H.; Roberts, Dar A.; Almeida, Teodoro I. R.; Souza Filho, Carlos R.
2015-10-01
Biological invasion substantially contributes to the increasing extinction rates of native vegetative species. The remote detection and mapping of invasive species is critical for environmental monitoring. This study aims to assess the performance of a Multiple Endmember Spectral Mixture Analysis (MESMA) applied to imaging spectroscopy data for mapping Dendrocalamus sp. (bamboo) and Pinus elliottii L. (slash pine), which are invasive plant species, in a Brazilian neotropical landscape within the tropical Brazilian savanna biome. The work also investigates the spectral mixture between these exotic species and the native woody formations, including woodland savanna, submontane and alluvial seasonal semideciduous forests (SSF). Visible to Shortwave Infrared (VSWIR) imaging spectroscopy data at one-meter spatial resolution were atmospherically corrected and subset into the different spectral ranges (VIS-NIR1: 530-919 nm; and NIR2-SWIR: 1141-2352 nm). The data were further normalized via continuum removal (CR). Multiple endmember selection methods, including Interactive Endmember Selection (IES), Endmember average root mean square error (EAR), Minimum average spectral angle (MASA) and Count-based (CoB) (collectively called EMC), were employed to create endmember libraries for the targeted vegetation classes. The performance of the MESMA was assessed at the pixel and crown scales. Statistically significant differences (α = 0.05) were observed between overall accuracies that were obtained at various spectral ranges. The infrared region (IR) was critical for detecting the vegetation classes using spectral data. The invasive species endmembers exhibited spectral patterns in the IR that were not observed in the native formations. Bamboo was characterized as having a high green vegetation (GV) fraction, lower non-photosynthetic vegetation (NPV) and a low shade fraction, while pine exhibited higher NPV and shade fractions. The invasive species showed a statistically significant larger number of spectra erroneously assigned to the woodland savanna class versus the alluvial and submontane SSF classes. Consequently, the invasive species tended to be overestimated, especially in the woodland savanna. Bamboo was best classified using the VSWIR(CR) data with the EMC endmember selection method (User's accuracy and Producer's accuracy = 98.11% and 72.22%, respectively). Pine was best classified using NIR2-SWIR(CR) data with the IES selected endmembers (97.06% and 62.26%, respectively). The results obtained during the two-endmember modeling were fully translated into the three-endmember unmixed images. The sub-pixel invasive species abundance analysis showed that MESMA performs well when unmixing at the pixel scale and for mapping invasive species fractions in a complex neotropical environment, at pixel and crown scales with 1-m spatial resolution data.
Rowan, L.C.
1998-01-01
The advanced spaceborne thermal emission and reflection (ASTER) radiometer was designed to record reflected energy in nine channels with 15 or 30 m resolution, including stereoscopic images, and emitted energy in five channels with 90 m resolution from the NASA Earth Observing System AM1 platform. A simulated ASTER data set was produced for the Iron Hill, Colorado, study area by resampling calibrated, registered airborne visible/infrared imaging spectrometer (AVIRIS) data, and thermal infrared multispectral scanner (TIMS) data to the appropriate spatial and spectral parameters. A digital elevation model was obtained to simulate ASTER-derived topographic data. The main lithologic units in the area are granitic rocks and felsite into which a carbonatite stock and associated alkalic igneous rocks were intruded; these rocks are locally covered by Jurassic sandstone, Tertiary rhyolitic tuff, and colluvial deposits. Several methods were evaluated for mapping the main lithologic units, including the unsupervised classification and spectral curve-matching techniques. In the five thermal-infrared (TIR) channels, comparison of the results of linear spectral unmixing and unsupervised classification with published geologic maps showed that the main lithologic units were mapped, but large areas with moderate to dense tree cover were not mapped in the TIR data. Compared to TIMS data, simulated ASTER data permitted slightly less discrimination in the mafic alkalic rock series, and carbonatite was not mapped in the TIMS nor in the simulated ASTER TIR data. In the nine visible and near-infrared channels, unsupervised classification did not yield useful results, but both the spectral linear unmixing and the matched filter techniques produced useful results, including mapping calcitic and dolomitic carbonatite exposures, travertine in hot spring deposits, kaolinite in argillized sandstone and tuff, and muscovite in sericitized granite and felsite, as well as commonly occurring illite/muscovite. However, the distinction made in AVIRIS data between calcite and dolomite was not consistently feasible in the simulated ASTER data. Comparison of the lithologic information produced by spectral analysis of the simulated ASTER data to a photogeologic interpretation of a simulated ASTER color image illustrates the high potential of spectral analysis of ASTER data to geologic interpretation. This paper is not subject to U.S. copyright. Published in 1998 by the American Geophysical Union.
Different techniques of multispectral data analysis for vegetation fraction retrieval
NASA Astrophysics Data System (ADS)
Kancheva, Rumiana; Georgiev, Georgi
2012-07-01
Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.
Suppression of vegetation in LANDSAT ETM+ remote sensing images
NASA Astrophysics Data System (ADS)
Yu, Le; Porwal, Alok; Holden, Eun-Jung; Dentith, Michael
2010-05-01
Vegetation cover is an impediment to the interpretation of multispectral remote sensing images for geological applications, especially in densely vegetated terrains. In order to enhance the underlying geological information in such terrains, it is desirable to suppress the reflectance component of vegetation. One form of spectral unmixing that has been successfully used for vegetation reflectance suppression in multispectral images is called "forced invariance". It is based on segregating components of the reflectance spectrum that are invariant with respect to a specific spectral index such as the NDVI. The forced invariance method uses algorithms such as software defoliation. However, the outputs of software defoliation are single channel data, which are not amenable to geological interpretations. Crippen and Blom (2001) proposed a new forced invariance algorithm that utilizes band statistics, rather than band ratios. The authors demonstrated the effectiveness of their algorithms on a LANDSAT TM scene from Nevada, USA, especially in open canopy areas in mixed and semi-arid terrains. In this presentation, we report the results of our experimentation with this algorithm on a densely to sparsely vegetated Landsat ETM+ scene. We selected a scene (Path 119, Row 39) acquired on 18th July, 2004. Two study areas located around the city of Hangzhou, eastern China were tested. One of them covers uninhabited hilly terrain characterized by low rugged topography, parts of the hills are densely vegetated; another one covers both inhabited urban areas and uninhabited hilly terrain, which is densely vegetated. Crippen and Blom's algorithm is implemented in the following sequential steps: (1) dark pixel correction; (2) vegetation index calculation; (3) estimation of statistical relationship between vegetation index and digital number (DN) values for each band; (4) calculation of a smooth best-fit curve for the above relationships; and finally, (5) selection of a target average DN value and scaling all pixels at each vegetation index level by an amount that shifts the curve to the target digital number (DN). The main drawback of their algorithm is severe distortions of the DN values of non-vegetated areas, a suggested solution is masking outliers such as cloud, water, etc. We therefore extend this algorithm by masking non-vegetated areas. Our algorithm comprises the following three steps: (1) masking of barren or sparsely vegetated areas using a threshold based on a vegetation index that is calculated after atmosphere correction (dark pixel correction and ACTOR were compared) in order to conserve their original spectral information through the subsequent processing; (2) applying Crippen and Blom's forced invariance algorithm to suppress the spectral response of vegetation only in vegetated areas; and (3) combining the processed vegetated areas with the masked barren or sparsely vegetated areas followed by histogram equalization to eliminate the differences in color-scales between these two types of areas, and enhance the integrated image. The output images of both study areas showed significant improvement over the original images in terms of suppression of vegetation reflectance and enhancement of the underlying geological information. The processed images show clear banding, probably associated with lithological variations in the underlying rock formations. The colors of non-vegetated pixels are distorted in the unmasked results but in the same location the pixels in the masked results show regions of higher contrast. We conclude that the algorithm offers an effective way to enhance geological information in LANDSAT TM/ETM+ images of terrains with significant vegetation cover. It is also suitable to other multispectral satellite data have bands in similar wavelength regions. In addition, an application of this method to hyperspectral data may be possible as long as it can provide the vegetation band ratios.
Pisharady, Pramod Kumar; Sotiropoulos, Stamatios N; Sapiro, Guillermo; Lenglet, Christophe
2017-09-01
We propose a sparse Bayesian learning algorithm for improved estimation of white matter fiber parameters from compressed (under-sampled q-space) multi-shell diffusion MRI data. The multi-shell data is represented in a dictionary form using a non-monoexponential decay model of diffusion, based on continuous gamma distribution of diffusivities. The fiber volume fractions with predefined orientations, which are the unknown parameters, form the dictionary weights. These unknown parameters are estimated with a linear un-mixing framework, using a sparse Bayesian learning algorithm. A localized learning of hyperparameters at each voxel and for each possible fiber orientations improves the parameter estimation. Our experiments using synthetic data from the ISBI 2012 HARDI reconstruction challenge and in-vivo data from the Human Connectome Project demonstrate the improvements.
NASA Astrophysics Data System (ADS)
Zimmermann, Robert; Brandmeier, Melanie; Andreani, Louis; Gloaguen, Richard
2015-04-01
Remote sensing data can provide valuable information about ore deposits and their alteration zones at surface level. High spectral and spatial resolution of the data is essential for detailed mapping of mineral abundances and related structures. Carbonatites are well known for hosting economic enrichments in REE, Ta, Nb and P (Jones et al. 2013). These make them a preferential target for exploration for those critical elements. In this study we show how combining geomorphic, textural and spectral data improves classification result. We selected a site with a well-known occurrence in northern Namibia: the Epembe dyke. For analysis LANDSAT 8, SRTM and airborne hyperspectral (HyMap) data were chosen. The overlapping data allows a multi-scale and multi-resolution approach. Results from data analysis were validated during fieldwork in 2014. Data was corrected for atmospherical and geometrical effects. Image classification, mineral mapping and tectonic geomorphology allow a refinement of the geological map by lithological mapping in a second step. Detailed mineral abundance maps were computed using spectral unmixing techniques. These techniques are well suited to map abundances of carbonate minerals, but not to discriminate the carbonatite itself from surrounding rocks with similar spectral signatures. Thus, geometric indices were calculated using tectonic geomorphology and textures. For this purpose the TecDEM-toolbox (SHAHZAD & GLOAGUEN 2011) was applied to the SRTM-data for geomorphic analysis. Textural indices (e.g. uniformity, entropy, angular second moment) were derived from HyMap and SRTM by a grey-level co-occurrence matrix (CLAUSI 2002). The carbonatite in the study area is ridge-forming and shows a narrow linear feature in the textural bands. Spectral and geometric information were combined using kohonen Self-Organizing Maps (SOM) for unsupervised clustering. The resulting class spectra were visually compared and interpreted. Classes with similar signatures were merged according to geological context. The major conclusions are: 1. Carbonate minerals can be mapped using spectral unmixing techniques. 2. Carbonatites are associated with specific geometric pattern 3. The combination of spectral and geometric information improves classification result and reduces misclassification. References Clausi, D. A. (2002): An analysis of co-occurrence texture statistics as a function of grey-level quantization. - Canadian Journal of Remote Sensing, 28 (1), 45-62 Jones, A. P., Genge, M. and Carmody, L (2013): Carbonate Melts and Carbonatites. - Reviews in Mineralogy & Geochemistry, 75, 289-322 Shahzad, F. & Gloaguen, R. (2011): TecDEM: A MATLAB based toolbox for tectonic geomorphology, Part 2: Surface dynamics and basin analysis. - Computers and Geosciences, 37 (2), 261-271
Widjaja, Effendi; Garland, Marc
2008-02-01
Raman microscopy was used in mapping mode to collect more than 1000 spectra in a 100 microm x 100 microm area from a commercial stamp. Band-target entropy minimization (BTEM) was then employed to unmix the mixture spectra in order to extract the pure component spectra of the samples. Three pure component spectral patterns with good signal-to-noise ratios were recovered, and their spatial distributions were determined. The three pure component spectral patterns were then identified as copper phthalocyanine blue, calcite-like material, and yellow organic dye material by comparison to known spectral libraries. The present investigation, consisting of (1) advanced curve resolution (blind-source separation) followed by (2) spectral data base matching, readily suggests extensions to authenticity and counterfeit studies of other types of commercial objects. The presence or absence of specific observable components form the basis for assessment. The present spectral analysis (BTEM) is applicable to highly overlapping spectral information. Since a priori information such as the number of components present and spectral libraries are not needed in BTEM, and since minor signals arising from trace components can be reconstructed, this analysis offers a robust approach to a wide variety of material problems involving authenticity and counterfeit issues.
The underlying philosophy of Unmix is to let the data speak for itself. Unmix seeks to solve the general mixture problem where the data are assumed to be a linear combination of an unknown number of sources of unknown composition, which contribute an unknown amount to each sample...
NASA Technical Reports Server (NTRS)
Cetin, Haluk
1999-01-01
The purpose of this project was to establish a new hyperspectral remote sensing laboratory at the Mid-America Remote sensing Center (MARC), dedicated to in situ and laboratory measurements of environmental samples and to the manipulation, analysis, and storage of remotely sensed data for environmental monitoring and research in ecological modeling using hyperspectral remote sensing at MARC, one of three research facilities of the Center of Reservoir Research at Murray State University (MSU), a Kentucky Commonwealth Center of Excellence. The equipment purchased, a FieldSpec FR portable spectroradiometer and peripherals, and ENVI hyperspectral data processing software, allowed MARC to provide hands-on experience, education, and training for the students of the Department of Geosciences in quantitative remote sensing using hyperspectral data, Geographic Information System (GIS), digital image processing (DIP), computer, geological and geophysical mapping; to provide field support to the researchers and students collecting in situ and laboratory measurements of environmental data; to create a spectral library of the cover types and to establish a World Wide Web server to provide the spectral library to other academic, state and Federal institutions. Much of the research will soon be published in scientific journals. A World Wide Web page has been created at the web site of MARC. Results of this project are grouped in two categories, education and research accomplishments. The Principal Investigator (PI) modified remote sensing and DIP courses to introduce students to ii situ field spectra and laboratory remote sensing studies for environmental monitoring in the region by using the new equipment in the courses. The PI collected in situ measurements using the spectroradiometer for the ER-2 mission to Puerto Rico project for the Moderate Resolution Imaging Spectrometer (MODIS) Airborne Simulator (MAS). Currently MARC is mapping water quality in Kentucky Lake and vegetation in the Land-Between-the Lakes (LBL) using Landsat-TM data. A Landsat-TM scene of the same day was obtained to relate ground measurements to the satellite data. A spectral library has been created for overstory species in LBL. Some of the methods, such as NPDF and IDFD techniques for spectral unmixing and reduction of effects of shadows in classifications- comparison of hyperspectral classification techniques, and spectral nonlinear and linear unmixing techniques, are being tested using the laboratory.
NASA Astrophysics Data System (ADS)
Favicchio, Rosy; Zacharakis, Giannis; Oikonomaki, Katerina; Zacharopoulos, Athanasios; Mamalaki, Clio; Ripoll, Jorge
2012-07-01
Detection of multiple fluorophores in conditions of low signal represents a limiting factor for the application of in vivo optical imaging techniques in immunology where fluorescent labels report for different functional characteristics. A noninvasive in vivo Multi-Spectral Normalized Epifluorescence Laser scanning (M-SNELS) method was developed for the simultaneous and quantitative detection of multiple fluorophores in low signal to noise ratios and used to follow T-cell activation and clonal expansion. Colocalized DsRed- and GFP-labeled T cells were followed in tandem during the mounting of an immune response. Spectral unmixing was used to distinguish the overlapping fluorescent emissions representative of the two distinct cell populations and longitudinal data reported the discrete pattern of antigen-driven proliferation. Retrieved values were validated both in vitro and in vivo with flow cytometry and significant correlation between all methodologies was achieved. Noninvasive M-SNELS successfully quantified two colocalized fluorescent populations and provides a valid alternative imaging approach to traditional invasive methods for detecting T cell dynamics.
Hyperspectral fluorescence imaging with multi wavelength LED excitation
NASA Astrophysics Data System (ADS)
Luthman, A. Siri; Dumitru, Sebastian; Quirós-Gonzalez, Isabel; Bohndiek, Sarah E.
2016-04-01
Hyperspectral imaging (HSI) can combine morphological and molecular information, yielding potential for real-time and high throughput multiplexed fluorescent contrast agent imaging. Multiplexed readout from targets, such as cell surface receptors overexpressed in cancer cells, could improve both sensitivity and specificity of tumor identification. There remains, however, a need for compact and cost effective implementations of the technology. We have implemented a low-cost wide-field multiplexed fluorescence imaging system, which combines LED excitation at 590, 655 and 740 nm with a compact commercial solid state HSI system operating in the range 600 - 1000 nm. A key challenge for using reflectance-based HSI is the separation of contrast agent fluorescence from the reflectance of the excitation light. Here, we illustrate how it is possible to address this challenge in software, using two offline reflectance removal methods, prior to least-squares spectral unmixing. We made a quantitative comparison of the methods using data acquired from dilutions of contrast agents prepared in well-plates. We then established the capability of our HSI system for non-invasive in vivo fluorescence imaging in small animals using the optimal reflectance removal method. The HSI presented here enables quantitative unmixing of at least four fluorescent contrast agents (Alexa Fluor 610, 647, 700 and 750) simultaneously in living mice. A successful unmixing of the four fluorescent contrast agents was possible both using the pure contrast agents and with mixtures. The system could in principle also be applied to imaging of ex vivo tissue or intraoperative imaging in a clinical setting. These data suggest a promising approach for developing clinical applications of HSI based on multiplexed fluorescence contrast agent imaging.
Hyperspectral processing in graphical processing units
NASA Astrophysics Data System (ADS)
Winter, Michael E.; Winter, Edwin M.
2011-06-01
With the advent of the commercial 3D video card in the mid 1990s, we have seen an order of magnitude performance increase with each generation of new video cards. While these cards were designed primarily for visualization and video games, it became apparent after a short while that they could be used for scientific purposes. These Graphical Processing Units (GPUs) are rapidly being incorporated into data processing tasks usually reserved for general purpose computers. It has been found that many image processing problems scale well to modern GPU systems. We have implemented four popular hyperspectral processing algorithms (N-FINDR, linear unmixing, Principal Components, and the RX anomaly detection algorithm). These algorithms show an across the board speedup of at least a factor of 10, with some special cases showing extreme speedups of a hundred times or more.
Spectrally Resolved Fiber Photometry for Multi-component Analysis of Brain Circuits.
Meng, Chengbo; Zhou, Jingheng; Papaneri, Amy; Peddada, Teja; Xu, Karen; Cui, Guohong
2018-04-25
To achieve simultaneous measurement of multiple cellular events in molecularly defined groups of neurons in vivo, we designed a spectrometer-based fiber photometry system that allows for spectral unmixing of multiple fluorescence signals recorded from deep brain structures in behaving animals. Using green and red Ca 2+ indicators differentially expressed in striatal direct- and indirect-pathway neurons, we were able to simultaneously monitor the neural activity in these two pathways in freely moving animals. We found that the activities were highly synchronized between the direct and indirect pathways within one hemisphere and were desynchronized between the two hemispheres. We further analyzed the relationship between the movement patterns and the magnitude of activation in direct- and indirect-pathway neurons and found that the striatal direct and indirect pathways coordinately control the dynamics and fate of movement. Published by Elsevier Inc.
Shallow sea-floor reflectance and water depth derived by unmixing multispectral imagery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bierwirth, P.N.; Lee, T.J.; Burne, R.V.
1993-03-01
A major problem for mapping shallow water zones by the analysis of remotely sensed data is that contrast effects due to water depth obscure and distort the special nature of the substrate. This paper outlines a new method which unmixes the exponential influence of depth in each pixel by employing a mathematical constraint. This leaves a multispectral residual which represents relative substrate reflectance. Input to the process are the raw multispectral data and water attenuation coefficients derived by the co-analysis of known bathymetry and remotely sensed data. Outputs are substrate-reflectance images corresponding to the input bands and a greyscale depthmore » image. The method has been applied in the analysis of Landsat TM data at Hamelin Pool in Shark Bay, Western Australia. Algorithm derived substrate reflectance images for Landsat TM bands 1, 2, and 3 combined in color represent the optimum enhancement for mapping or classifying substrate types. As a result, this color image successfully delineated features, which were obscured in the raw data, such as the distributions of sea-grasses, microbial mats, and sandy area. 19 refs.« less
Piqueras, Sara; Bedia, Carmen; Beleites, Claudia; Krafft, Christoph; Popp, Jürgen; Maeder, Marcel; Tauler, Romà; de Juan, Anna
2018-06-05
Data fusion of different imaging techniques allows a comprehensive description of chemical and biological systems. Yet, joining images acquired with different spectroscopic platforms is complex because of the different sample orientation and image spatial resolution. Whereas matching sample orientation is often solved by performing suitable affine transformations of rotation, translation, and scaling among images, the main difficulty in image fusion is preserving the spatial detail of the highest spatial resolution image during multitechnique image analysis. In this work, a special variant of the unmixing algorithm Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) for incomplete multisets is proposed to provide a solution for this kind of problem. This algorithm allows analyzing simultaneously images collected with different spectroscopic platforms without losing spatial resolution and ensuring spatial coherence among the images treated. The incomplete multiset structure concatenates images of the two platforms at the lowest spatial resolution with the image acquired with the highest spatial resolution. As a result, the constituents of the sample analyzed are defined by a single set of distribution maps, common to all platforms used and with the highest spatial resolution, and their related extended spectral signatures, covering the signals provided by each of the fused techniques. We demonstrate the potential of the new variant of MCR-ALS for multitechnique analysis on three case studies: (i) a model example of MIR and Raman images of pharmaceutical mixture, (ii) FT-IR and Raman images of palatine tonsil tissue, and (iii) mass spectrometry and Raman images of bean tissue.
Mapping Environmental Contaminants at Ray Mine, AZ
NASA Technical Reports Server (NTRS)
McCubbin, Ian; Lang, Harold
2000-01-01
Airborne Visible and InfraRed Imaging Spectrometer (AVIRIS) data was collected over Ray Mine as part of a demonstration project for the Environmental Protection Agency (EPA) through the Advanced Measurement Initiative (AMI). The overall goal of AMI is to accelerate adoption and application of advanced measurement technologies for cost effective environmental monitoring. The site was selected to demonstrate the benefit to EPA in using advanced remote sensing technologies for the detection of environmental contaminants due to the mineral extraction industry. The role of the Jet Propulsion Laboratory in this pilot study is to provide data as well as performing calibration, data analysis, and validation of the AVIRIS results. EPA is also interested in developing protocols that use commercial software to perform such work on other high priority EPA sites. Reflectance retrieval was performed using outputs generated by the MODTRAN radiative transfer model and field spectra collected for the purpose of calibration. We are presenting advanced applications of the ENVI software package using n-Dimensional Partial Unmixing to identify image-derived endmembers that best match target materials reference spectra from multiple spectral libraries. Upon identification of the image endmembers the Mixture Tuned Match Filter algorithm was applied to map the endmembers within each scene. Using this technique it was possible to map four different mineral classes that are associated with mine generated acid waste.
Automatic sub-pixel coastline extraction based on spectral mixture analysis using EO-1 Hyperion data
NASA Astrophysics Data System (ADS)
Hong, Zhonghua; Li, Xuesu; Han, Yanling; Zhang, Yun; Wang, Jing; Zhou, Ruyan; Hu, Kening
2018-06-01
Many megacities (such as Shanghai) are located in coastal areas, therefore, coastline monitoring is critical for urban security and urban development sustainability. A shoreline is defined as the intersection between coastal land and a water surface and features seawater edge movements as tides rise and fall. Remote sensing techniques have increasingly been used for coastline extraction; however, traditional hard classification methods are performed only at the pixel-level and extracting subpixel accuracy using soft classification methods is both challenging and time consuming due to the complex features in coastal regions. This paper presents an automatic sub-pixel coastline extraction method (ASPCE) from high-spectral satellite imaging that performs coastline extraction based on spectral mixture analysis and, thus, achieves higher accuracy. The ASPCE method consists of three main components: 1) A Water- Vegetation-Impervious-Soil (W-V-I-S) model is first presented to detect mixed W-V-I-S pixels and determine the endmember spectra in coastal regions; 2) The linear spectral mixture unmixing technique based on Fully Constrained Least Squares (FCLS) is applied to the mixed W-V-I-S pixels to estimate seawater abundance; and 3) The spatial attraction model is used to extract the coastline. We tested this new method using EO-1 images from three coastal regions in China: the South China Sea, the East China Sea, and the Bohai Sea. The results showed that the method is accurate and robust. Root mean square error (RMSE) was utilized to evaluate the accuracy by calculating the distance differences between the extracted coastline and the digitized coastline. The classifier's performance was compared with that of the Multiple Endmember Spectral Mixture Analysis (MESMA), Mixture Tuned Matched Filtering (MTMF), Sequential Maximum Angle Convex Cone (SMACC), Constrained Energy Minimization (CEM), and one classical Normalized Difference Water Index (NDWI). The results from the three test sites indicated that the proposed ASPCE method extracted coastlines more efficiently than did the compared methods, and its coastline extraction accuracy corresponded closely to the digitized coastline, with 0.39 pixels, 0.40 pixels, and 0.35 pixels in the three test regions, showing that the ASPCE method achieves an accuracy below 12.0 m (0.40 pixels). Moreover, in the quantitative accuracy assessment for the three test sites, the ASPCE method shows the best performance in coastline extraction, achieving a 0.35 pixel-level at the Bohai Sea, China test site. Therefore, the proposed ASPCE method can extract coastline more accurately than can the hard classification methods or other spectral unmixing methods.
MAX UnMix: Introducing a new web application for unmixing magnetic coercivity distributions
NASA Astrophysics Data System (ADS)
Feinberg, J. M.; Maxbauer, D.; Fox, D. L.
2016-12-01
Magnetic minerals are present in a wide variety of natural systems and are often indicative of the natural or anthropogenic processes that led to their deposition, formation, or transformation. Unmixing the contribution of magnetic components to bulk field-dependent magnetization curves has become increasingly common in environmental and rock magnetic studies and has enhanced our ability to fingerprint the magnetic signatures of magnetic minerals with distinct compositions, grain sizes, and origins. A variety of programs have been developed over the past two decades to allow researchers to deconvolve field-dependent magnetization curves for these purposes, however many of these programs are either outdated or have obstacles that inhibit the programs usability. MAX UnMix is a new web application (available online at http://www.irm.umn.edu/maxunmix) built using the `shiny' package for R-studio that can be used to process coercivity distributions derived from magnetization curves (acquisition, demagnetization, or backfield data) via an online user-interface. Here, we use example datasets from lake sediments and paleosols to present details of the MAX UnMix model and the programs functionality. MAX UnMix is designed to be accessible, user friendly, and should serve as a useful resource for future research.
NASA Technical Reports Server (NTRS)
Ramsey, Michael S.; Howard, Douglas A.; Christensen, Philip R.; Lancaster, Nicholas
1993-01-01
Mineral identification and mapping of alluvial material using thermal infrared (TIR) remote sensing is extremely useful for tracking sediment transport, assessing the degree of weathering and locating sediment sources. As a result of the linear relation between a mineral's percentage in a given area (image pixel) and the depth of its diagnostic spectral features, TIR spectra can be deconvolved in order to ascertain mineralogic percentages. Typical complications such as vegetation, particle size and thermal shadowing are minimized upon examination of dunes. Actively saltating dunes contain little to no vegetation, are very well sorted and lack the thermal shadows that arise from rocky terrain. The primary focus of this work was to use the Kelso Dunes as a test location for an accuracy analysis of temperature/emissivity separation and linear unmixing algorithms. Accurate determination of ground temperature and component discrimination will become key products of future ASTER data. A decorrelation stretch of the TIMS image showed clear color variations within the active dunes. Samples collected from these color units were analyzed for mineralogy, grain size, and separated into endmembers. This analysis not only revealed that the dunes contained significant mineralogic variation, but were more immature (low quartz percentage) than previously reported. Unmixing of the TIMS data using the primary mineral endmembers produced unique variations within the dunes and may indicate near, rather than far, source locales for the dunes. The Kelso Dunes lie in the eastern Mojave Desert, California, approximately 95 km west of the Colorado River. The primary dune field is contained within a topographic basin bounded by the Providence, Granite Mountains, with the active region marked by three northeast trending linear ridges. Although active, the dunes appear to lie at an opposing regional wind boundary which produces little net movement of the crests. Previous studies have estimated the dunes range from 70% to 90% quartz mainly derived from a source 40 km to the west. The dune field is assumed to have formed in a much more arid climate than present, with the age of the deposit estimated at greater than 100,000 years.
Quantitative Analysis of Immunohistochemistry in Melanoma Tumors
Lilyquist, Jenna; White, Kirsten Anne Meyer; Lee, Rebecca J.; Philips, Genevieve K.; Hughes, Christopher R.; Torres, Salina M.
2017-01-01
Abstract Identification of positive staining is often qualitative and subjective. This is particularly troublesome in pigmented melanoma lesions, because melanin is difficult to distinguish from the brown stain resulting from immunohistochemistry (IHC) using horse radish peroxidase developed with 3,3′-Diaminobenzidine (HRP-DAB). We sought to identify and quantify positive staining, particularly in melanoma lesions. We visualized G-protein coupled estrogen receptor (GPER) expression developed with HRP-DAB and counterstained with Azure B (stains melanin) in melanoma tissue sections (n = 3). Matched sections (n = 3), along with 22 unmatched sections, were stained only with Azure B as a control. Breast tissue (n = 1) was used as a positive HRP-DAB control. Images of the stained tissues were generated using a Nuance Spectral Imaging Camera. Analysis of the images was performed using the Nuance Spectral Imaging software and SlideBook. Data was analyzed using a Kruskal–Wallis one way analysis of variance (ANOVA). We showed that a pigmented melanoma tissue doubly stained with anti-GPER HRP-DAB and Azure B can be unmixed using spectra derived from a matched, Azure B-only section, and an anti-GPER HRP-DAB control. We unmixed each of the melanoma lesions using each of the Azure B spectra, evaluated the mean intensity of positive staining, and examined the distribution of the mean intensities (P = .73; Kruskal–Wallis). These results suggest that this method does not require a matched Azure B-only stained control tissue for every melanoma lesion, allowing precious tissues to be conserved for other studies. Importantly, this quantification method reduces the subjectivity of protein expression analysis, and provides a valuable tool for accurate evaluation, particularly for pigmented tissues. PMID:28403073
Remote sensing investigations of fugitive soil arsenic and its effects on vegetation reflectance
NASA Astrophysics Data System (ADS)
Slonecker, E. Terrence
2007-12-01
Three different remote sensing technologies were evaluated in support of the remediation of fugitive arsenic and other hazardous waste-related risks to human and ecological health at the Spring Valley Formerly Used Defense Site in northwest Washington D.C., an area of widespread soil arsenic contamination as a result of World War I research and development of chemical weapons. The first evaluation involved the value of information derived from the interpretation of historical aerial photographs. Historical aerial photographs dating back as far as 1918 provided a wealth of information about chemical weapons testing, storage, handling and disposal of these hazardous materials. When analyzed by a trained photo-analyst, the 1918 aerial photographs resulted in 42 features of potential interest. When compared with current remedial activities and known areas of contamination, 33 of 42 or 78.5 % of the features were spatially correlated with current areas of contamination or remedial activity. The second investigation involved the phytoremediation of arsenic through the use of Pteris ferns and the evaluation of the spectral properties of these ferns. Three hundred ferns were grown in controlled laboratory conditions in soils amended with five levels (0, 20, 50, 100 and 200 parts per million) of sodium arsenate. After 20 weeks, the Pteris ferns were shown to have an average uptake concentration of over 4,000 parts per million each. Additionally, statistical analysis of the spectral signature from each fern showed that the frond arsenic concentration could be reasonably predicted with a linear model when the concentration was equal or greater than 500 parts per million. Third, hyperspectral imagery of Spring Valley was obtained and analyzed with a suite of spectral analysis software tools. Results showed the grasses growing in areas of known high soil arsenic could be identified and mapped at an approximate 85% level of accuracy when the hyperspectral image was processed with a linear spectral unmixing algorithm and mapped with a maximum likelihood classifier. The information provided by these various remote sensing technologies presents a non-contact and potentially important alternative to the information needs of the hazardous waste remediation process, and is an important area for future environmental research.
Linear spectral unmixing to monitor crop growth in typical organic and inorganic amended arid soil
NASA Astrophysics Data System (ADS)
El Battay, A.; Mahmoudi, H.
2016-06-01
The soils of the GCC countries are dominantly sandy which is typical of arid regions such as the Arabian Peninsula. Such soils are low in nutrients and have a poor water holding capacity associated with a high infiltration rate. Soil amendments may rehabilitate these soils by restoring essential soil properties and hence enable site revegetation and revitalization for crop production, especially in a region where food security is a priority. In this study, two inorganic amendments; AustraHort and Zeoplant pellet, and one organic locally produced compost were tested as soil amendments at the experimental field of the International Center for Biosaline Agriculture in Dubai, UAE. The main objective is to assess the remote sensing ability to monitor crop growth, for instance Okra (Abelmoschus esculentus), having these amendments, as background with the soil. Three biomass spectral vegetation indices were used namely; NDVI, TDVI and SAVI. Pure spectral signatures of the soil and the three amendments were collected, using a field spectroradiometer, in addition to the spectral signatures of Okra in two growing stages (vegetative and flowering) in the field with a mixed F.O.V of the plant and amended soil during March and May 2015. The spectral signatures were all collected using the FieldSpec® HandHeld 2 (HH2) in the spectral range 325 nm - 1075 nm over 12 plots. A set of 4 plots were assigned for each of the three amendments as follow: three replicates of a 1.5 by 1.5 meter plot with 3kg/m2 of each amendment and 54 plants, one plot as control and all plots were given irrigation treatments at 100% based on ETc. Spectra collected over the plots were inversed in the range of 400-900 nm via a Linear Mixture Model using pure soil and amendments spectral signatures as reference. Field pictures were used to determine the vegetation fraction (in term of area of the F.O.V). Hence, the Okra spectral signatures were isolated for all plots with the three types of amendments. The three vegetation indices were then calculated and compared in the vegetation fraction of the entire F.O.V. The key outcome of this analysis was that a considerable bias was induced when using a mixed F.O.V. In fact, the compost as an organic soil amendment containing dead vegetation affects similarly the sensitivity of the three vegetation indices in the vegetative stage of Okra compared to AustraHort and Zeoplant pellet amended plots. However, the TDVI is very sensitive to vegetation presence even with unmixed crop spectra. AustraHort amendment led to better status of Okra both in March and May with values of 0.19 and 0.28 respectively. Bias induced by some soil amendments can be misleading when upscaling spectral information to satellite imagery with low spectral and spatial resolutions. The results obtained are encouraging for further use of spectral information for crop monitoring in soil containing amendments.
Overhead longwave infrared hyperspectral material identification using radiometric models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zelinski, M. E.
Material detection algorithms used in hyperspectral data processing are computationally efficient but can produce relatively high numbers of false positives. Material identification performed as a secondary processing step on detected pixels can help separate true and false positives. This paper presents a material identification processing chain for longwave infrared hyperspectral data of solid materials collected from airborne platforms. The algorithms utilize unwhitened radiance data and an iterative algorithm that determines the temperature, humidity, and ozone of the atmospheric profile. Pixel unmixing is done using constrained linear regression and Bayesian Information Criteria for model selection. The resulting product includes an optimalmore » atmospheric profile and full radiance material model that includes material temperature, abundance values, and several fit statistics. A logistic regression method utilizing all model parameters to improve identification is also presented. This paper details the processing chain and provides justification for the algorithms used. Several examples are provided using modeled data at different noise levels.« less
Resolving Mixed Algal Species in Hyperspectral Images
Mehrubeoglu, Mehrube; Teng, Ming Y.; Zimba, Paul V.
2014-01-01
We investigated a lab-based hyperspectral imaging system's response from pure (single) and mixed (two) algal cultures containing known algae types and volumetric combinations to characterize the system's performance. The spectral response to volumetric changes in single and combinations of algal mixtures with known ratios were tested. Constrained linear spectral unmixing was applied to extract the algal content of the mixtures based on abundances that produced the lowest root mean square error. Percent prediction error was computed as the difference between actual percent volumetric content and abundances at minimum RMS error. Best prediction errors were computed as 0.4%, 0.4% and 6.3% for the mixed spectra from three independent experiments. The worst prediction errors were found as 5.6%, 5.4% and 13.4% for the same order of experiments. Additionally, Beer-Lambert's law was utilized to relate transmittance to different volumes of pure algal suspensions demonstrating linear logarithmic trends for optical property measurements. PMID:24451451
NASA Astrophysics Data System (ADS)
Jeong, Jeong-Won; Kim, Tae-Seong; Shin, Dae-Chul; Do, Synho; Marmarelis, Vasilis Z.
2004-04-01
Recently it was shown that soft tissue can be differentiated with spectral unmixing and detection methods that utilize multi-band information obtained from a High-Resolution Ultrasonic Transmission Tomography (HUTT) system. In this study, we focus on tissue differentiation using the spectral target detection method based on Constrained Energy Minimization (CEM). We have developed a new tissue differentiation method called "CEM filter bank". Statistical inference on the output of each CEM filter of a filter bank is used to make a decision based on the maximum statistical significance rather than the magnitude of each CEM filter output. We validate this method through 3-D inter/intra-phantom soft tissue classification where target profiles obtained from an arbitrary single slice are used for differentiation in multiple tomographic slices. Also spectral coherence between target and object profiles of an identical tissue at different slices and phantoms is evaluated by conventional cross-correlation analysis. The performance of the proposed classifier is assessed using Receiver Operating Characteristic (ROC) analysis. Finally we apply our method to classify tiny structures inside a beef kidney such as Styrofoam balls (~1mm), chicken tissue (~5mm), and vessel-duct structures.
NASA Astrophysics Data System (ADS)
Pan, Yifan; Zhang, Xianfeng; Tian, Jie; Jin, Xu; Luo, Lun; Yang, Ke
2017-01-01
Asphalt road reflectance spectra change as pavement ages. This provides the possibility for remote sensing to be used to monitor a change in asphalt pavement conditions. However, the relatively narrow geometry of roads and the relatively coarse spatial resolution of remotely sensed imagery result in mixtures between pavement and adjacent landcovers (e.g., vegetation, buildings, and soil), increasing uncertainties in spectral analysis. To overcome this problem, multiple endmember spectral mixture analysis (MESMA) was used to map the asphalt pavement condition using Worldview-2 satellite imagery in this study. Based on extensive field investigation and in situ measurements, aged asphalt pavements were categorized into four stages-preliminarily aged, moderately aged, heavily aged, and distressed. The spectral characteristics in the first three stages were further analyzed, and a MESMA unmixing analysis was conducted to map these three kinds of pavement conditions from the Worldview-2 image. The results showed that the road pavement conditions could be detected well and mapped with an overall accuracy of 81.71% and Kappa coefficient of 0.77. Finally, a quantitative assessment of the pavement conditions for each road segment in this study area was conducted to inform road maintenance management.
Dictionary-Based Tensor Canonical Polyadic Decomposition
NASA Astrophysics Data System (ADS)
Cohen, Jeremy Emile; Gillis, Nicolas
2018-04-01
To ensure interpretability of extracted sources in tensor decomposition, we introduce in this paper a dictionary-based tensor canonical polyadic decomposition which enforces one factor to belong exactly to a known dictionary. A new formulation of sparse coding is proposed which enables high dimensional tensors dictionary-based canonical polyadic decomposition. The benefits of using a dictionary in tensor decomposition models are explored both in terms of parameter identifiability and estimation accuracy. Performances of the proposed algorithms are evaluated on the decomposition of simulated data and the unmixing of hyperspectral images.
Fluorescence hyperspectral imaging (fHSI) using a spectrally resolved detector array
Luthman, Anna Siri; Dumitru, Sebastian; Quiros‐Gonzalez, Isabel; Joseph, James
2017-01-01
Abstract The ability to resolve multiple fluorescent emissions from different biological targets in video rate applications, such as endoscopy and intraoperative imaging, has traditionally been limited by the use of filter‐based imaging systems. Hyperspectral imaging (HSI) facilitates the detection of both spatial and spectral information in a single data acquisition, however, instrumentation for HSI is typically complex, bulky and expensive. We sought to overcome these limitations using a novel robust and low cost HSI camera based on a spectrally resolved detector array (SRDA). We integrated this HSI camera into a wide‐field reflectance‐based imaging system operating in the near‐infrared range to assess the suitability for in vivo imaging of exogenous fluorescent contrast agents. Using this fluorescence HSI (fHSI) system, we were able to accurately resolve the presence and concentration of at least 7 fluorescent dyes in solution. We also demonstrate high spectral unmixing precision, signal linearity with dye concentration and at depth in tissue mimicking phantoms, and delineate 4 fluorescent dyes in vivo. Our approach, including statistical background removal, could be directly generalised to broader spectral ranges, for example, to resolve tissue reflectance or autofluorescence and in future be tailored to video rate applications requiring snapshot HSI data acquisition. PMID:28485130
A comparison of spectral mixture analysis an NDVI for ascertaining ecological variables
NASA Technical Reports Server (NTRS)
Wessman, Carol A.; Bateson, C. Ann; Curtiss, Brian; Benning, Tracy L.
1993-01-01
In this study, we compare the performance of spectral mixture analysis to the Normalized Difference Vegetation Index (NDVI) in detecting change in a grassland across topographically-induced nutrient gradients and different management schemes. The Konza Prairie Research Natural Area, Kansas, is a relatively homogeneous tallgrass prairie in which change in vegetation productivity occurs with respect to topographic positions in each watershed. The area is the site of long-term studies of the influence of fire and grazing on tallgrass production and was the site of the First ISLSCP (International Satellite Land Surface Climatology Project) Field Experiment (FIFE) from 1987 to 1989. Vegetation indices such as NDVI are commonly used with imagery collected in few (less than 10) spectral bands. However, the use of only two bands (e.g. NDVI) does not adequately account for the complex of signals making up most surface reflectance. Influences from background spectral variation and spatial heterogeneity may confound the direct relationship with biological or biophysical variables. High dimensional multispectral data allows for the application position of techniques such as derivative analysis and spectral curve fitting, thereby increasing the probability of successfully modeling the reflectance from mixed surfaces. The higher number of bands permits unmixing of a greater number of surface components, separating the vegetation signal for further analyses relevant to biological variables.
Andrews, John T.; Eberl, D.D.
2012-01-01
Along the margins of areas such as Greenland and Baffin Bay, sediment composition reflects a complex mixture of sources associated with the transport of sediment in sea ice, icebergs, melt-water and turbidite plumes. Similar situations arise in many contexts associated with sediment transport and with the mixing of sediments from different source areas. The question is: can contributions from discrete sediment (bedrock) sources be distinguished in a mixed sediment by using mineralogy, and, if so, how accurately? To solve this problem, four end-member source sediments were mixed in various proportions to form eleven artificial mixtures. Two of the end-member sediments are felsic, and the other two have more mafic compositions. End member and mixed sediment mineralogies were measured for the < 2. mm sediment fractions by quantitative X-ray diffraction (qXRD). The proportions of source sediments in the mixtures then were calculated using an Excel macro program named SedUnMix, and the results were evaluated to determine the robustness of the algorithm. The program permits the unmixing of up to six end members, each of which can be represented by up to 5 alternative compositions, so as to better simulate variability within each source region. The results indicate that we can track the relative percentages of the four end members in the mixtures. We recommend, prior to applying the technique to down-core or to other provenance problems, that a suite of known, artificial mixtures of sediments from probable source areas be prepared, scanned, analyzed for quantitative mineralogy, and then analyzed by SedUnMix to check the sensitivity of the method for each specific unmixing problem. ?? 2011 Elsevier B.V..
Parallel ICA and its hardware implementation in hyperspectral image analysis
NASA Astrophysics Data System (ADS)
Du, Hongtao; Qi, Hairong; Peterson, Gregory D.
2004-04-01
Advances in hyperspectral images have dramatically boosted remote sensing applications by providing abundant information using hundreds of contiguous spectral bands. However, the high volume of information also results in excessive computation burden. Since most materials have specific characteristics only at certain bands, a lot of these information is redundant. This property of hyperspectral images has motivated many researchers to study various dimensionality reduction algorithms, including Projection Pursuit (PP), Principal Component Analysis (PCA), wavelet transform, and Independent Component Analysis (ICA), where ICA is one of the most popular techniques. It searches for a linear or nonlinear transformation which minimizes the statistical dependence between spectral bands. Through this process, ICA can eliminate superfluous but retain practical information given only the observations of hyperspectral images. One hurdle of applying ICA in hyperspectral image (HSI) analysis, however, is its long computation time, especially for high volume hyperspectral data sets. Even the most efficient method, FastICA, is a very time-consuming process. In this paper, we present a parallel ICA (pICA) algorithm derived from FastICA. During the unmixing process, pICA divides the estimation of weight matrix into sub-processes which can be conducted in parallel on multiple processors. The decorrelation process is decomposed into the internal decorrelation and the external decorrelation, which perform weight vector decorrelations within individual processors and between cooperative processors, respectively. In order to further improve the performance of pICA, we seek hardware solutions in the implementation of pICA. Until now, there are very few hardware designs for ICA-related processes due to the complicated and iterant computation. This paper discusses capacity limitation of FPGA implementations for pICA in HSI analysis. A synthesis of Application-Specific Integrated Circuit (ASIC) is designed for pICA-based dimensionality reduction in HSI analysis. The pICA design is implemented using standard-height cells and aimed at TSMC 0.18 micron process. During the synthesis procedure, three ICA-related reconfigurable components are developed for the reuse and retargeting purpose. Preliminary results show that the standard-height cell based ASIC synthesis provide an effective solution for pICA and ICA-related processes in HSI analysis.
Desertification Assessment and Monitoring Based on Remote Sensing
NASA Astrophysics Data System (ADS)
Gao, Z.; del Barrio, G.; Li, X.
2016-08-01
The objective of Dragon 3 Project 10367 is the development of techniques research for desertification assessment and monitoring in China using remote sensing data in combination with climate and environmental-related data. The main achievements acquired during the last two years could be summarized as follows:(1) Photosynthetic vegetation (PV) and non-photosynthetic vegetation (NPV) were estimated in Otindag sandy land by comparison of the pixel-invariant (Spectral Mixture Analysis, SMA) and pixel-variable (Multi-Endmember Spectral Mixture Analysis, MESMA, Automated Monte Carlo Unmixing Analysis, AutoMCU) methods, based on GF-1 data and field measured spectral library.(2) Based on GF-1 data, SMA was applied to solve vegetation cover and transitional sandy land detection in Zhenglan Banner, Inner Mongolia, China.(3) By defined a new indictor, Moisture-responded NPP(MNPP), a new method for identification of degraded lands was put forward, and the land degradation in Xinlin Gol league, Inner Mongolia Autonomous Region, China was assessed preliminarily. (4) The 2dRUE proved to be a good indicator for land degradation, based on which, land degradation status in the general potential extent of desertification in China (PEDC) was assessed.
Electrotransport-induced unmixing and decomposition of ternary oxides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chun, Jakyu; Yoo, Han-Ill, E-mail: hiyoo@snu.ac.kr; Martin, Manfred
A general expectation is that in a uniform oxygen activity atmosphere, cation electrotransport induces a ternary or higher oxide, e.g., AB{sub 1+ξ}O{sub 3+δ}, to kinetically unmix unless the electrochemical mobilities of, say, A{sup 2+}and B{sup 4+} cations are identically equal, and eventually to decompose into the component oxides AO and BO{sub 2} once the extent of unmixing exceeds the stability range of its nonmolecularity ξ. It has, however, earlier been reported [Yoo et al., Appl. Phys. Lett. 92, 252103 (2008)] that even a massive cation electrotransport induces BaTiO{sub 3} to neither unmix nor decompose even at a voltage far exceedingmore » the so-called decomposition voltage U{sub d}, a measure of the standard formation free energy of the oxide (|ΔG{sub f}{sup o}| = nFU{sub d}). Here, we report that as expected, NiTiO{sub 3} unmixes at any voltage and even decomposes if the voltage applied exceeds seemingly a threshold value larger than U{sub d}. We demonstrate experimentally that the electrochemical mobilities of Ni{sup 2+} and Ti{sup 4+} should be necessarily unequal for unmixing. Also, we show theoretically that equal cation mobilities appear to be a sufficiency for BaTiO{sub 3} only for a thermodynamic reason.« less
Hyperspectral Image Classification using a Self-Organizing Map
NASA Technical Reports Server (NTRS)
Martinez, P.; Gualtieri, J. A.; Aguilar, P. L.; Perez, R. M.; Linaje, M.; Preciado, J. C.; Plaza, A.
2001-01-01
The use of hyperspectral data to determine the abundance of constituents in a certain portion of the Earth's surface relies on the capability of imaging spectrometers to provide a large amount of information at each pixel of a certain scene. Today, hyperspectral imaging sensors are capable of generating unprecedented volumes of radiometric data. The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), for example, routinely produces image cubes with 224 spectral bands. This undoubtedly opens a wide range of new possibilities, but the analysis of such a massive amount of information is not an easy task. In fact, most of the existing algorithms devoted to analyzing multispectral images are not applicable in the hyperspectral domain, because of the size and high dimensionality of the images. The application of neural networks to perform unsupervised classification of hyperspectral data has been tested by several authors and also by us in some previous work. We have also focused on analyzing the intrinsic capability of neural networks to parallelize the whole hyperspectral unmixing process. The results shown in this work indicate that neural network models are able to find clusters of closely related hyperspectral signatures, and thus can be used as a powerful tool to achieve the desired classification. The present work discusses the possibility of using a Self Organizing neural network to perform unsupervised classification of hyperspectral images. In sections 3 and 4, the topology of the proposed neural network and the training algorithm are respectively described. Section 5 provides the results we have obtained after applying the proposed methodology to real hyperspectral data, described in section 2. Different parameters in the learning stage have been modified in order to obtain a detailed description of their influence on the final results. Finally, in section 6 we provide the conclusions at which we have arrived.
Arctic Tundra Vegetation Functional Types Based on Photosynthetic Physiology and Optical Properties
NASA Technical Reports Server (NTRS)
Huemmrich, Karl Fred; Gamon, John A.; Tweedie, Craig E.; Campbell, Petya K. Entcheva; Landis, David R.; Middleton, Elizabeth M.
2013-01-01
Non-vascular plants (lichens and mosses) are significant components of tundra landscapes and may respond to climate change differently from vascular plants affecting ecosystem carbon balance. Remote sensing provides critical tools for monitoring plant cover types, as optical signals provide a way to scale from plot measurements to regional estimates of biophysical properties, for which spatial-temporal patterns may be analyzed. Gas exchange measurements were collected for pure patches of key vegetation functional types (lichens, mosses, and vascular plants) in sedge tundra at Barrow, AK. These functional types were found to have three significantly different values of light use efficiency (LUE) with values of 0.013 plus or minus 0.0002, 0.0018 plus or minus 0.0002, and 0.0012 plus or minus 0.0001 mol C mol (exp -1) absorbed quanta for vascular plants, mosses and lichens, respectively. Discriminant analysis of the spectra reflectance of these patches identified five spectral bands that separated each of these vegetation functional types as well as nongreen material (bare soil, standing water, and dead leaves). These results were tested along a 100 m transect where midsummer spectral reflectance and vegetation coverage were measured at one meter intervals. Along the transect, area-averaged canopy LUE estimated from coverage fractions of the three functional types varied widely, even over short distances. The patch-level statistical discriminant functions applied to in situ hyperspectral reflectance data collected along the transect successfully unmixed cover fractions of the vegetation functional types. The unmixing functions, developed from the transect data, were applied to 30 m spatial resolution Earth Observing-1 Hyperion imaging spectrometer data to examine variability in distribution of the vegetation functional types for an area near Barrow, AK. Spatial variability of LUE was derived from the observed functional type distributions. Across this landscape, a fivefold variation in tundra LUE was observed. LUE calculated from the functional type cover fractions was also correlated to a spectral vegetation index developed to detect vegetation chlorophyll content. The concurrence of these alternate methods suggest that hyperspectral remote sensing can distinguish functionally distinct vegetation types and can be used to develop regional estimates of photosynthetic LUE in tundra landscapes.
NASA Astrophysics Data System (ADS)
Ghaffarian, Saman; Ghaffarian, Salar
2014-11-01
This paper proposes an improved FastICA model named as Purposive FastICA (PFICA) with initializing by a simple color space transformation and a novel masking approach to automatically detect buildings from high resolution Google Earth imagery. ICA and FastICA algorithms are defined as Blind Source Separation (BSS) techniques for unmixing source signals using the reference data sets. In order to overcome the limitations of the ICA and FastICA algorithms and make them purposeful, we developed a novel method involving three main steps: 1-Improving the FastICA algorithm using Moore-Penrose pseudo inverse matrix model, 2-Automated seeding of the PFICA algorithm based on LUV color space and proposed simple rules to split image into three regions; shadow + vegetation, baresoil + roads and buildings, respectively, 3-Masking out the final building detection results from PFICA outputs utilizing the K-means clustering algorithm with two number of clusters and conducting simple morphological operations to remove noises. Evaluation of the results illustrates that buildings detected from dense and suburban districts with divers characteristics and color combinations using our proposed method have 88.6% and 85.5% overall pixel-based and object-based precision performances, respectively.
NASA Astrophysics Data System (ADS)
Diao, Chunyuan
In today's big data era, the increasing availability of satellite and airborne platforms at various spatial and temporal scales creates unprecedented opportunities to understand the complex and dynamic systems (e.g., plant invasion). Time series remote sensing is becoming more and more important to monitor the earth system dynamics and interactions. To date, most of the time series remote sensing studies have been conducted with the images acquired at coarse spatial scale, due to their relatively high temporal resolution. The construction of time series at fine spatial scale, however, is limited to few or discrete images acquired within or across years. The objective of this research is to advance the time series remote sensing at fine spatial scale, particularly to shift from discrete time series remote sensing to continuous time series remote sensing. The objective will be achieved through the following aims: 1) Advance intra-annual time series remote sensing under the pure-pixel assumption; 2) Advance intra-annual time series remote sensing under the mixed-pixel assumption; 3) Advance inter-annual time series remote sensing in monitoring the land surface dynamics; and 4) Advance the species distribution model with time series remote sensing. Taking invasive saltcedar as an example, four methods (i.e., phenological time series remote sensing model, temporal partial unmixing method, multiyear spectral angle clustering model, and time series remote sensing-based spatially explicit species distribution model) were developed to achieve the objectives. Results indicated that the phenological time series remote sensing model could effectively map saltcedar distributions through characterizing the seasonal phenological dynamics of plant species throughout the year. The proposed temporal partial unmixing method, compared to conventional unmixing methods, could more accurately estimate saltcedar abundance within a pixel by exploiting the adequate temporal signatures of saltcedar. The multiyear spectral angle clustering model could guide the selection of the most representative remotely sensed image for repetitive saltcedar mapping over space and time. Through incorporating spatial autocorrelation, the species distribution model developed in the study could identify the suitable habitats of saltcedar at a fine spatial scale and locate appropriate areas at high risk of saltcedar infestation. Among 10 environmental variables, the distance to the river and the phenological attributes summarized by the time series remote sensing were regarded as the most important. These methods developed in the study provide new perspectives on how the continuous time series can be leveraged under various conditions to investigate the plant invasion dynamics.
Estimating urban vegetation fraction across 25 cities in pan-Pacific using Landsat time series data
NASA Astrophysics Data System (ADS)
Lu, Yuhao; Coops, Nicholas C.; Hermosilla, Txomin
2017-04-01
Urbanization globally is consistently reshaping the natural landscape to accommodate the growing human population. Urban vegetation plays a key role in moderating environmental impacts caused by urbanization and is critically important for local economic, social and cultural development. The differing patterns of human population growth, varying urban structures and development stages, results in highly varied spatial and temporal vegetation patterns particularly in the pan-Pacific region which has some of the fastest urbanization rates globally. Yet spatially-explicit temporal information on the amount and change of urban vegetation is rarely documented particularly in less developed nations. Remote sensing offers an exceptional data source and a unique perspective to map urban vegetation and change due to its consistency and ubiquitous nature. In this research, we assess the vegetation fractions of 25 cities across 12 pan-Pacific countries using annual gap-free Landsat surface reflectance products acquired from 1984 to 2012, using sub-pixel, spectral unmixing approaches. Vegetation change trends were then analyzed using Mann-Kendall statistics and Theil-Sen slope estimators. Unmixing results successfully mapped urban vegetation for pixels located in urban parks, forested mountainous regions, as well as agricultural land (correlation coefficient ranging from 0.66 to 0.77). The greatest vegetation loss from 1984 to 2012 was found in Shanghai, Tianjin, and Dalian in China. In contrast, cities including Vancouver (Canada) and Seattle (USA) showed stable vegetation trends through time. Using temporal trend analysis, our results suggest that it is possible to reduce noise and outliers caused by phenological changes particularly in cropland using dense new Landsat time series approaches. We conclude that simple yet effective approaches of unmixing Landsat time series data for assessing spatial and temporal changes of urban vegetation at regional scales can provide critical information for urban planners and anthropogenic studies globally.
Ji, Cuicui; Jia, Yonghong; Gao, Zhihai; Wei, Huaidong; Li, Xiaosong
2017-01-01
Desert vegetation plays significant roles in securing the ecological integrity of oasis ecosystems in western China. Timely monitoring of photosynthetic/non-photosynthetic desert vegetation cover is necessary to guide management practices on land desertification and research into the mechanisms driving vegetation recession. In this study, nonlinear spectral mixture effects for photosynthetic/non-photosynthetic vegetation cover estimates are investigated through comparing the performance of linear and nonlinear spectral mixture models with different endmembers applied to field spectral measurements of two types of typical desert vegetation, namely, Nitraria shrubs and Haloxylon. The main results were as follows. (1) The correct selection of endmembers is important for improving the accuracy of vegetation cover estimates, and in particular, shadow endmembers cannot be neglected. (2) For both the Nitraria shrubs and Haloxylon, the Kernel-based Nonlinear Spectral Mixture Model (KNSMM) with nonlinear parameters was the best unmixing model. In consideration of the computational complexity and accuracy requirements, the Linear Spectral Mixture Model (LSMM) could be adopted for Nitraria shrubs plots, but this will result in significant errors for the Haloxylon plots since the nonlinear spectral mixture effects were more obvious for this vegetation type. (3) The vegetation canopy structure (planophile or erectophile) determines the strength of the nonlinear spectral mixture effects. Therefore, no matter for Nitraria shrubs or Haloxylon, the non-linear spectral mixing effects between the photosynthetic / non-photosynthetic vegetation and the bare soil do exist, and its strength is dependent on the three-dimensional structure of the vegetation canopy. The choice of linear or nonlinear spectral mixture models is up to the consideration of computational complexity and the accuracy requirement.
Jia, Yonghong; Gao, Zhihai; Wei, Huaidong
2017-01-01
Desert vegetation plays significant roles in securing the ecological integrity of oasis ecosystems in western China. Timely monitoring of photosynthetic/non-photosynthetic desert vegetation cover is necessary to guide management practices on land desertification and research into the mechanisms driving vegetation recession. In this study, nonlinear spectral mixture effects for photosynthetic/non-photosynthetic vegetation cover estimates are investigated through comparing the performance of linear and nonlinear spectral mixture models with different endmembers applied to field spectral measurements of two types of typical desert vegetation, namely, Nitraria shrubs and Haloxylon. The main results were as follows. (1) The correct selection of endmembers is important for improving the accuracy of vegetation cover estimates, and in particular, shadow endmembers cannot be neglected. (2) For both the Nitraria shrubs and Haloxylon, the Kernel-based Nonlinear Spectral Mixture Model (KNSMM) with nonlinear parameters was the best unmixing model. In consideration of the computational complexity and accuracy requirements, the Linear Spectral Mixture Model (LSMM) could be adopted for Nitraria shrubs plots, but this will result in significant errors for the Haloxylon plots since the nonlinear spectral mixture effects were more obvious for this vegetation type. (3) The vegetation canopy structure (planophile or erectophile) determines the strength of the nonlinear spectral mixture effects. Therefore, no matter for Nitraria shrubs or Haloxylon, the non-linear spectral mixing effects between the photosynthetic / non-photosynthetic vegetation and the bare soil do exist, and its strength is dependent on the three-dimensional structure of the vegetation canopy. The choice of linear or nonlinear spectral mixture models is up to the consideration of computational complexity and the accuracy requirement. PMID:29240777
Unmix 6.0 Model for environmental data analyses
Unmix Model is a mathematical receptor model developed by EPA scientists that provides scientific support for the development and review of the air and water quality standards, exposure research, and environmental forensics.
Schaaf, Tory M.; Peterson, Kurt C.; Grant, Benjamin D.; Bawaskar, Prachi; Yuen, Samantha; Li, Ji; Muretta, Joseph M.; Gillispie, Gregory D.; Thomas, David D.
2017-01-01
A robust high-throughput screening (HTS) strategy has been developed to discover small-molecule effectors targeting the sarco/endoplasmic reticulum calcium ATPase (SERCA), based on a fluorescence microplate reader that records both the nanosecond decay waveform (lifetime mode) and the complete emission spectrum (spectral mode), with high precision and speed. This spectral unmixing plate reader (SUPR) was used to screen libraries of small molecules with a fluorescence resonance energy transfer (FRET) biosensor expressed in living cells. Ligand binding was detected by FRET associated with structural rearrangements of green (GFP, donor) and red (RFP, acceptor) fluorescent proteins fused to the cardiac-specific SERCA2a isoform. The results demonstrate accurate quantitation of FRET along with high precision of hit identification. Fluorescence lifetime analysis resolved SERCA’s distinct structural states, providing a method to classify small-molecule chemotypes on the basis of their structural effect on the target. The spectral analysis was also applied to flag interference by fluorescent compounds. FRET hits were further evaluated for functional effects on SERCA’s ATPase activity via both a coupled-enzyme assay and a FRET-based calcium sensor. Concentration-response curves indicated excellent correlation between FRET and function. These complementary spectral and lifetime FRET detection methods offer an attractive combination of precision, speed, and resolution for HTS. PMID:27899691
EPA Unmix 6.0 Fundamentals & User Guide
Unmix seeks to solve the general mixture problem where the data are assumed to be a linear combination of an unknown number of sources of unknown composition, which contribute an unknown amount to each sample.
A clustering algorithm for sample data based on environmental pollution characteristics
NASA Astrophysics Data System (ADS)
Chen, Mei; Wang, Pengfei; Chen, Qiang; Wu, Jiadong; Chen, Xiaoyun
2015-04-01
Environmental pollution has become an issue of serious international concern in recent years. Among the receptor-oriented pollution models, CMB, PMF, UNMIX, and PCA are widely used as source apportionment models. To improve the accuracy of source apportionment and classify the sample data for these models, this study proposes an easy-to-use, high-dimensional EPC algorithm that not only organizes all of the sample data into different groups according to the similarities in pollution characteristics such as pollution sources and concentrations but also simultaneously detects outliers. The main clustering process consists of selecting the first unlabelled point as the cluster centre, then assigning each data point in the sample dataset to its most similar cluster centre according to both the user-defined threshold and the value of similarity function in each iteration, and finally modifying the clusters using a method similar to k-Means. The validity and accuracy of the algorithm are tested using both real and synthetic datasets, which makes the EPC algorithm practical and effective for appropriately classifying sample data for source apportionment models and helpful for better understanding and interpreting the sources of pollution.
Analysis of multispectral and hyperspectral longwave infrared (LWIR) data for geologic mapping
NASA Astrophysics Data System (ADS)
Kruse, Fred A.; McDowell, Meryl
2015-05-01
Multispectral MODIS/ASTER Airborne Simulator (MASTER) data and Hyperspectral Thermal Emission Spectrometer (HyTES) data covering the 8 - 12 μm spectral range (longwave infrared or LWIR) were analyzed for an area near Mountain Pass, California. Decorrelation stretched images were initially used to highlight spectral differences between geologic materials. Both datasets were atmospherically corrected using the ISAC method, and the Normalized Emissivity approach was used to separate temperature and emissivity. The MASTER data had 10 LWIR spectral bands and approximately 35-meter spatial resolution and covered a larger area than the HyTES data, which were collected with 256 narrow (approximately 17nm-wide) spectral bands at approximately 2.3-meter spatial resolution. Spectra for key spatially-coherent, spectrally-determined geologic units for overlap areas were overlain and visually compared to determine similarities and differences. Endmember spectra were extracted from both datasets using n-dimensional scatterplotting and compared to emissivity spectral libraries for identification. Endmember distributions and abundances were then mapped using Mixture-Tuned Matched Filtering (MTMF), a partial unmixing approach. Multispectral results demonstrate separation of silica-rich vs non-silicate materials, with distinct mapping of carbonate areas and general correspondence to the regional geology. Hyperspectral results illustrate refined mapping of silicates with distinction between similar units based on the position, character, and shape of high resolution emission minima near 9 μm. Calcite and dolomite were separated, identified, and mapped using HyTES based on a shift of the main carbonate emissivity minimum from approximately 11.3 to 11.2 μm respectively. Both datasets demonstrate the utility of LWIR spectral remote sensing for geologic mapping.
Electrokinetic instability micromixing.
Oddy, M H; Santiago, J G; Mikkelsen, J C
2001-12-15
We have developed an electrokinetic process to rapidly stir micro- and nanoliter volume solutions for microfluidic bioanalytical applications. We rapidly stir microflow streams by initiating a flow instability, which we have observed in sinusoidally oscillating, electroosmotic channel flows. As the effect occurs within an oscillating electroosmotic flow, we refer to it here as an electrokinetic instability (EKI). The rapid stretching and folding of material lines associated with this instability can be used to stir fluid streams with Reynolds numbers of order unity, based on channel depth and rms electroosmotic velocity. This paper presents a preliminary description of the EKI and the design and fabrication of two micromixing devices capable of rapidly stirring two fluid streams using this flow phenomenon. A high-resolution CCD camera is used to record the stirring and diffusion of fluorescein from an initially unmixed configuration. Integration of fluorescence intensity over measurement volumes (voxels) provides a measure of the degree to which two streams are mixed to within the length scales of the voxels. Ensemble-averaged probability density functions and power spectra of the instantaneous spatial intensity profiles are used to quantify the mixing processes. Two-dimensional spectral bandwidths of the mixing images are initially anisotropic for the unmixed configuration, broaden as the stirring associated with the EKI rapidly stretches and folds material lines (adding high spatial frequencies to the concentration field), and then narrow to a relatively isotropic spectrum at the well-mixed conditions.
Enhancement of overwritten text in the Archimedes Palimpsest
NASA Astrophysics Data System (ADS)
Knox, Keith T.
2008-02-01
The Archimedes Palimpsest is a thousand-year old overwritten parchment manuscript, containing several treatises by Archimedes. Eight hundred years ago, it was erased, overwritten and bound into a prayer book. In the middle of the twentieth century, a few pages were painted over with forged Byzantine icons. Today, a team of imagers, scholars and conservators is recovering and interpreting the erased Archimedes writings. Two different methods have been used to reveal the erased undertext. Spectral information is obtained by illuminating the manuscript with narrow-band light from the ultraviolet, through the visible wavebands and into the near-infrared wavelengths. Characters are extracted by combining pairs of spectral bands or by spectral unmixing techniques adapted from remote sensing. Lastly, since all of the text was written with iron gall ink, X-Ray fluorescence has been used to expose the ink underneath the painted icons. This paper describes the use of color to enhance the erased text in the processed images and to make it visible to the scholars. Special pseudocolor techniques have been developed that significantly increase the contrast of the erased text and make it possible to be read by the scholars despite the presence of the obscuring, overlaid text.
Raman spectroscopic imaging as complementary tool for histopathologic assessment of brain tumors
NASA Astrophysics Data System (ADS)
Krafft, Christoph; Bergner, Norbert; Romeike, Bernd; Reichart, Rupert; Kalff, Rolf; Geiger, Kathrin; Kirsch, Matthias; Schackert, Gabriele; Popp, Jürgen
2012-02-01
Raman spectroscopy enables label-free assessment of brain tissues and tumors based on their biochemical composition. Combination of the Raman spectra with the lateral information allows grading of tumors, determining the primary tumor of brain metastases and delineating tumor margins - even during surgery after coupling with fiber optic probes. This contribution presents exemplary Raman spectra and images collected from low grade and high grade regions of astrocytic gliomas and brain metastases. A region of interest in dried tissue sections encompassed slightly increased cell density. Spectral unmixing by vertex component analysis (VCA) and N-FINDR resolved cell nuclei in score plots and revealed the spectral contributions of nucleic acids, cholesterol, cholesterol ester and proteins in endmember signatures. The results correlated with the histopathological analysis after staining the specimens by hematoxylin and eosin. For a region of interest in non-dried, buffer immersed tissue sections image processing was not affected by drying artifacts such as denaturation of biomolecules and crystallization of cholesterol. Consequently, the results correspond better to in vivo situations. Raman spectroscopic imaging of a brain metastases from renal cell carcinoma showed an endmember with spectral contributions of glycogen which can be considered as a marker for this primary tumor.
Spectral compression algorithms for the analysis of very large multivariate images
Keenan, Michael R.
2007-10-16
A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.
Huang, Kuixian; Luo, Xingzhang
2018-01-01
The purpose of this study is to recognize the contamination characteristics of trace metals in soils and apportion their potential sources in Northern China to provide a scientific basis for basic of soil environment management and pollution control. The data set of metals for 12 elements in surface soil samples was collected. The enrichment factor and geoaccumulation index were used to identify the general geochemical characteristics of trace metals in soils. The UNMIX and positive matrix factorizations (PMF) models were comparatively applied to apportion their potential sources. Furthermore, geostatistical tools were used to study the spatial distribution of pollution characteristics and to identify the affected regions of sources that were derived from apportionment models. The soils were contaminated by Cd, Hg, Pb and Zn to varying degree. Industrial activities, agricultural activities and natural sources were identified as the potential sources determining the contents of trace metals in soils with contributions of 24.8%–24.9%, 33.3%–37.2% and 38.0%–41.8%, respectively. The slightly different results obtained from UNMIX and PMF might be caused by the estimations of uncertainty and different algorithms within the models. PMID:29474412
Lattice Independent Component Analysis for Mobile Robot Localization
NASA Astrophysics Data System (ADS)
Villaverde, Ivan; Fernandez-Gauna, Borja; Zulueta, Ekaitz
This paper introduces an approach to appearance based mobile robot localization using Lattice Independent Component Analysis (LICA). The Endmember Induction Heuristic Algorithm (EIHA) is used to select a set of Strong Lattice Independent (SLI) vectors, which can be assumed to be Affine Independent, and therefore candidates to be the endmembers of the data. Selected endmembers are used to compute the linear unmixing of the robot's acquired images. The resulting mixing coefficients are used as feature vectors for view recognition through classification. We show on a sample path experiment that our approach can recognise the localization of the robot and we compare the results with the Independent Component Analysis (ICA).
NASA Astrophysics Data System (ADS)
Pu, Yang; Wang, Wubao; Tang, Guichen; Budansky, Yury; Sharonov, Mikhail; Xu, Min; Achilefu, Samuel; Eastham, James A.; Alfano, Robert R.
2012-01-01
A portable near infrared scanning polarization imaging unit with an optical fiber-based rectal probe, namely Photonic Finger, was designed and developed o locate the 3D position of abnormal prostate site inside normal prostate tissue. An inverse algorithm, Optical Tomography using Independent Component Analysis (OPTICA) was improved particularly to unmix the signal from targets (cancerous tissue) embedded in a turbid medium (normal tissue) in the backscattering imaging geometry. Photonic Finger combined with OPTICA was tested to characterize different target(s) inside different tissue medium, including cancerous prostate tissue embedded by large piece of normal tissue.
NASA Astrophysics Data System (ADS)
Ma, Xiaoke; Wang, Bingbo; Yu, Liang
2018-01-01
Community detection is fundamental for revealing the structure-functionality relationship in complex networks, which involves two issues-the quantitative function for community as well as algorithms to discover communities. Despite significant research on either of them, few attempt has been made to establish the connection between the two issues. To attack this problem, a generalized quantification function is proposed for community in weighted networks, which provides a framework that unifies several well-known measures. Then, we prove that the trace optimization of the proposed measure is equivalent with the objective functions of algorithms such as nonnegative matrix factorization, kernel K-means as well as spectral clustering. It serves as the theoretical foundation for designing algorithms for community detection. On the second issue, a semi-supervised spectral clustering algorithm is developed by exploring the equivalence relation via combining the nonnegative matrix factorization and spectral clustering. Different from the traditional semi-supervised algorithms, the partial supervision is integrated into the objective of the spectral algorithm. Finally, through extensive experiments on both artificial and real world networks, we demonstrate that the proposed method improves the accuracy of the traditional spectral algorithms in community detection.
Linear unmixing of multidate hyperspectral imagery for crop yield estimation
USDA-ARS?s Scientific Manuscript database
In this paper, we have evaluated an unsupervised unmixing approach, vertex component analysis (VCA), for the application of crop yield estimation. The results show that abundance maps of the vegetation extracted by the approach are strongly correlated to the yield data (the correlation coefficients ...
Spectral matching technology for light-emitting diode-based jaundice photodynamic therapy device
NASA Astrophysics Data System (ADS)
Gan, Ru-ting; Guo, Zhen-ning; Lin, Jie-ben
2015-02-01
The objective of this paper is to obtain the spectrum of light-emitting diode (LED)-based jaundice photodynamic therapy device (JPTD), the bilirubin absorption spectrum in vivo was regarded as target spectrum. According to the spectral constructing theory, a simple genetic algorithm as the spectral matching algorithm was first proposed in this study. The optimal combination ratios of LEDs were obtained, and the required LEDs number was then calculated. Meanwhile, the algorithm was compared with the existing spectral matching algorithms. The results show that this algorithm runs faster with higher efficiency, the switching time consumed is 2.06 s, and the fitting spectrum is very similar to the target spectrum with 98.15% matching degree. Thus, blue LED-based JPTD can replace traditional blue fluorescent tube, the spectral matching technology that has been put forward can be applied to the light source spectral matching for jaundice photodynamic therapy and other medical phototherapy.
Excitation-scanning hyperspectral imaging as a means to discriminate various tissues types
NASA Astrophysics Data System (ADS)
Deal, Joshua; Favreau, Peter F.; Lopez, Carmen; Lall, Malvika; Weber, David S.; Rich, Thomas C.; Leavesley, Silas J.
2017-02-01
Little is currently known about the fluorescence excitation spectra of disparate tissues and how these spectra change with pathological state. Current imaging diagnostic techniques have limited capacity to investigate fluorescence excitation spectral characteristics. This study utilized excitation-scanning hyperspectral imaging to perform a comprehensive assessment of fluorescence spectral signatures of various tissues. Immediately following tissue harvest, a custom inverted microscope (TE-2000, Nikon Instruments) with Xe arc lamp and thin film tunable filter array (VersaChrome, Semrock, Inc.) were used to acquire hyperspectral image data from each sample. Scans utilized excitation wavelengths from 340 nm to 550 nm in 5 nm increments. Hyperspectral images were analyzed with custom Matlab scripts including linear spectral unmixing (LSU), principal component analysis (PCA), and Gaussian mixture modeling (GMM). Spectra were examined for potential characteristic features such as consistent intensity peaks at specific wavelengths or intensity ratios among significant wavelengths. The resultant spectral features were conserved among tissues of similar molecular composition. Additionally, excitation spectra appear to be a mixture of pure endmembers with commonalities across tissues of varied molecular composition, potentially identifiable through GMM. These results suggest the presence of common autofluorescent molecules in most tissues and that excitationscanning hyperspectral imaging may serve as an approach for characterizing tissue composition as well as pathologic state. Future work will test the feasibility of excitation-scanning hyperspectral imaging as a contrast mode for discriminating normal and pathological tissues.
Chen, X.; Vierling, Lee; Rowell, E.; DeFelice, Tom
2004-01-01
Structural and functional analyses of ecosystems benefit when high accuracy vegetation coverages can be derived over large areas. In this study, we utilize IKONOS, Landsat 7 ETM+, and airborne scanning light detection and ranging (lidar) to quantify coniferous forest and understory grass coverages in a ponderosa pine (Pinus ponderosa) dominated ecosystem in the Black Hills of South Dakota. Linear spectral mixture analyses of IKONOS and ETM+ data were used to isolate spectral endmembers (bare soil, understory grass, and tree/shade) and calculate their subpixel fractional coverages. We then compared these endmember cover estimates to similar cover estimates derived from lidar data and field measures. The IKONOS-derived tree/shade fraction was significantly correlated with the field-measured canopy effective leaf area index (LAIe) (r2=0.55, p<0.001) and with the lidar-derived estimate of tree occurrence (r2=0.79, p<0.001). The enhanced vegetation index (EVI) calculated from IKONOS imagery showed a negative correlation with the field measured tree canopy effective LAI and lidar tree cover response (r2=0.30, r=−0.55 and r2=0.41, r=−0.64, respectively; p<0.001) and further analyses indicate a strong linear relationship between EVI and the IKONOS-derived grass fraction (r2=0.99, p<0.001). We also found that using EVI resulted in better agreement with the subpixel vegetation fractions in this ecosystem than using normalized difference of vegetation index (NDVI). Coarsening the IKONOS data to 30 m resolution imagery revealed a stronger relationship with lidar tree measures (r2=0.77, p<0.001) than at 4 m resolution (r2=0.58, p<0.001). Unmixed tree/shade fractions derived from 30 m resolution ETM+ imagery also showed a significant correlation with the lidar data (r2=0.66, p<0.001). These results demonstrate the power of using high resolution lidar data to validate spectral unmixing results of satellite imagery, and indicate that IKONOS data and Landsat 7 ETM+ data both can serve to make the important distinction between tree/shade coverage and exposed understory grass coverage during peak summertime greenness in a ponderosa pine forest ecosystem.
SOURCE APPORTIONMENT OF PHOENIX PM2.5 AEROSOL WITH THE UNMIX RECEPTOR MODEL
The multivariate receptor model Unmix has been used to analyze a 3-yr PM2.5 ambient aerosol data set collected in Phoenix, AZ, beginning in 1995. The analysis generated source profiles and overall percentage source contribution estimates (SCE) for five source categories: ga...
NASA Technical Reports Server (NTRS)
Smith, Michael D.; Bandfield, Joshua L.; Christensen, Philip R.
2000-01-01
We present two algorithms for the separation of spectral features caused by atmospheric and surface components in Thermal Emission Spectrometer (TES) data. One algorithm uses radiative transfer and successive least squares fitting to find spectral shapes first for atmospheric dust, then for water-ice aerosols, and then, finally, for surface emissivity. A second independent algorithm uses a combination of factor analysis, target transformation, and deconvolution to simultaneously find dust, water ice, and surface emissivity spectral shapes. Both algorithms have been applied to TES spectra, and both find very similar atmospheric and surface spectral shapes. For TES spectra taken during aerobraking and science phasing periods in nadir-geometry these two algorithms give meaningful and usable surface emissivity spectra that can be used for mineralogical identification.
Software algorithm and hardware design for real-time implementation of new spectral estimator
2014-01-01
Background Real-time spectral analyzers can be difficult to implement for PC computer-based systems because of the potential for high computational cost, and algorithm complexity. In this work a new spectral estimator (NSE) is developed for real-time analysis, and compared with the discrete Fourier transform (DFT). Method Clinical data in the form of 216 fractionated atrial electrogram sequences were used as inputs. The sample rate for acquisition was 977 Hz, or approximately 1 millisecond between digital samples. Real-time NSE power spectra were generated for 16,384 consecutive data points. The same data sequences were used for spectral calculation using a radix-2 implementation of the DFT. The NSE algorithm was also developed for implementation as a real-time spectral analyzer electronic circuit board. Results The average interval for a single real-time spectral calculation in software was 3.29 μs for NSE versus 504.5 μs for DFT. Thus for real-time spectral analysis, the NSE algorithm is approximately 150× faster than the DFT. Over a 1 millisecond sampling period, the NSE algorithm had the capability to spectrally analyze a maximum of 303 data channels, while the DFT algorithm could only analyze a single channel. Moreover, for the 8 second sequences, the NSE spectral resolution in the 3-12 Hz range was 0.037 Hz while the DFT spectral resolution was only 0.122 Hz. The NSE was also found to be implementable as a standalone spectral analyzer board using approximately 26 integrated circuits at a cost of approximately $500. The software files used for analysis are included as a supplement, please see the Additional files 1 and 2. Conclusions The NSE real-time algorithm has low computational cost and complexity, and is implementable in both software and hardware for 1 millisecond updates of multichannel spectra. The algorithm may be helpful to guide radiofrequency catheter ablation in real time. PMID:24886214
NASA Astrophysics Data System (ADS)
MacLeod, Neil A.; Weidmann, Damien
2016-05-01
High sensitivity detection, identification and quantification of chemicals in a stand-off configuration is a highly sought after capability across the security and defense sector. Specific applications include assessing the presence of explosive related materials, poisonous or toxic chemical agents, and narcotics. Real world field deployment of an operational stand-off system is challenging due to stringent requirements: high detection sensitivity, stand-off ranges from centimeters to hundreds of meters, eye-safe invisible light, near real-time response and a wide chemical versatility encompassing both vapor and condensed phase chemicals. Additionally, field deployment requires a compact, rugged, power efficient, and cost-effective design. To address these demanding requirements, we have developed the concept of Active Coherent Laser Spectrometer (ACLaS), which can be also described as a middle infrared hyperspectral coherent lidar. Combined with robust spectral unmixing algorithms, inherited from retrievals of information from high-resolution spectral data generated by satellitebased spectrometers, ACLaS has been demonstrated to fulfil the above-mentioned needs. ACLaS prototypes have been so far developed using quantum cascade lasers (QCL) and interband cascade lasers (ICL) to exploit the fast frequency tuning capability of these solid state sources. Using distributed feedback (DFB) QCL, demonstration and performance analysis were carried out on narrow-band absorbing chemicals (N2O, H2O, H2O2, CH4, C2H2 and C2H6) at stand-off distances up to 50 m using realistic non cooperative targets such as wood, painted metal, and bricks. Using more widely tunable external cavity QCL, ACLaS has also been demonstrated on broadband absorbing chemicals (dichloroethane, HFC134a, ethylene glycol dinitrate and 4-nitroacetanilide solid) and on complex samples mixing narrow-band and broadband absorbers together in a realistic atmospheric background.
Lunar and Planetary Science XXXV: Mars: Remote Sensing and Terrestrial Analogs
NASA Technical Reports Server (NTRS)
2004-01-01
The session "Mars: Remote Sensing and Terrestrial Analogs" included the following:Physical Meaning of the Hapke Parameter for Macroscopic Roughness: Experimental Determination for Planetary Regolith Surface Analogs and Numerical Approach; Near-Infrared Spectra of Martian Pyroxene Separates: First Results from Mars Spectroscopy Consortium; Anomalous Spectra of High-Ca Pyroxenes: Correlation Between Ir and M ssbauer Patterns; THEMIS-IR Emissivity Spectrum of a Large Dark Streak near Olympus Mons; Geomorphologic/Thermophysical Mapping of the Athabasca Region, Mars, Using THEMIS Infrared Imaging; Mars Thermal Inertia from THEMIS Data; Multispectral Analysis Methods for Mapping Aqueous Mineral Depostis in Proposed Paleolake Basins on Mars Using THEMIS Data; Joint Analysis of Mars Odyssey THEMIS Visible and Infrared Images: A Magic Airbrush for Qualitative and Quantitative Morphology; Analysis of Mars Thermal Emission Spectrometer Data Using Large Mineral Reference Libraries ; Negative Abundance : A Problem in Compositional Modeling of Hyperspectral Images; Mars-LAB: First Remote Sensing Data of Mineralogy Exposed at Small Mars-Analog Craters, Nevada Test Site; A Tool for the 2003 Rover Mini-TES: Downwelling Radiance Compensation Using Integrated Line-Sight Sky Measurements; Learning About Mars Geology Using Thermal Infrared Spectral Imaging: Orbiter and Rover Perspectives; Classifying Terrestrial Volcanic Alteration Processes and Defining Alteration Processes they Represent on Mars; Cemented Volcanic Soils, Martian Spectra and Implications for the Martian Climate; Palagonitic Mars: A Basalt Centric View of Surface Composition and Aqueous Alteration; Combining a Non Linear Unmixing Model and the Tetracorder Algorithm: Application to the ISM Dataset; Spectral Reflectance Properties of Some Basaltic Weathering Products; Morphometric LIDAR Analysis of Amboy Crater, California: Application to MOLA Analysis of Analog Features on Mars; Airborne Radar Study of Soil Moisture at a Mars Analog Site: Tohachi Wash/Little Colorado River; and Antarctic Dry Valleys: Modification of Rocks and Soils and Implications for Mars The Arkaroola Mars Analogue Region, South Australia.
Mapping Tropical Forest Change in the Greater Marañón and Ucayali regions of Peru using CLASlite
NASA Astrophysics Data System (ADS)
Perez-Leiva, P.; Knapp, D. E.; Clark, J. K.; Asner, G. P.
2012-12-01
The Carnegie Landsat Analysis System-lite (CLASlite) was used to map and monitor tropical forest change in two large tropical watersheds in Peru: Greater Marañón and Ucayali. CLASlite uses radiometric and atmospheric correction algorithms as well as an Automated Monte Carlo Unmixing (AutoMCU) to obtain consistent fractional land cover per-pixel at high spatial resolution. Fractional land cover is automatically extracted from universal spectral libraries which allow for a differentiation between live photosynthetic vegetation (PV), non-photosynthetic vegetation (NPV) and bare substrate (S). Fractional cover information is directly translated to maps of forest cover based in the physical characteristics of the forest canopy. Rates of deforestation and disturbance are estimated through analysis of change in fractional land cover over time. The Greater Marañón and Ucayali watersheds were studied over the period 1985 to 2012, through analysis of 1900 multi-spectral images from Landsat 4, 5 and 7. These images were processed and analyzed using CLASlite to obtain fractional cover and forest cover information for each year within the period. Annualization of the collected maps provided detailed information on the gross rates of disturbance and deforestation throughout the region. Further, net deforestation and disturbance maps were used to show the general forest change in these watersheds over the past 25 years. We found that deforestation accounts for just ~50% of the total forest losses, and that forest disturbance (degradation) is critically important to consider when making forest change estimates associated with losses in habitat and carbon in the region. These results also provide spatially-detailed, temporally-specific information on forest change for nearly three decades. Information provided by this study will assist decision-makers in Peru to improve their regional environmental management. The results, unprecedented in spatial and temporal scope, are another example showing the fidelity of tropical deforestation and forest degradation monitoring made routine using the CLASlite system.
Encoding Strategy Changes and Spacing Effects in the Free Recall of Unmixed Lists
ERIC Educational Resources Information Center
Delaney, P.F.; Knowles, M.E.
2005-01-01
Memory for repeated items often improves when repetitions are separated by other items-a phenomenon called the spacing effect. In two experiments, we explored the complex interaction between study strategies, serial position, and spacing effects. When people studied several unmixed lists, they initially used mainly rote rehearsal, but some people…
Spectral Learning for Supervised Topic Models.
Ren, Yong; Wang, Yining; Zhu, Jun
2018-03-01
Supervised topic models simultaneously model the latent topic structure of large collections of documents and a response variable associated with each document. Existing inference methods are based on variational approximation or Monte Carlo sampling, which often suffers from the local minimum defect. Spectral methods have been applied to learn unsupervised topic models, such as latent Dirichlet allocation (LDA), with provable guarantees. This paper investigates the possibility of applying spectral methods to recover the parameters of supervised LDA (sLDA). We first present a two-stage spectral method, which recovers the parameters of LDA followed by a power update method to recover the regression model parameters. Then, we further present a single-phase spectral algorithm to jointly recover the topic distribution matrix as well as the regression weights. Our spectral algorithms are provably correct and computationally efficient. We prove a sample complexity bound for each algorithm and subsequently derive a sufficient condition for the identifiability of sLDA. Thorough experiments on synthetic and real-world datasets verify the theory and demonstrate the practical effectiveness of the spectral algorithms. In fact, our results on a large-scale review rating dataset demonstrate that our single-phase spectral algorithm alone gets comparable or even better performance than state-of-the-art methods, while previous work on spectral methods has rarely reported such promising performance.
Hyperspectral feature mapping classification based on mathematical morphology
NASA Astrophysics Data System (ADS)
Liu, Chang; Li, Junwei; Wang, Guangping; Wu, Jingli
2016-03-01
This paper proposed a hyperspectral feature mapping classification algorithm based on mathematical morphology. Without the priori information such as spectral library etc., the spectral and spatial information can be used to realize the hyperspectral feature mapping classification. The mathematical morphological erosion and dilation operations are performed respectively to extract endmembers. The spectral feature mapping algorithm is used to carry on hyperspectral image classification. The hyperspectral image collected by AVIRIS is applied to evaluate the proposed algorithm. The proposed algorithm is compared with minimum Euclidean distance mapping algorithm, minimum Mahalanobis distance mapping algorithm, SAM algorithm and binary encoding mapping algorithm. From the results of the experiments, it is illuminated that the proposed algorithm's performance is better than that of the other algorithms under the same condition and has higher classification accuracy.
Nakazawa, Yoshihisa; Takeda, Tsuyoshi; Suzuki, Nobuaki; Hayashi, Tatsushi; Harada, Yoko; Bamba, Takeshi; Kobayashi, Akio
2013-09-01
A microscopic technique combining spectral confocal laser scanning microscopy with a lipophilic fluorescent dye, Nile red, which can emit trans-polyisoprene specific fluorescence, was developed, and unmixed images of synthesized trans-polyisoprene in situ in Eucommia ulmoides were successfully obtained. The images showed that trans-polyisoprene was initially synthesized as granules in non-articulated laticifers that changed shape to fibers during laticifer maturation. Non-articulated laticifers are developed from single laticiferous cells, which are differentiated from surrounding parenchyma cells in the cambium. Therefore, these observations suggested that trans-polyisoprene biosynthesis first started in laticifer cells as granules and then the granules accumulated and fused in the inner space of the laticifers over time. Finally, laticifers were filled with the synthesized trans-polyisoprene, which formed a fibrous structure fitting the laticifers shape. Both trans- and cis-polyisoprene are among the most important polymers naturally produced by plants, and this microscopic technique combined with histological study should provide useful information in the fields of plant histology, bioindustry and phytochemistry.
Multispectral open-air intraoperative fluorescence imaging.
Behrooz, Ali; Waterman, Peter; Vasquez, Kristine O; Meganck, Jeff; Peterson, Jeffrey D; Faqir, Ilias; Kempner, Joshua
2017-08-01
Intraoperative fluorescence imaging informs decisions regarding surgical margins by detecting and localizing signals from fluorescent reporters, labeling targets such as malignant tissues. This guidance reduces the likelihood of undetected malignant tissue remaining after resection, eliminating the need for additional treatment or surgery. The primary challenges in performing open-air intraoperative fluorescence imaging come from the weak intensity of the fluorescence signal in the presence of strong surgical and ambient illumination, and the auto-fluorescence of non-target components, such as tissue, especially in the visible spectral window (400-650 nm). In this work, a multispectral open-air fluorescence imaging system is presented for translational image-guided intraoperative applications, which overcomes these challenges. The system is capable of imaging weak fluorescence signals with nanomolar sensitivity in the presence of surgical illumination. This is done using synchronized fluorescence excitation and image acquisition with real-time background subtraction. Additionally, the system uses a liquid crystal tunable filter for acquisition of multispectral images that are used to spectrally unmix target fluorescence from non-target auto-fluorescence. Results are validated by preclinical studies on murine models and translational canine oncology models.
Arsenovic, Paul T; Bathula, Kranthidhar; Conway, Daniel E
2017-04-11
The LINC complex has been hypothesized to be the critical structure that mediates the transfer of mechanical forces from the cytoskeleton to the nucleus. Nesprin-2G is a key component of the LINC complex that connects the actin cytoskeleton to membrane proteins (SUN domain proteins) in the perinuclear space. These membrane proteins connect to lamins inside the nucleus. Recently, a Förster Resonance Energy Transfer (FRET)-force probe was cloned into mini-Nesprin-2G (Nesprin-TS (tension sensor)) and used to measure tension across Nesprin-2G in live NIH3T3 fibroblasts. This paper describes the process of using Nesprin-TS to measure LINC complex forces in NIH3T3 fibroblasts. To extract FRET information from Nesprin-TS, an outline of how to spectrally unmix raw spectral images into acceptor and donor fluorescent channels is also presented. Using open-source software (ImageJ), images are pre-processed and transformed into ratiometric images. Finally, FRET data of Nesprin-TS is presented, along with strategies for how to compare data across different experimental groups.
Tunable thin-film optical filters for hyperspectral microscopy
NASA Astrophysics Data System (ADS)
Favreau, Peter F.; Rich, Thomas C.; Prabhat, Prashant; Leavesley, Silas J.
2013-02-01
Hyperspectral imaging was originally developed for use in remote sensing applications. More recently, it has been applied to biological imaging systems, such as fluorescence microscopes. The ability to distinguish molecules based on spectral differences has been especially advantageous for identifying fluorophores in highly autofluorescent tissues. A key component of hyperspectral imaging systems is wavelength filtering. Each filtering technology used for hyperspectral imaging has corresponding advantages and disadvantages. Recently, a new optical filtering technology has been developed that uses multi-layered thin-film optical filters that can be rotated, with respect to incident light, to control the center wavelength of the pass-band. Compared to the majority of tunable filter technologies, these filters have superior optical performance including greater than 90% transmission, steep spectral edges and high out-of-band blocking. Hence, tunable thin-film optical filters present optical characteristics that may make them well-suited for many biological spectral imaging applications. An array of tunable thin-film filters was implemented on an inverted fluorescence microscope (TE 2000, Nikon Instruments) to cover the full visible wavelength range. Images of a previously published model, GFP-expressing endothelial cells in the lung, were acquired using a charge-coupled device camera (Rolera EM-C2, Q-Imaging). This model sample presents fluorescently-labeled cells in a highly autofluorescent environment. Linear unmixing of hyperspectral images indicates that thin-film tunable filters provide equivalent spectral discrimination to our previous acousto-optic tunable filter-based approach, with increased signal-to-noise characteristics. Hence, tunable multi-layered thin film optical filters may provide greatly improved spectral filtering characteristics and therefore enable wider acceptance of hyperspectral widefield microscopy.
Spectral Diffusion: An Algorithm for Robust Material Decomposition of Spectral CT Data
Clark, Darin P.; Badea, Cristian T.
2014-01-01
Clinical successes with dual energy CT, aggressive development of energy discriminating x-ray detectors, and novel, target-specific, nanoparticle contrast agents promise to establish spectral CT as a powerful functional imaging modality. Common to all of these applications is the need for a material decomposition algorithm which is robust in the presence of noise. Here, we develop such an algorithm which uses spectrally joint, piece-wise constant kernel regression and the split Bregman method to iteratively solve for a material decomposition which is gradient sparse, quantitatively accurate, and minimally biased. We call this algorithm spectral diffusion because it integrates structural information from multiple spectral channels and their corresponding material decompositions within the framework of diffusion-like denoising algorithms (e.g. anisotropic diffusion, total variation, bilateral filtration). Using a 3D, digital bar phantom and a material sensitivity matrix calibrated for use with a polychromatic x-ray source, we quantify the limits of detectability (CNR = 5) afforded by spectral diffusion in the triple-energy material decomposition of iodine (3.1 mg/mL), gold (0.9 mg/mL), and gadolinium (2.9 mg/mL) concentrations. We then apply spectral diffusion to the in vivo separation of these three materials in the mouse kidneys, liver, and spleen. PMID:25296173
Spectral diffusion: an algorithm for robust material decomposition of spectral CT data.
Clark, Darin P; Badea, Cristian T
2014-11-07
Clinical successes with dual energy CT, aggressive development of energy discriminating x-ray detectors, and novel, target-specific, nanoparticle contrast agents promise to establish spectral CT as a powerful functional imaging modality. Common to all of these applications is the need for a material decomposition algorithm which is robust in the presence of noise. Here, we develop such an algorithm which uses spectrally joint, piecewise constant kernel regression and the split Bregman method to iteratively solve for a material decomposition which is gradient sparse, quantitatively accurate, and minimally biased. We call this algorithm spectral diffusion because it integrates structural information from multiple spectral channels and their corresponding material decompositions within the framework of diffusion-like denoising algorithms (e.g. anisotropic diffusion, total variation, bilateral filtration). Using a 3D, digital bar phantom and a material sensitivity matrix calibrated for use with a polychromatic x-ray source, we quantify the limits of detectability (CNR = 5) afforded by spectral diffusion in the triple-energy material decomposition of iodine (3.1 mg mL(-1)), gold (0.9 mg mL(-1)), and gadolinium (2.9 mg mL(-1)) concentrations. We then apply spectral diffusion to the in vivo separation of these three materials in the mouse kidneys, liver, and spleen.
A wavelet and least square filter based spatial-spectral denoising approach of hyperspectral imagery
NASA Astrophysics Data System (ADS)
Li, Ting; Chen, Xiao-Mei; Chen, Gang; Xue, Bo; Ni, Guo-Qiang
2009-11-01
Noise reduction is a crucial step in hyperspectral imagery pre-processing. Based on sensor characteristics, the noise of hyperspectral imagery represents in both spatial and spectral domain. However, most prevailing denosing techniques process the imagery in only one specific domain, which have not utilized multi-domain nature of hyperspectral imagery. In this paper, a new spatial-spectral noise reduction algorithm is proposed, which is based on wavelet analysis and least squares filtering techniques. First, in the spatial domain, a new stationary wavelet shrinking algorithm with improved threshold function is utilized to adjust the noise level band-by-band. This new algorithm uses BayesShrink for threshold estimation, and amends the traditional soft-threshold function by adding shape tuning parameters. Comparing with soft or hard threshold function, the improved one, which is first-order derivable and has a smooth transitional region between noise and signal, could save more details of image edge and weaken Pseudo-Gibbs. Then, in the spectral domain, cubic Savitzky-Golay filter based on least squares method is used to remove spectral noise and artificial noise that may have been introduced in during the spatial denoising. Appropriately selecting the filter window width according to prior knowledge, this algorithm has effective performance in smoothing the spectral curve. The performance of the new algorithm is experimented on a set of Hyperion imageries acquired in 2007. The result shows that the new spatial-spectral denoising algorithm provides more significant signal-to-noise-ratio improvement than traditional spatial or spectral method, while saves the local spectral absorption features better.
Multi scales based sparse matrix spectral clustering image segmentation
NASA Astrophysics Data System (ADS)
Liu, Zhongmin; Chen, Zhicai; Li, Zhanming; Hu, Wenjin
2018-04-01
In image segmentation, spectral clustering algorithms have to adopt the appropriate scaling parameter to calculate the similarity matrix between the pixels, which may have a great impact on the clustering result. Moreover, when the number of data instance is large, computational complexity and memory use of the algorithm will greatly increase. To solve these two problems, we proposed a new spectral clustering image segmentation algorithm based on multi scales and sparse matrix. We devised a new feature extraction method at first, then extracted the features of image on different scales, at last, using the feature information to construct sparse similarity matrix which can improve the operation efficiency. Compared with traditional spectral clustering algorithm, image segmentation experimental results show our algorithm have better degree of accuracy and robustness.
Using dark current data to estimate AVIRIS noise covariance and improve spectral analyses
NASA Technical Reports Server (NTRS)
Boardman, Joseph W.
1995-01-01
Starting in 1994, all AVIRIS data distributions include a new product useful for quantification and modeling of the noise in the reported radiance data. The 'postcal' file contains approximately 100 lines of dark current data collected at the end of each data acquisition run. In essence this is a regular spectral-image cube, with 614 samples, 100 lines and 224 channels, collected with a closed shutter. Since there is no incident radiance signal, the recorded DN measure only the DC signal level and the noise in the system. Similar dark current measurements, made at the end of each line are used, with a 100 line moving average, to remove the DC signal offset. Therefore, the pixel-by-pixel fluctuations about the mean of this dark current image provide an excellent model for the additive noise that is present in AVIRIS reported radiance data. The 61,400 dark current spectra can be used to calculate the noise levels in each channel and the noise covariance matrix. Both of these noise parameters should be used to improve spectral processing techniques. Some processing techniques, such as spectral curve fitting, will benefit from a robust estimate of the channel-dependent noise levels. Other techniques, such as automated unmixing and classification, will be improved by the stable and scene-independence noise covariance estimate. Future imaging spectrometry systems should have a similar ability to record dark current data, permitting this noise characterization and modeling.
Fast Constrained Spectral Clustering and Cluster Ensemble with Random Projection
Liu, Wenfen
2017-01-01
Constrained spectral clustering (CSC) method can greatly improve the clustering accuracy with the incorporation of constraint information into spectral clustering and thus has been paid academic attention widely. In this paper, we propose a fast CSC algorithm via encoding landmark-based graph construction into a new CSC model and applying random sampling to decrease the data size after spectral embedding. Compared with the original model, the new algorithm has the similar results with the increase of its model size asymptotically; compared with the most efficient CSC algorithm known, the new algorithm runs faster and has a wider range of suitable data sets. Meanwhile, a scalable semisupervised cluster ensemble algorithm is also proposed via the combination of our fast CSC algorithm and dimensionality reduction with random projection in the process of spectral ensemble clustering. We demonstrate by presenting theoretical analysis and empirical results that the new cluster ensemble algorithm has advantages in terms of efficiency and effectiveness. Furthermore, the approximate preservation of random projection in clustering accuracy proved in the stage of consensus clustering is also suitable for the weighted k-means clustering and thus gives the theoretical guarantee to this special kind of k-means clustering where each point has its corresponding weight. PMID:29312447
Unmixing the SNCs: Chemical, Isotopic, and Petrologic Components of the Martian Meteorites
NASA Technical Reports Server (NTRS)
2002-01-01
This volume contains abstracts that have been accepted for presentation at the conference on Unmixing the SNCs: Chemical, Isotopic, and Petrologic Components of Martian Meteorites, September 11-12, 2002, in Houston, Texas. Administration and publications support for this meeting were provided by the staff of the Publications and Program Services Department at the Lunar and Planetary Institute.
Unsupervised Bayesian linear unmixing of gene expression microarrays.
Bazot, Cécile; Dobigeon, Nicolas; Tourneret, Jean-Yves; Zaas, Aimee K; Ginsburg, Geoffrey S; Hero, Alfred O
2013-03-19
This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor.
Yang, Pao-Keng
2012-05-01
We present a noniterative algorithm to reliably reconstruct the spectral reflectance from discrete reflectance values measured by using multicolor light emitting diodes (LEDs) as probing light sources. The proposed algorithm estimates the spectral reflectance by a linear combination of product functions of the detector's responsivity function and the LEDs' line-shape functions. After introducing suitable correction, the resulting spectral reflectance was found to be free from the spectral-broadening effect due to the finite bandwidth of LED. We analyzed the data for a real sample and found that spectral reflectance with enhanced resolution gives a more accurate prediction in the color measurement.
NASA Astrophysics Data System (ADS)
Yang, Pao-Keng
2012-05-01
We present a noniterative algorithm to reliably reconstruct the spectral reflectance from discrete reflectance values measured by using multicolor light emitting diodes (LEDs) as probing light sources. The proposed algorithm estimates the spectral reflectance by a linear combination of product functions of the detector's responsivity function and the LEDs' line-shape functions. After introducing suitable correction, the resulting spectral reflectance was found to be free from the spectral-broadening effect due to the finite bandwidth of LED. We analyzed the data for a real sample and found that spectral reflectance with enhanced resolution gives a more accurate prediction in the color measurement.
Implementation of spectral clustering on microarray data of carcinoma using k-means algorithm
NASA Astrophysics Data System (ADS)
Frisca, Bustamam, Alhadi; Siswantining, Titin
2017-03-01
Clustering is one of data analysis methods that aims to classify data which have similar characteristics in the same group. Spectral clustering is one of the most popular modern clustering algorithms. As an effective clustering technique, spectral clustering method emerged from the concepts of spectral graph theory. Spectral clustering method needs partitioning algorithm. There are some partitioning methods including PAM, SOM, Fuzzy c-means, and k-means. Based on the research that has been done by Capital and Choudhury in 2013, when using Euclidian distance k-means algorithm provide better accuracy than PAM algorithm. So in this paper we use k-means as our partition algorithm. The major advantage of spectral clustering is in reducing data dimension, especially in this case to reduce the dimension of large microarray dataset. Microarray data is a small-sized chip made of a glass plate containing thousands and even tens of thousands kinds of genes in the DNA fragments derived from doubling cDNA. Application of microarray data is widely used to detect cancer, for the example is carcinoma, in which cancer cells express the abnormalities in his genes. The purpose of this research is to classify the data that have high similarity in the same group and the data that have low similarity in the others. In this research, Carcinoma microarray data using 7457 genes. The result of partitioning using k-means algorithm is two clusters.
Reconstructing Spectral Scenes Using Statistical Estimation to Enhance Space Situational Awareness
2006-12-01
simultane- ously spatially and spectrally deblur the images collected from ASIS. The algorithms are based on proven estimation theories and do not...collected with any system using a filtering technology known as Electronic Tunable Filters (ETFs). Previous methods to deblur spectral images collected...spectrally deblurring then the previously investigated methods. This algorithm expands on a method used for increasing the spectral resolution in gamma-ray
Evaluating an image-fusion algorithm with synthetic-image-generation tools
NASA Astrophysics Data System (ADS)
Gross, Harry N.; Schott, John R.
1996-06-01
An algorithm that combines spectral mixing and nonlinear optimization is used to fuse multiresolution images. Image fusion merges images of different spatial and spectral resolutions to create a high spatial resolution multispectral combination. High spectral resolution allows identification of materials in the scene, while high spatial resolution locates those materials. In this algorithm, conventional spectral mixing estimates the percentage of each material (called endmembers) within each low resolution pixel. Three spectral mixing models are compared; unconstrained, partially constrained, and fully constrained. In the partially constrained application, the endmember fractions are required to sum to one. In the fully constrained application, all fractions are additionally required to lie between zero and one. While negative fractions seem inappropriate, they can arise from random spectral realizations of the materials. In the second part of the algorithm, the low resolution fractions are used as inputs to a constrained nonlinear optimization that calculates the endmember fractions for the high resolution pixels. The constraints mirror the low resolution constraints and maintain consistency with the low resolution fraction results. The algorithm can use one or more higher resolution sharpening images to locate the endmembers to high spatial accuracy. The algorithm was evaluated with synthetic image generation (SIG) tools. A SIG developed image can be used to control the various error sources that are likely to impair the algorithm performance. These error sources include atmospheric effects, mismodeled spectral endmembers, and variability in topography and illumination. By controlling the introduction of these errors, the robustness of the algorithm can be studied and improved upon. The motivation for this research is to take advantage of the next generation of multi/hyperspectral sensors. Although the hyperspectral images will be of modest to low resolution, fusing them with high resolution sharpening images will produce a higher spatial resolution land cover or material map.
NASA Astrophysics Data System (ADS)
Abramovich, N. S.; Kovalev, A. A.; Plyuta, V. Y.
1986-02-01
A computer algorithm has been developed to classify the spectral bands of natural scenes on Earth according to their optical characteristics. The algorithm is written in FORTRAN-IV and can be used in spectral data processing programs requiring small data loads. The spectral classifications of some different types of green vegetable canopies are given in order to illustrate the effectiveness of the algorithm.
Red Blood Cell Count Automation Using Microscopic Hyperspectral Imaging Technology.
Li, Qingli; Zhou, Mei; Liu, Hongying; Wang, Yiting; Guo, Fangmin
2015-12-01
Red blood cell counts have been proven to be one of the most frequently performed blood tests and are valuable for early diagnosis of some diseases. This paper describes an automated red blood cell counting method based on microscopic hyperspectral imaging technology. Unlike the light microscopy-based red blood count methods, a combined spatial and spectral algorithm is proposed to identify red blood cells by integrating active contour models and automated two-dimensional k-means with spectral angle mapper algorithm. Experimental results show that the proposed algorithm has better performance than spatial based algorithm because the new algorithm can jointly use the spatial and spectral information of blood cells.
Hyperspectral Super-Resolution of Locally Low Rank Images From Complementary Multisource Data.
Veganzones, Miguel A; Simoes, Miguel; Licciardi, Giorgio; Yokoya, Naoto; Bioucas-Dias, Jose M; Chanussot, Jocelyn
2016-01-01
Remote sensing hyperspectral images (HSIs) are quite often low rank, in the sense that the data belong to a low dimensional subspace/manifold. This has been recently exploited for the fusion of low spatial resolution HSI with high spatial resolution multispectral images in order to obtain super-resolution HSI. Most approaches adopt an unmixing or a matrix factorization perspective. The derived methods have led to state-of-the-art results when the spectral information lies in a low-dimensional subspace/manifold. However, if the subspace/manifold dimensionality spanned by the complete data set is large, i.e., larger than the number of multispectral bands, the performance of these methods mainly decreases because the underlying sparse regression problem is severely ill-posed. In this paper, we propose a local approach to cope with this difficulty. Fundamentally, we exploit the fact that real world HSIs are locally low rank, that is, pixels acquired from a given spatial neighborhood span a very low-dimensional subspace/manifold, i.e., lower or equal than the number of multispectral bands. Thus, we propose to partition the image into patches and solve the data fusion problem independently for each patch. This way, in each patch the subspace/manifold dimensionality is low enough, such that the problem is not ill-posed anymore. We propose two alternative approaches to define the hyperspectral super-resolution through local dictionary learning using endmember induction algorithms. We also explore two alternatives to define the local regions, using sliding windows and binary partition trees. The effectiveness of the proposed approaches is illustrated with synthetic and semi real data.
NASA Astrophysics Data System (ADS)
Guerschman, J. P.; Scarth, P.; McVicar, T.; Malthus, T. J.; Stewart, J.; Rickards, J.; Trevithick, R.; Renzullo, L. J.
2013-12-01
Vegetation fractional cover is a key indicator for land management monitoring, both in pastoral and agricultural settings. Maintaining adequate vegetation cover protects the soil from the effects of water and wind erosion and also ensures that carbon is returned to soil through decomposition. Monitoring vegetation fractional cover across large areas and continuously in time needs good remote sensing techniques underpinned by high quality ground data to calibrate and validate algorithms. In this study we used Landsat and MODIS reflectance data together with field measurements from 1476 observations across Australia to produce estimates of vegetation fractional cover using a linear unmixing technique. Specifically, we aimed at separating fractions of photosynthetic vegetation (PV), non-photosynthetic vegetation (NPV) and bare soil (B). We used Landsat reflectance averaged over a 3x3 pixel window representing the area actually measured on the ground and also a 'degraded' Landsat reflectance 40x40 pixel window to simulate the effect of a coarser sensor. Using these two Landsat reflectances we quantified the heterogeneity of each site. We used data from two MODIS-derived reflectance products: the Nadir BRDF-Adjusted surface Reflectance product (MCD43A4) and the MODIS 8-day surface reflectance (MOD09A1). We derived endmembers from the data and estimated fractional cover using a linear unmixing technique. Log transforms and band interaction terms were added to account for non-linearities in the spectral mixing. For each reflectance source we investigated if the residuals were correlated with site heterogeneity, soil colour, soil moisture and land cover type. As expected, the best model was obtained when Landsat data for a small region around each site was used. We obtained root mean square error (RMSE) values of 0.134, 0.175 and 0.153 for PV, NPV and B respectively. When we degraded the Landsat data to an area of ~1 km2 around each site the model performance decreased to RMSE of 0.142, 0.181 and 0.166 for PV, NPV and B. Using MODIS reflectance data (from the MCD43A4 and MOD09A1 products) we obtained similar results as when using the 'degraded' Landsat reflectance, with no significant differences between them. Model performance decreased (i.e. RMSE increased) with site heterogeneity when coarse resolution reflectance data was used. We did not find any evidence of soil colour or moisture influence on model performance. We speculate that the unmixing models may be insensitive to soil colour and/or that the soil moisture in the top few millimetres of soil, which influence reflectance in optical sensors, is decoupled from the soil moisture in the top layer (i.e. a few cm) as measured by passive microwave sensors or estimated by models. The models tended to overestimate PV in cropping areas, possibly due to a strong red/ near infrared signal in homogeneous crops which do not have a high green cover. This study sets the basis for an operational Landsat/ MODIS combined product which would benefit users with varying requirements of spatial, temporal resolution and latency and could potentially be applied to other regions in the world.
Data compressive paradigm for multispectral sensing using tunable DWELL mid-infrared detectors.
Jang, Woo-Yong; Hayat, Majeed M; Godoy, Sebastián E; Bender, Steven C; Zarkesh-Ha, Payman; Krishna, Sanjay
2011-09-26
While quantum dots-in-a-well (DWELL) infrared photodetectors have the feature that their spectral responses can be shifted continuously by varying the applied bias, the width of the spectral response at any applied bias is not sufficiently narrow for use in multispectral sensing without the aid of spectral filters. To achieve higher spectral resolutions without using physical spectral filters, algorithms have been developed for post-processing the DWELL's bias-dependent photocurrents resulting from probing an object of interest repeatedly over a wide range of applied biases. At the heart of these algorithms is the ability to approximate an arbitrary spectral filter, which we desire the DWELL-algorithm combination to mimic, by forming a weighted superposition of the DWELL's non-orthogonal spectral responses over a range of applied biases. However, these algorithms assume availability of abundant DWELL data over a large number of applied biases (>30), leading to large overall acquisition times in proportion with the number of biases. This paper reports a new multispectral sensing algorithm to substantially compress the number of necessary bias values subject to a prescribed performance level across multiple sensing applications. The algorithm identifies a minimal set of biases to be used in sensing only the relevant spectral information for remote-sensing applications of interest. Experimental results on target spectrometry and classification demonstrate a reduction in the number of required biases by a factor of 7 (e.g., from 30 to 4). The tradeoff between performance and bias compression is thoroughly investigated. © 2011 Optical Society of America
NASA Astrophysics Data System (ADS)
Wu, Zhejun; Kudenov, Michael W.
2017-05-01
This paper presents a reconstruction algorithm for the Spatial-Spectral Multiplexing (SSM) optical system. The goal of this algorithm is to recover the three-dimensional spatial and spectral information of a scene, given that a one-dimensional spectrometer array is used to sample the pupil of the spatial-spectral modulator. The challenge of the reconstruction is that the non-parametric representation of the three-dimensional spatial and spectral object requires a large number of variables, thus leading to an underdetermined linear system that is hard to uniquely recover. We propose to reparameterize the spectrum using B-spline functions to reduce the number of unknown variables. Our reconstruction algorithm then solves the improved linear system via a least- square optimization of such B-spline coefficients with additional spatial smoothness regularization. The ground truth object and the optical model for the measurement matrix are simulated with both spatial and spectral assumptions according to a realistic field of view. In order to test the robustness of the algorithm, we add Poisson noise to the measurement and test on both two-dimensional and three-dimensional spatial and spectral scenes. Our analysis shows that the root mean square error of the recovered results can be achieved within 5.15%.
Lichten, Catherine A; White, Rachel; Clark, Ivan B N; Swain, Peter S
2014-02-03
To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowan, Nicolas B.; Strait, Talia E., E-mail: n-cowan@northwestern.edu
Planned missions will spatially resolve temperate terrestrial planets from their host star. Although reflected light from such a planet encodes information about its surface, it has not been shown how to establish surface characteristics of a planet without assuming known surfaces to begin with. We present a reanalysis of disk-integrated, time-resolved, multiband photometry of Earth obtained by the Deep Impact spacecraft as part of the EPOXI Mission of Opportunity. We extract reflectance spectra of clouds, ocean, and land without a priori knowledge of the numbers or colors of these surfaces. We show that the inverse problem of extracting surface spectramore » from such data is a novel and extreme instance of spectral unmixing, a well-studied problem in remote sensing. Principal component analysis is used to determine an appropriate number of model surfaces with which to interpret the data. Shrink-wrapping a simplex to the color excursions of the planet yields a conservative estimate of the planet's endmember spectra. The resulting surface maps are unphysical, however, requiring negative or larger-than-unity surface coverage at certain locations. Our ''rotational unmixing'' supersedes the endmember analysis by simultaneously solving for the surface spectra and their geographical distributions on the planet, under the assumption of diffuse reflection and known viewing geometry. We use a Markov Chain Monte Carlo to determine best-fit parameters and their uncertainties. The resulting albedo spectra are similar to clouds, ocean, and land seen through a Rayleigh-scattering atmosphere. This study suggests that future direct-imaging efforts could identify and map unknown surfaces and clouds on exoplanets.« less
2014-01-01
Background To connect gene expression with cellular physiology, we need to follow levels of proteins over time. Experiments typically use variants of Green Fluorescent Protein (GFP), and time-series measurements require specialist expertise if single cells are to be followed. Fluorescence plate readers, however, a standard in many laboratories, can in principle provide similar data, albeit at a mean, population level. Nevertheless, extracting the average fluorescence per cell is challenging because autofluorescence can be substantial. Results Here we propose a general method for correcting plate reader measurements of fluorescent proteins that uses spectral unmixing and determines both the fluorescence per cell and the errors on that fluorescence. Combined with strain collections, such as the GFP fusion collection for budding yeast, our methodology allows quantitative measurements of protein levels of up to hundreds of genes and therefore provides complementary data to high throughput studies of transcription. We illustrate the method by following the induction of the GAL genes in Saccharomyces cerevisiae for over 20 hours in different sugars and argue that the order of appearance of the Leloir enzymes may be to reduce build-up of the toxic intermediate galactose-1-phosphate. Further, we quantify protein levels of over 40 genes, again over 20 hours, after cells experience a change in carbon source (from glycerol to glucose). Conclusions Our methodology is sensitive, scalable, and should be applicable to other organisms. By allowing quantitative measurements on a per cell basis over tens of hours and over hundreds of genes, it should increase our understanding of the dynamic changes that drive cellular behaviour. PMID:24495318
Determining Reflectance Spectra of Surfaces and Clouds on Exoplanets
NASA Astrophysics Data System (ADS)
Cowan, Nicolas B.; Strait, Talia E.
2013-03-01
Planned missions will spatially resolve temperate terrestrial planets from their host star. Although reflected light from such a planet encodes information about its surface, it has not been shown how to establish surface characteristics of a planet without assuming known surfaces to begin with. We present a reanalysis of disk-integrated, time-resolved, multiband photometry of Earth obtained by the Deep Impact spacecraft as part of the EPOXI Mission of Opportunity. We extract reflectance spectra of clouds, ocean, and land without a priori knowledge of the numbers or colors of these surfaces. We show that the inverse problem of extracting surface spectra from such data is a novel and extreme instance of spectral unmixing, a well-studied problem in remote sensing. Principal component analysis is used to determine an appropriate number of model surfaces with which to interpret the data. Shrink-wrapping a simplex to the color excursions of the planet yields a conservative estimate of the planet's endmember spectra. The resulting surface maps are unphysical, however, requiring negative or larger-than-unity surface coverage at certain locations. Our "rotational unmixing" supersedes the endmember analysis by simultaneously solving for the surface spectra and their geographical distributions on the planet, under the assumption of diffuse reflection and known viewing geometry. We use a Markov Chain Monte Carlo to determine best-fit parameters and their uncertainties. The resulting albedo spectra are similar to clouds, ocean, and land seen through a Rayleigh-scattering atmosphere. This study suggests that future direct-imaging efforts could identify and map unknown surfaces and clouds on exoplanets.
An improved feature extraction algorithm based on KAZE for multi-spectral image
NASA Astrophysics Data System (ADS)
Yang, Jianping; Li, Jun
2018-02-01
Multi-spectral image contains abundant spectral information, which is widely used in all fields like resource exploration, meteorological observation and modern military. Image preprocessing, such as image feature extraction and matching, is indispensable while dealing with multi-spectral remote sensing image. Although the feature matching algorithm based on linear scale such as SIFT and SURF performs strong on robustness, the local accuracy cannot be guaranteed. Therefore, this paper proposes an improved KAZE algorithm, which is based on nonlinear scale, to raise the number of feature and to enhance the matching rate by using the adjusted-cosine vector. The experiment result shows that the number of feature and the matching rate of the improved KAZE are remarkably than the original KAZE algorithm.
Joint demosaicking and zooming using moderate spectral correlation and consistent edge map
NASA Astrophysics Data System (ADS)
Zhou, Dengwen; Dong, Weiming; Chen, Wengang
2014-07-01
The recently published joint demosaicking and zooming algorithms for single-sensor digital cameras all overfit the popular Kodak test images, which have been found to have higher spectral correlation than typical color images. Their performance perhaps significantly degrades on other datasets, such as the McMaster test images, which have weak spectral correlation. A new joint demosaicking and zooming algorithm is proposed for the Bayer color filter array (CFA) pattern, in which the edge direction information (edge map) extracted from the raw CFA data is consistently used in demosaicking and zooming. It also moderately utilizes the spectral correlation between color planes. The experimental results confirm that the proposed algorithm produces an excellent performance on both the Kodak and McMaster datasets in terms of both subjective and objective measures. Our algorithm also has high computational efficiency. It provides a better tradeoff among adaptability, performance, and computational cost compared to the existing algorithms.
Disaggregating tree and grass phenology in tropical savannas
NASA Astrophysics Data System (ADS)
Zhou, Qiang
Savannas are mixed tree-grass systems and as one of the world's largest biomes represent an important component of the Earth system affecting water and energy balances, carbon sequestration and biodiversity as well as supporting large human populations. Savanna vegetation structure and its distribution, however, may change because of major anthropogenic disturbances from climate change, wildfire, agriculture, and livestock production. The overstory and understory may have different water use strategies, different nutrient requirements and have different responses to fire and climate variation. The accurate measurement of the spatial distribution and structure of the overstory and understory are essential for understanding the savanna ecosystem. This project developed a workflow for separating the dynamics of the overstory and understory fractional cover in savannas at the continental scale (Australia, South America, and Africa). Previous studies have successfully separated the phenology of Australian savanna vegetation into persistent and seasonal greenness using time series decomposition, and into fractions of photosynthetic vegetation (PV), non-photosynthetic vegetation (NPV) and bare soil (BS) using linear unmixing. This study combined these methods to separate the understory and overstory signal in both the green and senescent phenological stages using remotely sensed imagery from the MODIS (MODerate resolution Imaging Spectroradiometer) sensor. The methods and parameters were adjusted based on the vegetation variation. The workflow was first tested at the Australian site. Here the PV estimates for overstory and understory showed best performance, however NPV estimates exhibited spatial variation in validation relationships. At the South American site (Cerrado), an additional method based on frequency unmixing was developed to separate green vegetation components with similar phenology. When the decomposition and frequency methods were compared, the frequency method was better for extracting the green tree phenology, but the original decomposition method was better for retrieval of understory grass phenology. Both methods, however, were less accurate than in the Cerrado than in Australia due to intermingling and intergrading of grass and small woody components. Since African savanna trees are predominantly deciduous, the frequency method was combined with the linear unmixing of fractional cover to attempt to separate the relatively similar phenology of deciduous trees and seasonal grasses. The results for Africa revealed limitations associated with both methods. There was spatial and seasonal variation in the spectral indices used to unmix fractional cover resulting in poor validation for NPV in particular. The frequency analysis revealed significant phase variation indicative of different phenology, but these could not be clearly ascribed to separate grass and tree components. Overall findings indicate that site-specific variation and vegetation structure and composition, along with MODIS pixel resolution, and the simple vegetation index approach used was not robust across the different savanna biomes. The approach showed generally better performance for estimating PV fraction, and separating green phenology, but there were major inconsistencies, errors and biases in estimation of NPV and BS outside of the Australian savanna environment.
NASA Astrophysics Data System (ADS)
Doha, E.; Bhrawy, A.
2006-06-01
It is well known that spectral methods (tau, Galerkin, collocation) have a condition number of ( is the number of retained modes of polynomial approximations). This paper presents some efficient spectral algorithms, which have a condition number of , based on the Jacobi?Galerkin methods of second-order elliptic equations in one and two space variables. The key to the efficiency of these algorithms is to construct appropriate base functions, which lead to systems with specially structured matrices that can be efficiently inverted. The complexities of the algorithms are a small multiple of operations for a -dimensional domain with unknowns, while the convergence rates of the algorithms are exponentials with smooth solutions.
Characterizing Drought Impacted Soils in the San Joaquin Valley of California Using Remote Sensing
NASA Astrophysics Data System (ADS)
Wahab, L. M.; Miller, D.; Roberts, D. A.
2017-12-01
California's San Joaquin Valley is an extremely agriculturally productive region of the country, and understanding the state of soils in this region is an important factor in maintaining this high productivity. In this study, we quantified changing soil cover during the drought and analyzed spatial changes in salinity, organic matter, and moisture using unique soil spectral characteristics. We used data from the Airborne Visible / Infrared Imaging Spectrometer (AVIRIS) from Hyperspectral Infrared Imager (HyspIRI) campaign flights in 2013 and 2014 over the San Joaquin Valley. A mixture model was applied to both images that identified non- photosynthetic vegetation, green vegetation, and soil cover fractions through image endmembers of each of these three classes. We optimized the spectral library used to identify these classes with Iterative Endmember Selection (IES), and the images were unmixed using Multiple Endmember Spectral Mixture Analysis (MESMA). Maps of soil electrical conductivity, organic matter, soil saturated moisture, and field moisture were generated for the San Joaquin Valley based on indices developed by Ben-Dor et al. [2002]. Representative polygons were chosen to quantify changes between years. Maps of spectrally distinct soils were also generated for 2013 and 2014, in order to determine the spatial distribution of these soil types as well as their temporal dynamics between years. We estimated that soil cover increased by 16% from 2013-2014. Six spectrally distinct soil types were identified for the region, and it was determined that the distribution of these soil types was not constant for most areas between 2013 and 2014. Changes in soil pH, electrical conductivity, and soil moisture were strongly tied in the region between 2013 and 2014.
[Source apportionment of soil heavy metals in Jiapigou goldmine based on the UNMIX model].
Ai, Jian-chao; Wang, Ning; Yang, Jing
2014-09-01
The paper determines 16 kinds of metal elements' concentration in soil samples which collected in Jipigou goldmine upper the Songhua River. The UNMIX Model which was recommended by US EPA to get the source apportionment results was applied in this study, Cd, Hg, Pb and Ag concentration contour maps were generated by using Kriging interpolation method to verify the results. The main conclusions of this study are: (1)the concentrations of Cd, Hg, Pb and Ag exceeded Jilin Province soil background values and enriched obviously in soil samples; (2)using the UNMIX Model resolved four pollution sources: source 1 represents human activities of transportation, ore mining and garbage, and the source 1's contribution is 39. 1% ; Source 2 represents the contribution of the weathering of rocks and biological effects, and the source 2's contribution is 13. 87% ; Source 3 is a comprehensive source of soil parent material and chemical fertilizer, and the source 3's contribution is 23. 93% ; Source 4 represents iron ore mining and transportation sources, and the source 4's contribution is 22. 89%. (3)the UNMIX Model results are in accordance with the survey of local land-use types, human activities and Cd, Hg and Pb content distributions.
Filtered gradient reconstruction algorithm for compressive spectral imaging
NASA Astrophysics Data System (ADS)
Mejia, Yuri; Arguello, Henry
2017-04-01
Compressive sensing matrices are traditionally based on random Gaussian and Bernoulli entries. Nevertheless, they are subject to physical constraints, and their structure unusually follows a dense matrix distribution, such as the case of the matrix related to compressive spectral imaging (CSI). The CSI matrix represents the integration of coded and shifted versions of the spectral bands. A spectral image can be recovered from CSI measurements by using iterative algorithms for linear inverse problems that minimize an objective function including a quadratic error term combined with a sparsity regularization term. However, current algorithms are slow because they do not exploit the structure and sparse characteristics of the CSI matrices. A gradient-based CSI reconstruction algorithm, which introduces a filtering step in each iteration of a conventional CSI reconstruction algorithm that yields improved image quality, is proposed. Motivated by the structure of the CSI matrix, Φ, this algorithm modifies the iterative solution such that it is forced to converge to a filtered version of the residual ΦTy, where y is the compressive measurement vector. We show that the filtered-based algorithm converges to better quality performance results than the unfiltered version. Simulation results highlight the relative performance gain over the existing iterative algorithms.
Multiway spectral community detection in networks
NASA Astrophysics Data System (ADS)
Zhang, Xiao; Newman, M. E. J.
2015-11-01
One of the most widely used methods for community detection in networks is the maximization of the quality function known as modularity. Of the many maximization techniques that have been used in this context, some of the most conceptually attractive are the spectral methods, which are based on the eigenvectors of the modularity matrix. Spectral algorithms have, however, been limited, by and large, to the division of networks into only two or three communities, with divisions into more than three being achieved by repeated two-way division. Here we present a spectral algorithm that can directly divide a network into any number of communities. The algorithm makes use of a mapping from modularity maximization to a vector partitioning problem, combined with a fast heuristic for vector partitioning. We compare the performance of this spectral algorithm with previous approaches and find it to give superior results, particularly in cases where community sizes are unbalanced. We also give demonstrative applications of the algorithm to two real-world networks and find that it produces results in good agreement with expectations for the networks studied.
NASA Technical Reports Server (NTRS)
Matic, Roy M.; Mosley, Judith I.
1994-01-01
Future space-based, remote sensing systems will have data transmission requirements that exceed available downlinks necessitating the use of lossy compression techniques for multispectral data. In this paper, we describe several algorithms for lossy compression of multispectral data which combine spectral decorrelation techniques with an adaptive, wavelet-based, image compression algorithm to exploit both spectral and spatial correlation. We compare the performance of several different spectral decorrelation techniques including wavelet transformation in the spectral dimension. The performance of each technique is evaluated at compression ratios ranging from 4:1 to 16:1. Performance measures used are visual examination, conventional distortion measures, and multispectral classification results. We also introduce a family of distortion metrics that are designed to quantify and predict the effect of compression artifacts on multi spectral classification of the reconstructed data.
Estimation of Biochemical Constituents From Fresh, Green Leaves By Spectrum Matching Techniques
NASA Technical Reports Server (NTRS)
Goetz, A. F. H.; Gao, B. C.; Wessman, C. A.; Bowman, W. D.
1990-01-01
Estimation of biochemical constituents in vegetation such as lignin, cellulose, starch, sugar and protein by remote sensing methods is an important goal in ecological research. The spectral reflectances of dried leaves exhibit diagnostic absorption features which can be used to estimate the abundance of important constituents. Lignin and nitrogen concentrations have been obtained from canopies by use of imaging spectrometry and multiple linear regression techniques. The difficulty in identifying individual spectra of leaf constituents in the region beyond 1 micrometer is that liquid water contained in the leaf dominates the spectral reflectance of leaves in this region. By use of spectrum matching techniques, originally used to quantify whole column water abundance in the atmosphere and equivalent liquid water thickness in leaves, we have been able to remove the liquid water contribution to the spectrum. The residual spectra resemble spectra for cellulose in the 1.1 micrometer region, lignin in the 1.7 micrometer region, and starch in the 2.0-2.3 micrometer region. In the entire 1.0-2.3 micrometer region each of the major constituents contributes to the spectrum. Quantitative estimates will require using unmixing techniques on the residual spectra.
NASA Technical Reports Server (NTRS)
Kruse, Fred A.; Dwyer, John L.
1993-01-01
The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.
Tensor Spectral Clustering for Partitioning Higher-order Network Structures.
Benson, Austin R; Gleich, David F; Leskovec, Jure
2015-01-01
Spectral graph theory-based methods represent an important class of tools for studying the structure of networks. Spectral methods are based on a first-order Markov chain derived from a random walk on the graph and thus they cannot take advantage of important higher-order network substructures such as triangles, cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering (TSC) algorithm that allows for modeling higher-order network structures in a graph partitioning framework. Our TSC algorithm allows the user to specify which higher-order network structures (cycles, feed-forward loops, etc.) should be preserved by the network clustering. Higher-order network structures of interest are represented using a tensor, which we then partition by developing a multilinear spectral method. Our framework can be applied to discovering layered flows in networks as well as graph anomaly detection, which we illustrate on synthetic networks. In directed networks, a higher-order structure of particular interest is the directed 3-cycle, which captures feedback loops in networks. We demonstrate that our TSC algorithm produces large partitions that cut fewer directed 3-cycles than standard spectral clustering algorithms.
Tensor Spectral Clustering for Partitioning Higher-order Network Structures
Benson, Austin R.; Gleich, David F.; Leskovec, Jure
2016-01-01
Spectral graph theory-based methods represent an important class of tools for studying the structure of networks. Spectral methods are based on a first-order Markov chain derived from a random walk on the graph and thus they cannot take advantage of important higher-order network substructures such as triangles, cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering (TSC) algorithm that allows for modeling higher-order network structures in a graph partitioning framework. Our TSC algorithm allows the user to specify which higher-order network structures (cycles, feed-forward loops, etc.) should be preserved by the network clustering. Higher-order network structures of interest are represented using a tensor, which we then partition by developing a multilinear spectral method. Our framework can be applied to discovering layered flows in networks as well as graph anomaly detection, which we illustrate on synthetic networks. In directed networks, a higher-order structure of particular interest is the directed 3-cycle, which captures feedback loops in networks. We demonstrate that our TSC algorithm produces large partitions that cut fewer directed 3-cycles than standard spectral clustering algorithms. PMID:27812399
NASA Astrophysics Data System (ADS)
Nolte, Lena; Antonopoulos, Georgios C.; Heisterkamp, Alexander; Ripken, Tammo; Meyer, Heiko
2018-02-01
Scanning laser optical tomography (SLOT) is a 3D imaging technique, based on the principle of computed tomography to visualize samples up to magnitude of several centimeters. Intrinsic contrast mechanisms as absorption, scattering and autofluorescence provide information about the 3D architecture and composition of the sample. Another valuable intrinsic contrast mechanism is second harmonic generation (SHG), which is generated in noncentrosymmetric materials and commonly used to image collagen in biological samples. The angular dependence of the SHG signal, however, produces artifacts in reconstructed optical tomography datasets (OPT, SLOT). Thus, successful use of this intrinsic contrast mechanism is impaired. We investigate these artifacts by simulation and experiment and propose an elimination procedure that enables successful reconstruction of SHG-SLOT data. Nevertheless, in many cases specific labeling of certain structures is necessary to make them visible. Using multiple dyes in one sample can lead to crosstalk between the different channels and reduce contrast of the images. Also autofluorescence of the sample itself can account for that. By using multispectral imaging in combination with spectral unmixing techniques, this loss can be compensated. Therefore either a spectrally resolved detection path, or spectrally resolved excitation is required. Therefore we integrated a white supercontinuum light source in our SLOT-setup that enables a spectral selection of the excitation beam and extended the detection path to a four channel setup. This enables the detection of three fluorescence channels and one absorption channel in parallel, and increases the contrast in the reconstructed 3D images significantly.
Tetrachromacy of human vision: spectral channels and primary colors
NASA Astrophysics Data System (ADS)
Gavrik, Vitali V.
2002-06-01
Full-color imaging requires four channels as, in contrast to a colorimeter, can add no primary to matched scene colors themselves. An ideal imaging channel should have the same spectral sensitivity of scene recording as a retinal receptor and evoke the same primary color sensation. The alternating matching functions of a triad of real primaries are inconsistent with the three cones but explicable of two pairs of independent opponent receptors with their alternating blue-yellow and green-red chromatic axes in the color space. Much other controversy of trichromatic approach can also be explained with the recently proposed intra- receptor processes in the photopic rod and cone, respectively. Each of their four primary sensations, unmixed around 465, 495, 575, and 650 nm, is evoked within a different spectral region. The current trichromatic photographic systems have been found separately to approximate the blue and red receptors, as well as their spectral opponency against the respective yellow and blue- green receptors simulated with a single middle-wave imaging channel. The channel sensitivities are delimited by the neutral points of rod and cone and cannot simulate the necessary overlap of non-opponent channels for properly to render some mixed colors. The yellow and cyan positive dyes closely control the brightness of blue and red sensations, respectively. Those red and blue respectively to control the yellow and blue-green sensations on brightness scales are replaced by magenta dye, controlling them together. Accurate rendering of natural saturation metameric colors, problematic blue-green, purple-red, and low-illumination colors requires to replace the hybrid 'green' channel with the blue-green and yellow channels.
Analysis of Forest Foliage Using a Multivariate Mixture Model
NASA Technical Reports Server (NTRS)
Hlavka, C. A.; Peterson, David L.; Johnson, L. F.; Ganapol, B.
1997-01-01
Data with wet chemical measurements and near infrared spectra of ground leaf samples were analyzed to test a multivariate regression technique for estimating component spectra which is based on a linear mixture model for absorbance. The resulting unmixed spectra for carbohydrates, lignin, and protein resemble the spectra of extracted plant starches, cellulose, lignin, and protein. The unmixed protein spectrum has prominent absorption spectra at wavelengths which have been associated with nitrogen bonds.
Combining spatial and spectral information to improve crop/weed discrimination algorithms
NASA Astrophysics Data System (ADS)
Yan, L.; Jones, G.; Villette, S.; Paoli, J. N.; Gée, C.
2012-01-01
Reduction of herbicide spraying is an important key to environmentally and economically improve weed management. To achieve this, remote sensors such as imaging systems are commonly used to detect weed plants. We developed spatial algorithms that detect the crop rows to discriminate crop from weeds. These algorithms have been thoroughly tested and provide robust and accurate results without learning process but their detection is limited to inter-row areas. Crop/Weed discrimination using spectral information is able to detect intra-row weeds but generally needs a prior learning process. We propose a method based on spatial and spectral information to enhance the discrimination and overcome the limitations of both algorithms. The classification from the spatial algorithm is used to build the training set for the spectral discrimination method. With this approach we are able to improve the range of weed detection in the entire field (inter and intra-row). To test the efficiency of these algorithms, a relevant database of virtual images issued from SimAField model has been used and combined to LOPEX93 spectral database. The developed method based is evaluated and compared with the initial method in this paper and shows an important enhancement from 86% of weed detection to more than 95%.
Spectral band selection for classification of soil organic matter content
NASA Technical Reports Server (NTRS)
Henderson, Tracey L.; Szilagyi, Andrea; Baumgardner, Marion F.; Chen, Chih-Chien Thomas; Landgrebe, David A.
1989-01-01
This paper describes the spectral-band-selection (SBS) algorithm of Chen and Landgrebe (1987, 1988, and 1989) and uses the algorithm to classify the organic matter content in the earth's surface soil. The effectiveness of the algorithm was evaluated comparing the results of classification of the soil organic matter using SBS bands with those obtained using Landsat MSS bands and TM bands, showing that the algorithm was successful in finding important spectral bands for classification of organic matter content. Using the calculated bands, the probabilities of correct classification for climate-stratified data were found to range from 0.910 to 0.980.
Spectral CT metal artifact reduction with an optimization-based reconstruction algorithm
NASA Astrophysics Data System (ADS)
Gilat Schmidt, Taly; Barber, Rina F.; Sidky, Emil Y.
2017-03-01
Metal objects cause artifacts in computed tomography (CT) images. This work investigated the feasibility of a spectral CT method to reduce metal artifacts. Spectral CT acquisition combined with optimization-based reconstruction is proposed to reduce artifacts by modeling the physical effects that cause metal artifacts and by providing the flexibility to selectively remove corrupted spectral measurements in the spectral-sinogram space. The proposed Constrained `One-Step' Spectral CT Image Reconstruction (cOSSCIR) algorithm directly estimates the basis material maps while enforcing convex constraints. The incorporation of constraints on the reconstructed basis material maps is expected to mitigate undersampling effects that occur when corrupted data is excluded from reconstruction. The feasibility of the cOSSCIR algorithm to reduce metal artifacts was investigated through simulations of a pelvis phantom. The cOSSCIR algorithm was investigated with and without the use of a third basis material representing metal. The effects of excluding data corrupted by metal were also investigated. The results demonstrated that the proposed cOSSCIR algorithm reduced metal artifacts and improved CT number accuracy. For example, CT number error in a bright shading artifact region was reduced from 403 HU in the reference filtered backprojection reconstruction to 33 HU using the proposed algorithm in simulation. In the dark shading regions, the error was reduced from 1141 HU to 25 HU. Of the investigated approaches, decomposing the data into three basis material maps and excluding the corrupted data demonstrated the greatest reduction in metal artifacts.
NASA Astrophysics Data System (ADS)
Ren, Ruizhi; Gu, Lingjia; Fu, Haoyang; Sun, Chenglin
2017-04-01
An effective super-resolution (SR) algorithm is proposed for actual spectral remote sensing images based on sparse representation and wavelet preprocessing. The proposed SR algorithm mainly consists of dictionary training and image reconstruction. Wavelet preprocessing is used to establish four subbands, i.e., low frequency, horizontal, vertical, and diagonal high frequency, for an input image. As compared to the traditional approaches involving the direct training of image patches, the proposed approach focuses on the training of features derived from these four subbands. The proposed algorithm is verified using different spectral remote sensing images, e.g., moderate-resolution imaging spectroradiometer (MODIS) images with different bands, and the latest Chinese Jilin-1 satellite images with high spatial resolution. According to the visual experimental results obtained from the MODIS remote sensing data, the SR images using the proposed SR algorithm are superior to those using a conventional bicubic interpolation algorithm or traditional SR algorithms without preprocessing. Fusion algorithms, e.g., standard intensity-hue-saturation, principal component analysis, wavelet transform, and the proposed SR algorithms are utilized to merge the multispectral and panchromatic images acquired by the Jilin-1 satellite. The effectiveness of the proposed SR algorithm is assessed by parameters such as peak signal-to-noise ratio, structural similarity index, correlation coefficient, root-mean-square error, relative dimensionless global error in synthesis, relative average spectral error, spectral angle mapper, and the quality index Q4, and its performance is better than that of the standard image fusion algorithms.
FIVQ algorithm for interference hyper-spectral image compression
NASA Astrophysics Data System (ADS)
Wen, Jia; Ma, Caiwen; Zhao, Junsuo
2014-07-01
Based on the improved vector quantization (IVQ) algorithm [1] which was proposed in 2012, this paper proposes a further improved vector quantization (FIVQ) algorithm for LASIS (Large Aperture Static Imaging Spectrometer) interference hyper-spectral image compression. To get better image quality, IVQ algorithm takes both the mean values and the VQ indices as the encoding rules. Although IVQ algorithm can improve both the bit rate and the image quality, it still can be further improved in order to get much lower bit rate for the LASIS interference pattern with the special optical characteristics based on the pushing and sweeping in LASIS imaging principle. In the proposed algorithm FIVQ, the neighborhood of the encoding blocks of the interference pattern image, which are using the mean value rules, will be checked whether they have the same mean value as the current processing block. Experiments show the proposed algorithm FIVQ can get lower bit rate compared to that of the IVQ algorithm for the LASIS interference hyper-spectral sequences.
Exploring Planetary Analogs With an Ultracompact Near-Infrared Reflectance Instrument
NASA Astrophysics Data System (ADS)
Sobron, P.; Wang, A.
2017-12-01
Orbital reflectance spectrometers provide unique measurements of mineralogical features globally and repeatedly on planets and moons of our solar system. Mounted on landed spacecraft, reflectance sensors enable fine-scale investigations and can provide ground truth analyses to assess the validity of spectral remote sensing. We have developed a miniaturized, field-ready, active source NIR (1.14-4.76 μm) reflectance spectrometer (WIR) WIR enables in-situ, near real-time identification of water (structural or adsorbed), carbonates, sulfates, hydrated silicates, as well as C-H & N-H bonds in organic species. WIR is suited for lander/rover deployment in two modes: 1) In Traverse Survey Mode WIR is integrated into a rover wheel and performs nonstop synchronized data collection with every revolution of the wheel; large amounts of data points can be collected during a rover traverse that inform the spatial distribution of mineral phases; 2) In Point-Check Mode WIR is mounted on a robotic arm of a rover/lander and deployed on selected targets at planetary surfaces, or installed inside an analytical lab where samples from a drill/scoop are delivered for detailed analysis. Over the past 10 years we have deployed WIR in planetary analog settings, including hydrothermal springs in Svalbard (Norway) and High Andes (Chile); Arctic volcanoes in Svalbard; Arctic springs and permafrost in Axel Heiberg (Canada); Antarctic ice-covered lakes; saline playas in hyperarid deserts in the Tibetan Plateau (China) and the Atacama; high elevation ore deposits in the Andes and the Abitibi gold belt region (Canada); lava tubes in California; and acidic waters in Rio Tinto (Spain). We have recorded in-situ NIR reflectance spectra from these analogues and used improved spectral unmixing algorithms to determine the mineralogical composition at these sites. We have observed minerals consistent with sedimentary, mineralogical, morphological, and geochemical processes, some of which have been observed/predicted on other planets. In select cases, WIR data has provided critical ground truthing for remote sensing mineralogical investigations. At the Meeting, we will discuss our in-situ WIR analyses and path forward towards developing a flight version of WIR.
Algorithms for Spectral Decomposition with Applications to Optical Plume Anomaly Detection
NASA Technical Reports Server (NTRS)
Srivastava, Askok N.; Matthews, Bryan; Das, Santanu
2008-01-01
The analysis of spectral signals for features that represent physical phenomenon is ubiquitous in the science and engineering communities. There are two main approaches that can be taken to extract relevant features from these high-dimensional data streams. The first set of approaches relies on extracting features using a physics-based paradigm where the underlying physical mechanism that generates the spectra is used to infer the most important features in the data stream. We focus on a complementary methodology that uses a data-driven technique that is informed by the underlying physics but also has the ability to adapt to unmodeled system attributes and dynamics. We discuss the following four algorithms: Spectral Decomposition Algorithm (SDA), Non-Negative Matrix Factorization (NMF), Independent Component Analysis (ICA) and Principal Components Analysis (PCA) and compare their performance on a spectral emulator which we use to generate artificial data with known statistical properties. This spectral emulator mimics the real-world phenomena arising from the plume of the space shuttle main engine and can be used to validate the results that arise from various spectral decomposition algorithms and is very useful for situations where real-world systems have very low probabilities of fault or failure. Our results indicate that methods like SDA and NMF provide a straightforward way of incorporating prior physical knowledge while NMF with a tuning mechanism can give superior performance on some tests. We demonstrate these algorithms to detect potential system-health issues on data from a spectral emulator with tunable health parameters.
Speech enhancement based on modified phase-opponency detectors
NASA Astrophysics Data System (ADS)
Deshmukh, Om D.; Espy-Wilson, Carol Y.
2005-09-01
A speech enhancement algorithm based on a neural model was presented by Deshmukh et al., [149th meeting of the Acoustical Society America, 2005]. The algorithm consists of a bank of Modified Phase Opponency (MPO) filter pairs tuned to different center frequencies. This algorithm is able to enhance salient spectral features in speech signals even at low signal-to-noise ratios. However, the algorithm introduces musical noise and sometimes misses a spectral peak that is close in frequency to a stronger spectral peak. Refinement in the design of the MPO filters was recently made that takes advantage of the falling spectrum of the speech signal in sonorant regions. The modified set of filters leads to better separation of the noise and speech signals, and more accurate enhancement of spectral peaks. The improvements also lead to a significant reduction in musical noise. Continuity algorithms based on the properties of speech signals are used to further reduce the musical noise effect. The efficiency of the proposed method in enhancing the speech signal when the level of the background noise is fluctuating will be demonstrated. The performance of the improved speech enhancement method will be compared with various spectral subtraction-based methods. [Work supported by NSF BCS0236707.
Impact of JPEG2000 compression on spatial-spectral endmember extraction from hyperspectral data
NASA Astrophysics Data System (ADS)
Martín, Gabriel; Ruiz, V. G.; Plaza, Antonio; Ortiz, Juan P.; García, Inmaculada
2009-08-01
Hyperspectral image compression has received considerable interest in recent years. However, an important issue that has not been investigated in the past is the impact of lossy compression on spectral mixture analysis applications, which characterize mixed pixels in terms of a suitable combination of spectrally pure spectral substances (called endmembers) weighted by their estimated fractional abundances. In this paper, we specifically investigate the impact of JPEG2000 compression of hyperspectral images on the quality of the endmembers extracted by algorithms that incorporate both the spectral and the spatial information (useful for incorporating contextual information in the spectral endmember search). The two considered algorithms are the automatic morphological endmember extraction (AMEE) and the spatial spectral endmember extraction (SSEE) techniques. Experimental results are conducted using a well-known data set collected by AVIRIS over the Cuprite mining district in Nevada and with detailed ground-truth information available from U. S. Geological Survey. Our experiments reveal some interesting findings that may be useful to specialists applying spatial-spectral endmember extraction algorithms to compressed hyperspectral imagery.
Long-term retention as a function of word concreteness under conditions of free recall.
Postman, L; Burns, S
1974-07-01
Acquisition and long-term retention of concrete (C) and abstract (A) words were investigated under conditions of multiple-trial free recall. Both unmixed and mixed lists were used in original learning. Retention was tested either 1 rain or 1 week after attainment of the learning criterion. Acquisition was faster and retention was higher for C than for A words. These differences were more pronounced for mixed than for unmixed lists.
Jonasson, U; Jonasson, B; Saldeen, T
1999-07-26
In Sweden, the frequency of fatal poisoning by dextropropoxyphene (DXP) ingestion is constantly high. There are seven preparations containing DXP on the Swedish market; in three of them DXP is the sole analgesic ingredient, while four of them are combinations of analgesics. In an attempt to assess the death rate attributable to each DXP preparation on the basis of toxicological analyses, altogether 834 cases of dextropropoxyphene-related death over a 5-year period (1992-1996) in Sweden have been reviewed. The ratio between number of fatal poisonings and prescription of defined daily dose/1000 inhabitants during a 12-month period (DDD) was determined. The highest ratio, 27, was attributed to unmixed preparations. The ratio for DXP + paracetamol-related deaths was 6.3, and for DXP + phenazone, 6.4, while the lowest ratio, 2, was found among the DXP + chlorzoxazone cases. The unmixed preparations, representing 26% of all DXP prescriptions during the study years, were implicated in 62% of the DXP fatalities, a considerable over-representation. Unmixed preparations, with their higher content of DXP, may be more attractive for many consumers because of their narcotic (euphoric) effects rather than for any analgetic superiority. Another possibility is that unmixed preparations may erroneously have been regarded as safer than when combined with paracetamol, as reports of poisoning with compounds containing DXP + paracetamol have been most frequently reported, probably due to their predominance on the market.
What Stroop tasks can tell us about selective attention from childhood to adulthood.
Wright, Barlow C
2017-08-01
A rich body of research concerns causes of Stroop effects plus applications of Stroop. However, several questions remain. We included assessment of errors with children and adults (N = 316), who sat either a task wherein each block employed only trials of one type (unmixed task) or where every block comprised of a mix of the congruent, neutral, and incongruent trials. Children responded slower than adults and made more errors on each task. Contrary to some previous studies, interference (the difference between neutral and incongruent condition) showed no reaction time (RT) differences by group or task, although there were differences in errors. By contrast, facilitation (the difference between neutral and congruent condition) was greater in children than adults, and greater on the unmixed task than the mixed task. After considering a number of theoretical accounts, we settle on the inadvertent word-reading hypothesis, whereby facilitation stems from children and the unmixed task promoting inadvertent reading particularly in the congruent condition. Stability of interference RT is explained by fixed semantic differences between neutral and incongruent conditions, for children versus adults and for unmixed versus mixed task. We conclude that utilizing two tasks together may reveal more about how attention is affected in other groups. © 2016 The Authors. British Journal of Psychology published by John Wiley & Sons Ltd on behalf of the British Psychological Society.
Multi-pass encoding of hyperspectral imagery with spectral quality control
NASA Astrophysics Data System (ADS)
Wasson, Steven; Walker, William
2015-05-01
Multi-pass encoding is a technique employed in the field of video compression that maximizes the quality of an encoded video sequence within the constraints of a specified bit rate. This paper presents research where multi-pass encoding is extended to the field of hyperspectral image compression. Unlike video, which is primarily intended to be viewed by a human observer, hyperspectral imagery is processed by computational algorithms that generally attempt to classify the pixel spectra within the imagery. As such, these algorithms are more sensitive to distortion in the spectral dimension of the image than they are to perceptual distortion in the spatial dimension. The compression algorithm developed for this research, which uses the Karhunen-Loeve transform for spectral decorrelation followed by a modified H.264/Advanced Video Coding (AVC) encoder, maintains a user-specified spectral quality level while maximizing the compression ratio throughout the encoding process. The compression performance may be considered near-lossless in certain scenarios. For qualitative purposes, this paper presents the performance of the compression algorithm for several Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Hyperion datasets using spectral angle as the spectral quality assessment function. Specifically, the compression performance is illustrated in the form of rate-distortion curves that plot spectral angle versus bits per pixel per band (bpppb).
Accuracy Improvement for Light-Emitting-Diode-Based Colorimeter by Iterative Algorithm
NASA Astrophysics Data System (ADS)
Yang, Pao-Keng
2011-09-01
We present a simple algorithm, combining an interpolating method with an iterative calculation, to enhance the resolution of spectral reflectance by removing the spectral broadening effect due to the finite bandwidth of the light-emitting diode (LED) from it. The proposed algorithm can be used to improve the accuracy of a reflective colorimeter using multicolor LEDs as probing light sources and is also applicable to the case when the probing LEDs have different bandwidths in different spectral ranges, to which the powerful deconvolution method cannot be applied.
NASA Astrophysics Data System (ADS)
Cheng, Liantao; Zhang, Fenghui; Kang, Xiaoyu; Wang, Lang
2018-05-01
In evolutionary population synthesis (EPS) models, we need to convert stellar evolutionary parameters into spectra via interpolation in a stellar spectral library. For theoretical stellar spectral libraries, the spectrum grid is homogeneous on the effective-temperature and gravity plane for a given metallicity. It is relatively easy to derive stellar spectra. For empirical stellar spectral libraries, stellar parameters are irregularly distributed and the interpolation algorithm is relatively complicated. In those EPS models that use empirical stellar spectral libraries, different algorithms are used and the codes are often not released. Moreover, these algorithms are often complicated. In this work, based on a radial basis function (RBF) network, we present a new spectrum interpolation algorithm and its code. Compared with the other interpolation algorithms that are used in EPS models, it can be easily understood and is highly efficient in terms of computation. The code is written in MATLAB scripts and can be used on any computer system. Using it, we can obtain the interpolated spectra from a library or a combination of libraries. We apply this algorithm to several stellar spectral libraries (such as MILES, ELODIE-3.1 and STELIB-3.2) and give the integrated spectral energy distributions (ISEDs) of stellar populations (with ages from 1 Myr to 14 Gyr) by combining them with Yunnan-III isochrones. Our results show that the differences caused by the adoption of different EPS model components are less than 0.2 dex. All data about the stellar population ISEDs in this work and the RBF spectrum interpolation code can be obtained by request from the first author or downloaded from http://www1.ynao.ac.cn/˜zhangfh.
A Spectral Algorithm for Envelope Reduction of Sparse Matrices
NASA Technical Reports Server (NTRS)
Barnard, Stephen T.; Pothen, Alex; Simon, Horst D.
1993-01-01
The problem of reordering a sparse symmetric matrix to reduce its envelope size is considered. A new spectral algorithm for computing an envelope-reducing reordering is obtained by associating a Laplacian matrix with the given matrix and then sorting the components of a specified eigenvector of the Laplacian. This Laplacian eigenvector solves a continuous relaxation of a discrete problem related to envelope minimization called the minimum 2-sum problem. The permutation vector computed by the spectral algorithm is a closest permutation vector to the specified Laplacian eigenvector. Numerical results show that the new reordering algorithm usually computes smaller envelope sizes than those obtained from the current standard algorithms such as Gibbs-Poole-Stockmeyer (GPS) or SPARSPAK reverse Cuthill-McKee (RCM), in some cases reducing the envelope by more than a factor of two.
NASA Technical Reports Server (NTRS)
Phinney, D. E. (Principal Investigator)
1980-01-01
An algorithm for estimating spectral crop calendar shifts of spring small grains was applied to 1978 spring wheat fields. The algorithm provides estimates of the date of peak spectral response by maximizing the cross correlation between a reference profile and the observed multitemporal pattern of Kauth-Thomas greenness for a field. A methodology was developed for estimation of crop development stage from the date of peak spectral response. Evaluation studies showed that the algorithm provided stable estimates with no geographical bias. Crop development stage estimates had a root mean square error near 10 days. The algorithm was recommended for comparative testing against other models which are candidates for use in AgRISTARS experiments.
NASA Technical Reports Server (NTRS)
Gulick, V. C.; Morris, R. L.; Bishop, J.; Gazis, P.; Alena, R.; Sierhuis, M.
2002-01-01
We are developing science analyses algorithms to interface with a Geologist's Field Assistant device to allow robotic or human remote explorers to better sense their surroundings during limited surface excursions. Our algorithms will interpret spectral and imaging data obtained by various sensors. Additional information is contained in the original extended abstract.
Radiative absorption enhancement of dust mixed with anthropogenic pollution over East Asia
NASA Astrophysics Data System (ADS)
Tian, Pengfei; Zhang, Lei; Ma, Jianmin; Tang, Kai; Xu, Lili; Wang, Yuan; Cao, Xianjie; Liang, Jiening; Ji, Yuemeng; Jiang, Jonathan H.; Yung, Yuk L.; Zhang, Renyi
2018-06-01
The particle mixing state plays a significant yet poorly quantified role in aerosol radiative forcing, especially for the mixing of dust (mineral absorbing) and anthropogenic pollution (black carbon absorbing) over East Asia. We have investigated the absorption enhancement of mixed-type aerosols over East Asia by using the Aerosol Robotic Network observations and radiative transfer model calculations. The mixed-type aerosols exhibit significantly enhanced absorbing ability than the corresponding unmixed dust and anthropogenic aerosols, as revealed in the spectral behavior of absorbing aerosol optical depth, single scattering albedo, and imaginary refractive index. The aerosol radiative efficiencies for the dust, mixed-type, and anthropogenic aerosols are -101.0, -112.9, and -98.3 Wm-2 τ-1 at the bottom of the atmosphere (BOA); -42.3, -22.5, and -39.8 Wm-2 τ-1 at the top of the atmosphere (TOA); and 58.7, 90.3, and 58.5 Wm-2 τ-1 in the atmosphere (ATM), respectively. The BOA cooling and ATM heating efficiencies of the mixed-type aerosols are significantly higher than those of the unmixed aerosol types over the East Asia region, resulting in atmospheric stabilization. In addition, the mixed-type aerosols correspond to a lower TOA cooling efficiency, indicating that the cooling effect by the corresponding individual aerosol components is partially counteracted. We conclude that the interaction between dust and anthropogenic pollution not only represents a viable aerosol formation pathway but also results in unfavorable dispersion conditions, both exacerbating the regional air pollution in East Asia. Our results highlight the necessity to accurately account for the mixing state of aerosols in atmospheric models over East Asia in order to better understand the formation mechanism for regional air pollution and to assess its impacts on human health, weather, and climate.
Bhargava, Rohit; Perlman, Rebecca Schwartz; Fernandez, Daniel C; Levin, Ira W; Bartick, Edward G
2009-08-01
Current latent print and trace evidence collecting technologies are usually invasive and can be destructive to the original deposits. We describe a non-invasive vibrational spectroscopic approach that yields latent fingerprints that are overlaid on top of one another or that may contain trace evidence that needs to be distinguished from the print. Because of the variation in the chemical composition distribution within the fingerprint, we demonstrate that linear unmixing applied to the spectral content of the data can be used to provide images that reveal superimposed fingerprints. In addition, we demonstrate that the chemical composition of the trace evidence located in the region of the print can potentially be identified by its infrared spectrum. Thus, trace evidence found at a crime scene that previously could not be directly related to an individual, now has the potential to be directly related by its presence in the individual-identifying fingerprints.
Linear mixing model applied to AVHRR LAC data
NASA Technical Reports Server (NTRS)
Holben, Brent N.; Shimabukuro, Yosio E.
1993-01-01
A linear mixing model was applied to coarse spatial resolution data from the NOAA Advanced Very High Resolution Radiometer. The reflective component of the 3.55 - 3.93 microns channel was extracted and used with the two reflective channels 0.58 - 0.68 microns and 0.725 - 1.1 microns to run a Constraine Least Squares model to generate vegetation, soil, and shade fraction images for an area in the Western region of Brazil. The Landsat Thematic Mapper data covering the Emas National park region was used for estimating the spectral response of the mixture components and for evaluating the mixing model results. The fraction images were compared with an unsupervised classification derived from Landsat TM data acquired on the same day. The relationship between the fraction images and normalized difference vegetation index images show the potential of the unmixing techniques when using coarse resolution data for global studies.
NASA Astrophysics Data System (ADS)
Jiang, Kaili; Zhu, Jun; Tang, Bin
2017-12-01
Periodic nonuniform sampling occurs in many applications, and the Nyquist folding receiver (NYFR) is an efficient, low complexity, and broadband spectrum sensing architecture. In this paper, we first derive that the radio frequency (RF) sample clock function of NYFR is periodic nonuniform. Then, the classical results of periodic nonuniform sampling are applied to NYFR. We extend the spectral reconstruction algorithm of time series decomposed model to the subsampling case by using the spectrum characteristics of NYFR. The subsampling case is common for broadband spectrum surveillance. Finally, we take example for a LFM signal under large bandwidth to verify the proposed algorithm and compare the spectral reconstruction algorithm with orthogonal matching pursuit (OMP) algorithm.
Comparison of three methods for materials identification and mapping with imaging spectroscopy
NASA Technical Reports Server (NTRS)
Clark, Roger N.; Swayze, Gregg; Boardman, Joe; Kruse, Fred
1993-01-01
We are comparing three methods of mapping analysis tools for imaging spectroscopy data. The purpose of this comparison is to understand the advantages and disadvantages of each algorithm so others would be better able to choose the best algorithm or combinations of algorithms for a particular problem. The three algorithms are: (1) the spectralfeature modified least squares mapping algorithm of Clark et al (1990, 1991): programs mbandmap and tricorder; (2) the Spectral Angle Mapper Algorithm(Boardman, 1993) found in the CU CSES SIPS package; and (3) the Expert System of Kruse et al. (1993). The comparison uses a ground-calibrated 1990 AVIRIS scene of 400 by 410 pixels over Cuprite, Nevada. Along with the test data set is a spectral library of 38 minerals. Each algorithm is tested with the same AVIRIS data set and spectral library. Field work has confirmed the presence of many of these minerals in the AVIRIS scene (Swayze et al. 1992).
A spectral, quasi-cylindrical and dispersion-free Particle-In-Cell algorithm
Lehe, Remi; Kirchen, Manuel; Andriyash, Igor A.; ...
2016-02-17
We propose a spectral Particle-In-Cell (PIC) algorithm that is based on the combination of a Hankel transform and a Fourier transform. For physical problems that have close-to-cylindrical symmetry, this algorithm can be much faster than full 3D PIC algorithms. In addition, unlike standard finite-difference PIC codes, the proposed algorithm is free of spurious numerical dispersion, in vacuum. This algorithm is benchmarked in several situations that are of interest for laser-plasma interactions. These benchmarks show that it avoids a number of numerical artifacts, that would otherwise affect the physics in a standard PIC algorithm - including the zero-order numerical Cherenkov effect.
NASA Astrophysics Data System (ADS)
Edgett, Kenneth S.
1996-10-01
INTRODUCTION: On Earth, aeolian sand dunes are used as tools of scientific inquiry. Holocene and Pleistocene dunes preserve clues about Quaternary climate variations and human activities ranging from Ice Age hunting practices to Twentieth Century warfare. Modern dunes contain the sedimentary textures and structures necessary for interpreting ancient sandstones, and they provide natural laboratories for investigation of aeolian physics and desertification processes. The dunes of Mars can likewise be used as scientific tools. Dunes provide relatively dust-free surfaces. From a remote sensing perspective, martian dunes have much potential for providing clues about surface mineralogy and the interaction between the surface and atmosphere. Such information can in turn provide insights regarding crust composition, volcanic evolution, present and past climate events, and perhaps weathering rates. The Mars Global Surveyor Thermal Emission Spectrometer (TES) is expected to reach the planet in September 1997. TES will provide 6 to 50 micrometer spectra of the martian surface at ground resolutions of 3 to 9 km. Sandy aeolian environments on Mars might provide key information about bedrock composition. To prepare for the TES investigation, I have been examining a thermal infrared image of a Mars-composition analog dune field in Christmas Lake Valley, Oregon. COMPOSITION AND GEOLOGIC SETTING: The "Shifting Sand Dunes" dune field is located at the eastern end of Christmas Lake Valley, in what was once the Pleistocene Fort Rock Lake [1]. Much of the sand that makes up the Shifting Sand Dunes dune field is reworked Mt. Mazama airfall from its terminal eruption 6,800 years ago, plus material deflated from the lake bed [1, 2]. The main constituents of the dunes are volcanic glass and devitrified glass fragments, plagioclase crystals, basalt lithic fragments, aggregates of silt and clay-size volcanic ash, pyroxenes, opaque oxide minerals (mostly magnetite), and trace occurrences of fossil fragments and other minerals [3]. THERMAL INFRARED IMAGE: The thermal infrared images used in this study was obtained by the NASA Ames Research Center C-130 Earth Resources airborne Thermal Infrared Multispectral Scanner (TIMS) on 21 September 1991. The image has 6 spectral bands between 8 and 12 micrometers and a ground resolution of 9 m/pixel. The raw image was converted to calibrated radiance, from which normalized emittance was computed for each of the six bands, following the method of Realmutto [4]. Atmospheric effects were corrected using an empirical method described by Edgett and Anderson [5]. The resulting 6-band image provides quantitative determination of the surface emissivity. Dune spectra in the image match spectra obtained in our laboratory using samples collected from the field area [3, 5]. ACTIVE DUNES, INACTIVE DUNES, AND INTERDUNE AREAS FROM EMISSIVITY VARIATION: This study shows that in a modern dune field, the location of active dunes, interdune surfaces, and inactive dunes can be mapped using emissivity in the thermal infrared band that shows the most spectral variation [6]. In this case, TIMS band 3 (9.2 micrometers) had the most variation, although the entire emissivity range was only from 0.89 to 1.0. Active dunes had the lowest emissivities (0.89 to 0.91), inactive dunes were distinguished by higher emissivities (.094 to 1.0), and interdune surfaces had intermediate values (0.90 to 0.95). These emissivity variations result from differences in particle size, as inactive dunes tend to have finer-grained silt and dust on them. LINEAR UNMIXING USING IMAGE ENDMEMBERS: Quantitative estimates of thermal infrared spectral emissivity are ideally suited to unmixing analysis. For grains larger than the wavelength (e.g., dune sand), a linear unmixing approach provides geologically useful results [7]. In the present study, image endmembers were selected for a preliminary unmixing study: (1) "regular sand," which contains nearly 50% plagioclase and nearly 20% volcanic glass; (2) "dark sand, which consists mainly of basalt clasts (> 25%) and glass (> 30%); (3) "mud chips," which are volcanic ash aggregates broken into sand-sized pieces, (4) sagebrush and grass; and (5) thick vegetation, such as an alfalfa farm near the dunes. The most important result of this preliminary unmixing work is an image that shows the distribution of ash aggregates and "dark sand," both of which vary throughout the dune field as a function of proximity to the source. The volcanic ash aggregates, in particular, are locally eroded from a layer that caps the Pleistocene lake beds that underlie the dunes [3]. SUMMARY: This study highlights the use of thermal infrared spectra to map local contributions of sand to a dune field, and to distinguish active versus inactive dune fields. Mapping of local contributions to active dune fields on Mars using TES or other multispectral images has potential to provide indications of local bedrock composition. REFERENCES: [1] Allison, I. S. (1979) Oregon Dept. Geol. Minl. Res. Spec. Pap. 7. [2] Dole, H. M. (1942) M.S. Thesis, Oregon State, Corvallis, Or. [3] Edgett, K. S. (1994) in Ph.D. Diss., pp. 145-201, Arizona State, Tempe, AZ. [4] Realmutto, V. J. (1990) in JPL Publ. 90-55, pp. 31-35. [5] Edgett, K. S., and D. L. Anderson (1995) in JPL Publ. 95-1, v. 2, pp. 9-12. [6] Edgett, K. S. et al. (1995) in JPL Publ. 95-1, v. 2, pp. 13-16. [7] Ramsey, M. S. (1996) Ph.D. Diss, Arizona State, Tempe, AZ.
Forest Cover Mapping in Iskandar Malaysia Using Satellite Data
NASA Astrophysics Data System (ADS)
Kanniah, K. D.; Mohd Najib, N. E.; Vu, T. T.
2016-09-01
Malaysia is the third largest country in the world that had lost forest cover. Therefore, timely information on forest cover is required to help the government to ensure that the remaining forest resources are managed in a sustainable manner. This study aims to map and detect changes of forest cover (deforestation and disturbance) in Iskandar Malaysia region in the south of Peninsular Malaysia between years 1990 and 2010 using Landsat satellite images. The Carnegie Landsat Analysis System-Lite (CLASlite) programme was used to classify forest cover using Landsat images. This software is able to mask out clouds, cloud shadows, terrain shadows, and water bodies and atmospherically correct the images using 6S radiative transfer model. An Automated Monte Carlo Unmixing technique embedded in CLASlite was used to unmix each Landsat pixel into fractions of photosynthetic vegetation (PV), non photosynthetic vegetation (NPV) and soil surface (S). Forest and non-forest areas were produced from the fractional cover images using appropriate threshold values of PV, NPV and S. CLASlite software was found to be able to classify forest cover in Iskandar Malaysia with only a difference between 14% (1990) and 5% (2010) compared to the forest land use map produced by the Department of Agriculture, Malaysia. Nevertheless, the CLASlite automated software used in this study was found not to exclude other vegetation types especially rubber and oil palm that has similar reflectance to forest. Currently rubber and oil palm were discriminated from forest manually using land use maps. Therefore, CLASlite algorithm needs further adjustment to exclude these vegetation and classify only forest cover.
Validating SWE reconstruction using Airborne Snow Observatory measurements in the Sierra Nevada
NASA Astrophysics Data System (ADS)
Bair, N.; Rittger, K.; Davis, R. E.; Dozier, J.
2015-12-01
The Airborne Snow Observatory (ASO) program offers high resolution estimates of snow water equivalent (SWE) in several small basins across California during the melt season. Primarily, water managers use this information to model snowmelt runoff into reservoirs. Another, and potentially more impactful, use of ASO SWE measurements is in validating and improving satellite-based SWE estimates which can be used in austere regions with no ground-based snow or water measurements, such as Afghanistan's Hindu Kush. Using the entire ASO dataset to date (2013-2015) which is mostly from the Upper Tuolumne basin, but also includes measurements from 2015 in the Kings, Rush Creek, Merced, and Mammoth Lakes basins, we compare ASO measurements to those from a SWE reconstruction method. Briefly, SWE reconstruction involves downscaling energy balance forcings to compute potential melt energy, then using satellite-derived estimates of fractional snow covered area (fSCA) to estimate snow melt from potential melt. The snowpack can then be built in reverse, given a remotely-sensed date of snow disappearance (fSCA=0). Our model has improvements over previous iterations in that it: uses the full energy balance (compared to a modified degree-day) approach, models bulk and surface snow temperatures, accounts for ephemeral snow, and uses a remotely-sensed snow albedo adjusted for impurities. To check that ASO provides accurate snow measurements, we compare fSCA derived from ASO snow depth at 3 m resolution with fSCA from a spectral unmixing algorithm for LandSAT at 30 m, and from binary SCA estimates from Geoeye at 0.5 m from supervised classification. To conclude, we document how our reconstruction model has evolved over the years and provide specific examples where improvements have been made using ASO and other verification sources.
A Subsystem Test Bed for Chinese Spectral Radioheliograph
NASA Astrophysics Data System (ADS)
Zhao, An; Yan, Yihua; Wang, Wei
2014-11-01
The Chinese Spectral Radioheliograph is a solar dedicated radio interferometric array that will produce high spatial resolution, high temporal resolution, and high spectral resolution images of the Sun simultaneously in decimetre and centimetre wave range. Digital processing of intermediate frequency signal is an important part in a radio telescope. This paper describes a flexible and high-speed digital down conversion system for the CSRH by applying complex mixing, parallel filtering, and extracting algorithms to process IF signal at the time of being designed and incorporates canonic-signed digit coding and bit-plane method to improve program efficiency. The DDC system is intended to be a subsystem test bed for simulation and testing for CSRH. Software algorithms for simulation and hardware language algorithms based on FPGA are written which use less hardware resources and at the same time achieve high performances such as processing high-speed data flow (1 GHz) with 10 MHz spectral resolution. An experiment with the test bed is illustrated by using geostationary satellite data observed on March 20, 2014. Due to the easy alterability of the algorithms on FPGA, the data can be recomputed with different digital signal processing algorithms for selecting optimum algorithm.
A real-time spectral mapper as an emerging diagnostic technology in biomedical sciences.
Epitropou, George; Kavvadias, Vassilis; Iliou, Dimitris; Stathopoulos, Efstathios; Balas, Costas
2013-01-01
Real time spectral imaging and mapping at video rates can have tremendous impact not only on diagnostic sciences but also on fundamental physiological problems. We report the first real-time spectral mapper based on the combination of snap-shot spectral imaging and spectral estimation algorithms. Performance evaluation revealed that six band imaging combined with the Wiener algorithm provided high estimation accuracy, with error levels lying within the experimental noise. High accuracy is accompanied with much faster, by 3 orders of magnitude, spectral mapping, as compared with scanning spectral systems. This new technology is intended to enable spectral mapping at nearly video rates in all kinds of dynamic bio-optical effects as well as in applications where the target-probe relative position is randomly and fast changing.
An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method
NASA Astrophysics Data System (ADS)
Tang, J.
2012-01-01
Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.
NASA Astrophysics Data System (ADS)
Yadav, Deepti; Arora, M. K.; Tiwari, K. C.; Ghosh, J. K.
2016-04-01
Hyperspectral imaging is a powerful tool in the field of remote sensing and has been used for many applications like mineral detection, detection of landmines, target detection etc. Major issues in target detection using HSI are spectral variability, noise, small size of the target, huge data dimensions, high computation cost, complex backgrounds etc. Many of the popular detection algorithms do not work for difficult targets like small, camouflaged etc. and may result in high false alarms. Thus, target/background discrimination is a key issue and therefore analyzing target's behaviour in realistic environments is crucial for the accurate interpretation of hyperspectral imagery. Use of standard libraries for studying target's spectral behaviour has limitation that targets are measured in different environmental conditions than application. This study uses the spectral data of the same target which is used during collection of the HSI image. This paper analyze spectrums of targets in a way that each target can be spectrally distinguished from a mixture of spectral data. Artificial neural network (ANN) has been used to identify the spectral range for reducing data and further its efficacy for improving target detection is verified. The results of ANN proposes discriminating band range for targets; these ranges were further used to perform target detection using four popular spectral matching target detection algorithm. Further, the results of algorithms were analyzed using ROC curves to evaluate the effectiveness of the ranges suggested by ANN over full spectrum for detection of desired targets. In addition, comparative assessment of algorithms is also performed using ROC.
A complex guided spectral transform Lanczos method for studying quantum resonance states
Yu, Hua-Gen
2014-12-28
A complex guided spectral transform Lanczos (cGSTL) algorithm is proposed to compute both bound and resonance states including energies, widths and wavefunctions. The algorithm comprises of two layers of complex-symmetric Lanczos iterations. A short inner layer iteration produces a set of complex formally orthogonal Lanczos (cFOL) polynomials. They are used to span the guided spectral transform function determined by a retarded Green operator. An outer layer iteration is then carried out with the transform function to compute the eigen-pairs of the system. The guided spectral transform function is designed to have the same wavefunctions as the eigenstates of the originalmore » Hamiltonian in the spectral range of interest. Therefore the energies and/or widths of bound or resonance states can be easily computed with their wavefunctions or by using a root-searching method from the guided spectral transform surface. The new cGSTL algorithm is applied to bound and resonance states of HO₂, and compared to previous calculations.« less
Collection of endmembers and their separability for spectral unmixing in rangeland applications
NASA Astrophysics Data System (ADS)
Rolfson, David
Rangelands are an important resource to Alberta. Due to their size, mapping rangeland features is difficult. However, the use of aerial and satellite data for mapping has increased the area that can be studied at one time. The recent success in applying hyperspectral data to vegetation mapping has shown promise in rangeland classification. However, classification mapping of hyperspectral data requires existing data for input into classification algorithms. The research reported in this thesis focused on acquiring a seasonal inventory of in-situ reflectance spectra of rangeland plant species (endmembers) and comparing them to evaluate their separability as an indicator of their suitability for hyperspectral image classification analysis. The goals of this research also included determining the separability of species endmembers at different times of the growing season. In 2008, reflectance spectra were collected for three shrub species ( Artemisia cana, Symphoricarpos occidentalis, and Rosa acicularis ), five rangeland grass species native to southern Alberta ( Koeleria gracilis, Stipa comata, Bouteloua gracilis, Agropyron smithii, Festuca idahoensis) and one invasive grass species (Agropyron cristatum ). A spectral library, built using the SPECCHIO spectral database software, was populated using these spectroradiometric measurements with a focus on vegetation spectra. Average endmembers of plant spectra acquired during the peak of sample greenness were compared using three separability measures -- normalized Euclidean distance (NED), correlation separability measure (CSM) and Modified Spectral Angle Mapper (MSAM) -- to establish the degree to which the species were separable. Results were normalized to values between 0 and 1 and values above the established thresholds indicate that the species were not separable. The endmembers for Agropyron cristatum, Agropyron smithii, and Rosa acicularis were not separable using CSM (threshold = 0.992) or MSAM (threshold = 0.970). NED (threshold = 0.950) was best able to separate species endmembers. Using reflectance data collected throughout the summer and fall, species endmembers obtained within two-week periods were analyzed using NED to plot their separability. As expected, separability of sample species changed as they progressed through their individual phenological patterns. Spectra collected during different solar zenith angles were compared to see if they affected the separability measures. Sample species endmembers were generally separable using NED during the periods in which they were measured and compared. However, Koeleria gracilis and Festuca idahoensis endmembers were inseparable from June to mid-August when measurements were taken at solar zenith angles between 25° -- 30° and 45° -- 60°. However, between 30° and 45°, Bouteloua gracilis and Festuca idahoensis endmembers, normally separable during other solar zenith angles, became spectrally similar during the same sampling period. Findings suggest that the choice of separability measures is an important factor when analyzing hyperspectral data. The differences observed in the separability results over time also suggest that the consideration of phenological patterns in planning data acquisition for rangeland classification mapping has a high level of importance.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the location of the water-supply pumping wells based on the available data. The possible applications of the NMFk algorithm are not limited to hydrology problems; NMFk can be applied to any problem where temporal system behavior is observed at multiple locations and an unknown number of physical sources are causing these fluctuations.
NASA Astrophysics Data System (ADS)
Shecter, Liat; Oiknine, Yaniv; August, Isaac; Stern, Adrian
2017-09-01
Recently we presented a Compressive Sensing Miniature Ultra-spectral Imaging System (CS-MUSI)1 . This system consists of a single Liquid Crystal (LC) phase retarder as a spectral modulator and a gray scale sensor array to capture a multiplexed signal of the imaged scene. By designing the LC spectral modulator in compliance with the Compressive Sensing (CS) guidelines and applying appropriate algorithms we demonstrated reconstruction of spectral (hyper/ ultra) datacubes from an order of magnitude fewer samples than taken by conventional sensors. The LC modulator is designed to have an effective width of a few tens of micrometers, therefore it is prone to imperfections and spatial nonuniformity. In this work, we present the study of this nonuniformity and present a mathematical algorithm that allows the inference of the spectral transmission over the entire cell area from only a few calibration measurements.
NASA Astrophysics Data System (ADS)
Polak, Mark L.; Hall, Jeffrey L.; Herr, Kenneth C.
1995-08-01
We present a ratioing algorithm for quantitative analysis of the passive Fourier-transform infrared spectrum of a chemical plume. We show that the transmission of a near-field plume is given by tau plume = (Lobsd - Lbb-plume)/(Lbkgd - Lbb-plume), where tau plume is the frequency-dependent transmission of the plume, L obsd is the spectral radiance of the scene that contains the plume, Lbkgd is the spectral radiance of the same scene without the plume, and Lbb-plume is the spectral radiance of a blackbody at the plume temperature. The algorithm simultaneously achieves background removal, elimination of the spectrometer internal signature, and quantification of the plume spectral transmission. It has applications to both real-time processing for plume visualization and quantitative measurements of plume column densities. The plume temperature (Lbb-plume ), which is not always precisely known, can have a profound effect on the quantitative interpretation of the algorithm and is discussed in detail. Finally, we provide an illustrative example of the use of the algorithm on a trichloroethylene and acetone plume.
Unmixed fuel processors and methods for using the same
Kulkarni, Parag Prakash; Cui, Zhe
2010-08-24
Disclosed herein are unmixed fuel processors and methods for using the same. In one embodiment, an unmixed fuel processor comprises: an oxidation reactor comprising an oxidation portion and a gasifier, a CO.sub.2 acceptor reactor, and a regeneration reactor. The oxidation portion comprises an air inlet, effluent outlet, and an oxygen transfer material. The gasifier comprises a solid hydrocarbon fuel inlet, a solids outlet, and a syngas outlet. The CO.sub.2 acceptor reactor comprises a water inlet, a hydrogen outlet, and a CO.sub.2 sorbent, and is configured to receive syngas from the gasifier. The regeneration reactor comprises a water inlet and a CO.sub.2 stream outlet. The regeneration reactor is configured to receive spent CO.sub.2 adsorption material from the gasification reactor and to return regenerated CO.sub.2 adsorption material to the gasification reactor, and configured to receive oxidized oxygen transfer material from the oxidation reactor and to return reduced oxygen transfer material to the oxidation reactor.
Super-Nyquist shaping and processing technologies for high-spectral-efficiency optical systems
NASA Astrophysics Data System (ADS)
Jia, Zhensheng; Chien, Hung-Chang; Zhang, Junwen; Dong, Ze; Cai, Yi; Yu, Jianjun
2013-12-01
The implementations of super-Nyquist pulse generation, both in a digital field using a digital-to-analog converter (DAC) or an optical filter at transmitter side, are introduced. Three corresponding signal processing algorithms at receiver are presented and compared for high spectral-efficiency (SE) optical systems employing the spectral prefiltering. Those algorithms are designed for the mitigation towards inter-symbol-interference (ISI) and inter-channel-interference (ICI) impairments by the bandwidth constraint, including 1-tap constant modulus algorithm (CMA) and 3-tap maximum likelihood sequence estimation (MLSE), regular CMA and digital filter with 2-tap MLSE, and constant multi-modulus algorithm (CMMA) with 2-tap MLSE. The principles and prefiltering tolerance are given through numerical and experimental results.
A Spectral Algorithm for Solving the Relativistic Vlasov-Maxwell Equations
NASA Technical Reports Server (NTRS)
Shebalin, John V.
2001-01-01
A spectral method algorithm is developed for the numerical solution of the full six-dimensional Vlasov-Maxwell system of equations. Here, the focus is on the electron distribution function, with positive ions providing a constant background. The algorithm consists of a Jacobi polynomial-spherical harmonic formulation in velocity space and a trigonometric formulation in position space. A transform procedure is used to evaluate nonlinear terms. The algorithm is suitable for performing moderate resolution simulations on currently available supercomputers for both scientific and engineering applications.
NASA Technical Reports Server (NTRS)
Clark, Roger N.; Swayze, Gregg A.
1995-01-01
One of the challenges of Imaging Spectroscopy is the identification, mapping and abundance determination of materials, whether mineral, vegetable, or liquid, given enough spectral range, spectral resolution, signal to noise, and spatial resolution. Many materials show diagnostic absorption features in the visual and near infrared region (0.4 to 2.5 micrometers) of the spectrum. This region is covered by the modern imaging spectrometers such as AVIRIS. The challenge is to identify the materials from absorption bands in their spectra, and determine what specific analyses must be done to derive particular parameters of interest, ranging from simply identifying its presence to deriving its abundance, or determining specific chemistry of the material. Recently, a new analysis algorithm was developed that uses a digital spectral library of known materials and a fast, modified-least-squares method of determining if a single spectral feature for a given material is present. Clark et al. made another advance in the mapping algorithm: simultaneously mapping multiple minerals using multiple spectral features. This was done by a modified-least-squares fit of spectral features, from data in a digital spectral library, to corresponding spectral features in the image data. This version has now been superseded by a more comprehensive spectral analysis system called Tricorder.
Algorithms for Solvents and Spectral Factors of Matrix Polynomials
1981-01-01
spectral factors of matrix polynomials LEANG S. SHIEHt, YIH T. TSAYt and NORMAN P. COLEMANt A generalized Newton method , based on the contracted gradient...of a matrix poly- nomial, is derived for solving the right (left) solvents and spectral factors of matrix polynomials. Two methods of selecting initial...estimates for rapid convergence of the newly developed numerical method are proposed. Also, new algorithms for solving complete sets of the right
Combing Visible and Infrared Spectral Tests for Dust Identification
NASA Technical Reports Server (NTRS)
Zhou, Yaping; Levy, Robert; Kleidman, Richard; Remer, Lorraine; Mattoo, Shana
2016-01-01
The MODIS Dark Target aerosol algorithm over Ocean (DT-O) uses spectral reflectance in the visible, near-IR and SWIR wavelengths to determine aerosol optical depth (AOD) and Angstrom Exponent (AE). Even though DT-O does have "dust-like" models to choose from, dust is not identified a priori before inversion. The "dust-like" models are not true "dust models" as they are spherical and do not have enough absorption at short wavelengths, so retrieved AOD and AE for dusty regions tends to be biased. The inference of "dust" is based on postprocessing criteria for AOD and AE by users. Dust aerosol has known spectral signatures in the near-UV (Deep blue), visible, and thermal infrared (TIR) wavelength regions. Multiple dust detection algorithms have been developed over the years with varying detection capabilities. Here, we test a few of these dust detection algorithms, to determine whether they can be useful to help inform the choices made by the DT-O algorithm. We evaluate the following methods: The multichannel imager (MCI) algorithm uses spectral threshold tests in (0.47, 0.64, 0.86, 1.38, 2.26, 3.9, 11.0, 12.0 micrometer) channels and spatial uniformity test [Zhao et al., 2010]. The NOAA dust aerosol index (DAI) uses spectral contrast in the blue channels (412nm and 440nm) [Ciren and Kundragunta, 2014]. The MCI is already included as tests within the "Wisconsin" (MOD35) Cloud mask algorithm.
Statistical analysis and machine learning algorithms for optical biopsy
NASA Astrophysics Data System (ADS)
Wu, Binlin; Liu, Cheng-hui; Boydston-White, Susie; Beckman, Hugh; Sriramoju, Vidyasagar; Sordillo, Laura; Zhang, Chunyuan; Zhang, Lin; Shi, Lingyan; Smith, Jason; Bailin, Jacob; Alfano, Robert R.
2018-02-01
Analyzing spectral or imaging data collected with various optical biopsy methods is often times difficult due to the complexity of the biological basis. Robust methods that can utilize the spectral or imaging data and detect the characteristic spectral or spatial signatures for different types of tissue is challenging but highly desired. In this study, we used various machine learning algorithms to analyze a spectral dataset acquired from human skin normal and cancerous tissue samples using resonance Raman spectroscopy with 532nm excitation. The algorithms including principal component analysis, nonnegative matrix factorization, and autoencoder artificial neural network are used to reduce dimension of the dataset and detect features. A support vector machine with a linear kernel is used to classify the normal tissue and cancerous tissue samples. The efficacies of the methods are compared.
Efficient geometric rectification techniques for spectral analysis algorithm
NASA Technical Reports Server (NTRS)
Chang, C. Y.; Pang, S. S.; Curlander, J. C.
1992-01-01
The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.
An Extended Spectral-Spatial Classification Approach for Hyperspectral Data
NASA Astrophysics Data System (ADS)
Akbari, D.
2017-11-01
In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.
Detection of illicit substances in fingerprints by infrared spectral imaging.
Ng, Ping Hei Ronnie; Walker, Sarah; Tahtouh, Mark; Reedy, Brian
2009-08-01
FTIR and Raman spectral imaging can be used to simultaneously image a latent fingerprint and detect exogenous substances deposited within it. These substances might include drugs of abuse or traces of explosives or gunshot residue. In this work, spectral searching algorithms were tested for their efficacy in finding targeted substances deposited within fingerprints. "Reverse" library searching, where a large number of possibly poor-quality spectra from a spectral image are searched against a small number of high-quality reference spectra, poses problems for common search algorithms as they are usually implemented. Out of a range of algorithms which included conventional Euclidean distance searching, the spectral angle mapper (SAM) and correlation algorithms gave the best results when used with second-derivative image and reference spectra. All methods tested gave poorer performances with first derivative and undifferentiated spectra. In a search against a caffeine reference, the SAM and correlation methods were able to correctly rank a set of 40 confirmed but poor-quality caffeine spectra at the top of a dataset which also contained 4,096 spectra from an image of an uncontaminated latent fingerprint. These methods also successfully and individually detected aspirin, diazepam and caffeine that had been deposited together in another fingerprint, and they did not indicate any of these substances as a match in a search for another substance which was known not to be present. The SAM was used to successfully locate explosive components in fingerprints deposited on silicon windows. The potential of other spectral searching algorithms used in the field of remote sensing is considered, and the applicability of the methods tested in this work to other modes of spectral imaging is discussed.
NASA Astrophysics Data System (ADS)
Lazcano, R.; Madroñal, D.; Fabelo, H.; Ortega, S.; Salvador, R.; Callicó, G. M.; Juárez, E.; Sanz, C.
2017-10-01
Hyperspectral Imaging (HI) assembles high resolution spectral information from hundreds of narrow bands across the electromagnetic spectrum, thus generating 3D data cubes in which each pixel gathers the spectral information of the reflectance of every spatial pixel. As a result, each image is composed of large volumes of data, which turns its processing into a challenge, as performance requirements have been continuously tightened. For instance, new HI applications demand real-time responses. Hence, parallel processing becomes a necessity to achieve this requirement, so the intrinsic parallelism of the algorithms must be exploited. In this paper, a spatial-spectral classification approach has been implemented using a dataflow language known as RVCCAL. This language represents a system as a set of functional units, and its main advantage is that it simplifies the parallelization process by mapping the different blocks over different processing units. The spatial-spectral classification approach aims at refining the classification results previously obtained by using a K-Nearest Neighbors (KNN) filtering process, in which both the pixel spectral value and the spatial coordinates are considered. To do so, KNN needs two inputs: a one-band representation of the hyperspectral image and the classification results provided by a pixel-wise classifier. Thus, spatial-spectral classification algorithm is divided into three different stages: a Principal Component Analysis (PCA) algorithm for computing the one-band representation of the image, a Support Vector Machine (SVM) classifier, and the KNN-based filtering algorithm. The parallelization of these algorithms shows promising results in terms of computational time, as the mapping of them over different cores presents a speedup of 2.69x when using 3 cores. Consequently, experimental results demonstrate that real-time processing of hyperspectral images is achievable.
MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions
NASA Astrophysics Data System (ADS)
Novosad, Philip; Reader, Andrew J.
2016-06-01
Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [18F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel model can also be used for effective post-reconstruction denoising, through the use of an EM-like image-space algorithm. Finally, we applied the proposed algorithm to reconstruction of real high-resolution dynamic [11C]SCH23390 data, showing promising results.
MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions.
Novosad, Philip; Reader, Andrew J
2016-06-21
Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [(18)F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel model can also be used for effective post-reconstruction denoising, through the use of an EM-like image-space algorithm. Finally, we applied the proposed algorithm to reconstruction of real high-resolution dynamic [(11)C]SCH23390 data, showing promising results.
Anatomy-Based Algorithms for Detecting Oral Cancer Using Reflectance and Fluorescence Spectroscopy
McGee, Sasha; Mardirossian, Vartan; Elackattu, Alphi; Mirkovic, Jelena; Pistey, Robert; Gallagher, George; Kabani, Sadru; Yu, Chung-Chieh; Wang, Zimmern; Badizadegan, Kamran; Grillone, Gregory; Feld, Michael S.
2010-01-01
Objectives We used reflectance and fluorescence spectroscopy to noninvasively and quantitatively distinguish benign from dysplastic/malignant oral lesions. We designed diagnostic algorithms to account for differences in the spectral properties among anatomic sites (gingiva, buccal mucosa, etc). Methods In vivo reflectance and fluorescence spectra were collected from 71 patients with oral lesions. The tissue was then biopsied and the specimen evaluated by histopathology. Quantitative parameters related to tissue morphology and biochemistry were extracted from the spectra. Diagnostic algorithms specific for combinations of sites with similar spectral properties were developed. Results Discrimination of benign from dysplastic/malignant lesions was most successful when algorithms were designed for individual sites (area under the receiver operator characteristic curve [ROC-AUC], 0.75 for the lateral surface of the tongue) and was least accurate when all sites were combined (ROC-AUC, 0.60). The combination of sites with similar spectral properties (floor of mouth and lateral surface of the tongue) yielded an ROC-AUC of 0.71. Conclusions Accurate spectroscopic detection of oral disease must account for spectral variations among anatomic sites. Anatomy-based algorithms for single sites or combinations of sites demonstrated good diagnostic performance in distinguishing benign lesions from dysplastic/malignant lesions and consistently performed better than algorithms developed for all sites combined. PMID:19999369
Broadband Gerchberg-Saxton algorithm for freeform diffractive spectral filter design.
Vorndran, Shelby; Russo, Juan M; Wu, Yuechen; Pelaez, Silvana Ayala; Kostuk, Raymond K
2015-11-30
A multi-wavelength expansion of the Gerchberg-Saxton (GS) algorithm is developed to design and optimize a surface relief Diffractive Optical Element (DOE). The DOE simultaneously diffracts distinct wavelength bands into separate target regions. A description of the algorithm is provided, and parameters that affect filter performance are examined. Performance is based on the spectral power collected within specified regions on a receiver plane. The modified GS algorithm is used to design spectrum splitting optics for CdSe and Si photovoltaic (PV) cells. The DOE has average optical efficiency of 87.5% over the spectral bands of interest (400-710 nm and 710-1100 nm). Simulated PV conversion efficiency is 37.7%, which is 29.3% higher than the efficiency of the better performing PV cell without spectrum splitting optics.
NASA Astrophysics Data System (ADS)
Toadere, Florin
2017-12-01
A spectral image processing algorithm that allows the illumination of the scene with different illuminants together with the reconstruction of the scene's reflectance is presented. Color checker spectral image and CIE A (warm light 2700 K), D65 (cold light 6500 K) and Cree TW Series LED T8 (4000 K) are employed for scene illumination. Illuminants used in the simulations have different spectra and, as a result of their illumination, the colors of the scene change. The influence of the illuminants on the reconstruction of the scene's reflectance is estimated. Demonstrative images and reflectance showing the operation of the algorithm are illustrated.
NASA Astrophysics Data System (ADS)
Yang, Tao; Peng, Jing-xiao; Ho, Ho-pui; Song, Chun-yuan; Huang, Xiao-li; Zhu, Yong-yuan; Li, Xing-ao; Huang, Wei
2018-01-01
By using a preaggregated silver nanoparticle monolayer film and an infrared sensor card, we demonstrate a miniature spectrometer design that covers a broad wavelength range from visible to infrared with high spectral resolution. The spectral contents of an incident probe beam are reconstructed by solving a matrix equation with a smoothing simulated annealing algorithm. The proposed spectrometer offers significant advantages over current instruments that are based on Fourier transform and grating dispersion, in terms of size, resolution, spectral range, cost and reliability. The spectrometer contains three components, which are used for dispersion, frequency conversion and detection. Disordered silver nanoparticles in dispersion component reduce the fabrication complexity. An infrared sensor card in the conversion component broaden the operational spectral range of the system into visible and infrared bands. Since the CCD used in the detection component provides very large number of intensity measurements, one can reconstruct the final spectrum with high resolution. An additional feature of our algorithm for solving the matrix equation, which is suitable for reconstructing both broadband and narrowband signals, we have adopted a smoothing step based on a simulated annealing algorithm. This algorithm improve the accuracy of the spectral reconstruction.
Speech Enhancement, Gain, and Noise Spectrum Adaptation Using Approximate Bayesian Estimation
Hao, Jiucang; Attias, Hagai; Nagarajan, Srikantan; Lee, Te-Won; Sejnowski, Terrence J.
2010-01-01
This paper presents a new approximate Bayesian estimator for enhancing a noisy speech signal. The speech model is assumed to be a Gaussian mixture model (GMM) in the log-spectral domain. This is in contrast to most current models in frequency domain. Exact signal estimation is a computationally intractable problem. We derive three approximations to enhance the efficiency of signal estimation. The Gaussian approximation transforms the log-spectral domain GMM into the frequency domain using minimal Kullback–Leiber (KL)-divergency criterion. The frequency domain Laplace method computes the maximum a posteriori (MAP) estimator for the spectral amplitude. Correspondingly, the log-spectral domain Laplace method computes the MAP estimator for the log-spectral amplitude. Further, the gain and noise spectrum adaptation are implemented using the expectation–maximization (EM) algorithm within the GMM under Gaussian approximation. The proposed algorithms are evaluated by applying them to enhance the speeches corrupted by the speech-shaped noise (SSN). The experimental results demonstrate that the proposed algorithms offer improved signal-to-noise ratio, lower word recognition error rate, and less spectral distortion. PMID:20428253
NASA Astrophysics Data System (ADS)
Padma, S.; Sanjeevi, S.
2014-12-01
This paper proposes a novel hyperspectral matching algorithm by integrating the stochastic Jeffries-Matusita measure (JM) and the deterministic Spectral Angle Mapper (SAM), to accurately map the species and the associated landcover types of the mangroves of east coast of India using hyperspectral satellite images. The JM-SAM algorithm signifies the combination of a qualitative distance measure (JM) and a quantitative angle measure (SAM). The spectral capabilities of both the measures are orthogonally projected using the tangent and sine functions to result in the combined algorithm. The developed JM-SAM algorithm is implemented to discriminate the mangrove species and the landcover classes of Pichavaram (Tamil Nadu), Muthupet (Tamil Nadu) and Bhitarkanika (Odisha) mangrove forests along the Eastern Indian coast using the Hyperion image dat asets that contain 242 bands. The developed algorithm is extended in a supervised framework for accurate classification of the Hyperion image. The pixel-level matching performance of the developed algorithm is assessed by the Relative Spectral Discriminatory Probability (RSDPB) and Relative Spectral Discriminatory Entropy (RSDE) measures. From the values of RSDPB and RSDE, it is inferred that hybrid JM-SAM matching measure results in improved discriminability of the mangrove species and the associated landcover types than the individual SAM and JM algorithms. This performance is reflected in the classification accuracies of species and landcover map of Pichavaram mangrove ecosystem. Thus, the JM-SAM (TAN) matching algorithm yielded an accuracy better than SAM and JM measures at an average difference of 13.49 %, 7.21 % respectively, followed by JM-SAM (SIN) at 12.06%, 5.78% respectively. Similarly, in the case of Muthupet, JM-SAM (TAN) yielded an increased accuracy than SAM and JM measures at an average difference of 12.5 %, 9.72 % respectively, followed by JM-SAM (SIN) at 8.34 %, 5.55% respectively. For Bhitarkanika, the combined JM-SAM (TAN) and (SIN) measures improved the performance of individual SAM by (16.1 %, 15%) and of JM by (10.3%, 9.2%) respectively.
Baumann, Tobias; Schmitt, Franz-Josef; Pelzer, Almut; Spiering, Vivian Jeanette; Freiherr von Sass, Georg Johannes; Friedrich, Thomas; Budisa, Nediljko
2018-04-27
Fluorescent proteins are fundamental tools for the life sciences, in particular for fluorescence microscopy of living cells. While wild-type and engineered variants of the green fluorescent protein from Aequorea victoria (avGFP) as well as homologs from other species already cover large parts of the optical spectrum, a spectral gap remains in the near-infrared region, for which avGFP-based fluorophores are not available. Red-shifted fluorescent protein (FP) variants would substantially expand the toolkit for spectral unmixing of multiple molecular species, but the naturally occurring red-shifted FPs derived from corals or sea anemones have lower fluorescence quantum yield and inferior photo-stability compared to the avGFP variants. Further manipulation and possible expansion of the chromophore's conjugated system towards the far-red spectral region is also limited by the repertoire of 20 canonical amino acids prescribed by the genetic code. To overcome these limitations, synthetic biology can achieve further spectral red-shifting via insertion of non-canonical amino acids into the chromophore triad. We describe the application of SPI to engineer avGFP variants with novel spectral properties. Protein expression is performed in a tryptophan-auxotrophic E. coli strain and by supplementing growth media with suitable indole precursors. Inside the cells, these precursors are converted to the corresponding tryptophan analogs and incorporated into proteins by the ribosomal machinery in response to UGG codons. The replacement of Trp-66 in the enhanced "cyan" variant of avGFP (ECFP) by an electron-donating 4-aminotryptophan results in GdFP featuring a 108 nm Stokes shift and a strongly red-shifted emission maximum (574 nm), while being thermodynamically more stable than its predecessor ECFP. Residue-specific incorporation of the non-canonical amino acid is analyzed by mass spectrometry. The spectroscopic properties of GdFP are characterized by time-resolved fluorescence spectroscopy as one of the valuable applications of genetically encoded FPs in life sciences.
Wide-band array signal processing via spectral smoothing
NASA Technical Reports Server (NTRS)
Xu, Guanghan; Kailath, Thomas
1989-01-01
A novel algorithm for the estimation of direction-of-arrivals (DOA) of multiple wide-band sources via spectral smoothing is presented. The proposed algorithm does not require an initial DOA estimate or a specific signal model. The advantages of replacing the MUSIC search with an ESPRIT search are discussed.
Selecting algorithms, sensors, and linear bases for optimum spectral recovery of skylight.
López-Alvarez, Miguel A; Hernández-Andrés, Javier; Valero, Eva M; Romero, Javier
2007-04-01
In a previous work [Appl. Opt.44, 5688 (2005)] we found the optimum sensors for a planned multispectral system for measuring skylight in the presence of noise by adapting a linear spectral recovery algorithm proposed by Maloney and Wandell [J. Opt. Soc. Am. A3, 29 (1986)]. Here we continue along these lines by simulating the responses of three to five Gaussian sensors and recovering spectral information from noise-affected sensor data by trying out four different estimation algorithms, three different sizes for the training set of spectra, and various linear bases. We attempt to find the optimum combination of sensors, recovery method, linear basis, and matrix size to recover the best skylight spectral power distributions from colorimetric and spectral (in the visible range) points of view. We show how all these parameters play an important role in the practical design of a real multispectral system and how to obtain several relevant conclusions from simulating the behavior of sensors in the presence of noise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
I. W. Ginsberg
Multiresolutional decompositions known as spectral fingerprints are often used to extract spectral features from multispectral/hyperspectral data. In this study, the authors investigate the use of wavelet-based algorithms for generating spectral fingerprints. The wavelet-based algorithms are compared to the currently used method, traditional convolution with first-derivative Gaussian filters. The comparison analyses consists of two parts: (a) the computational expense of the new method is compared with the computational costs of the current method and (b) the outputs of the wavelet-based methods are compared with those of the current method to determine any practical differences in the resulting spectral fingerprints. The resultsmore » show that the wavelet-based algorithms can greatly reduce the computational expense of generating spectral fingerprints, while practically no differences exist in the resulting fingerprints. The analysis is conducted on a database of hyperspectral signatures, namely, Hyperspectral Digital Image Collection Experiment (HYDICE) signatures. The reduction in computational expense is by a factor of about 30, and the average Euclidean distance between resulting fingerprints is on the order of 0.02.« less
NASA Astrophysics Data System (ADS)
Ming, Mei-Jun; Xu, Long-Kun; Wang, Fan; Bi, Ting-Jun; Li, Xiang-Yuan
2017-07-01
In this work, a matrix form of numerical algorithm for spectral shift is presented based on the novel nonequilibrium solvation model that is established by introducing the constrained equilibrium manipulation. This form is convenient for the development of codes for numerical solution. By means of the integral equation formulation polarizable continuum model (IEF-PCM), a subroutine has been implemented to compute spectral shift numerically. Here, the spectral shifts of absorption spectra for several popular chromophores, N,N-diethyl-p-nitroaniline (DEPNA), methylenecyclopropene (MCP), acrolein (ACL) and p-nitroaniline (PNA) were investigated in different solvents with various polarities. The computed spectral shifts can explain the available experimental findings reasonably. Discussions were made on the contributions of solute geometry distortion, electrostatic polarization and other non-electrostatic interactions to spectral shift.
Malonza, I M; Tyndall, M W; Ndinya-Achola, J O; Maclean, I; Omar, S; MacDonald, K S; Perriens, J; Orle, K; Plummer, F A; Ronald, A R; Moses, S
1999-12-01
A randomized, double-blind, placebo-controlled clinical trial was conducted in Nairobi, Kenya, to compare single-dose ciprofloxacin with a 7-day course of erythromycin for the treatment of chancroid. In all, 208 men and 37 women presenting with genital ulcers clinically compatible with chancroid were enrolled. Ulcer etiology was determined using culture techniques for chancroid, serology for syphilis, and a multiplex polymerase chain reaction for chancroid, syphilis, and herpes simplex virus (HSV). Ulcer etiology was 31% unmixed chancroid, 23% unmixed syphilis, 16% unmixed HSV, 15% mixed etiology, and 15% unknown. For 111 participants with chancroid, cure rates were 92% with ciprofloxacin and 91% with erythromycin. For all study participants, the treatment failure rate was 15%, mostly related to ulcer etiologies of HSV infection or syphilis, and treatment failure was 3 times more frequent in human immunodeficiency virus-infected subjects than in others, mostly owing to HSV infection. Ciprofloxacin is an effective single-dose treatment for chancroid, but current recommendations for empiric therapy of genital ulcers may result in high treatment failure due to HSV infection.
Visual enhancement of unmixed multispectral imagery using adaptive smoothing
Lemeshewsky, G.P.; Rahman, Z.-U.; Schowengerdt, R.A.; Reichenbach, S.E.
2004-01-01
Adaptive smoothing (AS) has been previously proposed as a method to smooth uniform regions of an image, retain contrast edges, and enhance edge boundaries. The method is an implementation of the anisotropic diffusion process which results in a gray scale image. This paper discusses modifications to the AS method for application to multi-band data which results in a color segmented image. The process was used to visually enhance the three most distinct abundance fraction images produced by the Lagrange constraint neural network learning-based unmixing of Landsat 7 Enhanced Thematic Mapper Plus multispectral sensor data. A mutual information-based method was applied to select the three most distinct fraction images for subsequent visualization as a red, green, and blue composite. A reported image restoration technique (partial restoration) was applied to the multispectral data to reduce unmixing error, although evaluation of the performance of this technique was beyond the scope of this paper. The modified smoothing process resulted in a color segmented image with homogeneous regions separated by sharpened, coregistered multiband edges. There was improved class separation with the segmented image, which has importance to subsequent operations involving data classification.
NASA Astrophysics Data System (ADS)
Kal, S.; Kasko, I.; Ryssel, H.
1995-10-01
The influence of ion-beam mixing on ultra-thin cobalt silicide (CoSi2) formation was investigated by characterizing the ion-beam mixed and unmixed CoSi2 films. A Ge+ ion-implantation through the Co film prior to silicidation causes an interface mixing of the cobalt film with the silicon substrate and results in improved silicide-to-silicon interface roughness. Rapid thermal annealing was used to form Ge+ ion mixed and unmixed thin CoSi2 layer from 10 nm sputter deposited Co film. The silicide films were characterized by secondary neutral mass spectroscopy, x-ray diffraction, tunneling electron microscopy (TEM), Rutherford backscattering, and sheet resistance measurements. The experi-mental results indicate that the final rapid thermal annealing temperature should not exceed 800°C for thin (<50 nm) CoSi2 preparation. A comparison of the plan-view and cross-section TEM micrographs of the ion-beam mixed and unmixed CoSi2 films reveals that Ge+ ion mixing (45 keV, 1 × 1015 cm-2) produces homogeneous silicide with smooth silicide-to-silicon interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couturier, Laurent, E-mail: laurent.couturier55@ho
The fine microstructure obtained by unmixing of a solid solution either by classical precipitation or spinodal decomposition is often characterized either by small angle scattering or atom probe tomography. This article shows that a common data analysis framework can be used to analyze data obtained from these two techniques. An example of the application of this common analysis is given for characterization of the unmixing of the Fe-Cr matrix of a 15-5 PH stainless steel during long-term ageing at 350 °C and 400 °C. A direct comparison of the Cr composition fluctuations amplitudes and characteristic lengths obtained with both techniquesmore » is made showing a quantitative agreement for the fluctuation amplitudes. The origin of the discrepancy remaining for the characteristic lengths is discussed. - Highlights: •Common analysis framework for atom probe tomography and small angle scattering •Comparison of same microstructural characteristics obtained using both techniques •Good correlation of Cr composition fluctuations amplitudes from both techniques •Good correlation of Cr composition fluctuations amplitudes with classic V parameter.« less
Fang, Jieming; Zhang, Da; Wilcox, Carol; Heidinger, Benedikt; Raptopoulos, Vassilios; Brook, Alexander; Brook, Olga R
2017-03-01
To assess single energy metal artifact reduction (SEMAR) and spectral energy metal artifact reduction (MARS) algorithms in reducing artifacts generated by different metal implants. Phantom was scanned with and without SEMAR (Aquilion One, Toshiba) and MARS (Discovery CT750 HD, GE), with various metal implants. Images were evaluated objectively by measuring standard deviation in regions of interests and subjectively by two independent reviewers grading on a scale of 0 (no artifact) to 4 (severe artifact). Reviewers also graded new artifacts introduced by metal artifact reduction algorithms. SEMAR and MARS significantly decreased variability of the density measurement adjacent to the metal implant, with median SD (standard deviation of density measurement) of 52.1 HU without SEMAR, vs. 12.3 HU with SEMAR, p < 0.001. Median SD without MARS of 63.1 HU decreased to 25.9 HU with MARS, p < 0.001. Median SD with SEMAR is significantly lower than median SD with MARS (p = 0.0011). SEMAR improved subjective image quality with reduction in overall artifacts grading from 3.2 ± 0.7 to 1.4 ± 0.9, p < 0.001. Improvement of overall image quality by MARS has not reached statistical significance (3.2 ± 0.6 to 2.6 ± 0.8, p = 0.088). There was a significant introduction of artifacts introduced by metal artifact reduction algorithm for MARS with 2.4 ± 1.0, but minimal with SEMAR 0.4 ± 0.7, p < 0.001. CT iterative reconstruction algorithms with single and spectral energy are both effective in reduction of metal artifacts. Single energy-based algorithm provides better overall image quality than spectral CT-based algorithm. Spectral metal artifact reduction algorithm introduces mild to moderate artifacts in the far field.
Trace gas detection in hyperspectral imagery using the wavelet packet subspace
NASA Astrophysics Data System (ADS)
Salvador, Mark A. Z.
This dissertation describes research into a new remote sensing method to detect trace gases in hyperspectral and ultra-spectral data. This new method is based on the wavelet packet transform. It attempts to improve both the computational tractability and the detection of trace gases in airborne and spaceborne spectral imagery. Atmospheric trace gas research supports various Earth science disciplines to include climatology, vulcanology, pollution monitoring, natural disasters, and intelligence and military applications. Hyperspectral and ultra-spectral data significantly increases the data glut of existing Earth science data sets. Spaceborne spectral data in particular significantly increases spectral resolution while performing daily global collections of the earth. Application of the wavelet packet transform to the spectral space of hyperspectral and ultra-spectral imagery data potentially improves remote sensing detection algorithms. It also facilities the parallelization of these methods for high performance computing. This research seeks two science goals, (1) developing a new spectral imagery detection algorithm, and (2) facilitating the parallelization of trace gas detection in spectral imagery data.
Low complexity feature extraction for classification of harmonic signals
NASA Astrophysics Data System (ADS)
William, Peter E.
In this dissertation, feature extraction algorithms have been developed for extraction of characteristic features from harmonic signals. The common theme for all developed algorithms is the simplicity in generating a significant set of features directly from the time domain harmonic signal. The features are a time domain representation of the composite, yet sparse, harmonic signature in the spectral domain. The algorithms are adequate for low-power unattended sensors which perform sensing, feature extraction, and classification in a standalone scenario. The first algorithm generates the characteristic features using only the duration between successive zero-crossing intervals. The second algorithm estimates the harmonics' amplitudes of the harmonic structure employing a simplified least squares method without the need to estimate the true harmonic parameters of the source signal. The third algorithm, resulting from a collaborative effort with Daniel White at the DSP Lab, University of Nebraska-Lincoln, presents an analog front end approach that utilizes a multichannel analog projection and integration to extract the sparse spectral features from the analog time domain signal. Classification is performed using a multilayer feedforward neural network. Evaluation of the proposed feature extraction algorithms for classification through the processing of several acoustic and vibration data sets (including military vehicles and rotating electric machines) with comparison to spectral features shows that, for harmonic signals, time domain features are simpler to extract and provide equivalent or improved reliability over the spectral features in both the detection probabilities and false alarm rate.
Matched-filter algorithm for subpixel spectral detection in hyperspectral image data
NASA Astrophysics Data System (ADS)
Borough, Howard C.
1991-11-01
Hyperspectral imagery, spatial imagery with associated wavelength data for every pixel, offers a significant potential for improved detection and identification of certain classes of targets. The ability to make spectral identifications of objects which only partially fill a single pixel (due to range or small size) is of considerable interest. Multiband imagery such as Landsat's 5 and 7 band imagery has demonstrated significant utility in the past. Hyperspectral imaging systems with hundreds of spectral bands offer improved performance. To explore the application of differentpixel spectral detection algorithms a synthesized set of hyperspectral image data (hypercubes) was generated utilizing NASA earth resources and other spectral data. The data was modified using LOWTRAN 7 to model the illumination, atmospheric contributions, attenuations and viewing geometry to represent a nadir view from 10,000 ft. altitude. The base hypercube (HC) represented 16 by 21 spatial pixels with 101 wavelength samples from 0.5 to 2.5 micrometers for each pixel. Insertions were made into the base data to provide random location, random pixel percentage, and random material. Fifteen different hypercubes were generated for blind testing of candidate algorithms. An algorithm utilizing a matched filter in the spectral dimension proved surprisingly good yielding 100% detections for pixels filled greater than 40% with a standard camouflage paint, and a 50% probability of detection for pixels filled 20% with the paint, with no false alarms. The false alarm rate as a function of the number of spectral bands in the range from 101 to 12 bands was measured and found to increase from zero to 50% illustrating the value of a large number of spectral bands. This test was on imagery without system noise; the next step is to incorporate typical system noise sources.
Adiabatic Quantum Search in Open Systems.
Wild, Dominik S; Gopalakrishnan, Sarang; Knap, Michael; Yao, Norman Y; Lukin, Mikhail D
2016-10-07
Adiabatic quantum algorithms represent a promising approach to universal quantum computation. In isolated systems, a key limitation to such algorithms is the presence of avoided level crossings, where gaps become extremely small. In open quantum systems, the fundamental robustness of adiabatic algorithms remains unresolved. Here, we study the dynamics near an avoided level crossing associated with the adiabatic quantum search algorithm, when the system is coupled to a generic environment. At zero temperature, we find that the algorithm remains scalable provided the noise spectral density of the environment decays sufficiently fast at low frequencies. By contrast, higher order scattering processes render the algorithm inefficient at any finite temperature regardless of the spectral density, implying that no quantum speedup can be achieved. Extensions and implications for other adiabatic quantum algorithms will be discussed.
Compression of multispectral Landsat imagery using the Embedded Zerotree Wavelet (EZW) algorithm
NASA Technical Reports Server (NTRS)
Shapiro, Jerome M.; Martucci, Stephen A.; Czigler, Martin
1994-01-01
The Embedded Zerotree Wavelet (EZW) algorithm has proven to be an extremely efficient and flexible compression algorithm for low bit rate image coding. The embedding algorithm attempts to order the bits in the bit stream in numerical importance and thus a given code contains all lower rate encodings of the same algorithm. Therefore, precise bit rate control is achievable and a target rate or distortion metric can be met exactly. Furthermore, the technique is fully image adaptive. An algorithm for multispectral image compression which combines the spectral redundancy removal properties of the image-dependent Karhunen-Loeve Transform (KLT) with the efficiency, controllability, and adaptivity of the embedded zerotree wavelet algorithm is presented. Results are shown which illustrate the advantage of jointly encoding spectral components using the KLT and EZW.
NASA Astrophysics Data System (ADS)
Witharana, Chandi; LaRue, Michelle A.; Lynch, Heather J.
2016-03-01
Remote sensing is a rapidly developing tool for mapping the abundance and distribution of Antarctic wildlife. While both panchromatic and multispectral imagery have been used in this context, image fusion techniques have received little attention. We tasked seven widely-used fusion algorithms: Ehlers fusion, hyperspherical color space fusion, high-pass fusion, principal component analysis (PCA) fusion, University of New Brunswick fusion, and wavelet-PCA fusion to resolution enhance a series of single-date QuickBird-2 and Worldview-2 image scenes comprising penguin guano, seals, and vegetation. Fused images were assessed for spectral and spatial fidelity using a variety of quantitative quality indicators and visual inspection methods. Our visual evaluation elected the high-pass fusion algorithm and the University of New Brunswick fusion algorithm as best for manual wildlife detection while the quantitative assessment suggested the Gram-Schmidt fusion algorithm and the University of New Brunswick fusion algorithm as best for automated classification. The hyperspherical color space fusion algorithm exhibited mediocre results in terms of spectral and spatial fidelities. The PCA fusion algorithm showed spatial superiority at the expense of spectral inconsistencies. The Ehlers fusion algorithm and the wavelet-PCA algorithm showed the weakest performances. As remote sensing becomes a more routine method of surveying Antarctic wildlife, these benchmarks will provide guidance for image fusion and pave the way for more standardized products for specific types of wildlife surveys.
A three-dimensional spectral algorithm for simulations of transition and turbulence
NASA Technical Reports Server (NTRS)
Zang, T. A.; Hussaini, M. Y.
1985-01-01
A spectral algorithm for simulating three dimensional, incompressible, parallel shear flows is described. It applies to the channel, to the parallel boundary layer, and to other shear flows with one wall bounded and two periodic directions. Representative applications to the channel and to the heated boundary layer are presented.
Land, P E; Haigh, J D
1997-12-20
In algorithms for the atmospheric correction of visible and near-IR satellite observations of the Earth's surface, it is generally assumed that the spectral variation of aerosol optical depth is characterized by an Angström power law or similar dependence. In an iterative fitting algorithm for atmospheric correction of ocean color imagery over case 2 waters, this assumption leads to an inability to retrieve the aerosol type and to the attribution to aerosol spectral variations of spectral effects actually caused by the water contents. An improvement to this algorithm is described in which the spectral variation of optical depth is calculated as a function of aerosol type and relative humidity, and an attempt is made to retrieve the relative humidity in addition to aerosol type. The aerosol is treated as a mixture of aerosol components (e.g., soot), rather than of aerosol types (e.g., urban). We demonstrate the improvement over the previous method by using simulated case 1 and case 2 sea-viewing wide field-of-view sensor data, although the retrieval of relative humidity was not successful.
Spectral Anonymization of Data
Lasko, Thomas A.; Vinterbo, Staal A.
2011-01-01
The goal of data anonymization is to allow the release of scientifically useful data in a form that protects the privacy of its subjects. This requires more than simply removing personal identifiers from the data, because an attacker can still use auxiliary information to infer sensitive individual information. Additional perturbation is necessary to prevent these inferences, and the challenge is to perturb the data in a way that preserves its analytic utility. No existing anonymization algorithm provides both perfect privacy protection and perfect analytic utility. We make the new observation that anonymization algorithms are not required to operate in the original vector-space basis of the data, and many algorithms can be improved by operating in a judiciously chosen alternate basis. A spectral basis derived from the data’s eigenvectors is one that can provide substantial improvement. We introduce the term spectral anonymization to refer to an algorithm that uses a spectral basis for anonymization, and we give two illustrative examples. We also propose new measures of privacy protection that are more general and more informative than existing measures, and a principled reference standard with which to define adequate privacy protection. PMID:21373375
Hazardous gas detection for FTIR-based hyperspectral imaging system using DNN and CNN
NASA Astrophysics Data System (ADS)
Kim, Yong Chan; Yu, Hyeong-Geun; Lee, Jae-Hoon; Park, Dong-Jo; Nam, Hyun-Woo
2017-10-01
Recently, a hyperspectral imaging system (HIS) with a Fourier Transform InfraRed (FTIR) spectrometer has been widely used due to its strengths in detecting gaseous fumes. Even though numerous algorithms for detecting gaseous fumes have already been studied, it is still difficult to detect target gases properly because of atmospheric interference substances and unclear characteristics of low concentration gases. In this paper, we propose detection algorithms for classifying hazardous gases using a deep neural network (DNN) and a convolutional neural network (CNN). In both the DNN and CNN, spectral signal preprocessing, e.g., offset, noise, and baseline removal, are carried out. In the DNN algorithm, the preprocessed spectral signals are used as feature maps of the DNN with five layers, and it is trained by a stochastic gradient descent (SGD) algorithm (50 batch size) and dropout regularization (0.7 ratio). In the CNN algorithm, preprocessed spectral signals are trained with 1 × 3 convolution layers and 1 × 2 max-pooling layers. As a result, the proposed algorithms improve the classification accuracy rate by 1.5% over the existing support vector machine (SVM) algorithm for detecting and classifying hazardous gases.
Spectral multigrid methods for the solution of homogeneous turbulence problems
NASA Technical Reports Server (NTRS)
Erlebacher, G.; Zang, T. A.; Hussaini, M. Y.
1987-01-01
New three-dimensional spectral multigrid algorithms are analyzed and implemented to solve the variable coefficient Helmholtz equation. Periodicity is assumed in all three directions which leads to a Fourier collocation representation. Convergence rates are theoretically predicted and confirmed through numerical tests. Residual averaging results in a spectral radius of 0.2 for the variable coefficient Poisson equation. In general, non-stationary Richardson must be used for the Helmholtz equation. The algorithms developed are applied to the large-eddy simulation of incompressible isotropic turbulence.
Toward Improved Hyperspectral Analysis in Semiarid Systems
NASA Astrophysics Data System (ADS)
Glenn, N. F.; Mitchell, J.
2012-12-01
Idaho State University's Boise Center Aerospace Laboratory (BCAL) has processed and applied hyperspectral data for a variety of biophysical sciences in semiarid systems over the past 10 years. HyMap hyperspectral data have been used in most of these studies, along with AVIRIS, CASI, and PIKA-II data. Our studies began with the detection of individual weed species, such as leafy spurge, corroborated with extensive field analysis, including spectrometer data. Early contributions to the field of hyperspectral analysis included the use of: time-series datasets and classification threshold methods for target detection, and subpixel analysis for characterizing weed invasions and post-fire vegetation and soil conditions. Subsequent studies optimized subpixel unmixing performance using spectral subsetting and vegetation abundance investigations. More recent studies have extended the application of hyperspectral data from individual plant species detection to identification of biochemical constituents. We demonstrated field and airborne hyperspectral Nitrogen absorption in sagebrush using combinations of data reduction and spectral transformation techniques (i.e., continuum removal, derivative analysis, partial least squares regression). In spite of these and many other successful demonstrations, gaps still exist in effective species level discrimination due to the high complexity of soil and nonlinear mixing in semiarid shrubland. BCAL studies are currently focusing on complimenting narrowband vegetation indices with LiDAR (light detection and ranging, both airborne and ground-based) derivatives to improve vegetation cover predictions. Future combinations of LiDAR and hyperspectral data will involve exploring the full range spectral information and serve as an integral step in scaling shrub biomass estimates from plot to landscape and regional scales.
Arctic Tundra Vegetation Functional Types Based on Photosynthetic Physiology and Optical Properties
NASA Technical Reports Server (NTRS)
Huemmrich, Karl F.; Gamon, John; Tweedie, Craig; Campbell, Petya P. K.; Landis, David; Middleton, Elizabeth
2012-01-01
Climate change in tundra regions may alter vegetation species composition and ecosystem carbon balance. Remote sensing provides critical tools for monitoring these changes as optical signals provide a way to scale from plot measurements to regional patterns. Gas exchange measurements of pure patches of key vegetation functional types (lichens, mosses, and vascular plants) in sedge tundra at Barrow AK, show three significantly different values of light use efficiency (LUE) with values of 0.013+/-0.001, 0.0018+/-0.0002, and 0.0012 0.0001 mol C/mol absorbed quanta for vascular plants, mosses and lichens, respectively. Further, discriminant analysis of patch reflectance identifies five spectral bands that can separate each vegetation functional type as well as nongreen material (bare soil, standing water, and dead leaves). These results were tested along a 100 m transect where midsummer spectral reflectance and vegetation coverage were measured at one meter intervals. Area-averaged canopy LUE estimated from coverage fractions of the three functional types varied widely, even over short distances. Patch-level statistical discriminant functions applied to in situ hyperspectral reflectance successfully unmixed cover fractions of the vegetation functional types. These functions, developed from the tram data, were applied to 30 m spatial resolution Earth Observing-1 Hyperion imaging spectrometer data to examine regional variability in distribution of the vegetation functional types and from those distributions, the variability of LUE. Across the landscape, there was a fivefold variation in tundra LUE that was correlated to a spectral vegetation index developed to detect vegetation chlorophyll content.
NASA Astrophysics Data System (ADS)
Das, Ranabir; Kumar, Anil
2004-10-01
Quantum information processing has been effectively demonstrated on a small number of qubits by nuclear magnetic resonance. An important subroutine in any computing is the readout of the output. "Spectral implementation" originally suggested by Z. L. Madi, R. Bruschweiler, and R. R. Ernst [J. Chem. Phys. 109, 10603 (1999)], provides an elegant method of readout with the use of an extra "observer" qubit. At the end of computation, detection of the observer qubit provides the output via the multiplet structure of its spectrum. In spectral implementation by two-dimensional experiment the observer qubit retains the memory of input state during computation, thereby providing correlated information on input and output, in the same spectrum. Spectral implementation of Grover's search algorithm, approximate quantum counting, a modified version of Berstein-Vazirani problem, and Hogg's algorithm are demonstrated here in three- and four-qubit systems.
NASA Astrophysics Data System (ADS)
Chang, Bingguo; Chen, Xiaofei
2018-05-01
Ultrasonography is an important examination for the diagnosis of chronic liver disease. The doctor gives the liver indicators and suggests the patient's condition according to the description of ultrasound report. With the rapid increase in the amount of data of ultrasound report, the workload of professional physician to manually distinguish ultrasound results significantly increases. In this paper, we use the spectral clustering method to cluster analysis of the description of the ultrasound report, and automatically generate the ultrasonic diagnostic diagnosis by machine learning. 110 groups ultrasound examination report of chronic liver disease were selected as test samples in this experiment, and the results were validated by spectral clustering and compared with k-means clustering algorithm. The results show that the accuracy of spectral clustering is 92.73%, which is higher than that of k-means clustering algorithm, which provides a powerful ultrasound-assisted diagnosis for patients with chronic liver disease.
Zhang, Shang; Dong, Yuhan; Fu, Hongyan; Huang, Shao-Lun; Zhang, Lin
2018-02-22
The miniaturization of spectrometer can broaden the application area of spectrometry, which has huge academic and industrial value. Among various miniaturization approaches, filter-based miniaturization is a promising implementation by utilizing broadband filters with distinct transmission functions. Mathematically, filter-based spectral reconstruction can be modeled as solving a system of linear equations. In this paper, we propose an algorithm of spectral reconstruction based on sparse optimization and dictionary learning. To verify the feasibility of the reconstruction algorithm, we design and implement a simple prototype of a filter-based miniature spectrometer. The experimental results demonstrate that sparse optimization is well applicable to spectral reconstruction whether the spectra are directly sparse or not. As for the non-directly sparse spectra, their sparsity can be enhanced by dictionary learning. In conclusion, the proposed approach has a bright application prospect in fabricating a practical miniature spectrometer.
Zhang, Shang; Fu, Hongyan; Huang, Shao-Lun; Zhang, Lin
2018-01-01
The miniaturization of spectrometer can broaden the application area of spectrometry, which has huge academic and industrial value. Among various miniaturization approaches, filter-based miniaturization is a promising implementation by utilizing broadband filters with distinct transmission functions. Mathematically, filter-based spectral reconstruction can be modeled as solving a system of linear equations. In this paper, we propose an algorithm of spectral reconstruction based on sparse optimization and dictionary learning. To verify the feasibility of the reconstruction algorithm, we design and implement a simple prototype of a filter-based miniature spectrometer. The experimental results demonstrate that sparse optimization is well applicable to spectral reconstruction whether the spectra are directly sparse or not. As for the non-directly sparse spectra, their sparsity can be enhanced by dictionary learning. In conclusion, the proposed approach has a bright application prospect in fabricating a practical miniature spectrometer. PMID:29470406
Surface emissivity and temperature retrieval for a hyperspectral sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borel, C.C.
1998-12-01
With the growing use of hyper-spectral imagers, e.g., AVIRIS in the visible and short-wave infrared there is hope of using such instruments in the mid-wave and thermal IR (TIR) some day. The author believes that this will enable him to get around using the present temperature-emissivity separation algorithms using methods which take advantage of the many channels available in hyper-spectral imagers. A simple fact used in coming up with a novel algorithm is that a typical surface emissivity spectrum are rather smooth compared to spectral features introduced by the atmosphere. Thus, a iterative solution technique can be devised which retrievesmore » emissivity spectra based on spectral smoothness. To make the emissivities realistic, atmospheric parameters are varied using approximations, look-up tables derived from a radiative transfer code and spectral libraries. One such iterative algorithm solves the radiative transfer equation for the radiance at the sensor for the unknown emissivity and uses the blackbody temperature computed in an atmospheric window to get a guess for the unknown surface temperature. By varying the surface temperature over a small range a series of emissivity spectra are calculated. The one with the smoothest characteristic is chosen. The algorithm was tested on synthetic data using MODTRAN and the Salisbury emissivity database.« less
Alternative techniques for high-resolution spectral estimation of spectrally encoded endoscopy
NASA Astrophysics Data System (ADS)
Mousavi, Mahta; Duan, Lian; Javidi, Tara; Ellerbee, Audrey K.
2015-09-01
Spectrally encoded endoscopy (SEE) is a minimally invasive optical imaging modality capable of fast confocal imaging of internal tissue structures. Modern SEE systems use coherent sources to image deep within the tissue and data are processed similar to optical coherence tomography (OCT); however, standard processing of SEE data via the Fast Fourier Transform (FFT) leads to degradation of the axial resolution as the bandwidth of the source shrinks, resulting in a well-known trade-off between speed and axial resolution. Recognizing the limitation of FFT as a general spectral estimation algorithm to only take into account samples collected by the detector, in this work we investigate alternative high-resolution spectral estimation algorithms that exploit information such as sparsity and the general region position of the bulk sample to improve the axial resolution of processed SEE data. We validate the performance of these algorithms using bothMATLAB simulations and analysis of experimental results generated from a home-built OCT system to simulate an SEE system with variable scan rates. Our results open a new door towards using non-FFT algorithms to generate higher quality (i.e., higher resolution) SEE images at correspondingly fast scan rates, resulting in systems that are more accurate and more comfortable for patients due to the reduced image time.
NASA Technical Reports Server (NTRS)
Gao, Bo-Cai; Montes, Marcos J.; Davis, Curtiss O.
2003-01-01
This SIMBIOS contract supports several activities over its three-year time-span. These include certain computational aspects of atmospheric correction, including the modification of our hyperspectral atmospheric correction algorithm Tafkaa for various multi-spectral instruments, such as SeaWiFS, MODIS, and GLI. Additionally, since absorbing aerosols are becoming common in many coastal areas, we are making the model calculations to incorporate various absorbing aerosol models into tables used by our Tafkaa atmospheric correction algorithm. Finally, we have developed the algorithms to use MODIS data to characterize thin cirrus effects on aerosol retrieval.
Peng, Tao; Bonamy, Ghislain M C; Glory-Afshar, Estelle; Rines, Daniel R; Chanda, Sumit K; Murphy, Robert F
2010-02-16
Many proteins or other biological macromolecules are localized to more than one subcellular structure. The fraction of a protein in different cellular compartments is often measured by colocalization with organelle-specific fluorescent markers, requiring availability of fluorescent probes for each compartment and acquisition of images for each in conjunction with the macromolecule of interest. Alternatively, tailored algorithms allow finding particular regions in images and quantifying the amount of fluorescence they contain. Unfortunately, this approach requires extensive hand-tuning of algorithms and is often cell type-dependent. Here we describe a machine-learning approach for estimating the amount of fluorescent signal in different subcellular compartments without hand tuning, requiring only the acquisition of separate training images of markers for each compartment. In testing on images of cells stained with mixtures of probes for different organelles, we achieved a 93% correlation between estimated and expected amounts of probes in each compartment. We also demonstrated that the method can be used to quantify drug-dependent protein translocations. The method enables automated and unbiased determination of the distributions of protein across cellular compartments, and will significantly improve imaging-based high-throughput assays and facilitate proteome-scale localization efforts.
Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A
2014-05-19
Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.
Spectral methods to detect surface mines
NASA Astrophysics Data System (ADS)
Winter, Edwin M.; Schatten Silvious, Miranda
2008-04-01
Over the past five years, advances have been made in the spectral detection of surface mines under minefield detection programs at the U. S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD). The problem of detecting surface land mines ranges from the relatively simple, the detection of large anti-vehicle mines on bare soil, to the very difficult, the detection of anti-personnel mines in thick vegetation. While spatial and spectral approaches can be applied to the detection of surface mines, spatial-only detection requires many pixels-on-target such that the mine is actually imaged and shape-based features can be exploited. This method is unreliable in vegetated areas because only part of the mine may be exposed, while spectral detection is possible without the mine being resolved. At NVESD, hyperspectral and multi-spectral sensors throughout the reflection and thermal spectral regimes have been applied to the mine detection problem. Data has been collected on mines in forest and desert regions and algorithms have been developed both to detect the mines as anomalies and to detect the mines based on their spectral signature. In addition to the detection of individual mines, algorithms have been developed to exploit the similarities of mines in a minefield to improve their detection probability. In this paper, the types of spectral data collected over the past five years will be summarized along with the advances in algorithm development.
Li, Qingli; Zhang, Jingfa; Wang, Yiting; Xu, Guoteng
2009-12-01
A molecular spectral imaging system has been developed based on microscopy and spectral imaging technology. The system is capable of acquiring molecular spectral images from 400 nm to 800 nm with 2 nm wavelength increments. The basic principles, instrumental systems, and system calibration method as well as its applications for the calculation of the stain-uptake by tissues are introduced. As a case study, the system is used for determining the pathogenesis of diabetic retinopathy and evaluating the therapeutic effects of erythropoietin. Some molecular spectral images of retinal sections of normal, diabetic, and treated rats were collected and analyzed. The typical transmittance curves of positive spots stained for albumin and advanced glycation end products are retrieved from molecular spectral data with the spectral response calibration algorithm. To explore and evaluate the protective effect of erythropoietin (EPO) on retinal albumin leakage of streptozotocin-induced diabetic rats, an algorithm based on Beer-Lambert's law is presented. The algorithm can assess the uptake by histologic retinal sections of stains used in quantitative pathology to label albumin leakage and advanced glycation end products formation. Experimental results show that the system is helpful for the ophthalmologist to reveal the pathogenesis of diabetic retinopathy and explore the protective effect of erythropoietin on retinal cells of diabetic rats. It also highlights the potential of molecular spectral imaging technology to provide more effective and reliable diagnostic criteria in pathology.
Demosaicking for full motion video 9-band SWIR sensor
NASA Astrophysics Data System (ADS)
Kanaev, Andrey V.; Rawhouser, Marjorie; Kutteruf, Mary R.; Yetzbacher, Michael K.; DePrenger, Michael J.; Novak, Kyle M.; Miller, Corey A.; Miller, Christopher W.
2014-05-01
Short wave infrared (SWIR) spectral imaging systems are vital for Intelligence, Surveillance, and Reconnaissance (ISR) applications because of their abilities to autonomously detect targets and classify materials. Typically the spectral imagers are incapable of providing Full Motion Video (FMV) because of their reliance on line scanning. We enable FMV capability for a SWIR multi-spectral camera by creating a repeating pattern of 3x3 spectral filters on a staring focal plane array (FPA). In this paper we present the imagery from an FMV SWIR camera with nine discrete bands and discuss image processing algorithms necessary for its operation. The main task of image processing in this case is demosaicking of the spectral bands i.e. reconstructing full spectral images with original FPA resolution from spatially subsampled and incomplete spectral data acquired with the choice of filter array pattern. To the best of author's knowledge, the demosaicking algorithms for nine or more equally sampled bands have not been reported before. Moreover all existing algorithms developed for demosaicking visible color filter arrays with less than nine colors assume either certain relationship between the visible colors, which are not valid for SWIR imaging, or presence of one color band with higher sampling rate compared to the rest of the bands, which does not conform to our spectral filter pattern. We will discuss and present results for two novel approaches to demosaicking: interpolation using multi-band edge information and application of multi-frame super-resolution to a single frame resolution enhancement of multi-spectral spatially multiplexed images.
GIFTS SM EDU Level 1B Algorithms
NASA Technical Reports Server (NTRS)
Tian, Jialin; Gazarik, Michael J.; Reisse, Robert A.; Johnson, David G.
2007-01-01
The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) SensorModule (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiances using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three focal plane arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes the GIFTS SM EDU Level 1B algorithms involved in the calibration. The GIFTS Level 1B calibration procedures can be subdivided into four blocks. In the first block, the measured raw interferograms are first corrected for the detector nonlinearity distortion, followed by the complex filtering and decimation procedure. In the second block, a phase correction algorithm is applied to the filtered and decimated complex interferograms. The resulting imaginary part of the spectrum contains only the noise component of the uncorrected spectrum. Additional random noise reduction can be accomplished by applying a spectral smoothing routine to the phase-corrected spectrum. The phase correction and spectral smoothing operations are performed on a set of interferogram scans for both ambient and hot blackbody references. To continue with the calibration, we compute the spectral responsivity based on the previous results, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. We now can estimate the noise equivalent spectral radiance (NESR) from the calibrated ABB and HBB spectra. The correction schemes that compensate for the fore-optics offsets and off-axis effects are also implemented. In the third block, we developed an efficient method of generating pixel performance assessments. In addition, a random pixel selection scheme is designed based on the pixel performance evaluation. Finally, in the fourth block, the single pixel algorithms are applied to the entire FPA.
A Framework of Hyperspectral Image Compression using Neural Networks
Masalmah, Yahya M.; Martínez Nieves, Christian; Rivera Soto, Rafael; ...
2015-01-01
Hyperspectral image analysis has gained great attention due to its wide range of applications. Hyperspectral images provide a vast amount of information about underlying objects in an image by using a large range of the electromagnetic spectrum for each pixel. However, since the same image is taken multiple times using distinct electromagnetic bands, the size of such images tend to be significant, which leads to greater processing requirements. The aim of this paper is to present a proposed framework for image compression and to study the possible effects of spatial compression on quality of unmixing results. Image compression allows usmore » to reduce the dimensionality of an image while still preserving most of the original information, which could lead to faster image processing. Lastly, this paper presents preliminary results of different training techniques used in Artificial Neural Network (ANN) based compression algorithm.« less
Simulated altitude exposure assessment by hyperspectral imaging
NASA Astrophysics Data System (ADS)
Calin, Mihaela Antonina; Macovei, Adrian; Miclos, Sorin; Parasca, Sorin Viorel; Savastru, Roxana; Hristea, Razvan
2017-05-01
Testing the human body's reaction to hypoxia (including the one generated by high altitude) is important in aeronautic medicine. This paper presents a method of monitoring blood oxygenation during experimental hypoxia using hyperspectral imaging (HSI) and a spectral unmixing model based on a modified Beer-Lambert law. A total of 20 healthy volunteers (males) aged 25 to 60 years were included in this study. A line-scan HSI system was used to acquire images of the faces of the subjects. The method generated oxyhemoglobin and deoxyhemoglobin distribution maps from the foreheads of the subjects at 5 and 10 min of hypoxia and after recovery in a high oxygen breathing mixture. The method also generated oxygen saturation maps that were validated using pulse oximetry. An interesting pattern of desaturation on the forehead was discovered during the study, showing one of the advantages of using HSI for skin oxygenation monitoring in hypoxic conditions. This could bring new insight into the physiological response to high altitude and may become a step forward in air crew testing.
Near-infrared imaging spectroscopy for counterfeit drug detection
NASA Astrophysics Data System (ADS)
Arnold, Thomas; De Biasio, Martin; Leitner, Raimund
2011-06-01
Pharmaceutical counterfeiting is a significant issue in the healthcare community as well as for the pharmaceutical industry worldwide. The use of counterfeit medicines can result in treatment failure or even death. A rapid screening technique such as near infrared (NIR) spectroscopy could aid in the search for and identification of counterfeit drugs. This work presents a comparison of two laboratory NIR imaging systems and the chemometric analysis of the acquired spectroscopic image data. The first imaging system utilizes a NIR liquid crystal tuneable filter and is designed for the investigation of stationary objects. The second imaging system utilizes a NIR imaging spectrograph and is designed for the fast analysis of moving objects on a conveyor belt. Several drugs in form of tablets and capsules were analyzed. Spectral unmixing techniques were applied to the mixed reflectance spectra to identify constituent parts of the investigated drugs. The results show that NIR spectroscopic imaging can be used for contact-less detection and identification of a variety of counterfeit drugs.
Mid Term Progress Report: Desertification Assessment and Monitoring in China Based on Remote Sensing
NASA Astrophysics Data System (ADS)
Gao, Zhihai; del Barrio, Gabriel; Li, Xiaosong; Wang, Bengyu; Puigdefabregas, Juan; Sanjuan, Maria E.; Bai, Lina; Wu, Junjun; Sun, Bin; Li, Changlong
2014-11-01
The objective of Dragon 3 Project 10367 is the development of techniques research for desertification assessment and monitoring in China using remote sensing data in combination with climate and environmental-related data. The main achievements acquired since2012could be summarized as follows: (1)Photosynthetic vegetation(PV)and non-photosynthetic vegetation(NPV)fraction were retrieved separately through utilizing Auto Monte Carlo Unmixing technique (AutoMCU), based on BJ-1 data and field measured spectral library. (2) The accuracy of sandy land classification was as high as81.52%when the object-oriented method and Support Vector Machine (SVM) classifiers were used. (3) A new Monthly net primary productivity (NPP)dataset from 2002 to 2010 for the whole China were established with Envisat-MERIS fraction of absorbed photosynthetically active radiation (FPAR) data. (4) The 2dRUE proved to be a good indicator for land degradation, based on which, land degradation status in the general potential extent of desertification in China(PEDC) was assessed preliminarily.
NASA Technical Reports Server (NTRS)
Asner, Gregory P.; Heidebrecht, Kathleen B.
2001-01-01
Remote sensing of vegetation cover and condition is critically needed to understand the impacts of land use and climate variability in and and semi-arid regions. However, remote sensing of vegetation change in these environments is difficult for several reasons. First, individual plant canopies are typically small and do not reach the spatial scale of typical Landsat-like satellite image pixels. Second, the phenological status and subsequent dry carbon (or non-photosynthetic) fraction of plant canopies varies dramatically in both space and time throughout and and semi-arid regions. Detection of only the 'green' part of the vegetation using a metric such as the normalized difference vegetation index (NDVI) thus yields limited information on the presence and condition of plants in these ecosystems. Monitoring of both photosynthetic vegetation (PV) and non-photosynthetic vegetation (NPV) is needed to understand a range of ecosystem characteristics including vegetation presence, cover and abundance, physiological and biogeochemical functioning, drought severity, fire fuel load, disturbance events and recovery from disturbance.
Mid Term Progress Report: Desertification Assessment and Monitoring in China Based on Remote Sensing
NASA Astrophysics Data System (ADS)
Gao, Zhihai; del Barrio, Gabriel; Li, Xiaosong; Wang, Wengyu; Puigdefabregas, Juan; Sanjuan, Maria E.; Bai, Lina; Wu, Junjun; Sun, Bin; Li, Changlong
2014-11-01
The objective of Dragon 3 Project 10367 is the development of techniques research for desertification assessment and monitoring in China using remote sensing data in combination with climate and environmental-related data. The main achievements acquired since 2012 could be summarized as follows:(1) Photosynthetic vegetation (PV) and non-photosynthetic vegetation (NPV) fraction were retrieved separately through utilizing Auto Monte Carlo Unmixing technique (AutoMCU), based on BJ-1 data and field measured spectral library.(2) The accuracy of sandy land classification was as high as 81.52% when the object-oriented method and Support Vector Machine (SVM) classifiers were used.(3) A new Monthly net primary productivity (NPP) dataset from 2002 to 2010 for the whole China were established with Envisat-MERIS fraction of absorbed photosynthetically active radiation (FPAR) data.(4) The 2dRUE proved to be a good indicator for land degradation, based on which, land degradation status in the general potential extent of desertification in China (PEDC) was assessed preliminarily.
Simulated altitude exposure assessment by hyperspectral imaging.
Calin, Mihaela Antonina; Macovei, Adrian; Miclos, Sorin; Parasca, Sorin Viorel; Savastru, Roxana; Hristea, Razvan
2017-05-01
Testing the human body’s reaction to hypoxia (including the one generated by high altitude) is important in aeronautic medicine. This paper presents a method of monitoring blood oxygenation during experimental hypoxia using hyperspectral imaging (HSI) and a spectral unmixing model based on a modified Beer–Lambert law. A total of 20 healthy volunteers (males) aged 25 to 60 years were included in this study. A line-scan HSI system was used to acquire images of the faces of the subjects. The method generated oxyhemoglobin and deoxyhemoglobin distribution maps from the foreheads of the subjects at 5 and 10 min of hypoxia and after recovery in a high oxygen breathing mixture. The method also generated oxygen saturation maps that were validated using pulse oximetry. An interesting pattern of desaturation on the forehead was discovered during the study, showing one of the advantages of using HSI for skin oxygenation monitoring in hypoxic conditions. This could bring new insight into the physiological response to high altitude and may become a step forward in air crew testing.
NASA Astrophysics Data System (ADS)
Zhao, H.; Hao, Y.; Liu, X.; Hou, M.; Zhao, X.
2018-04-01
Hyperspectral remote sensing is a completely non-invasive technology for measurement of cultural relics, and has been successfully applied in identification and analysis of pigments of Chinese historical paintings. Although the phenomenon of mixing pigments is very usual in Chinese historical paintings, the quantitative analysis of the mixing pigments in the ancient paintings is still unsolved. In this research, we took two typical mineral pigments, vermilion and stone yellow as example, made precisely mixed samples using these two kinds of pigments, and measured their spectra in the laboratory. For the mixing spectra, both fully constrained least square (FCLS) method and derivative of ratio spectroscopy (DRS) were performed. Experimental results showed that the mixing spectra of vermilion and stone yellow had strong nonlinear mixing characteristics, but at some bands linear unmixing could also achieve satisfactory results. DRS using strong linear bands can reach much higher accuracy than that of FCLS using full bands.
Villiger, Martin; Zhang, Ellen Ziyi; Nadkarni, Seemantini K.; Oh, Wang-Yuhl; Vakoc, Benjamin J.; Bouma, Brett E.
2013-01-01
Polarization mode dispersion (PMD) has been recognized as a significant barrier to sensitive and reproducible birefringence measurements with fiber-based, polarization-sensitive optical coherence tomography systems. Here, we present a signal processing strategy that reconstructs the local retardation robustly in the presence of system PMD. The algorithm uses a spectral binning approach to limit the detrimental impact of system PMD and benefits from the final averaging of the PMD-corrected retardation vectors of the spectral bins. The algorithm was validated with numerical simulations and experimental measurements of a rubber phantom. When applied to the imaging of human cadaveric coronary arteries, the algorithm was found to yield a substantial improvement in the reconstructed birefringence maps. PMID:23938487
Convex Accelerated Maximum Entropy Reconstruction
Worley, Bradley
2016-01-01
Maximum entropy (MaxEnt) spectral reconstruction methods provide a powerful framework for spectral estimation of nonuniformly sampled datasets. Many methods exist within this framework, usually defined based on the magnitude of a Lagrange multiplier in the MaxEnt objective function. An algorithm is presented here that utilizes accelerated first-order convex optimization techniques to rapidly and reliably reconstruct nonuniformly sampled NMR datasets using the principle of maximum entropy. This algorithm – called CAMERA for Convex Accelerated Maximum Entropy Reconstruction Algorithm – is a new approach to spectral reconstruction that exhibits fast, tunable convergence in both constant-aim and constant-lambda modes. A high-performance, open source NMR data processing tool is described that implements CAMERA, and brief comparisons to existing reconstruction methods are made on several example spectra. PMID:26894476
NASA Astrophysics Data System (ADS)
Xie, ChengJun; Xu, Lin
2008-03-01
This paper presents an algorithm based on mixing transform of wave band grouping to eliminate spectral redundancy, the algorithm adapts to the relativity difference between different frequency spectrum images, and still it works well when the band number is not the power of 2. Using non-boundary extension CDF(2,2)DWT and subtraction mixing transform to eliminate spectral redundancy, employing CDF(2,2)DWT to eliminate spatial redundancy and SPIHT+CABAC for compression coding, the experiment shows that a satisfied lossless compression result can be achieved. Using hyper-spectral image Canal of American JPL laboratory as the data set for lossless compression test, when the band number is not the power of 2, lossless compression result of this compression algorithm is much better than the results acquired by JPEG-LS, WinZip, ARJ, DPCM, the research achievements of a research team of Chinese Academy of Sciences, Minimum Spanning Tree and Near Minimum Spanning Tree, on the average the compression ratio of this algorithm exceeds the above algorithms by 41%,37%,35%,29%,16%,10%,8% respectively; when the band number is the power of 2, for 128 frames of the image Canal, taking 8, 16 and 32 respectively as the number of one group for groupings based on different numbers, considering factors like compression storage complexity, the type of wave band and the compression effect, we suggest using 8 as the number of bands included in one group to achieve a better compression effect. The algorithm of this paper has priority in operation speed and hardware realization convenience.
NASA Astrophysics Data System (ADS)
Zhou, Xiran; Liu, Jun; Liu, Shuguang; Cao, Lei; Zhou, Qiming; Huang, Huawen
2014-02-01
High spatial resolution and spectral fidelity are basic standards for evaluating an image fusion algorithm. Numerous fusion methods for remote sensing images have been developed. Some of these methods are based on the intensity-hue-saturation (IHS) transform and the generalized IHS (GIHS), which may cause serious spectral distortion. Spectral distortion in the GIHS is proven to result from changes in saturation during fusion. Therefore, reducing such changes can achieve high spectral fidelity. A GIHS-based spectral preservation fusion method that can theoretically reduce spectral distortion is proposed in this study. The proposed algorithm consists of two steps. The first step is spectral modulation (SM), which uses the Gaussian function to extract spatial details and conduct SM of multispectral (MS) images. This method yields a desirable visual effect without requiring histogram matching between the panchromatic image and the intensity of the MS image. The second step uses the Gaussian convolution function to restore lost edge details during SM. The proposed method is proven effective and shown to provide better results compared with other GIHS-based methods.
Method to analyze remotely sensed spectral data
Stork, Christopher L [Albuquerque, NM; Van Benthem, Mark H [Middletown, DE
2009-02-17
A fast and rigorous multivariate curve resolution (MCR) algorithm is applied to remotely sensed spectral data. The algorithm is applicable in the solar-reflective spectral region, comprising the visible to the shortwave infrared (ranging from approximately 0.4 to 2.5 .mu.m), midwave infrared, and thermal emission spectral region, comprising the thermal infrared (ranging from approximately 8 to 15 .mu.m). For example, employing minimal a priori knowledge, notably non-negativity constraints on the extracted endmember profiles and a constant abundance constraint for the atmospheric upwelling component, MCR can be used to successfully compensate thermal infrared hyperspectral images for atmospheric upwelling and, thereby, transmittance effects. Further, MCR can accurately estimate the relative spectral absorption coefficients and thermal contrast distribution of a gas plume component near the minimum detectable quantity.
Rocchini, Duccio
2009-01-01
Measuring heterogeneity in satellite imagery is an important task to deal with. Most measures of spectral diversity have been based on Shannon Information theory. However, this approach does not inherently address different scales, ranging from local (hereafter referred to alpha diversity) to global scales (gamma diversity). The aim of this paper is to propose a method for measuring spectral heterogeneity at multiple scales based on rarefaction curves. An algorithmic solution of rarefaction applied to image pixel values (Digital Numbers, DNs) is provided and discussed. PMID:22389600
NASA Technical Reports Server (NTRS)
Clark, Roger N.; Swayze, Gregg A.; Gallagher, Andrea
1992-01-01
The sedimentary sections exposed in the Canyonlands and Arches National Parks region of Utah (generally referred to as 'Canyonlands') consist of sandstones, shales, limestones, and conglomerates. Reflectance spectra of weathered surfaces of rocks from these areas show two components: (1) variations in spectrally detectable mineralogy, and (2) variations in the relative ratios of the absorption bands between minerals. Both types of information can be used together to map each major lithology and the Clark spectral features mapping algorithm is applied to do the job.
Leibig, Christian; Wachtler, Thomas; Zeck, Günther
2016-09-15
Unsupervised identification of action potentials in multi-channel extracellular recordings, in particular from high-density microelectrode arrays with thousands of sensors, is an unresolved problem. While independent component analysis (ICA) achieves rapid unsupervised sorting, it ignores the convolutive structure of extracellular data, thus limiting the unmixing to a subset of neurons. Here we present a spike sorting algorithm based on convolutive ICA (cICA) to retrieve a larger number of accurately sorted neurons than with instantaneous ICA while accounting for signal overlaps. Spike sorting was applied to datasets with varying signal-to-noise ratios (SNR: 3-12) and 27% spike overlaps, sampled at either 11.5 or 23kHz on 4365 electrodes. We demonstrate how the instantaneity assumption in ICA-based algorithms has to be relaxed in order to improve the spike sorting performance for high-density microelectrode array recordings. Reformulating the convolutive mixture as an instantaneous mixture by modeling several delayed samples jointly is necessary to increase signal-to-noise ratio. Our results emphasize that different cICA algorithms are not equivalent. Spike sorting performance was assessed with ground-truth data generated from experimentally derived templates. The presented spike sorter was able to extract ≈90% of the true spike trains with an error rate below 2%. It was superior to two alternative (c)ICA methods (≈80% accurately sorted neurons) and comparable to a supervised sorting. Our new algorithm represents a fast solution to overcome the current bottleneck in spike sorting of large datasets generated by simultaneous recording with thousands of electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.
Fusion of spectral models for dynamic modeling of sEMG and skeletal muscle force.
Potluri, Chandrasekhar; Anugolu, Madhavi; Chiu, Steve; Urfer, Alex; Schoen, Marco P; Naidu, D Subbaram
2012-01-01
In this paper, we present a method of combining spectral models using a Kullback Information Criterion (KIC) data fusion algorithm. Surface Electromyographic (sEMG) signals and their corresponding skeletal muscle force signals are acquired from three sensors and pre-processed using a Half-Gaussian filter and a Chebyshev Type- II filter, respectively. Spectral models - Spectral Analysis (SPA), Empirical Transfer Function Estimate (ETFE), Spectral Analysis with Frequency Dependent Resolution (SPFRD) - are extracted from sEMG signals as input and skeletal muscle force as output signal. These signals are then employed in a System Identification (SI) routine to establish the dynamic models relating the input and output. After the individual models are extracted, the models are fused by a probability based KIC fusion algorithm. The results show that the SPFRD spectral models perform better than SPA and ETFE models in modeling the frequency content of the sEMG/skeletal muscle force data.
NASA Astrophysics Data System (ADS)
Schott, John R.; Brown, Scott D.; Raqueno, Rolando V.; Gross, Harry N.; Robinson, Gary
1999-01-01
The need for robust image data sets for algorithm development and testing has prompted the consideration of synthetic imagery as a supplement to real imagery. The unique ability of synthetic image generation (SIG) tools to supply per-pixel truth allows algorithm writers to test difficult scenarios that would require expensive collection and instrumentation efforts. In addition, SIG data products can supply the user with `actual' truth measurements of the entire image area that are not subject to measurement error thereby allowing the user to more accurately evaluate the performance of their algorithm. Advanced algorithms place a high demand on synthetic imagery to reproduce both the spectro-radiometric and spatial character observed in real imagery. This paper describes a synthetic image generation model that strives to include the radiometric processes that affect spectral image formation and capture. In particular, it addresses recent advances in SIG modeling that attempt to capture the spatial/spectral correlation inherent in real images. The model is capable of simultaneously generating imagery from a wide range of sensors allowing it to generate daylight, low-light-level and thermal image inputs for broadband, multi- and hyper-spectral exploitation algorithms.
Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images
NASA Astrophysics Data System (ADS)
Awumah, Anna; Mahanti, Prasun; Robinson, Mark
2016-10-01
Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).
Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Neal, Maxwell; Cashmere, David J; Germain, Anne; Reifman, Jaques
2018-02-01
Electroencephalography (EEG) recordings during sleep are often contaminated by muscle and ocular artefacts, which can affect the results of spectral power analyses significantly. However, the extent to which these artefacts affect EEG spectral power across different sleep states has not been quantified explicitly. Consequently, the effectiveness of automated artefact-rejection algorithms in minimizing these effects has not been characterized fully. To address these issues, we analysed standard 10-channel EEG recordings from 20 subjects during one night of sleep. We compared their spectral power when the recordings were contaminated by artefacts and after we removed them by visual inspection or by using automated artefact-rejection algorithms. During both rapid eye movement (REM) and non-REM (NREM) sleep, muscle artefacts contaminated no more than 5% of the EEG data across all channels. However, they corrupted delta, beta and gamma power levels substantially by up to 126, 171 and 938%, respectively, relative to the power level computed from artefact-free data. Although ocular artefacts were infrequent during NREM sleep, they affected up to 16% of the frontal and temporal EEG channels during REM sleep, primarily corrupting delta power by up to 33%. For both REM and NREM sleep, the automated artefact-rejection algorithms matched power levels to within ~10% of the artefact-free power level for each EEG channel and frequency band. In summary, although muscle and ocular artefacts affect only a small fraction of EEG data, they affect EEG spectral power significantly. This suggests the importance of using artefact-rejection algorithms before analysing EEG data. © 2017 European Sleep Research Society.
Modified fuzzy c-means applied to a Bragg grating-based spectral imager for material clustering
NASA Astrophysics Data System (ADS)
Rodríguez, Aida; Nieves, Juan Luis; Valero, Eva; Garrote, Estíbaliz; Hernández-Andrés, Javier; Romero, Javier
2012-01-01
We have modified the Fuzzy C-Means algorithm for an application related to segmentation of hyperspectral images. Classical fuzzy c-means algorithm uses Euclidean distance for computing sample membership to each cluster. We have introduced a different distance metric, Spectral Similarity Value (SSV), in order to have a more convenient similarity measure for reflectance information. SSV distance metric considers both magnitude difference (by the use of Euclidean distance) and spectral shape (by the use of Pearson correlation). Experiments confirmed that the introduction of this metric improves the quality of hyperspectral image segmentation, creating spectrally more dense clusters and increasing the number of correctly classified pixels.
GIFTS SM EDU Radiometric and Spectral Calibrations
NASA Technical Reports Server (NTRS)
Tian, J.; Reisse, R. a.; Johnson, D. G.; Gazarik, J. J.
2007-01-01
The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiance using a Fourier transform spectrometer (FTS). The GIFTS instrument gathers measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes the processing algorithms involved in the calibration. The calibration procedures can be subdivided into three categories: the pre-calibration stage, the calibration stage, and finally, the post-calibration stage. Detailed derivations for each stage are presented in this paper.
NASA Astrophysics Data System (ADS)
Mahbub, Saabah B.; Succer, Peter; Gosnell, Martin E.; Anwaer, Ayad G.; Herbert, Benjamin; Vesey, Graham; Goldys, Ewa M.
2016-03-01
Extracting biochemical information from tissue autofluorescence is a promising approach to non-invasively monitor disease treatments at a cellular level, without using any external biomarkers. Our recently developed unsupervised hyperspectral unmixing by Dependent Component Analysis (DECA) provides robust and detailed metabolic information with proper account of intrinsic cellular heterogeneity. Moreover this method is compatible with established methods of fluorescent biomarker labelling. Recently adipose-derived stem cell (ADSC) - based therapies have been introduced for treating different diseases in animals and humans. ADSC have been shown promise in regenerative treatments for osteoarthritis and other bone and joint disorders. One of the mechanism of their action is their anti-inflammatory effects within osteoarthritic joints which aid the regeneration of cartilage. These therapeutic effects are known to be driven by secretions of different cytokines from the ADSCs. We have been using the hyperspectral unmixing techniques to study in-vitro the effects of ADSC-derived cytokine-rich secretions with the cartilage chip in both human and bovine samples. The study of metabolic effects of different cytokine treatment on different cartilage layers makes it possible to compare the merits of those treatments for repairing cartilage.
UNMIX Methods Applied to Characterize Sources of Volatile Organic Compounds in Toronto, Ontario
Porada, Eugeniusz; Szyszkowicz, Mieczysław
2016-01-01
UNMIX, a sensor modeling routine from the U.S. Environmental Protection Agency (EPA), was used to model volatile organic compound (VOC) receptors in four urban sites in Toronto, Ontario. VOC ambient concentration data acquired in 2000–2009 for 175 VOC species in four air quality monitoring stations were analyzed. UNMIX, by performing multiple modeling attempts upon varying VOC menus—while rejecting the results that were not reliable—allowed for discriminating sources by their most consistent chemical characteristics. The method assessed occurrences of VOCs in sources typical of the urban environment (traffic, evaporative emissions of fuels, banks of fugitive inert gases), industrial point sources (plastic-, polymer-, and metalworking manufactures), and in secondary sources (releases from water, sediments, and contaminated urban soil). The remote sensing and robust modeling used here produces chemical profiles of putative VOC sources that, if combined with known environmental fates of VOCs, can be used to assign physical sources’ shares of VOCs emissions into the atmosphere. This in turn provides a means of assessing the impact of environmental policies on one hand, and industrial activities on the other hand, on VOC air pollution. PMID:29051416
NASA Technical Reports Server (NTRS)
Hewes, C. R.; Brodersen, R. W.; De Wit, M.; Buss, D. D.
1976-01-01
Charge-coupled devices (CCDs) are ideally suited for performing sampled-data transversal filtering operations in the analog domain. Two algorithms have been identified for performing spectral analysis in which the bulk of the computation can be performed in a CCD transversal filter; the chirp z-transform and the prime transform. CCD implementation of both these transform algorithms is presented together with performance data and applications.
NASA Astrophysics Data System (ADS)
Ye, Fa-wang; Liu, De-chang
2008-12-01
Practices of sandstone-type uranium exploration in recent years in China indicate that the uranium mineralization alteration information is of great importance for selecting a new uranium target or prospecting in outer area of the known uranium ore district. Taking a case study of BASHIBULAKE uranium ore district, this paper mainly presents the technical minds and methods of extracting the reduced alteration information by oil and gas in BASHIBULAKE ore district using ASTER data. First, the regional geological setting and study status in BASHIBULAKE uranium ore district are introduced in brief. Then, the spectral characteristics of altered sandstone and un-altered sandstone in BASHIBULAKE ore district are analyzed deeply. Based on the spectral analysis, two technical minds to extract the remote sensing reduced alteration information are proposed, and the un-mixing method is introduced to process ASTER data to extract the reduced alteration information in BASHIBULAKE ore district. From the enhanced images, three remote sensing anomaly zones are discovered, and their geological and prospecting significances are further made sure by taking the advantages of multi-bands in SWIR of ASTER data. Finally, the distribution and intensity of the reduced alteration information in Cretaceous system and its relationship with the genesis of uranium deposit are discussed, the specific suggestions for uranium prospecting orientation in outer of BASHIBULAKE ore district are also proposed.
A framelet-based iterative maximum-likelihood reconstruction algorithm for spectral CT
NASA Astrophysics Data System (ADS)
Wang, Yingmei; Wang, Ge; Mao, Shuwei; Cong, Wenxiang; Ji, Zhilong; Cai, Jian-Feng; Ye, Yangbo
2016-11-01
Standard computed tomography (CT) cannot reproduce spectral information of an object. Hardware solutions include dual-energy CT which scans the object twice in different x-ray energy levels, and energy-discriminative detectors which can separate lower and higher energy levels from a single x-ray scan. In this paper, we propose a software solution and give an iterative algorithm that reconstructs an image with spectral information from just one scan with a standard energy-integrating detector. The spectral information obtained can be used to produce color CT images, spectral curves of the attenuation coefficient μ (r,E) at points inside the object, and photoelectric images, which are all valuable imaging tools in cancerous diagnosis. Our software solution requires no change on hardware of a CT machine. With the Shepp-Logan phantom, we have found that although the photoelectric and Compton components were not perfectly reconstructed, their composite effect was very accurately reconstructed as compared to the ground truth and the dual-energy CT counterpart. This means that our proposed method has an intrinsic benefit in beam hardening correction and metal artifact reduction. The algorithm is based on a nonlinear polychromatic acquisition model for x-ray CT. The key technique is a sparse representation of iterations in a framelet system. Convergence of the algorithm is studied. This is believed to be the first application of framelet imaging tools to a nonlinear inverse problem.
Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.
2016-01-01
Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in-vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43~73%) without sacrificing CT number accuracy or spatial resolution. PMID:27551878
NASA Astrophysics Data System (ADS)
Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.
2016-09-01
Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43-73%) without sacrificing CT number accuracy or spatial resolution.
Evaluation of Algorithms for Compressing Hyperspectral Data
NASA Technical Reports Server (NTRS)
Cook, Sid; Harsanyi, Joseph; Faber, Vance
2003-01-01
With EO-1 Hyperion in orbit NASA is showing their continued commitment to hyperspectral imaging (HSI). As HSI sensor technology continues to mature, the ever-increasing amounts of sensor data generated will result in a need for more cost effective communication and data handling systems. Lockheed Martin, with considerable experience in spacecraft design and developing special purpose onboard processors, has teamed with Applied Signal & Image Technology (ASIT), who has an extensive heritage in HSI spectral compression and Mapping Science (MSI) for JPEG 2000 spatial compression expertise, to develop a real-time and intelligent onboard processing (OBP) system to reduce HSI sensor downlink requirements. Our goal is to reduce the downlink requirement by a factor > 100, while retaining the necessary spectral and spatial fidelity of the sensor data needed to satisfy the many science, military, and intelligence goals of these systems. Our compression algorithms leverage commercial-off-the-shelf (COTS) spectral and spatial exploitation algorithms. We are currently in the process of evaluating these compression algorithms using statistical analysis and NASA scientists. We are also developing special purpose processors for executing these algorithms onboard a spacecraft.
DOA estimation of noncircular signals for coprime linear array via locally reduced-dimensional Capon
NASA Astrophysics Data System (ADS)
Zhai, Hui; Zhang, Xiaofei; Zheng, Wang
2018-05-01
We investigate the issue of direction of arrival (DOA) estimation of noncircular signals for coprime linear array (CLA). The noncircular property enhances the degree of freedom and improves angle estimation performance, but it leads to a more complex angle ambiguity problem. To eliminate ambiguity, we theoretically prove that the actual DOAs of noncircular signals can be uniquely estimated by finding the coincide results from the two decomposed subarrays based on the coprimeness. We propose a locally reduced-dimensional (RD) Capon algorithm for DOA estimation of noncircular signals for CLA. The RD processing is used in the proposed algorithm to avoid two dimensional (2D) spectral peak search, and coprimeness is employed to avoid the global spectral peak search. The proposed algorithm requires one-dimensional locally spectral peak search, and it has very low computational complexity. Furthermore, the proposed algorithm needs no prior knowledge of the number of sources. We also derive the Crámer-Rao bound of DOA estimation of noncircular signals in CLA. Numerical simulation results demonstrate the effectiveness and superiority of the algorithm.
Spectral correction algorithm for multispectral CdTe x-ray detectors
NASA Astrophysics Data System (ADS)
Christensen, Erik D.; Kehres, Jan; Gu, Yun; Feidenhans'l, Robert; Olsen, Ulrik L.
2017-09-01
Compared to the dual energy scintillator detectors widely used today, pixelated multispectral X-ray detectors show the potential to improve material identification in various radiography and tomography applications used for industrial and security purposes. However, detector effects, such as charge sharing and photon pileup, distort the measured spectra in high flux pixelated multispectral detectors. These effects significantly reduce the detectors' capabilities to be used for material identification, which requires accurate spectral measurements. We have developed a semi analytical computational algorithm for multispectral CdTe X-ray detectors which corrects the measured spectra for severe spectral distortions caused by the detector. The algorithm is developed for the Multix ME100 CdTe X-ray detector, but could potentially be adapted for any pixelated multispectral CdTe detector. The calibration of the algorithm is based on simple attenuation measurements of commercially available materials using standard laboratory sources, making the algorithm applicable in any X-ray setup. The validation of the algorithm has been done using experimental data acquired with both standard lab equipment and synchrotron radiation. The experiments show that the algorithm is fast, reliable even at X-ray flux up to 5 Mph/s/mm2, and greatly improves the accuracy of the measured X-ray spectra, making the algorithm very useful for both security and industrial applications where multispectral detectors are used.
Spectral dispersion and fringe detection in IOTA
NASA Technical Reports Server (NTRS)
Traub, W. A.; Lacasse, M. G.; Carleton, N. P.
1990-01-01
Pupil plane beam combination, spectral dispersion, detection, and fringe tracking are discussed for the IOTA interferometer. A new spectrometer design is presented in which the angular dispersion with respect to wavenumber is nearly constant. The dispersing element is a type of grism, a series combination of grating and prism, in which the constant parts of the dispersion add, but the slopes cancel. This grism is optimized for the display of channelled spectra. The dispersed fringes can be tracked by a matched-filter photon-counting correlator algorithm. This algorithm requires very few arithmetic operations per detected photon, making it well-suited for real-time fringe tracking. The algorithm is able to adapt to different stellar spectral types, intensity levels, and atmospheric time constants. The results of numerical experiments are reported.
GIFTS SM EDU Data Processing and Algorithms
NASA Technical Reports Server (NTRS)
Tian, Jialin; Johnson, David G.; Reisse, Robert A.; Gazarik, Michael J.
2007-01-01
The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Sensor Module (SM) Engineering Demonstration Unit (EDU) is a high resolution spectral imager designed to measure infrared (IR) radiances using a Fourier transform spectrometer (FTS). The GIFTS instrument employs three Focal Plane Arrays (FPAs), which gather measurements across the long-wave IR (LWIR), short/mid-wave IR (SMWIR), and visible spectral bands. The raw interferogram measurements are radiometrically and spectrally calibrated to produce radiance spectra, which are further processed to obtain atmospheric profiles via retrieval algorithms. This paper describes the processing algorithms involved in the calibration stage. The calibration procedures can be subdivided into three stages. In the pre-calibration stage, a phase correction algorithm is applied to the decimated and filtered complex interferogram. The resulting imaginary part of the spectrum contains only the noise component of the uncorrected spectrum. Additional random noise reduction can be accomplished by applying a spectral smoothing routine to the phase-corrected blackbody reference spectra. In the radiometric calibration stage, we first compute the spectral responsivity based on the previous results, from which, the calibrated ambient blackbody (ABB), hot blackbody (HBB), and scene spectra can be obtained. During the post-processing stage, we estimate the noise equivalent spectral radiance (NESR) from the calibrated ABB and HBB spectra. We then implement a correction scheme that compensates for the effect of fore-optics offsets. Finally, for off-axis pixels, the FPA off-axis effects correction is performed. To estimate the performance of the entire FPA, we developed an efficient method of generating pixel performance assessments. In addition, a random pixel selection scheme is designed based on the pixel performance evaluation.
NASA Astrophysics Data System (ADS)
Ghrefat, Habes A.; Goodell, Philip C.
2011-08-01
The goal of this research is to map land cover patterns and to detect changes that occurred at Alkali Flat and Lake Lucero, White Sands using multispectral Landsat 7 Enhanced Thematic Mapper Plus (ETM+), Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Advanced Land Imager (ALI), and hyperspectral Hyperion and Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data. The other objectives of this study were: (1) to evaluate the information dimensionality limits of Landsat 7 ETM+, ASTER, ALI, Hyperion, and AVIRIS data with respect to signal-to-noise and spectral resolution, (2) to determine the spatial distribution and fractional abundances of land cover endmembers, and (3) to check ground correspondence with satellite data. A better understanding of the spatial and spectral resolution of these sensors, optimum spectral bands and their information contents, appropriate image processing methods, spectral signatures of land cover classes, and atmospheric effects are needed to our ability to detect and map minerals from space. Image spectra were validated using samples collected from various localities across Alkali Flat and Lake Lucero. These samples were measured in the laboratory using VNIR-SWIR (0.4-2.5 μm) spectra and X-ray Diffraction (XRD) method. Dry gypsum deposits, wet gypsum deposits, standing water, green vegetation, and clastic alluvial sediments dominated by mixtures of ferric iron (ferricrete) and calcite were identified in the study area using Minimum Noise Fraction (MNF), Pixel Purity Index (PPI), and n-D Visualization. The results of MNF confirm that AVIRIS and Hyperion data have higher information dimensionality thresholds exceeding the number of available bands of Landsat 7 ETM+, ASTER, and ALI data. ASTER and ALI data can be a reasonable alternative to AVIRIS and Hyperion data for the purpose of monitoring land cover, hydrology and sedimentation in the basin. The spectral unmixing analysis and dimensionality eigen analysis between the various datasets helped to uncover the most optimum spatial-spectral-temporal and radiometric-resolution sensor characteristics for remote sensing based on monitoring of seasonal land cover, surface water, groundwater, and alluvial sediment input changes within the basin. The results demonstrated good agreement between ground truth data and XRD analysis of samples, and the results of Matched Filtering (MF) mapping method.
Zhang, Hong-guang; Lu, Jian-gang
2016-02-01
Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.
Saliency detection algorithm based on LSC-RC
NASA Astrophysics Data System (ADS)
Wu, Wei; Tian, Weiye; Wang, Ding; Luo, Xin; Wu, Yingfei; Zhang, Yu
2018-02-01
Image prominence is the most important region in an image, which can cause the visual attention and response of human beings. Preferentially allocating the computer resources for the image analysis and synthesis by the significant region is of great significance to improve the image area detecting. As a preprocessing of other disciplines in image processing field, the image prominence has widely applications in image retrieval and image segmentation. Among these applications, the super-pixel segmentation significance detection algorithm based on linear spectral clustering (LSC) has achieved good results. The significance detection algorithm proposed in this paper is better than the regional contrast ratio by replacing the method of regional formation in the latter with the linear spectral clustering image is super-pixel block. After combining with the latest depth learning method, the accuracy of the significant region detecting has a great promotion. At last, the superiority and feasibility of the super-pixel segmentation detection algorithm based on linear spectral clustering are proved by the comparative test.
NASA Astrophysics Data System (ADS)
Dafu, Shen; Leihong, Zhang; Dong, Liang; Bei, Li; Yi, Kang
2017-07-01
The purpose of this study is to improve the reconstruction precision and better copy the color of spectral image surfaces. A new spectral reflectance reconstruction algorithm based on an iterative threshold combined with weighted principal component space is presented in this paper, and the principal component with weighted visual features is the sparse basis. Different numbers of color cards are selected as the training samples, a multispectral image is the testing sample, and the color differences in the reconstructions are compared. The channel response value is obtained by a Mega Vision high-accuracy, multi-channel imaging system. The results show that spectral reconstruction based on weighted principal component space is superior in performance to that based on traditional principal component space. Therefore, the color difference obtained using the compressive-sensing algorithm with weighted principal component analysis is less than that obtained using the algorithm with traditional principal component analysis, and better reconstructed color consistency with human eye vision is achieved.
Nasirudin, Radin A.; Mei, Kai; Panchev, Petar; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Fiebich, Martin; Noël, Peter B.
2015-01-01
Purpose The exciting prospect of Spectral CT (SCT) using photon-counting detectors (PCD) will lead to new techniques in computed tomography (CT) that take advantage of the additional spectral information provided. We introduce a method to reduce metal artifact in X-ray tomography by incorporating knowledge obtained from SCT into a statistical iterative reconstruction scheme. We call our method Spectral-driven Iterative Reconstruction (SPIR). Method The proposed algorithm consists of two main components: material decomposition and penalized maximum likelihood iterative reconstruction. In this study, the spectral data acquisitions with an energy-resolving PCD were simulated using a Monte-Carlo simulator based on EGSnrc C++ class library. A jaw phantom with a dental implant made of gold was used as an object in this study. A total of three dental implant shapes were simulated separately to test the influence of prior knowledge on the overall performance of the algorithm. The generated projection data was first decomposed into three basis functions: photoelectric absorption, Compton scattering and attenuation of gold. A pseudo-monochromatic sinogram was calculated and used as input in the reconstruction, while the spatial information of the gold implant was used as a prior. The results from the algorithm were assessed and benchmarked with state-of-the-art reconstruction methods. Results Decomposition results illustrate that gold implant of any shape can be distinguished from other components of the phantom. Additionally, the result from the penalized maximum likelihood iterative reconstruction shows that artifacts are significantly reduced in SPIR reconstructed slices in comparison to other known techniques, while at the same time details around the implant are preserved. Quantitatively, the SPIR algorithm best reflects the true attenuation value in comparison to other algorithms. Conclusion It is demonstrated that the combination of the additional information from Spectral CT and statistical reconstruction can significantly improve image quality, especially streaking artifacts caused by the presence of materials with high atomic numbers. PMID:25955019
Handwritten text line segmentation by spectral clustering
NASA Astrophysics Data System (ADS)
Han, Xuecheng; Yao, Hui; Zhong, Guoqiang
2017-02-01
Since handwritten text lines are generally skewed and not obviously separated, text line segmentation of handwritten document images is still a challenging problem. In this paper, we propose a novel text line segmentation algorithm based on the spectral clustering. Given a handwritten document image, we convert it to a binary image first, and then compute the adjacent matrix of the pixel points. We apply spectral clustering on this similarity metric and use the orthogonal kmeans clustering algorithm to group the text lines. Experiments on Chinese handwritten documents database (HIT-MW) demonstrate the effectiveness of the proposed method.
Device, Algorithm and Integrated Modeling Research for Performance-Drive Multi-Modal Optical Sensors
2012-12-17
to!feature!aided!tracking! using !spectral! information .! ! !iii! •! A!novel!technique!for!spectral!waveband!selection!was!developed!and! used !as! part! of ... of !spectral! information ! using !the!tunable!single;pixel!spectrometer!concept.! •! A! database! was! developed! of ! spectral! reflectance! measurements...exploring! the! utility! of ! spectral! and! polarimetric! information !to!help!with!the!vehicle!tracking!application.!Through!the! use ! of ! both
Semi-automated scoring of triple-probe FISH in human sperm using confocal microscopy.
Branch, Francesca; Nguyen, GiaLinh; Porter, Nicholas; Young, Heather A; Martenies, Sheena E; McCray, Nathan; Deloid, Glen; Popratiloff, Anastas; Perry, Melissa J
2017-09-01
Structural and numerical sperm chromosomal aberrations result from abnormal meiosis and are directly linked to infertility. Any live births that arise from aneuploid conceptuses can result in syndromes such as Kleinfelter, Turners, XYY and Edwards. Multi-probe fluorescence in situ hybridization (FISH) is commonly used to study sperm aneuploidy, however manual FISH scoring in sperm samples is labor-intensive and introduces errors. Automated scoring methods are continuously evolving. One challenging aspect for optimizing automated sperm FISH scoring has been the overlap in excitation and emission of the fluorescent probes used to enumerate the chromosomes of interest. Our objective was to demonstrate the feasibility of combining confocal microscopy and spectral imaging with high-throughput methods for accurately measuring sperm aneuploidy. Our approach used confocal microscopy to analyze numerical chromosomal abnormalities in human sperm using enhanced slide preparation and rigorous semi-automated scoring methods. FISH for chromosomes X, Y, and 18 was conducted to determine sex chromosome disomy in sperm nuclei. Application of online spectral linear unmixing was used for effective separation of four fluorochromes while decreasing data acquisition time. Semi-automated image processing, segmentation, classification, and scoring were performed on 10 slides using custom image processing and analysis software and results were compared with manual methods. No significant differences in disomy frequencies were seen between the semi automated and manual methods. Samples treated with pepsin were observed to have reduced background autofluorescence and more uniform distribution of cells. These results demonstrate that semi-automated methods using spectral imaging on a confocal platform are a feasible approach for analyzing numerical chromosomal aberrations in sperm, and are comparable to manual methods. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
NASA Astrophysics Data System (ADS)
Kruse, Fred A.
2015-05-01
Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and spatially coincident Hyperspectral Thermal Emission Spectrometer (HyTES) data were used to map geology and alteration for a site in northern Death Valley, California and Nevada, USA. AVIRIS, with 224 bands at 10 nm spectral resolution over the range 0.4 - 2.5 μm at 3-meter spatial resolution were converted to reflectance using an atmospheric model. HyTES data with 256 bands at approximately 17 nm spectral resolution covering the 8 - 12 μm range at 4-meter spatial resolution were converted to emissivity using a longwave infrared (LWIR) radiative transfer atmospheric compensation model and a normalized temperature-emissivity separation approach. Key spectral endmembers were separately extracted for each wavelength region and identified, and the predominant material at each pixel was mapped for each range using Mixture-Tuned-Matched Filtering (MTMF), a partial unmixing approach. AVIRIS mapped iron oxides, clays, mica, and silicification (hydrothermal alteration); and the difference between calcite and dolomite. HyTES separated and mapped several igneous phases (not possible using AVIRIS), silicification, and validated separation of calcite from dolomite. Comparison of the material maps from the different modes, however, reveals complex overlap, indicating that multiple materials/processes exist in many areas. Combined and integrated analyses were performed to compare individual results and more completely characterize occurrences of multiple materials. Three approaches were used 1) integrated full-range analysis, 2) combined multimode classification, and 3) directed combined analysis in geologic context. Results illustrate that together, these two datasets provide an improved picture of the distribution of geologic units and subsequent alteration.
Interpreting spectral unmixing coefficients: From spectral weights to mass fractions
NASA Astrophysics Data System (ADS)
Grumpe, Arne; Mengewein, Natascha; Rommel, Daniela; Mall, Urs; Wöhler, Christian
2018-01-01
It is well known that many common planetary minerals exhibit prominent absorption features. Consequently, the analysis of spectral reflectance measurements has become a major tool of remote sensing. Quantifying the mineral abundances, however, is not a trivial task. The interaction between the incident light rays and particulate surfaces, e.g., the lunar regolith, leads to a non-linear relationship between the reflectance spectra of the pure minerals, the so-called ;endmembers;, and the surface's reflectance spectrum. It is, however, possible to transform the non-linear reflectance mixture into a linear mixture of single-scattering albedos of the Hapke model. The abundances obtained by inverting the linear single-scattering albedo mixture may be interpreted as volume fractions which are weighted by the endmember's extinction coefficient. Commonly, identical extinction coefficients are assumed throughout all endmembers and the obtained volume fractions are converted to mass fractions using either measured or assumed densities. In theory, the proposed method may cover different grain sizes if each grain size range of a mineral is treated as a distinct endmember. Here, we present a method to transform the mixing coefficients to mass fractions for arbitrary combinations of extinction coefficients and densities. The required parameters are computed from reflectance measurements of well defined endmember mixtures. Consequently, additional measurements, e.g., the endmember density, are no longer required. We evaluate the method based on laboratory measurements and various results presented in the literature, respectively. It is shown that the procedure transforms the mixing coefficients to mass fractions yielding an accuracy comparable to carefully calibrated laboratory measurements without additional knowledge. For our laboratory measurements, the square root of the mean squared error is less than 4.82 wt%. In addition, the method corrects for systematic effects originating from mixtures of endmembers showing a highly varying albedo, e.g., plagioclase and pyroxene.
Hyperspectral Remote Sensing of Terrestrial Ecosystem Productivity from ISS
NASA Astrophysics Data System (ADS)
Huemmrich, K. F.; Campbell, P. K. E.; Gao, B. C.; Flanagan, L. B.; Goulden, M.
2017-12-01
Data from the Hyperspectral Imager for Coastal Ocean (HICO), mounted on the International Space Station (ISS), were used to develop and test algorithms for remotely retrieving ecosystem productivity. The ISS orbit introduces both limitations and opportunities for observing ecosystem dynamics. Twenty six HICO images were used from four study sites representing different vegetation types: grasslands, shrubland, and forest. Gross ecosystem production (GEP) data from eddy covariance were matched with HICO-derived spectra. Multiple algorithms were successful relating spectral reflectance with GEP, including: Spectral Vegetation Indices (SVI), SVI in a light use efficiency model framework, spectral shape characteristics through spectral derivatives and absorption feature analysis, and statistical models leading to Multiband Hyperspectral Indices (MHI) from stepwise regressions and Partial Least Squares Regression (PLSR). Algorithms were able to achieve r2 better than 0.7 for both GEP at the overpass time and daily GEP. These algorithms were successful using a diverse set of observations combining data from multiple years, multiple times during growing season, different times of day, with different view angles, and different vegetation types. The demonstrated robustness of the algorithms presented in this study over these conditions provides some confidence in mapping spatial patterns of GEP, describing variability within fields as well as the regional patterns based only on spectral reflectance information. The ISS orbit provides periods with multiple observations collected at different times of the day within a period of a few days. Diurnal GEP patterns were estimated comparing the half-hourly average GEP from the flux tower against HICO estimates of GEP (r2=0.87) if morning, midday, and afternoon observations were available for average fluxes in the time period.
NASA Astrophysics Data System (ADS)
Harris, Jennifer; Grindrod, Peter
2017-04-01
At present, martian meteorites represent the only samples of Mars available for study in terrestrial laboratories. However, these samples have never been definitively tied to source locations on Mars, meaning that the fundamental geological context is missing. The goal of this work is to link the bulk mineralogical analyses of martian meteorites to the surface geology of Mars through spectral mixture analysis of hyperspectral imagery. Hapke radiation transfer modelling has been shown to provide accurate (within 5 - 10% absolute error) mineral abundance values from laboratory derived hyperspectral measurements of binary [1] and ternary [2] mixtures of plagioclase, pyroxene and olivine. These three minerals form the vast bulk of the SNC meteorites [3] and the bedrock of the Amazonian provinces on Mars that are inferred to be the source regions for these meteorites based on isotopic aging. Spectral unmixing through the Hapke model could be used to quantitatively analyse the Martian surface and pinpoint the exact craters from which the SNC meteorites originated. However the Hapke model is complex with numerous variables, many of which are determinable in laboratory conditions but not from remote measurements of a planetary surface. Using binary and tertiary spectral mixtures and martian meteorite spectra from the RELAB spectral library, the accuracy of Hapke abundance estimation is investigated in the face of increasing constraints and simplifications to simulate CRISM data. Constraints and simplifications include reduced spectral resolution, additional noise, unknown endmembers and unknown particle physical characteristics. CRISM operates in two spectral resolutions, the Full Resolution Targeted (FRT) with which it has imaged approximately 2% of the martian surface, and the lower spectral resolution MultiSpectral Survey mode (MSP) with which it has covered the vast majority of the surface. On resampling the RELAB spectral mixtures to these two wavelength ranges it was found that with the lower spectral resolution the Hapke abundance results were just as accurate (within 7% absolute error) as with the higher resolution. Further results taking into account additional noise from both instrument and atmospheric sources and the potential presence of minor amounts of accessory minerals, and the selection of appropriate spectral endmembers where the exact endmembers present are unknown shall be presented. References [1] Mustard, J. F., Pieters, C. M., Quantitative abundance estimates from bidirectional reflectance measurements, Journal of Geophysical Research, Vol. 92, B4, E617 - E626, 1987 [2] Li, S., Milliken, R. E., Estimating the modal mineralogy of eucrite and diogenite meteorites using visible-near infrared reflectance spectroscopy, Meteoritics and Planetary Science, Vol. 50, 11, 1821 - 1850, 2015 [3] Hutchinson, R., Meteorites: A petrologic, chemical and isotopic synthesis, Cambridge University Press, 2004
Spatial-Spectral Approaches to Edge Detection in Hyperspectral Remote Sensing
NASA Astrophysics Data System (ADS)
Cox, Cary M.
This dissertation advances geoinformation science at the intersection of hyperspectral remote sensing and edge detection methods. A relatively new phenomenology among its remote sensing peers, hyperspectral imagery (HSI) comprises only about 7% of all remote sensing research - there are five times as many radar-focused peer reviewed journal articles than hyperspectral-focused peer reviewed journal articles. Similarly, edge detection studies comprise only about 8% of image processing research, most of which is dedicated to image processing techniques most closely associated with end results, such as image classification and feature extraction. Given the centrality of edge detection to mapping, that most important of geographic functions, improving the collective understanding of hyperspectral imagery edge detection methods constitutes a research objective aligned to the heart of geoinformation sciences. Consequently, this dissertation endeavors to narrow the HSI edge detection research gap by advancing three HSI edge detection methods designed to leverage HSI's unique chemical identification capabilities in pursuit of generating accurate, high-quality edge planes. The Di Zenzo-based gradient edge detection algorithm, an innovative version of the Resmini HySPADE edge detection algorithm and a level set-based edge detection algorithm are tested against 15 traditional and non-traditional HSI datasets spanning a range of HSI data configurations, spectral resolutions, spatial resolutions, bandpasses and applications. This study empirically measures algorithm performance against Dr. John Canny's six criteria for a good edge operator: false positives, false negatives, localization, single-point response, robustness to noise and unbroken edges. The end state is a suite of spatial-spectral edge detection algorithms that produce satisfactory edge results against a range of hyperspectral data types applicable to a diverse set of earth remote sensing applications. This work also explores the concept of an edge within hyperspectral space, the relative importance of spatial and spectral resolutions as they pertain to HSI edge detection and how effectively compressed HSI data improves edge detection results. The HSI edge detection experiments yielded valuable insights into the algorithms' strengths, weaknesses and optimal alignment to remote sensing applications. The gradient-based edge operator produced strong edge planes across a range of evaluation measures and applications, particularly with respect to false negatives, unbroken edges, urban mapping, vegetation mapping and oil spill mapping applications. False positives and uncompressed HSI data presented occasional challenges to the algorithm. The HySPADE edge operator produced satisfactory results with respect to localization, single-point response, oil spill mapping and trace chemical detection, and was challenged by false positives, declining spectral resolution and vegetation mapping applications. The level set edge detector produced high-quality edge planes for most tests and demonstrated strong performance with respect to false positives, single-point response, oil spill mapping and mineral mapping. False negatives were a regular challenge for the level set edge detection algorithm. Finally, HSI data optimized for spectral information compression and noise was shown to improve edge detection performance across all three algorithms, while the gradient-based algorithm and HySPADE demonstrated significant robustness to declining spectral and spatial resolutions.
NASA Astrophysics Data System (ADS)
Gambacorta, A.; Nalli, N. R.; Tan, C.; Iturbide-Sanchez, F.; Wilson, M.; Zhang, K.; Xiong, X.; Barnet, C. D.; Sun, B.; Zhou, L.; Wheeler, A.; Reale, A.; Goldberg, M.
2017-12-01
The NOAA Unique Combined Atmospheric Processing System (NUCAPS) is the NOAA operational algorithm to retrieve thermodynamic and composition variables from hyper spectral thermal sounders such as CrIS, IASI and AIRS. The combined use of microwave sounders, such as ATMS, AMSU and MHS, enables full atmospheric sounding of the atmospheric column under all-sky conditions. NUCAPS retrieval products are accessible in near real time (about 1.5 hour delay) through the NOAA Comprehensive Large Array-data Stewardship System (CLASS). Since February 2015, NUCAPS retrievals have been also accessible via Direct Broadcast, with unprecedented low latency of less than 0.5 hours. NUCAPS builds on a long-term, multi-agency investment on algorithm research and development. The uniqueness of this algorithm consists in a number of features that are key in providing highly accurate and stable atmospheric retrievals, suitable for real time weather and air quality applications. Firstly, maximizing the use of the information content present in hyper spectral thermal measurements forms the foundation of the NUCAPS retrieval algorithm. Secondly, NUCAPS is a modular, name-list driven design. It can process multiple hyper spectral infrared sounders (on Aqua, NPP, MetOp and JPSS series) by mean of the same exact retrieval software executable and underlying spectroscopy. Finally, a cloud-clearing algorithm and a synergetic use of microwave radiance measurements enable full vertical sounding of the atmosphere, under all-sky regimes. As we transition toward improved hyper spectral missions, assessing retrieval skill and consistency across multiple platforms becomes a priority for real time users applications. Focus of this presentation is a general introduction on the recent improvements in the delivery of the NUCAPS full spectral resolution upgrade and an overview of the lessons learned from the 2017 Hazardous Weather Test bed Spring Experiment. Test cases will be shown on the use of NPP and MetOp NUCAPS under pre-convective, capping inversion and dry layer intrusion events.
GOME Total Ozone and Calibration Error Derived Usign Version 8 TOMS Algorithm
NASA Technical Reports Server (NTRS)
Gleason, J.; Wellemeyer, C.; Qin, W.; Ahn, C.; Gopalan, A.; Bhartia, P.
2003-01-01
The Global Ozone Monitoring Experiment (GOME) is a hyper-spectral satellite instrument measuring the ultraviolet backscatter at relatively high spectral resolution. GOME radiances have been slit averaged to emulate measurements of the Total Ozone Mapping Spectrometer (TOMS) made at discrete wavelengths and processed using the new TOMS Version 8 Ozone Algorithm. Compared to Differential Optical Absorption Spectroscopy (DOAS) techniques based on local structure in the Huggins Bands, the TOMS uses differential absorption between a pair of wavelengths including the local stiucture as well as the background continuum. This makes the TOMS Algorithm more sensitive to ozone, but it also makes the algorithm more sensitive to instrument calibration errors. While calibration adjustments are not needed for the fitting techniques like the DOAS employed in GOME algorithms, some adjustment is necessary when applying the TOMS Algorithm to GOME. Using spectral discrimination at near ultraviolet wavelength channels unabsorbed by ozone, the GOME wavelength dependent calibration drift is estimated and then checked using pair justification. In addition, the day one calibration offset is estimated based on the residuals of the Version 8 TOMS Algorithm. The estimated drift in the 2b detector of GOME is small through the first four years and then increases rapidly to +5% in normalized radiance at 331 nm relative to 385 nm by mid 2000. The lb detector appears to be quite well behaved throughout this time period.
Wang, Wei; Song, Wei-Guo; Liu, Shi-Xing; Zhang, Yong-Ming; Zheng, Hong-Yang; Tian, Wei
2011-04-01
An improved method for detecting cloud combining Kmeans clustering and the multi-spectral threshold approach is described. On the basis of landmark spectrum analysis, MODIS data is categorized into two major types initially by Kmeans method. The first class includes clouds, smoke and snow, and the second class includes vegetation, water and land. Then a multi-spectral threshold detection is applied to eliminate interference such as smoke and snow for the first class. The method is tested with MODIS data at different time under different underlying surface conditions. By visual method to test the performance of the algorithm, it was found that the algorithm can effectively detect smaller area of cloud pixels and exclude the interference of underlying surface, which provides a good foundation for the next fire detection approach.
Constrained spectral clustering under a local proximity structure assumption
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri; Xu, Qianjun; des Jardins, Marie
2005-01-01
This work focuses on incorporating pairwise constraints into a spectral clustering algorithm. A new constrained spectral clustering method is proposed, as well as an active constraint acquisition technique and a heuristic for parameter selection. We demonstrate that our constrained spectral clustering method, CSC, works well when the data exhibits what we term local proximity structure.
An efficient quantum algorithm for spectral estimation
NASA Astrophysics Data System (ADS)
Steffens, Adrian; Rebentrost, Patrick; Marvian, Iman; Eisert, Jens; Lloyd, Seth
2017-03-01
We develop an efficient quantum implementation of an important signal processing algorithm for line spectral estimation: the matrix pencil method, which determines the frequencies and damping factors of signals consisting of finite sums of exponentially damped sinusoids. Our algorithm provides a quantum speedup in a natural regime where the sampling rate is much higher than the number of sinusoid components. Along the way, we develop techniques that are expected to be useful for other quantum algorithms as well—consecutive phase estimations to efficiently make products of asymmetric low rank matrices classically accessible and an alternative method to efficiently exponentiate non-Hermitian matrices. Our algorithm features an efficient quantum-classical division of labor: the time-critical steps are implemented in quantum superposition, while an interjacent step, requiring much fewer parameters, can operate classically. We show that frequencies and damping factors can be obtained in time logarithmic in the number of sampling points, exponentially faster than known classical algorithms.
NASA Astrophysics Data System (ADS)
Mechlem, Korbinian; Ehn, Sebastian; Sellerer, Thorsten; Pfeiffer, Franz; Noël, Peter B.
2017-03-01
In spectral computed tomography (spectral CT), the additional information about the energy dependence of attenuation coefficients can be exploited to generate material selective images. These images have found applications in various areas such as artifact reduction, quantitative imaging or clinical diagnosis. However, significant noise amplification on material decomposed images remains a fundamental problem of spectral CT. Most spectral CT algorithms separate the process of material decomposition and image reconstruction. Separating these steps is suboptimal because the full statistical information contained in the spectral tomographic measurements cannot be exploited. Statistical iterative reconstruction (SIR) techniques provide an alternative, mathematically elegant approach to obtaining material selective images with improved tradeoffs between noise and resolution. Furthermore, image reconstruction and material decomposition can be performed jointly. This is accomplished by a forward model which directly connects the (expected) spectral projection measurements and the material selective images. To obtain this forward model, detailed knowledge of the different photon energy spectra and the detector response was assumed in previous work. However, accurately determining the spectrum is often difficult in practice. In this work, a new algorithm for statistical iterative material decomposition is presented. It uses a semi-empirical forward model which relies on simple calibration measurements. Furthermore, an efficient optimization algorithm based on separable surrogate functions is employed. This partially negates one of the major shortcomings of SIR, namely high computational cost and long reconstruction times. Numerical simulations and real experiments show strongly improved image quality and reduced statistical bias compared to projection-based material decomposition.
NASA Astrophysics Data System (ADS)
Zhang, Xing; Wen, Gongjian
2015-10-01
Anomaly detection (AD) becomes increasingly important in hyperspectral imagery analysis with many practical applications. Local orthogonal subspace projection (LOSP) detector is a popular anomaly detector which exploits local endmembers/eigenvectors around the pixel under test (PUT) to construct background subspace. However, this subspace only takes advantage of the spectral information, but the spatial correlat ion of the background clutter is neglected, which leads to the anomaly detection result sensitive to the accuracy of the estimated subspace. In this paper, a local three dimensional orthogonal subspace projection (3D-LOSP) algorithm is proposed. Firstly, under the jointly use of both spectral and spatial information, three directional background subspaces are created along the image height direction, the image width direction and the spectral direction, respectively. Then, the three corresponding orthogonal subspaces are calculated. After that, each vector along three direction of the local cube is projected onto the corresponding orthogonal subspace. Finally, a composite score is given through the three direction operators. In 3D-LOSP, the anomalies are redefined as the target not only spectrally different to the background, but also spatially distinct. Thanks to the addition of the spatial information, the robustness of the anomaly detection result has been improved greatly by the proposed 3D-LOSP algorithm. It is noteworthy that the proposed algorithm is an expansion of LOSP and this ideology can inspire many other spectral-based anomaly detection methods. Experiments with real hyperspectral images have proved the stability of the detection result.
2007-03-01
Quadrature QPSK Quadrature Phase-Shift Keying RV Random Variable SHAC Single-Hop-Observation Auto- Correlation SINR Signal-to-Interference...The fast Fourier transform ( FFT ) accumulation method and the strip spectral correlation algorithm subdivide the support region in the bi-frequency...diamond shapes, while the strip spectral correlation algorithm subdivides the region into strips. Each strip covers a number of the FFT accumulation
Designing a practical system for spectral imaging of skylight.
López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier; Lee, Raymond L
2005-09-20
In earlier work [J. Opt. Soc. Am. A 21, 13-23 (2004)], we showed that a combination of linear models and optimum Gaussian sensors obtained by an exhaustive search can recover daylight spectra reliably from broadband sensor data. Thus our algorithm and sensors could be used to design an accurate, relatively inexpensive system for spectral imaging of daylight. Here we improve our simulation of the multispectral system by (1) considering the different kinds of noise inherent in electronic devices such as change-coupled devices (CCDs) or complementary metal-oxide semiconductors (CMOS) and (2) extending our research to a different kind of natural illumination, skylight. Because exhaustive searches are expensive computationally, here we switch to a simulated annealing algorithm to define the optimum sensors for recovering skylight spectra. The annealing algorithm requires us to minimize a single cost function, and so we develop one that calculates both the spectral and colorimetric similarity of any pair of skylight spectra. We show that the simulated annealing algorithm yields results similar to the exhaustive search but with much less computational effort. Our technique lets us study the properties of optimum sensors in the presence of noise, one side effect of which is that adding more sensors may not improve the spectral recovery.
NASA Technical Reports Server (NTRS)
Arduini, R. F.; Aherron, R. M.; Samms, R. W.
1984-01-01
A computational model of the deterministic and stochastic processes involved in multispectral remote sensing was designed to evaluate the performance of sensor systems and data processing algorithms for spectral feature classification. Accuracy in distinguishing between categories of surfaces or between specific types is developed as a means to compare sensor systems and data processing algorithms. The model allows studies to be made of the effects of variability of the atmosphere and of surface reflectance, as well as the effects of channel selection and sensor noise. Examples of these effects are shown.
An excitation wavelength-scanning spectral imaging system for preclinical imaging
NASA Astrophysics Data System (ADS)
Leavesley, Silas; Jiang, Yanan; Patsekin, Valery; Rajwa, Bartek; Robinson, J. Paul
2008-02-01
Small-animal fluorescence imaging is a rapidly growing field, driven by applications in cancer detection and pharmaceutical therapies. However, the practical use of this imaging technology is limited by image-quality issues related to autofluorescence background from animal tissues, as well as attenuation of the fluorescence signal due to scatter and absorption. To combat these problems, spectral imaging and analysis techniques are being employed to separate the fluorescence signal from background autofluorescence. To date, these technologies have focused on detecting the fluorescence emission spectrum at a fixed excitation wavelength. We present an alternative to this technique, an imaging spectrometer that detects the fluorescence excitation spectrum at a fixed emission wavelength. The advantages of this approach include increased available information for discrimination of fluorescent dyes, decreased optical radiation dose to the animal, and ability to scan a continuous wavelength range instead of discrete wavelength sampling. This excitation-scanning imager utilizes an acousto-optic tunable filter (AOTF), with supporting optics, to scan the excitation spectrum. Advanced image acquisition and analysis software has also been developed for classification and unmixing of the spectral image sets. Filtering has been implemented in a single-pass configuration with a bandwidth (full width at half maximum) of 16nm at 550nm central diffracted wavelength. We have characterized AOTF filtering over a wide range of incident light angles, much wider than has been previously reported in the literature, and we show how changes in incident light angle can be used to attenuate AOTF side lobes and alter bandwidth. A new parameter, in-band to out-of-band ratio, was defined to assess the quality of the filtered excitation light. Additional parameters were measured to allow objective characterization of the AOTF and the imager as a whole. This is necessary for comparing the excitation-scanning imager to other spectral and fluorescence imaging technologies. The effectiveness of the hyperspectral imager was tested by imaging and analysis of mice with injected fluorescent dyes. Finally, a discussion of the optimization of spectral fluorescence imagers is given, relating the effects of filter quality on fluorescence images collected and the analysis outcome.
Preconditioned Mixed Spectral Element Methods for Elasticity and Stokes Problems
NASA Technical Reports Server (NTRS)
Pavarino, Luca F.
1996-01-01
Preconditioned iterative methods for the indefinite systems obtained by discretizing the linear elasticity and Stokes problems with mixed spectral elements in three dimensions are introduced and analyzed. The resulting stiffness matrices have the structure of saddle point problems with a penalty term, which is associated with the Poisson ratio for elasticity problems or with stabilization techniques for Stokes problems. The main results of this paper show that the convergence rate of the resulting algorithms is independent of the penalty parameter, the number of spectral elements Nu and mildly dependent on the spectral degree eta via the inf-sup constant. The preconditioners proposed for the whole indefinite system are block-diagonal and block-triangular. Numerical experiments presented in the final section show that these algorithms are a practical and efficient strategy for the iterative solution of the indefinite problems arising from mixed spectral element discretizations of elliptic systems.
Wavelet compression techniques for hyperspectral data
NASA Technical Reports Server (NTRS)
Evans, Bruce; Ringer, Brian; Yeates, Mathew
1994-01-01
Hyperspectral sensors are electro-optic sensors which typically operate in visible and near infrared bands. Their characteristic property is the ability to resolve a relatively large number (i.e., tens to hundreds) of contiguous spectral bands to produce a detailed profile of the electromagnetic spectrum. In contrast, multispectral sensors measure relatively few non-contiguous spectral bands. Like multispectral sensors, hyperspectral sensors are often also imaging sensors, measuring spectra over an array of spatial resolution cells. The data produced may thus be viewed as a three dimensional array of samples in which two dimensions correspond to spatial position and the third to wavelength. Because they multiply the already large storage/transmission bandwidth requirements of conventional digital images, hyperspectral sensors generate formidable torrents of data. Their fine spectral resolution typically results in high redundancy in the spectral dimension, so that hyperspectral data sets are excellent candidates for compression. Although there have been a number of studies of compression algorithms for multispectral data, we are not aware of any published results for hyperspectral data. Three algorithms for hyperspectral data compression are compared. They were selected as representatives of three major approaches for extending conventional lossy image compression techniques to hyperspectral data. The simplest approach treats the data as an ensemble of images and compresses each image independently, ignoring the correlation between spectral bands. The second approach transforms the data to decorrelate the spectral bands, and then compresses the transformed data as a set of independent images. The third approach directly generalizes two-dimensional transform coding by applying a three-dimensional transform as part of the usual transform-quantize-entropy code procedure. The algorithms studied all use the discrete wavelet transform. In the first two cases, a wavelet transform coder was used for the two-dimensional compression. The third case used a three dimensional extension of this same algorithm.
NASA Astrophysics Data System (ADS)
Bassani, C.; Cavalli, R. M.; Fasulli, L.; Palombo, A.; Pascucci, S.; Santini, F.; Pignatti, S.
2009-04-01
The application of Remote Sensing data for detecting subsurface structures is becoming a remarkable tool for the archaeological observations to be combined with the near surface geophysics [1, 2]. As matter of fact, different satellite and airborne sensors have been used for archaeological applications, such as the identification of spectral anomalies (i.e. marks) related to the buried remnants within archaeological sites, and the management and protection of archaeological sites [3, 5]. The dominant factors that affect the spectral detectability of marks related to manmade archaeological structures are: (1) the spectral contrast between the target and background materials, (2) the proportion of the target on the surface (relative to the background), (3) the imaging system characteristics being used (i.e. bands, instrument noise and pixel size), and (4) the conditions under which the surface is being imaged (i.e. illumination and atmospheric conditions) [4]. In this context, just few airborne hyperspectral sensors were applied for cultural heritage studies, among them the AVIRIS (Airborne Visible/Infrared Imaging Spectrometer), the CASI (Compact Airborne Spectrographic Imager), the HyMAP (Hyperspectral MAPping) and the MIVIS (Multispectral Infrared and Visible Imaging Spectrometer). Therefore, the application of high spatial/spectral resolution imagery arise the question on which is the trade off between high spectral and spatial resolution imagery for archaeological applications and which spectral region is optimal for the detection of subsurface structures. This paper points out the most suitable spectral information useful to evaluate the image capability in terms of spectral anomaly detection of subsurface archaeological structures in different land cover contexts. In this study, we assess the capability of MIVIS and CASI reflectances and of ATM and MIVIS emissivities (Table 1) for subsurface archaeological prospection in different sites of the Arpi archaeological area (southern Italy). We identify, for the selected sites, three main land cover overlying the buried structures: (a) photosynthetic (i.e. green low vegetation), (b) non-photosynthetic vegetation (i.e. yellow, dry low vegetation), and (c) dry bare soil. Afterwards, we analyse the spectral regions showing an inherent potential for the archaeological detection as a function of the land cover characteristics. The classified land cover units have been used in a spectral mixture analysis to assess the land cover fractional abundance surfacing the buried structures (i.e. mark-background system). The classification and unmixing results for the CASI, MIVIS and ATM remote sensing data processing showed a good accordance both in the land cover units and in the subsurface structures identification. The integrated analysis of the unmixing results for the three sensors allowed us to establish that for the land cover characterized by green and dry vegetation (occurrence higher than 75%), the visible and near infrared (VNIR) spectral regions better enhance the buried man-made structures. In particular, if the structures are covered by more than 75% of vegetation the two most promising wavelengths for their detection are the chlorophyll peak at 0.56 m (Visible region) and the red edge region (0.67 to 0.72 m; NIR region). This result confirms that the variation induced by the subsurface structures (e.g., stone walls, tile concentrations, pavements near the surface, road networks) to the natural vegetation growth and/or colour (i.e., for different stress factors) is primarily detectable by the chlorophyll peak and the red edge region applied for the vegetation stress detection. Whereas, if dry soils cover the structures (occurrence higher than 75%), both the VNIR and thermal infrared (TIR) regions are suitable to detect the subsurface structures. This work demonstrates that airborne reflectances and emissivities data, even though at different spatial/spectral resolutions and acquisition time represent an effective and rapid tool to detect subsurface structures within different land cover contexts. As concluding results, this study reveals that the airborne multi/hyperspectral image processing can be an effective and cost-efficient tool to perform a preliminary analysis of those areas where large cultural heritage assets prioritising and localizing the sites where to apply near surface geophysics surveys. Spectral Region Spectral Resolution ( m )Spectral Range ( m) Spatial Resolution (m)IFOV (deg) ATM VIS-NIR SWIR-TIR (tot 12 ch) variable from 24 to 3100 0.42 - 1150 2 0.143 CASI VNIR (48 ch.) 0.01 0.40-0.94 2 0.115 MIVIS VNIR (28ch.) 0.02 (VIS) 0.05 (NIR) 0.43-0.83 (VIS) 1.15-1.55 (NIR) 6 - 7 0.115 SWIR (64ch.) 0.09 1.983-2.478 TIR (10ch.) 0.34-0.54 8.180-12.700 Table 1. Characteristics of airborne sensors used for the Arpi test area. 1 References 2 [1] Beck, A., Philip, G., Abdulkarim, M. and Donoghue, D., 2007. Evaluation of Corona and Ikonos high resolution satellite imagery for archaeological prospection in western Syria. Antiquity, 81: 161-175. 3 [2] Altaweel, M., 2005. The Use of ASTER Satellite Imagery in Archaeological Contexts. Archaeological Prospection, 12: 151- 166. 4 [3] Cavalli, R.M.; Colosi, F.; Palombo, A.; Pignatti, S.; Poscolieri, M. Remote hyperspectral imagery as a support to archaeological prospection. J. of Cultural Heritage 2007, 8, 272-283. 5 [4] Kucukkaya, A.G. Photogrammetry and remote sensing in archaeology. J. Quant. Spectrosc. Radiat. Transfer 2004, 97(1-3), 83-97. [5] Rowlands, A.; Sarris, A. Detection of exposed and subsurface archaeological remains using multi-sensor remote sensing. J. of Archaeological Science 2007, 34, 795-803.
NASA Astrophysics Data System (ADS)
Xie, ChengJun; Xu, Lin
2008-03-01
This paper presents a new algorithm based on mixing transform to eliminate redundancy, SHIRCT and subtraction mixing transform is used to eliminate spectral redundancy, 2D-CDF(2,2)DWT to eliminate spatial redundancy, This transform has priority in hardware realization convenience, since it can be fully implemented by add and shift operation. Its redundancy elimination effect is better than (1D+2D)CDF(2,2)DWT. Here improved SPIHT+CABAC mixing compression coding algorithm is used to implement compression coding. The experiment results show that in lossless image compression applications the effect of this method is a little better than the result acquired using (1D+2D)CDF(2,2)DWT+improved SPIHT+CABAC, still it is much better than the results acquired by JPEG-LS, WinZip, ARJ, DPCM, the research achievements of a research team of Chinese Academy of Sciences, NMST and MST. Using hyper-spectral image Canal of American JPL laboratory as the data set for lossless compression test, on the average the compression ratio of this algorithm exceeds the above algorithms by 42%,37%,35%,30%,16%,13%,11% respectively.
Mandarin Chinese Tone Identification in Cochlear Implants: Predictions from Acoustic Models
Morton, Kenneth D.; Torrione, Peter A.; Throckmorton, Chandra S.; Collins, Leslie M.
2015-01-01
It has been established that current cochlear implants do not supply adequate spectral information for perception of tonal languages. Comprehension of a tonal language, such as Mandarin Chinese, requires recognition of lexical tones. New strategies of cochlear stimulation such as variable stimulation rate and current steering may provide the means of delivering more spectral information and thus may provide the auditory fine structure required for tone recognition. Several cochlear implant signal processing strategies are examined in this study, the continuous interleaved sampling (CIS) algorithm, the frequency amplitude modulation encoding (FAME) algorithm, and the multiple carrier frequency algorithm (MCFA). These strategies provide different types and amounts of spectral information. Pattern recognition techniques can be applied to data from Mandarin Chinese tone recognition tasks using acoustic models as a means of testing the abilities of these algorithms to transmit the changes in fundamental frequency indicative of the four lexical tones. The ability of processed Mandarin Chinese tones to be correctly classified may predict trends in the effectiveness of different signal processing algorithms in cochlear implants. The proposed techniques can predict trends in performance of the signal processing techniques in quiet conditions but fail to do so in noise. PMID:18706497
NASA Technical Reports Server (NTRS)
Lustman, L.
1984-01-01
An outline for spectral methods for partial differential equations is presented. The basic spectral algorithm is defined, collocation are emphasized and the main advantage of the method, the infinite order of accuracy in problems with smooth solutions are discussed. Examples of theoretical numerical analysis of spectral calculations are presented. An application of spectral methods to transonic flow is presented. The full potential transonic equation is among the best understood among nonlinear equations.
Radiation anomaly detection algorithms for field-acquired gamma energy spectra
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen
2015-08-01
The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.
Restoration of MRI Data for Field Nonuniformities using High Order Neighborhood Statistics
Hadjidemetriou, Stathis; Studholme, Colin; Mueller, Susanne; Weiner, Michael; Schuff, Norbert
2007-01-01
MRI at high magnetic fields (> 3.0 T ) is complicated by strong inhomogeneous radio-frequency fields, sometimes termed the “bias field”. These lead to nonuniformity of image intensity, greatly complicating further analysis such as registration and segmentation. Existing methods for bias field correction are effective for 1.5 T or 3.0 T MRI, but are not completely satisfactory for higher field data. This paper develops an effective bias field correction for high field MRI based on the assumption that the nonuniformity is smoothly varying in space. Also, nonuniformity is quantified and unmixed using high order neighborhood statistics of intensity cooccurrences. They are computed within spherical windows of limited size over the entire image. The restoration is iterative and makes use of a novel stable stopping criterion that depends on the scaled entropy of the cooccurrence statistics, which is a non monotonic function of the iterations; the Shannon entropy of the cooccurrence statistics normalized to the effective dynamic range of the image. The algorithm restores whole head data, is robust to intense nonuniformities present in high field acquisitions, and is robust to variations in anatomy. This algorithm significantly improves bias field correction in comparison to N3 on phantom 1.5 T head data and high field 4 T human head data. PMID:18193095
Novel Spectral Representations and Sparsity-Driven Algorithms for Shape Modeling and Analysis
NASA Astrophysics Data System (ADS)
Zhong, Ming
In this dissertation, we focus on extending classical spectral shape analysis by incorporating spectral graph wavelets and sparsity-seeking algorithms. Defined with the graph Laplacian eigenbasis, the spectral graph wavelets are localized both in the vertex domain and graph spectral domain, and thus are very effective in describing local geometry. With a rich dictionary of elementary vectors and forcing certain sparsity constraints, a real life signal can often be well approximated by a very sparse coefficient representation. The many successful applications of sparse signal representation in computer vision and image processing inspire us to explore the idea of employing sparse modeling techniques with dictionary of spectral basis to solve various shape modeling problems. Conventional spectral mesh compression uses the eigenfunctions of mesh Laplacian as shape bases, which are highly inefficient in representing local geometry. To ameliorate, we advocate an innovative approach to 3D mesh compression using spectral graph wavelets as dictionary to encode mesh geometry. The spectral graph wavelets are locally defined at individual vertices and can better capture local shape information than Laplacian eigenbasis. The multi-scale SGWs form a redundant dictionary as shape basis, so we formulate the compression of 3D shape as a sparse approximation problem that can be readily handled by greedy pursuit algorithms. Surface inpainting refers to the completion or recovery of missing shape geometry based on the shape information that is currently available. We devise a new surface inpainting algorithm founded upon the theory and techniques of sparse signal recovery. Instead of estimating the missing geometry directly, our novel method is to find this low-dimensional representation which describes the entire original shape. More specifically, we find that, for many shapes, the vertex coordinate function can be well approximated by a very sparse coefficient representation with respect to the dictionary comprising its Laplacian eigenbasis, and it is then possible to recover this sparse representation from partial measurements of the original shape. Taking advantage of the sparsity cue, we advocate a novel variational approach for surface inpainting, integrating data fidelity constraints on the shape domain with coefficient sparsity constraints on the transformed domain. Because of the powerful properties of Laplacian eigenbasis, the inpainting results of our method tend to be globally coherent with the remaining shape. Informative and discriminative feature descriptors are vital in qualitative and quantitative shape analysis for a large variety of graphics applications. We advocate novel strategies to define generalized, user-specified features on shapes. Our new region descriptors are primarily built upon the coefficients of spectral graph wavelets that are both multi-scale and multi-level in nature, consisting of both local and global information. Based on our novel spectral feature descriptor, we developed a user-specified feature detection framework and a tensor-based shape matching algorithm. Through various experiments, we demonstrate the competitive performance of our proposed methods and the great potential of spectral basis and sparsity-driven methods for shape modeling.
NASA Astrophysics Data System (ADS)
Deng, Junjun; Zhang, Yanru; Qiu, Yuqing; Zhang, Hongliang; Du, Wenjiao; Xu, Lingling; Hong, Youwei; Chen, Yanting; Chen, Jinsheng
2018-04-01
Source apportionment of fine particulate matter (PM2.5) were conducted at the Lin'an Regional Atmospheric Background Station (LA) in the Yangtze River Delta (YRD) region in China from July 2014 to April 2015 with three receptor models including principal component analysis combining multiple linear regression (PCA-MLR), UNMIX and Positive Matrix Factorization (PMF). The model performance, source identification and source contribution of the three models were analyzed and inter-compared. Source apportionment of PM2.5 was also conducted with the receptor models. Good correlations between the reconstructed and measured concentrations of PM2.5 and its major chemical species were obtained for all models. PMF resolved almost all masses of PM2.5, while PCA-MLR and UNMIX explained about 80%. Five, four and seven sources were identified by PCA-MLR, UNMIX and PMF, respectively. Combustion, secondary source, marine source, dust and industrial activities were identified by all the three receptor models. Combustion source and secondary source were the major sources, and totally contributed over 60% to PM2.5. The PMF model had a better performance on separating the different combustion sources. These findings improve the understanding of PM2.5 sources in background region.
Speech enhancement on smartphone voice recording
NASA Astrophysics Data System (ADS)
Tris Atmaja, Bagus; Nur Farid, Mifta; Arifianto, Dhany
2016-11-01
Speech enhancement is challenging task in audio signal processing to enhance the quality of targeted speech signal while suppress other noises. In the beginning, the speech enhancement algorithm growth rapidly from spectral subtraction, Wiener filtering, spectral amplitude MMSE estimator to Non-negative Matrix Factorization (NMF). Smartphone as revolutionary device now is being used in all aspect of life including journalism; personally and professionally. Although many smartphones have two microphones (main and rear) the only main microphone is widely used for voice recording. This is why the NMF algorithm widely used for this purpose of speech enhancement. This paper evaluate speech enhancement on smartphone voice recording by using some algorithms mentioned previously. We also extend the NMF algorithm to Kulback-Leibler NMF with supervised separation. The last algorithm shows improved result compared to others by spectrogram and PESQ score evaluation.
Jacobi spectral Galerkin method for elliptic Neumann problems
NASA Astrophysics Data System (ADS)
Doha, E.; Bhrawy, A.; Abd-Elhameed, W.
2009-01-01
This paper is concerned with fast spectral-Galerkin Jacobi algorithms for solving one- and two-dimensional elliptic equations with homogeneous and nonhomogeneous Neumann boundary conditions. The paper extends the algorithms proposed by Shen (SIAM J Sci Comput 15:1489-1505, 1994) and Auteri et al. (J Comput Phys 185:427-444, 2003), based on Legendre polynomials, to Jacobi polynomials with arbitrary α and β. The key to the efficiency of our algorithms is to construct appropriate basis functions with zero slope at the endpoints, which lead to systems with sparse matrices for the discrete variational formulations. The direct solution algorithm developed for the homogeneous Neumann problem in two-dimensions relies upon a tensor product process. Nonhomogeneous Neumann data are accounted for by means of a lifting. Numerical results indicating the high accuracy and effectiveness of these algorithms are presented.
Optimizing interconnections to maximize the spectral radius of interdependent networks
NASA Astrophysics Data System (ADS)
Chen, Huashan; Zhao, Xiuyan; Liu, Feng; Xu, Shouhuai; Lu, Wenlian
2017-03-01
The spectral radius (i.e., the largest eigenvalue) of the adjacency matrices of complex networks is an important quantity that governs the behavior of many dynamic processes on the networks, such as synchronization and epidemics. Studies in the literature focused on bounding this quantity. In this paper, we investigate how to maximize the spectral radius of interdependent networks by optimally linking k internetwork connections (or interconnections for short). We derive formulas for the estimation of the spectral radius of interdependent networks and employ these results to develop a suite of algorithms that are applicable to different parameter regimes. In particular, a simple algorithm is to link the k nodes with the largest k eigenvector centralities in one network to the node in the other network with a certain property related to both networks. We demonstrate the applicability of our algorithms via extensive simulations. We discuss the physical implications of the results, including how the optimal interconnections can more effectively decrease the threshold of epidemic spreading in the susceptible-infected-susceptible model and the threshold of synchronization of coupled Kuramoto oscillators.
An analysis of spectral envelope-reduction via quadratic assignment problems
NASA Technical Reports Server (NTRS)
George, Alan; Pothen, Alex
1994-01-01
A new spectral algorithm for reordering a sparse symmetric matrix to reduce its envelope size was described. The ordering is computed by associating a Laplacian matrix with the given matrix and then sorting the components of a specified eigenvector of the Laplacian. In this paper, we provide an analysis of the spectral envelope reduction algorithm. We described related 1- and 2-sum problems; the former is related to the envelope size, while the latter is related to an upper bound on the work involved in an envelope Cholesky factorization scheme. We formulate the latter two problems as quadratic assignment problems, and then study the 2-sum problem in more detail. We obtain lower bounds on the 2-sum by considering a projected quadratic assignment problem, and then show that finding a permutation matrix closest to an orthogonal matrix attaining one of the lower bounds justifies the spectral envelope reduction algorithm. The lower bound on the 2-sum is seen to be tight for reasonably 'uniform' finite element meshes. We also obtain asymptotically tight lower bounds for the envelope size for certain classes of meshes.
Multitaper scan-free spectrum estimation using a rotational shear interferometer.
Lepage, Kyle; Thomson, David J; Kraut, Shawn; Brady, David J
2006-05-01
Multitaper methods for a scan-free spectrum estimation that uses a rotational shear interferometer are investigated. Before source spectra can be estimated the sources must be detected. A source detection algorithm based upon the multitaper F-test is proposed. The algorithm is simulated, with additive, white Gaussian detector noise. A source with a signal-to-noise ratio (SNR) of 0.71 is detected 2.9 degrees from a source with a SNR of 70.1, with a significance level of 10(-4), approximately 4 orders of magnitude more significant than the source detection obtained with a standard detection algorithm. Interpolation and the use of prewhitening filters are investigated in the context of rotational shear interferometer (RSI) source spectra estimation. Finally, a multitaper spectrum estimator is proposed, simulated, and compared with untapered estimates. The multitaper estimate is found via simulation to distinguish a spectral feature with a SNR of 1.6 near a large spectral feature. The SNR of 1.6 spectral feature is not distinguished by the untapered spectrum estimate. The findings are consistent with the strong capability of the multitaper estimate to reduce out-of-band spectral leakage.
Multitaper scan-free spectrum estimation using a rotational shear interferometer
NASA Astrophysics Data System (ADS)
Lepage, Kyle; Thomson, David J.; Kraut, Shawn; Brady, David J.
2006-05-01
Multitaper methods for a scan-free spectrum estimation that uses a rotational shear interferometer are investigated. Before source spectra can be estimated the sources must be detected. A source detection algorithm based upon the multitaper F-test is proposed. The algorithm is simulated, with additive, white Gaussian detector noise. A source with a signal-to-noise ratio (SNR) of 0.71 is detected 2.9° from a source with a SNR of 70.1, with a significance level of 10-4, ˜4 orders of magnitude more significant than the source detection obtained with a standard detection algorithm. Interpolation and the use of prewhitening filters are investigated in the context of rotational shear interferometer (RSI) source spectra estimation. Finally, a multitaper spectrum estimator is proposed, simulated, and compared with untapered estimates. The multitaper estimate is found via simulation to distinguish a spectral feature with a SNR of 1.6 near a large spectral feature. The SNR of 1.6 spectral feature is not distinguished by the untapered spectrum estimate. The findings are consistent with the strong capability of the multitaper estimate to reduce out-of-band spectral leakage.
Yu, Shuang; Liu, Guo-hai; Xia, Rong-sheng; Jiang, Hui
2016-01-01
In order to achieve the rapid monitoring of process state of solid state fermentation (SSF), this study attempted to qualitative identification of process state of SSF of feed protein by use of Fourier transform near infrared (FT-NIR) spectroscopy analysis technique. Even more specifically, the FT-NIR spectroscopy combined with Adaboost-SRDA-NN integrated learning algorithm as an ideal analysis tool was used to accurately and rapidly monitor chemical and physical changes in SSF of feed protein without the need for chemical analysis. Firstly, the raw spectra of all the 140 fermentation samples obtained were collected by use of Fourier transform near infrared spectrometer (Antaris II), and the raw spectra obtained were preprocessed by use of standard normal variate transformation (SNV) spectral preprocessing algorithm. Thereafter, the characteristic information of the preprocessed spectra was extracted by use of spectral regression discriminant analysis (SRDA). Finally, nearest neighbors (NN) algorithm as a basic classifier was selected and building state recognition model to identify different fermentation samples in the validation set. Experimental results showed as follows: the SRDA-NN model revealed its superior performance by compared with other two different NN models, which were developed by use of the feature information form principal component analysis (PCA) and linear discriminant analysis (LDA), and the correct recognition rate of SRDA-NN model achieved 94.28% in the validation set. In this work, in order to further improve the recognition accuracy of the final model, Adaboost-SRDA-NN ensemble learning algorithm was proposed by integrated the Adaboost and SRDA-NN methods, and the presented algorithm was used to construct the online monitoring model of process state of SSF of feed protein. Experimental results showed as follows: the prediction performance of SRDA-NN model has been further enhanced by use of Adaboost lifting algorithm, and the correct recognition rate of the Adaboost-SRDA-NN model achieved 100% in the validation set. The overall results demonstrate that SRDA algorithm can effectively achieve the spectral feature information extraction to the spectral dimension reduction in model calibration process of qualitative analysis of NIR spectroscopy. In addition, the Adaboost lifting algorithm can improve the classification accuracy of the final model. The results obtained in this work can provide research foundation for developing online monitoring instruments for the monitoring of SSF process.
Application of hierarchical Bayesian unmixing models in river sediment source apportionment
NASA Astrophysics Data System (ADS)
Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice
2016-04-01
Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling process, (3) deriving and using informative priors in sediment fingerprinting context and (4) transparency of the process and replication of model results by other users.
Resolution Study of a Hyperspectral Sensor using Computed Tomography in the Presence of Noise
2012-06-14
diffraction efficiency is dependent on wavelength. Compared to techniques developed by later work, simple algebraic reconstruction techniques were used...spectral di- mension, using computed tomography (CT) techniques with only a finite number of diverse images. CTHIS require a reconstruction algorithm in...many frames are needed to reconstruct the spectral cube of a simple object using a theoretical lower bound. In this research a new algorithm is derived
Two-dimensional imaging of gas temperature and concentration based on hyperspectral tomography
NASA Astrophysics Data System (ADS)
Xin, Ming-yuan; Jin, Xing; Wang, Guang-yu; Song, Junling
2016-10-01
Two-dimensional imaging of gas temperature and concentration is realized by hyperspectral tomography, which has the characteristics of using multi-wavelengths absorption spectral information, so that the imaging could be accomplished in a small number of projections and viewing angles. A temperature and concentration model is established to simulate the combustion conditions and a total number of 10 near-infrared absorption spectral information of H2O is used. An improved simulated annealing algorithm by adjusting search step is performed the main search algorithm for the tomography. By adding random errors into the absorption area information, the stability of the algorithm is tested, and the results are compared with the reconstructions provided by algebraic reconstruction technique which takes advantage of 2 spectral information contents in imaging. The results show that the two methods perform equivalent in low-level noise environment, but at high-level, hyperspectral tomography turns out to be more stable.
Model-based spectral estimation of Doppler signals using parallel genetic algorithms.
Solano González, J; Rodríguez Vázquez, K; García Nocetti, D F
2000-05-01
Conventional spectral analysis methods use a fast Fourier transform (FFT) on consecutive or overlapping windowed data segments. For Doppler ultrasound signals, this approach suffers from an inadequate frequency resolution due to the time segment duration and the non-stationarity characteristics of the signals. Parametric or model-based estimators can give significant improvements in the time-frequency resolution at the expense of a higher computational complexity. This work describes an approach which implements in real-time a parametric spectral estimator method using genetic algorithms (GAs) in order to find the optimum set of parameters for the adaptive filter that minimises the error function. The aim is to reduce the computational complexity of the conventional algorithm by using the simplicity associated to GAs and exploiting its parallel characteristics. This will allow the implementation of higher order filters, increasing the spectrum resolution, and opening a greater scope for using more complex methods.
Automated computation of autonomous spectral submanifolds for nonlinear modal analysis
NASA Astrophysics Data System (ADS)
Ponsioen, Sten; Pedergnana, Tiemo; Haller, George
2018-04-01
We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.
Overlapping communities detection based on spectral analysis of line graphs
NASA Astrophysics Data System (ADS)
Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan
2018-05-01
Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.
NASA Astrophysics Data System (ADS)
Huo, Yanfeng; Duan, Minzheng; Tian, Wenshou; Min, Qilong
2015-08-01
A differential optical absorption spectroscopy (DOAS)-like algorithm is developed to retrieve the column-averaged dryair mole fraction of carbon dioxide from ground-based hyper-spectral measurements of the direct solar beam. Different to the spectral fitting method, which minimizes the difference between the observed and simulated spectra, the ratios of multiple channel-pairs—one weak and one strong absorption channel—are used to retrieve from measurements of the shortwave infrared (SWIR) band. Based on sensitivity tests, a super channel-pair is carefully selected to reduce the effects of solar lines, water vapor, air temperature, pressure, instrument noise, and frequency shift on retrieval errors. The new algorithm reduces computational cost and the retrievals are less sensitive to temperature and H2O uncertainty than the spectral fitting method. Multi-day Total Carbon Column Observing Network (TCCON) measurements under clear-sky conditions at two sites (Tsukuba and Bremen) are used to derive xxxx for the algorithm evaluation and validation. The DOAS-like results agree very well with those of the TCCON algorithm after correction of an airmass-dependent bias.
Exploratory Item Classification Via Spectral Graph Clustering
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2017-01-01
Large-scale assessments are supported by a large item pool. An important task in test development is to assign items into scales that measure different characteristics of individuals, and a popular approach is cluster analysis of items. Classical methods in cluster analysis, such as the hierarchical clustering, K-means method, and latent-class analysis, often induce a high computational overhead and have difficulty handling missing data, especially in the presence of high-dimensional responses. In this article, the authors propose a spectral clustering algorithm for exploratory item cluster analysis. The method is computationally efficient, effective for data with missing or incomplete responses, easy to implement, and often outperforms traditional clustering algorithms in the context of high dimensionality. The spectral clustering algorithm is based on graph theory, a branch of mathematics that studies the properties of graphs. The algorithm first constructs a graph of items, characterizing the similarity structure among items. It then extracts item clusters based on the graphical structure, grouping similar items together. The proposed method is evaluated through simulations and an application to the revised Eysenck Personality Questionnaire. PMID:29033476
Fast algorithm for bilinear transforms in optics
NASA Astrophysics Data System (ADS)
Ostrovsky, Andrey S.; Martinez-Niconoff, Gabriel C.; Ramos Romero, Obdulio; Cortes, Liliana
2000-10-01
The fast algorithm for calculating the bilinear transform in the optical system is proposed. This algorithm is based on the coherent-mode representation of the cross-spectral density function of the illumination. The algorithm is computationally efficient when the illumination is partially coherent. Numerical examples are studied and compared with the theoretical results.
Clark, Roger N.; Swayze, Gregg A.; Livo, K. Eric; Kokaly, Raymond F.; Sutley, Steve J.; Dalton, J. Brad; McDougal, Robert R.; Gent, Carol A.
2003-01-01
Imaging spectroscopy is a tool that can be used to spectrally identify and spatially map materials based on their specific chemical bonds. Spectroscopic analysis requires significantly more sophistication than has been employed in conventional broadband remote sensing analysis. We describe a new system that is effective at material identification and mapping: a set of algorithms within an expert system decision‐making framework that we call Tetracorder. The expertise in the system has been derived from scientific knowledge of spectral identification. The expert system rules are implemented in a decision tree where multiple algorithms are applied to spectral analysis, additional expert rules and algorithms can be applied based on initial results, and more decisions are made until spectral analysis is complete. Because certain spectral features are indicative of specific chemical bonds in materials, the system can accurately identify and map those materials. In this paper we describe the framework of the decision making process used for spectral identification, describe specific spectral feature analysis algorithms, and give examples of what analyses and types of maps are possible with imaging spectroscopy data. We also present the expert system rules that describe which diagnostic spectral features are used in the decision making process for a set of spectra of minerals and other common materials. We demonstrate the applications of Tetracorder to identify and map surface minerals, to detect sources of acid rock drainage, and to map vegetation species, ice, melting snow, water, and water pollution, all with one set of expert system rules. Mineral mapping can aid in geologic mapping and fault detection and can provide a better understanding of weathering, mineralization, hydrothermal alteration, and other geologic processes. Environmental site assessment, such as mapping source areas of acid mine drainage, has resulted in the acceleration of site cleanup, saving millions of dollars and years in cleanup time. Imaging spectroscopy data and Tetracorder analysis can be used to study both terrestrial and planetary science problems. Imaging spectroscopy can be used to probe planetary systems, including their atmospheres, oceans, and land surfaces.
Johansen, Richard; Beck, Richard; Nowosad, Jakub; Nietch, Christopher; Xu, Min; Shu, Song; Yang, Bo; Liu, Hongxing; Emery, Erich; Reif, Molly; Harwood, Joseph; Young, Jade; Macke, Dana; Martin, Mark; Stillings, Garrett; Stumpf, Richard; Su, Haibin
2018-06-01
This study evaluated the performances of twenty-nine algorithms that use satellite-based spectral imager data to derive estimates of chlorophyll-a concentrations that, in turn, can be used as an indicator of the general status of algal cell densities and the potential for a harmful algal bloom (HAB). The performance assessment was based on making relative comparisons between two temperate inland lakes: Harsha Lake (7.99 km 2 ) in Southwest Ohio and Taylorsville Lake (11.88 km 2 ) in central Kentucky. Of interest was identifying algorithm-imager combinations that had high correlation with coincident chlorophyll-a surface observations for both lakes, as this suggests portability for regional HAB monitoring. The spectral data utilized to estimate surface water chlorophyll-a concentrations were derived from the airborne Compact Airborne Spectral Imager (CASI) 1500 hyperspectral imager, that was then used to derive synthetic versions of currently operational satellite-based imagers using spatial resampling and spectral binning. The synthetic data mimics the configurations of spectral imagers on current satellites in earth's orbit including, WorldView-2/3, Sentinel-2, Landsat-8, Moderate-resolution Imaging Spectroradiometer (MODIS), and Medium Resolution Imaging Spectrometer (MERIS). High correlations were found between the direct measurement and the imagery-estimated chlorophyll-a concentrations at both lakes. The results determined that eleven out of the twenty-nine algorithms were considered portable, with r 2 values greater than 0.5 for both lakes. Even though the two lakes are different in terms of background water quality, size and shape, with Taylorsville being generally less impaired, larger, but much narrower throughout, the results support the portability of utilizing a suite of certain algorithms across multiple sensors to detect potential algal blooms through the use of chlorophyll-a as a proxy. Furthermore, the strong performance of the Sentinel-2 algorithms is exceptionally promising, due to the recent launch of the second satellite in the constellation, which will provide higher temporal resolution for temperate inland water bodies. Additionally, scripts were written for the open-source statistical software R that automate much of the spectral data processing steps. This allows for the simultaneous consideration of numerous algorithms across multiple imagers over an expedited time frame for the near real-time monitoring required for detecting algal blooms and mitigating their adverse impacts. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Liu, Kuojuey Ray
1990-01-01
Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.
Reliable Quantitative Mineral Abundances of the Martian Surface using THEMIS
NASA Astrophysics Data System (ADS)
Smith, R. J.; Huang, J.; Ryan, A. J.; Christensen, P. R.
2013-12-01
The following presents a proof of concept that given quality data, Thermal Emission Imaging System (THEMIS) data can be used to derive reliable quantitative mineral abundances of the Martian surface using a limited mineral library. The THEMIS instrument aboard the Mars Odyssey spacecraft is a multispectral thermal infrared imager with a spatial resolution of 100 m/pixel. The relatively high spatial resolution along with global coverage makes THEMIS datasets powerful tools for comprehensive fine scale petrologic analyses. However, the spectral resolution of THEMIS is limited to 8 surface sensitive bands between 6.8 and 14.0 μm with an average bandwidth of ~ 1 μm, which complicates atmosphere-surface separation and spectral analysis. This study utilizes the atmospheric correction methods of both Bandfield et al. [2004] and Ryan et al. [2013] joined with the iterative linear deconvolution technique pioneered by Huang et al. [in review] in order to derive fine-scale quantitative mineral abundances of the Martian surface. In general, it can be assumed that surface emissivity combines in a linear fashion in the thermal infrared (TIR) wavelengths such that the emitted energy is proportional to the areal percentage of the minerals present. TIR spectra are unmixed using a set of linear equations involving an endmember library of lab measured mineral spectra. The number of endmembers allowed in a spectral library are restricted to a quantity of n-1 (where n = the number of spectral bands of an instrument), preserving one band for blackbody. Spectral analysis of THEMIS data is thus allowed only seven endmembers. This study attempts to prove that this limitation does not prohibit the derivation of meaningful spectral analyses from THEMIS data. Our study selects THEMIS stamps from a region of Mars that is well characterized in the TIR by the higher spectral resolution, lower spatial resolution Thermal Emission Spectrometer (TES) instrument (143 bands at 10 cm-1 sampling and 3x5 km pixel). Multiple atmospheric corrections are performed for one image using the methods of Bandfield et al. [2004] and Ryan et al. [2013]. 7x7 pixel areas were selected, averaged, and compared using each atmospherically corrected image to ensure consistency. Corrections that provided reliable data were then used for spectral analyses. Linear deconvolution is performed using an iterative spectral analysis method [Huang et al. in review] that takes an endmember spectral library, and creates mineral combinations based on prescribed mineral group selections. The script then performs a spectral mixture analysis on each surface spectrum using all possible mineral combinations, and reports the best modeled fit to the measured spectrum. Here we present initial results from Syrtis Planum where multiple atmospherically corrected THEMIS images were deconvolved to produce similar spectral analysis results, within the detection limit of the instrument. THEMIS mineral abundances are comparable to TES-derived abundances. References: Bandfield, JL et al. [2004], JGR 109, E10008 Huang, J et al., JGR, in review Ryan, AJ et al. [2013], AGU Fall Meeting
Turbulent unmixing: how marine turbulence drives patchy distributions of motile phytoplankton
NASA Astrophysics Data System (ADS)
Durham, William; Climent, Eric; Barry, Michael; de Lillo, Filippo; Boffetta, Guido; Cencini, Massimo; Stocker, Roman
2013-11-01
Centimeter-scale patchiness in the distribution of phytoplankton increases the efficacy of many important ecological interactions in the marine food web. We show that turbulent fluid motion, usually synonymous with mixing, instead triggers intense small-scale patchiness in the distribution of motile phytoplankton. We use a suite of experiments, direct numerical simulations of turbulence, and analytical tools to show that turbulent shear and acceleration directs the motility of cells towards well-defined regions of flow, increasing local cell concentrations more than ten fold. This motility-driven `unmixing' offers an explanation for why motile cells are often more patchily distributed than non-motile cells and provides a mechanistic framework to understand how turbulence, whose strength varies profoundly in marine environments, impacts ocean productivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calder, Stuart A; Cao, Guixin; Okamoto, Satoshi
The J_eff=1/2 state is manifested in systems with large cubic crystal field splitting and spin-orbit coupling that are comparable to the on-site Coulomb interaction, U. 5d transition metal oxides host parameters in this regime and strong evidence for this state in Sr2IrO4, and additional iridates, has been presented. All the candidates, however, deviate from the cubic crystal field required to provide an unmixed canonical J_eff=1/2 state, impacting the development of a robust model of this novel insulating and magnetic state. We present experimental and theoretical results that not only show Ca4IrO6 hosts the state, but furthermore uniquely resides in themore » limit required for a canonical unmixed J_eff=1/2 state.« less
A New Algorithm for Detecting Cloud Height using OMPS/LP Measurements
NASA Technical Reports Server (NTRS)
Chen, Zhong; DeLand, Matthew; Bhartia, Pawan K.
2016-01-01
The Ozone Mapping and Profiler Suite Limb Profiler (OMPS/LP) ozone product requires the determination of cloud height for each event to establish the lower boundary of the profile for the retrieval algorithm. We have created a revised cloud detection algorithm for LP measurements that uses the spectral dependence of the vertical gradient in radiance between two wavelengths in the visible and near-IR spectral regions. This approach provides better discrimination between clouds and aerosols than results obtained using a single wavelength. Observed LP cloud height values show good agreement with coincident Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) measurements.
NASA Astrophysics Data System (ADS)
Ustra, A.; Kessouri, P.; Leite, A.; Mendonça, C. A.; Bandeira, N.
2017-12-01
Magnetic minerals in soils and rocks are one way to study biogechemical and paleoenvironmental processes. The ultrafine fraction of these minerals (superparmagnetic (SP) and stable single domain (SSD)) are usually investigated in environmental magnetism studies, since changes in mineralogy, concentration, size and morphology of the magnetic grains can be related to biogeochemical processes. In this study, we use low-field frequency dependent susceptibility (FDS) and isothermal remanent magnetization (IRM) to characterize the magnetic properties of materials in environmental magnetism. Magnetic susceptibility (MS) measurements are frequently used as a proxy of magnetic minerals present in soils and rocks. MS is a complex function of magnetic mineralogy and grain size, as well as magnitude and frequency of the applied field. This work presents a method for inverting low-field FDS data. The inverted parameters can be interpreted in terms of grain size variations of magnetic particles on the SP-SSD transition. This work also presents a method for inverting IRM demagnetization curves, to obtain the saturation magnetization, the individual magnetic moment for an assemblage of ultrafine SP minerals and estimate the concentration of magnetic carriers. IRM magnetization curves can be interpreted as resulting from distinct contributions of different mineral phases, which can be described by Cummulative Log-Gaussian (CLG) distributions. Each acquisition curve provides fundamental parameters that are characteristic of the respective mineral phase. The CLG decomposition is widely used in an interpretation procedure named mineral unmixing. In this work we present an inversion method for mineral unmixing, implementing the genetic algorithm to find the parameters of distinct components. These methodologies have been tested by synthetic models and applied to data from environmental magnetism studies. In this work we apply the proposed methodologies to characterize the magnetic properties of samples from the former Brandywine MD Defense Reutilization and Marketing Office (DRMO). The results from the magnetic properties characterization will provide additional information that may assist the interpretation of the biogeophysical signatures observed at the site.
Ahn, M. H.; Han, D.; Won, H. Y.; ...
2015-02-03
For better utilization of the ground-based microwave radiometer, it is important to detect the cloud presence in the measured data. Here, we introduce a simple and fast cloud detection algorithm by using the optical characteristics of the clouds in the infrared atmospheric window region. The new algorithm utilizes the brightness temperature (Tb) measured by an infrared radiometer installed on top of a microwave radiometer. The two-step algorithm consists of a spectral test followed by a temporal test. The measured Tb is first compared with a predicted clear-sky Tb obtained by an empirical formula as a function of surface air temperaturemore » and water vapor pressure. For the temporal test, the temporal variability of the measured Tb during one minute compares with a dynamic threshold value, representing the variability of clear-sky conditions. It is designated as cloud-free data only when both the spectral and temporal tests confirm cloud-free data. Overall, most of the thick and uniform clouds are successfully detected by the spectral test, while the broken and fast-varying clouds are detected by the temporal test. The algorithm is validated by comparison with the collocated ceilometer data for six months, from January to June 2013. The overall proportion of correctness is about 88.3% and the probability of detection is 90.8%, which are comparable with or better than those of previous similar approaches. Two thirds of discrepancies occur when the new algorithm detects clouds while the ceilometer does not, resulting in different values of the probability of detection with different cloud-base altitude, 93.8, 90.3, and 82.8% for low, mid, and high clouds, respectively. Finally, due to the characteristics of the spectral range, the new algorithm is found to be insensitive to the presence of inversion layers.« less
A spectroscopic analysis of Martian crater central peaks: Formation of the ancient crust
NASA Astrophysics Data System (ADS)
Skok, J. R.; Mustard, J. F.; Tornabene, L. L.; Pan, C.; Rogers, D.; Murchie, S. L.
2012-11-01
The earliest formed crust on a single plate planet such as Mars should be preserved, deeply buried under subsequent surface materials. Mars' extensive cratering history would have fractured and disrupted the upper layers of this ancient crust. Large impacts occurring late in Martian geologic history would have excavated and exposed this deeply buried material. We report the compositional analysis of unaltered mafic Martian crater central peaks with high-resolution spectral data that was used to characterize the presence, distribution and composition of mafic mineralogy. Reflectance spectra of mafic outcrops are modeled with the Modified Gaussian Model (MGM) to determine cation composition of olivine and pyroxene mineral deposits. Observations show that central peaks with unaltered mafic units are only observed in four general regions of Mars. Each mafic unit exhibits spectrally unmixed outcrops of olivine or pyroxene, indicating dunite and pyroxenite dominated compositions instead of basaltic composition common throughout much of the planet. Compositional analysis shows a wide range of olivine Fo# ranging from Fo60 to Fo5. This variation is best explained by a high degree of fractionation in a slowly cooling, differentiating magma body. Pyroxene analysis shows that all the sites in the Southern Highlands are consistent with moderately Fe-rich, low-Ca pyroxene. Mineral segregation in the ancient crust could be caused by cumulate crystallization and settling in a large, potentially global, lava lake or near surface plutons driven by a hypothesized early Martian mantle overturn.
NASA Technical Reports Server (NTRS)
Yuhas, Roberta H.; Boardman, Joseph W.; Goetz, Alexander F. H.
1993-01-01
Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were acquired during three consecutive growing seasons (26 September 1989, 22 March 1990, and 7 August 1990) over an area of the High Plains east of Greeley, Colorado (40 deg 20 min N and 104 deg 16 min W). A repeat visit to assess vegetation at its peak growth was flown on 6 June 1993. This region contains extensive eolian deposits in the form of stabilized dune complexes (small scale parabolic dunes superimposed on large scale longitudinal and parabolic dunes). Due to the dunes' large scale (2-10 km) and low relief (1-5 m), the scaling and morphological relationships that contribute to the evolution of this landscape are nearly impossible to understand without the use of remote sensing. Additionally, this area and regions similarly situated could be the first to experience the effects caused by global climate change. During the past 10,000 years there were at least four periods of extensive sand activity due to climate change, followed by periods of landscape stability, as shown in the stratigraphic record of this area.
Terascale spectral element algorithms and implementations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischer, P. F.; Tufo, H. M.
1999-08-17
We describe the development and implementation of an efficient spectral element code for multimillion gridpoint simulations of incompressible flows in general two- and three-dimensional domains. We review basic and recently developed algorithmic underpinnings that have resulted in good parallel and vector performance on a broad range of architectures, including the terascale computing systems now coming online at the DOE labs. Sustained performance of 219 GFLOPS has been recently achieved on 2048 nodes of the Intel ASCI-Red machine at Sandia.
Andrews, John T.; Eberl, D.D.
2011-01-01
To better understand the glacial history of the ice sheets surrounding Baffin Bay and to provide information on sediment pathways, samples from 82 seafloor grabs and core tops, and from seven box cores were subjected to quantitative X-ray diffraction weight percent (wt.%) analysis of the 2000 m) all show an abrupt drop in calcite wt.% (post-5 cal ka BP?) following a major peak in detrital carbonate (mainly dolomite). This dolomite-rich detrital carbonate (DC) event in JR175BC06 is possibly coeval with the Younger Dryas cold event. Four possible glacial-sourced end members were employed in a compositional unmixing algorithm to gain insight into down core changes in sediment provenance at the deep central basin. Estimates of the rates of sediment accumulation in the central basin are only in the range of 2 to 4 cm/cal ka, surprisingly low given the glaciated nature of the surrounding land.
Yu, Haitong; Liu, Dong; Duan, Yuanyuan; Wang, Xiaodong
2014-04-07
Opacified aerogels are particulate thermal insulating materials in which micrometric opacifier mineral grains are surrounded by silica aerogel nanoparticles. A geometric model was developed to characterize the spectral properties of such microsize grains surrounded by much smaller particles. The model represents the material's microstructure with the spherical opacifier's spectral properties calculated using the multi-sphere T-matrix (MSTM) algorithm. The results are validated by comparing the measured reflectance of an opacified aerogel slab against the value predicted using the discrete ordinate method (DOM) based on calculated optical properties. The results suggest that the large particles embedded in the nanoparticle matrices show different scattering and absorption properties from the single scattering condition and that the MSTM and DOM algorithms are both useful for calculating the spectral and radiative properties of this particulate system.
NASA Astrophysics Data System (ADS)
Mirapeix, J.; García-Allende, P. B.; Cobo, A.; Conde, O.; López-Higuera, J. M.
2007-07-01
A new spectral processing technique designed for its application in the on-line detection and classification of arc-welding defects is presented in this paper. A non-invasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed by means of two consecutive stages. A compression algorithm is first applied to the data allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in a previous paper, giving rise to an improvement in the performance of the monitoring system.
Radionuclide identification algorithm for organic scintillator-based radiation portal monitor
NASA Astrophysics Data System (ADS)
Paff, Marc Gerrit; Di Fulvio, Angela; Clarke, Shaun D.; Pozzi, Sara A.
2017-03-01
We have developed an algorithm for on-the-fly radionuclide identification for radiation portal monitors using organic scintillation detectors. The algorithm was demonstrated on experimental data acquired with our pedestrian portal monitor on moving special nuclear material and industrial sources at a purpose-built radiation portal monitor testing facility. The experimental data also included common medical isotopes. The algorithm takes the power spectral density of the cumulative distribution function of the measured pulse height distributions and matches these to reference spectra using a spectral angle mapper. F-score analysis showed that the new algorithm exhibited significant performance improvements over previously implemented radionuclide identification algorithms for organic scintillators. Reliable on-the-fly radionuclide identification would help portal monitor operators more effectively screen out the hundreds of thousands of nuisance alarms they encounter annually due to recent nuclear-medicine patients and cargo containing naturally occurring radioactive material. Portal monitor operators could instead focus on the rare but potentially high impact incidents of nuclear and radiological material smuggling detection for which portal monitors are intended.
Spectral element multigrid. Part 2: Theoretical justification
NASA Technical Reports Server (NTRS)
Maday, Yvon; Munoz, Rafael
1988-01-01
A multigrid algorithm is analyzed which is used for solving iteratively the algebraic system resulting from tha approximation of a second order problem by spectral or spectral element methods. The analysis, performed here in the one dimensional case, justifies the good smoothing properties of the Jacobi preconditioner that was presented in Part 1 of this paper.
Onboard spectral imager data processor
NASA Astrophysics Data System (ADS)
Otten, Leonard J.; Meigs, Andrew D.; Franklin, Abraham J.; Sears, Robert D.; Robison, Mark W.; Rafert, J. Bruce; Fronterhouse, Donald C.; Grotbeck, Ronald L.
1999-10-01
Previous papers have described the concept behind the MightySat II.1 program, the satellite's Fourier Transform imaging spectrometer's optical design, the design for the spectral imaging payload, and its initial qualification testing. This paper discusses the on board data processing designed to reduce the amount of downloaded data by an order of magnitude and provide a demonstration of a smart spaceborne spectral imaging sensor. Two custom components, a spectral imager interface 6U VME card that moves data at over 30 MByte/sec, and four TI C-40 processors mounted to a second 6U VME and daughter card, are used to adapt the sensor to the spacecraft and provide the necessary high speed processing. A system architecture that offers both on board real time image processing and high-speed post data collection analysis of the spectral data has been developed. In addition to the on board processing of the raw data into a usable spectral data volume, one feature extraction technique has been incorporated. This algorithm operates on the basic interferometric data. The algorithm is integrated within the data compression process to search for uploadable feature descriptions.
Directly data processing algorithm for multi-wavelength pyrometer (MWP).
Xing, Jian; Peng, Bo; Ma, Zhao; Guo, Xin; Dai, Li; Gu, Weihong; Song, Wenlong
2017-11-27
Data processing of multi-wavelength pyrometer (MWP) is a difficult problem because unknown emissivity. So far some solutions developed generally assumed particular mathematical relations for emissivity versus wavelength or emissivity versus temperature. Due to the deviation between the hypothesis and actual situation, the inversion results can be seriously affected. So directly data processing algorithm of MWP that does not need to assume the spectral emissivity model in advance is main aim of the study. Two new data processing algorithms of MWP, Gradient Projection (GP) algorithm and Internal Penalty Function (IPF) algorithm, each of which does not require to fix emissivity model in advance, are proposed. The novelty core idea is that data processing problem of MWP is transformed into constraint optimization problem, then it can be solved by GP or IPF algorithms. By comparison of simulation results for some typical spectral emissivity models, it is found that IPF algorithm is superior to GP algorithm in terms of accuracy and efficiency. Rocket nozzle temperature experiment results show that true temperature inversion results from IPF algorithm agree well with the theoretical design temperature as well. So the proposed combination IPF algorithm with MWP is expected to be a directly data processing algorithm to clear up the unknown emissivity obstacle for MWP.
TES Level 1 Algorithms: Interferogram Processing, Geolocation, Radiometric, and Spectral Calibration
NASA Technical Reports Server (NTRS)
Worden, Helen; Beer, Reinhard; Bowman, Kevin W.; Fisher, Brendan; Luo, Mingzhao; Rider, David; Sarkissian, Edwin; Tremblay, Denis; Zong, Jia
2006-01-01
The Tropospheric Emission Spectrometer (TES) on the Earth Observing System (EOS) Aura satellite measures the infrared radiance emitted by the Earth's surface and atmosphere using Fourier transform spectrometry. The measured interferograms are converted into geolocated, calibrated radiance spectra by the L1 (Level 1) processing, and are the inputs to L2 (Level 2) retrievals of atmospheric parameters, such as vertical profiles of trace gas abundance. We describe the algorithmic components of TES Level 1 processing, giving examples of the intermediate results and diagnostics that are necessary for creating TES L1 products. An assessment of noise-equivalent spectral radiance levels and current systematic errors is provided. As an initial validation of our spectral radiances, TES data are compared to the Atmospheric Infrared Sounder (AIRS) (on EOS Aqua), after accounting for spectral resolution differences by applying the AIRS spectral response function to the TES spectra. For the TES L1 nadir data products currently available, the agreement with AIRS is 1 K or better.
NASA Astrophysics Data System (ADS)
Ma, Suodong; Pan, Qiao; Shen, Weimin
2016-09-01
As one kind of light source simulation devices, spectrally tunable light sources are able to generate specific spectral shape and radiant intensity outputs according to different application requirements, which have urgent demands in many fields of the national economy and the national defense industry. Compared with the LED-type spectrally tunable light source, the one based on a DMD-convex grating Offner configuration has advantages of high spectral resolution, strong digital controllability, high spectrum synthesis accuracy, etc. As a key link of the above type light source to achieve target spectrum outputs, spectrum synthesis algorithm based on spectrum matching is therefore very important. An improved spectrum synthesis algorithm based on linear least square initialization and Levenberg-Marquardt iterative optimization is proposed in this paper on the basis of in-depth study of the spectrum matching principle. The effectiveness of the proposed method is verified by a series of simulations and experimental works.
Spectral CT Reconstruction with Image Sparsity and Spectral Mean
Zhang, Yi; Xi, Yan; Yang, Qingsong; Cong, Wenxiang; Zhou, Jiliu
2017-01-01
Photon-counting detectors can acquire x-ray intensity data in different energy bins. The signal to noise ratio of resultant raw data in each energy bin is generally low due to the narrow bin width and quantum noise. To address this problem, here we propose an image reconstruction approach for spectral CT to simultaneously reconstructs x-ray attenuation coefficients in all the energy bins. Because the measured spectral data are highly correlated among the x-ray energy bins, the intra-image sparsity and inter-image similarity are important prior acknowledge for image reconstruction. Inspired by this observation, the total variation (TV) and spectral mean (SM) measures are combined to improve the quality of reconstructed images. For this purpose, a linear mapping function is used to minimalize image differences between energy bins. The split Bregman technique is applied to perform image reconstruction. Our numerical and experimental results show that the proposed algorithms outperform competing iterative algorithms in this context. PMID:29034267
Onboard Science and Applications Algorithm for Hyperspectral Data Reduction
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Davies, Ashley G.; Silverman, Dorothy; Mandl, Daniel
2012-01-01
An onboard processing mission concept is under development for a possible Direct Broadcast capability for the HyspIRI mission, a Hyperspectral remote sensing mission under consideration for launch in the next decade. The concept would intelligently spectrally and spatially subsample the data as well as generate science products onboard to enable return of key rapid response science and applications information despite limited downlink bandwidth. This rapid data delivery concept focuses on wildfires and volcanoes as primary applications, but also has applications to vegetation, coastal flooding, dust, and snow/ice applications. Operationally, the HyspIRI team would define a set of spatial regions of interest where specific algorithms would be executed. For example, known coastal areas would have certain products or bands downlinked, ocean areas might have other bands downlinked, and during fire seasons other areas would be processed for active fire detections. Ground operations would automatically generate the mission plans specifying the highest priority tasks executable within onboard computation, setup, and data downlink constraints. The spectral bands of the TIR (thermal infrared) instrument can accurately detect the thermal signature of fires and send down alerts, as well as the thermal and VSWIR (visible to short-wave infrared) data corresponding to the active fires. Active volcanism also produces a distinctive thermal signature that can be detected onboard to enable spatial subsampling. Onboard algorithms and ground-based algorithms suitable for onboard deployment are mature. On HyspIRI, the algorithm would perform a table-driven temperature inversion from several spectral TIR bands, and then trigger downlink of the entire spectrum for each of the hot pixels identified. Ocean and coastal applications include sea surface temperature (using a small spectral subset of TIR data, but requiring considerable ancillary data), and ocean color applications to track biological activity such as harmful algal blooms. Measuring surface water extent to track flooding is another rapid response product leveraging VSWIR spectral information.
A wavelet transform algorithm for peak detection and application to powder x-ray diffraction data.
Gregoire, John M; Dale, Darren; van Dover, R Bruce
2011-01-01
Peak detection is ubiquitous in the analysis of spectral data. While many noise-filtering algorithms and peak identification algorithms have been developed, recent work [P. Du, W. Kibbe, and S. Lin, Bioinformatics 22, 2059 (2006); A. Wee, D. Grayden, Y. Zhu, K. Petkovic-Duran, and D. Smith, Electrophoresis 29, 4215 (2008)] has demonstrated that both of these tasks are efficiently performed through analysis of the wavelet transform of the data. In this paper, we present a wavelet-based peak detection algorithm with user-defined parameters that can be readily applied to the application of any spectral data. Particular attention is given to the algorithm's resolution of overlapping peaks. The algorithm is implemented for the analysis of powder diffraction data, and successful detection of Bragg peaks is demonstrated for both low signal-to-noise data from theta-theta diffraction of nanoparticles and combinatorial x-ray diffraction data from a composition spread thin film. These datasets have different types of background signals which are effectively removed in the wavelet-based method, and the results demonstrate that the algorithm provides a robust method for automated peak detection.
An Analysis of Light Periods of BL Lac Object S5 0716+714 with the MUSIC Algorithm
NASA Astrophysics Data System (ADS)
Tang, Jie
2012-07-01
The multiple signal classification (MUSIC) algorithm is introduced to the estimation of light periods of BL Lac objects. The principle of the MUSIC algorithm is given, together with a testing on its spectral resolution by using a simulative signal. From a lot of literature, we have collected a large number of effective observational data of the BL Lac object S5 0716+714 in the three optical wavebands V, R, and I from 1994 to 2008. The light periods of S5 0716+714 are obtained by means of the MUSIC algorithm and average periodogram algorithm, respectively. It is found that there exist two major periodic components, one is the period of (3.33±0.08) yr, another is the period of (1.24±0.01) yr. The comparison of the performances of periodicity analysis of two algorithms indicates that the MUSIC algorithm has a smaller requirement on the sample length, as well as a good spectral resolution and anti-noise ability, to improve the accuracy of periodicity analysis in the case of short sample length.
Xie, Dengfeng; Zhang, Jinshui; Zhu, Xiufang; Pan, Yaozhong; Liu, Hongli; Yuan, Zhoumiqi; Yun, Ya
2016-02-05
Remote sensing technology plays an important role in monitoring rapid changes of the Earth's surface. However, sensors that can simultaneously provide satellite images with both high temporal and spatial resolution haven't been designed yet. This paper proposes an improved spatial and temporal adaptive reflectance fusion model (STARFM) with the help of an Unmixing-based method (USTARFM) to generate the high spatial and temporal data needed for the study of heterogeneous areas. The results showed that the USTARFM had higher accuracy than STARFM methods in two aspects of analysis: individual bands and of heterogeneity analysis. Taking the predicted NIR band as an example, the correlation coefficients (r) for the USTARFM, STARFM and unmixing methods were 0.96, 0.95, 0.90, respectively (p-value < 0.001); Root Mean Square Error (RMSE) values were 0.0245, 0.0300, 0.0401, respectively; and ERGAS values were 0.5416, 0.6507, 0.8737, respectively. The USTARM showed consistently higher performance than STARM when the degree of heterogeneity ranged from 2 to 10, highlighting that the use of this method provides the capacity to solve the data fusion problems faced when using STARFM. Additionally, the USTARFM method could help researchers achieve better performance than STARFM at a smaller window size from its heterogeneous land surface quantitative representation.
Color analysis and image rendering of woodblock prints with oil-based ink
NASA Astrophysics Data System (ADS)
Horiuchi, Takahiko; Tanimoto, Tetsushi; Tominaga, Shoji
2012-01-01
This paper proposes a method for analyzing the color characteristics of woodblock prints having oil-based ink and rendering realistic images based on camera data. The analysis results of woodblock prints show some characteristic features in comparison with oil paintings: 1) A woodblock print can be divided into several cluster areas, each with similar surface spectral reflectance; and 2) strong specular reflection from the influence of overlapping paints arises only in specific cluster areas. By considering these properties, we develop an effective rendering algorithm by modifying our previous algorithm for oil paintings. A set of surface spectral reflectances of a woodblock print is represented by using only a small number of average surface spectral reflectances and the registered scaling coefficients, whereas the previous algorithm for oil paintings required surface spectral reflectances of high dimension at all pixels. In the rendering process, in order to reproduce the strong specular reflection in specific cluster areas, we use two sets of parameters in the Torrance-Sparrow model for cluster areas with or without strong specular reflection. An experiment on a woodblock printing with oil-based ink was performed to demonstrate the feasibility of the proposed method.
Models of formation and some algorithms of hyperspectral image processing
NASA Astrophysics Data System (ADS)
Achmetov, R. N.; Stratilatov, N. R.; Yudakov, A. A.; Vezenov, V. I.; Eremeev, V. V.
2014-12-01
Algorithms and information technologies for processing Earth hyperspectral imagery are presented. Several new approaches are discussed. Peculiar properties of processing the hyperspectral imagery, such as multifold signal-to-noise reduction, atmospheric distortions, access to spectral characteristics of every image point, and high dimensionality of data, were studied. Different measures of similarity between individual hyperspectral image points and the effect of additive uncorrelated noise on these measures were analyzed. It was shown that these measures are substantially affected by noise, and a new measure free of this disadvantage was proposed. The problem of detecting the observed scene object boundaries, based on comparing the spectral characteristics of image points, is considered. It was shown that contours are processed much better when spectral characteristics are used instead of energy brightness. A statistical approach to the correction of atmospheric distortions, which makes it possible to solve the stated problem based on analysis of a distorted image in contrast to analytical multiparametric models, was proposed. Several algorithms used to integrate spectral zonal images with data from other survey systems, which make it possible to image observed scene objects with a higher quality, are considered. Quality characteristics of hyperspectral data processing were proposed and studied.
Spectral-spatial classification of hyperspectral imagery with cooperative game
NASA Astrophysics Data System (ADS)
Zhao, Ji; Zhong, Yanfei; Jia, Tianyi; Wang, Xinyu; Xu, Yao; Shu, Hong; Zhang, Liangpei
2018-01-01
Spectral-spatial classification is known to be an effective way to improve classification performance by integrating spectral information and spatial cues for hyperspectral imagery. In this paper, a game-theoretic spectral-spatial classification algorithm (GTA) using a conditional random field (CRF) model is presented, in which CRF is used to model the image considering the spatial contextual information, and a cooperative game is designed to obtain the labels. The algorithm establishes a one-to-one correspondence between image classification and game theory. The pixels of the image are considered as the players, and the labels are considered as the strategies in a game. Similar to the idea of soft classification, the uncertainty is considered to build the expected energy model in the first step. The local expected energy can be quickly calculated, based on a mixed strategy for the pixels, to establish the foundation for a cooperative game. Coalitions can then be formed by the designed merge rule based on the local expected energy, so that a majority game can be performed to make a coalition decision to obtain the label of each pixel. The experimental results on three hyperspectral data sets demonstrate the effectiveness of the proposed classification algorithm.
Interference graph-based dynamic frequency reuse in optical attocell networks
NASA Astrophysics Data System (ADS)
Liu, Huanlin; Xia, Peijie; Chen, Yong; Wu, Lan
2017-11-01
Indoor optical attocell network may achieve higher capacity than radio frequency (RF) or Infrared (IR)-based wireless systems. It is proposed as a special type of visible light communication (VLC) system using Light Emitting Diodes (LEDs). However, the system spectral efficiency may be severely degraded owing to the inter-cell interference (ICI), particularly for dense deployment scenarios. To address these issues, we construct the spectral interference graph for indoor optical attocell network, and propose the Dynamic Frequency Reuse (DFR) and Weighted Dynamic Frequency Reuse (W-DFR) algorithms to decrease ICI and improve the spectral efficiency performance. The interference graph makes LEDs can transmit data without interference and select the minimum sub-bands needed for frequency reuse. Then, DFR algorithm reuses the system frequency equally across service-providing cells to mitigate spectrum interference. While W-DFR algorithm can reuse the system frequency by using the bandwidth weight (BW), which is defined based on the number of service users. Numerical results show that both of the proposed schemes can effectively improve the average spectral efficiency (ASE) of the system. Additionally, improvement of the user data rate is also obtained by analyzing its cumulative distribution function (CDF).
NASA Astrophysics Data System (ADS)
Andrianov, M. N.; Kostenko, V. I.; Likhachev, S. F.
2018-01-01
The algorithms for achieving a practical increase in the rate of data transmission on the space-craft-ground tracking station line has been considered. This increase is achieved by applying spectral-effective modulation techniques, the technology of orthogonal frequency compression of signals using millimeterrange radio waves. The advantages and disadvantages of each of three algorithms have been revealed. A significant advantage of data transmission in the millimeter range has been indicated.
Deng, Yingbin; Fan, Fenglei; Chen, Renrong
2012-01-01
Impervious surface area (ISA) is considered as an indicator of environment change and is regarded as an important input parameter for hydrological cycle simulation, water management and area pollution assessment. The Pearl River Delta (PRD), the 3rd most important economic district of China, is chosen in this paper to extract the ISA information based on Landsat images of 1998, 2003 and 2008 by using a linear spectral un-mixing method and to monitor impervious surface change by analyzing the multi-temporal Landsat-derived fractional impervious surface. Results of this study were as follows: (1) the area of ISA in the PRD increased 79.09% from 1998 to 2003 and 26.88% from 2003 to 2008 separately; (2) the spatial distribution of ISA was described according to the 1998/2003 percentage respectively. Most of middle and high percentage ISA was located in northwestern and southeastern of the whole delta, and middle percentage ISA was mainly located in the city interior, high percentage ISA was mainly located in the suburban around the city accordingly; (3) the expanding direction and trend of high percentage ISA was discussed in order to understand the change of urban in this delta; High percentage ISA moved from inner city to edge of urban area during 1998–2003 and moved to the suburban area that far from the urban area mixed with jumpily and gradually during 2003–2008. According to the discussion of high percentage ISA spatial expanded direction, it could be found out that high percentage ISA moved outward from the centre line of Pearl River of the whole delta while a high ISA percentage in both shores of the Pearl River Estuary moved toward the Pearl River; (4) combining the change of ISA with social conditions, the driving relationship was analyzed in detail. It was evident that ISA percentage change had a deep relationship with the economic development of this region in the past ten years. Contemporaneous major sport events (16th Asia Games of Guangzhou, 26th Summer Universidad of Shenzhen) and the government policies also promoted the development of the ISA. Meanwhile, topographical features like the National Nature Reserve of China restricted and affected the expansion of the ISA. Above all, this paper attempted to extract ISA in a major region of the PRD; the temporal and spatial analyses to PRD ISA demonstrated the drastic changes in developed areas of China. These results were important and valuable for land use management, ecological protection and policy establishment. PMID:22438741
Deng, Yingbin; Fan, Fenglei; Chen, Renrong
2012-01-01
Impervious surface area (ISA) is considered as an indicator of environment change and is regarded as an important input parameter for hydrological cycle simulation, water management and area pollution assessment. The Pearl River Delta (PRD), the 3rd most important economic district of China, is chosen in this paper to extract the ISA information based on Landsat images of 1998, 2003 and 2008 by using a linear spectral un-mixing method and to monitor impervious surface change by analyzing the multi-temporal Landsat-derived fractional impervious surface. Results of this study were as follows: (1) the area of ISA in the PRD increased 79.09% from 1998 to 2003 and 26.88% from 2003 to 2008 separately; (2) the spatial distribution of ISA was described according to the 1998/2003 percentage respectively. Most of middle and high percentage ISA was located in northwestern and southeastern of the whole delta, and middle percentage ISA was mainly located in the city interior, high percentage ISA was mainly located in the suburban around the city accordingly; (3) the expanding direction and trend of high percentage ISA was discussed in order to understand the change of urban in this delta; High percentage ISA moved from inner city to edge of urban area during 1998-2003 and moved to the suburban area that far from the urban area mixed with jumpily and gradually during 2003-2008. According to the discussion of high percentage ISA spatial expanded direction, it could be found out that high percentage ISA moved outward from the centre line of Pearl River of the whole delta while a high ISA percentage in both shores of the Pearl River Estuary moved toward the Pearl River; (4) combining the change of ISA with social conditions, the driving relationship was analyzed in detail. It was evident that ISA percentage change had a deep relationship with the economic development of this region in the past ten years. Contemporaneous major sport events (16th Asia Games of Guangzhou, 26th Summer Universidad of Shenzhen) and the government policies also promoted the development of the ISA. Meanwhile, topographical features like the National Nature Reserve of China restricted and affected the expansion of the ISA. Above all, this paper attempted to extract ISA in a major region of the PRD; the temporal and spatial analyses to PRD ISA demonstrated the drastic changes in developed areas of China. These results were important and valuable for land use management, ecological protection and policy establishment.