Sample records for sensing algorithm development

  1. The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Zhou, Liqing

    2015-12-01

    With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.

  2. Development of MODIS data-based algorithm for retrieving sea surface temperature in coastal waters.

    PubMed

    Wang, Jiao; Deng, Zhiqiang

    2017-06-01

    A new algorithm was developed for retrieving sea surface temperature (SST) in coastal waters using satellite remote sensing data from Moderate Resolution Imaging Spectroradiometer (MODIS) aboard Aqua platform. The new SST algorithm was trained using the Artificial Neural Network (ANN) method and tested using 8 years of remote sensing data from MODIS Aqua sensor and in situ sensing data from the US coastal waters in Louisiana, Texas, Florida, California, and New Jersey. The ANN algorithm could be utilized to map SST in both deep offshore and particularly shallow nearshore waters at the high spatial resolution of 1 km, greatly expanding the coverage of remote sensing-based SST data from offshore waters to nearshore waters. Applications of the ANN algorithm require only the remotely sensed reflectance values from the two MODIS Aqua thermal bands 31 and 32 as input data. Application results indicated that the ANN algorithm was able to explaining 82-90% variations in observed SST in US coastal waters. While the algorithm is generally applicable to the retrieval of SST, it works best for nearshore waters where important coastal resources are located and existing algorithms are either not applicable or do not work well, making the new ANN-based SST algorithm unique and particularly useful to coastal resource management.

  3. Distributed Sensing and Shape Control of Piezoelectric Bimorph Mirrors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redmond, James M.; Barney, Patrick S.; Henson, Tammy D.

    1999-07-28

    As part of a collaborative effort between Sandia National Laboratories and the University of Kentucky to develop a deployable mirror for remote sensing applications, research in shape sensing and control algorithms that leverage the distributed nature of electron gun excitation for piezoelectric bimorph mirrors is summarized. A coarse shape sensing technique is developed that uses reflected light rays from the sample surface to provide discrete slope measurements. Estimates of surface profiles are obtained with a cubic spline curve fitting algorithm. Experiments on a PZT bimorph illustrate appropriate deformation trends as a function of excitation voltage. A parallel effort to effectmore » desired shape changes through electron gun excitation is also summarized. A one dimensional model-based algorithm is developed to correct profile errors in bimorph beams. A more useful two dimensional algorithm is also developed that relies on measured voltage-curvature sensitivities to provide corrective excitation profiles for the top and bottom surfaces of bimorph plates. The two algorithms are illustrated using finite element models of PZT bimorph structures subjected to arbitrary disturbances. Corrective excitation profiles that yield desired parabolic forms are computed, and are shown to provide the necessary corrective action.« less

  4. Experimental Validation of Advanced Dispersed Fringe Sensing (ADFS) Algorithm Using Advanced Wavefront Sensing and Correction Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Wang, Xu; Shi, Fang; Sigrist, Norbert; Seo, Byoung-Joon; Tang, Hong; Bikkannavar, Siddarayappa; Basinger, Scott; Lay, Oliver

    2012-01-01

    Large aperture telescope commonly features segment mirrors and a coarse phasing step is needed to bring these individual segments into the fine phasing capture range. Dispersed Fringe Sensing (DFS) is a powerful coarse phasing technique and its alteration is currently being used for JWST.An Advanced Dispersed Fringe Sensing (ADFS) algorithm is recently developed to improve the performance and robustness of previous DFS algorithms with better accuracy and unique solution. The first part of the paper introduces the basic ideas and the essential features of the ADFS algorithm and presents the some algorithm sensitivity study results. The second part of the paper describes the full details of algorithm validation process through the advanced wavefront sensing and correction testbed (AWCT): first, the optimization of the DFS hardware of AWCT to ensure the data accuracy and reliability is illustrated. Then, a few carefully designed algorithm validation experiments are implemented, and the corresponding data analysis results are shown. Finally the fiducial calibration using Range-Gate-Metrology technique is carried out and a <10nm or <1% algorithm accuracy is demonstrated.

  5. Mechanisms of Undersensing by a Noise Detection Algorithm That Utilizes Far-Field Electrograms With Near-Field Bandpass Filtering.

    PubMed

    Koneru, Jayanthi N; Swerdlow, Charles D; Ploux, Sylvain; Sharma, Parikshit S; Kaszala, Karoly; Tan, Alex Y; Huizar, Jose F; Vijayaraman, Pugazhendi; Kenigsberg, David; Ellenbogen, Kenneth A

    2017-02-01

    Implantable cardioverter defibrillators (ICDs) must establish a balance between delivering appropriate shocks for ventricular tachyarrhythmias and withholding inappropriate shocks for lead-related oversensing ("noise"). To improve the specificity of ICD therapy, manufacturers have developed proprietary algorithms that detect lead noise. The SecureSense TM RV Lead Noise discrimination (St. Jude Medical, St. Paul, MN, USA) algorithm is designed to differentiate oversensing due to lead failure from ventricular tachyarrhythmias and withhold therapies in the presence of sustained lead-related oversensing. We report 5 patients in whom appropriate ICD therapy was withheld due to the operation of the SecureSense algorithm and explain the mechanism for inhibition of therapy in each case. Limitations of algorithms designed to increase ICD therapy specificity, especially for the SecureSense algorithm, are analyzed. The SecureSense algorithm can withhold appropriate therapies for ventricular arrhythmias due to design and programming limitations. Electrophysiologists should have a thorough understanding of the SecureSense algorithm before routinely programming it and understand the implications for ventricular arrhythmia misclassification. © 2016 Wiley Periodicals, Inc.

  6. Soil water balance calculation using a two source energy balance model and wireless sensor arrays aboard a center pivot

    USDA-ARS?s Scientific Manuscript database

    Recent developments in wireless sensor technology and remote sensing algorithms, coupled with increased use of center pivot irrigation systems, have removed several long-standing barriers to adoption of remote sensing for real-time irrigation management. One remote sensing-based algorithm is a two s...

  7. Remote Sensing Applications to Water Quality Management in Florida

    EPA Science Inventory

    Increasingly, optical datasets from estuarine and coastal systems are becoming available for remote sensing algorithm development, validation, and application. With validated algorithms, the data streams from satellite sensors can provide unprecedented spatial and temporal data ...

  8. [Algorithms of multiband remote sensing for coastal red tide waters].

    PubMed

    Mao, Xianmou; Huang, Weigen

    2003-07-01

    The spectral characteristics of the coastal waters in East China Sea was studied using in situ measurements, and the multiband algorithms of remote sensing for bloom waters was discussed and developed. Examples of red tide detection using the algorithms in the East China Sea were presented. The results showed that the algorithms could provide information about the location and the area coverage of the red tide events.

  9. Observability-Based Guidance and Sensor Placement

    NASA Astrophysics Data System (ADS)

    Hinson, Brian T.

    Control system performance is highly dependent on the quality of sensor information available. In a growing number of applications, however, the control task must be accomplished with limited sensing capabilities. This thesis addresses these types of problems from a control-theoretic point-of-view, leveraging system nonlinearities to improve sensing performance. Using measures of observability as an information quality metric, guidance trajectories and sensor distributions are designed to improve the quality of sensor information. An observability-based sensor placement algorithm is developed to compute optimal sensor configurations for a general nonlinear system. The algorithm utilizes a simulation of the nonlinear system as the source of input data, and convex optimization provides a scalable solution method. The sensor placement algorithm is applied to a study of gyroscopic sensing in insect wings. The sensor placement algorithm reveals information-rich areas on flexible insect wings, and a comparison to biological data suggests that insect wings are capable of acting as gyroscopic sensors. An observability-based guidance framework is developed for robotic navigation with limited inertial sensing. Guidance trajectories and algorithms are developed for range-only and bearing-only navigation that improve navigation accuracy. Simulations and experiments with an underwater vehicle demonstrate that the observability measure allows tuning of the navigation uncertainty.

  10. Agent-Based Chemical Plume Tracing Using Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Zarzhitsky, Dimitri; Spears, Diana; Thayer, David; Spears, William

    2004-01-01

    This paper presents a rigorous evaluation of a novel, distributed chemical plume tracing algorithm. The algorithm is a combination of the best aspects of the two most popular predecessors for this task. Furthermore, it is based on solid, formal principles from the field of fluid mechanics. The algorithm is applied by a network of mobile sensing agents (e.g., robots or micro-air vehicles) that sense the ambient fluid velocity and chemical concentration, and calculate derivatives. The algorithm drives the robotic network to the source of the toxic plume, where measures can be taken to disable the source emitter. This work is part of a much larger effort in research and development of a physics-based approach to developing networks of mobile sensing agents for monitoring, tracking, reporting and responding to hazardous conditions.

  11. Embedded 32-bit Differential Pulse Voltammetry (DPV) Technique for 3-electrode Cell Sensing

    NASA Astrophysics Data System (ADS)

    N, Aqmar N. Z.; Abdullah, W. F. H.; Zain, Z. M.; Rani, S.

    2018-03-01

    This paper addresses the development of differential pulse voltammetry (DPV) embedded algorithm using an ARM cortex processor with new developed potentiostat circuit design for in-situ 3-electrode cell sensing. This project is mainly to design a low cost potentiostat for the researchers in laboratories. It is required to develop an embedded algorithm for analytical technique to be used with the designed potentiostat. DPV is one of the most familiar pulse technique method used with 3-electrode cell sensing in chemical studies. Experiment was conducted on 10mM solution of Ferricyanide using the designed potentiostat and the developed DPV algorithm. As a result, the device can generate an excitation signal of DPV from 0.4V to 1.2V and produced a peaked voltammogram with relatively small error compared to the commercial potentiostat; which is only 6.25% difference in peak potential reading. The design of potentiostat device and its DPV algorithm is verified.

  12. Development and demonstration of a freezing drizzle algorithm for roadway environmental sensing Systems.

    DOT National Transportation Integrated Search

    2012-10-01

    The primary goal of this project is to demonstrate the accuracy and utility of a freezing drizzle algorithm that can be implemented on roadway environmental sensing systems (ESSs). : The types of problems related to the occurrence of freezing precipi...

  13. Fiber Optic Wing Shape Sensing on NASA's Ikhana UAV

    NASA Technical Reports Server (NTRS)

    Richards, Lance; Parker, Allen R.; Ko, William L.; Piazza, Anthony

    2008-01-01

    Fiber Optic Wing Shape Sensing on Ikhana involves five major areas 1) Algorithm development: Local-strain-to-displacement algorithms have been developed for complex wing shapes for real-time implementation (NASA TP-2007-214612, patent application submitted) 2) FBG system development: Dryden advancements to fiber optic sensing technology have increased data sampling rates to levels suitable for monitoring structures in flight (patent application submitted) 3) Instrumentation: 2880 FBG strain sensors have been successfully installed on the Ikhana wings 4) Ground Testing: Fiber optic wing shape sensing methods for high aspect ratio UAVs have been validated through extensive ground testing in Dryden s Flight Loads Laboratory 5) Flight Testing: Real time fiber Bragg strain measurements successfully acquired and validated in flight (4/28/2008) Real-time fiber optic wing shape sensing successfully demonstrated in flight

  14. Prospective Elementary Teachers' Development of Fraction Number Sense

    ERIC Educational Resources Information Center

    Utley, Juliana; Reeder, Stacy

    2012-01-01

    Can prospective elementary teachers "unlearn" harmful algorithms used with fractions as they are invited to develop fraction number sense? This study examined the development of prospective elementary teachers' fraction number sense during an intermediate (grades 5-8) mathematics methods course. During this course, participants' were involved in a…

  15. Investigation of the application of remote sensing technology to environmental monitoring

    NASA Technical Reports Server (NTRS)

    Rader, M. L. (Principal Investigator)

    1980-01-01

    Activities and results are reported of a project to investigate the application of remote sensing technology developed for the LACIE, AgRISTARS, Forestry and other NASA remote sensing projects for the environmental monitoring of strip mining, industrial pollution, and acid rain. Following a remote sensing workshop for EPA personnel, the EOD clustering algorithm CLASSY was selected for evaluation by EPA as a possible candidate technology. LANDSAT data acquired for a North Dakota test sight was clustered in order to compare CLASSY with other algorithms.

  16. Shearlet Features for Registration of Remotely Sensed Multitemporal Images

    NASA Technical Reports Server (NTRS)

    Murphy, James M.; Le Moigne, Jacqueline

    2015-01-01

    We investigate the role of anisotropic feature extraction methods for automatic image registration of remotely sensed multitemporal images. Building on the classical use of wavelets in image registration, we develop an algorithm based on shearlets, a mathematical generalization of wavelets that offers increased directional sensitivity. Initial experimental results on LANDSAT images are presented, which indicate superior performance of the shearlet algorithm when compared to classical wavelet algorithms.

  17. Remote Sensing Applications to Water Quality Management in Florida

    NASA Astrophysics Data System (ADS)

    Lehrter, J. C.; Schaeffer, B. A.; Hagy, J.; Spiering, B.; Barnes, B.; Hu, C.; Le, C.; McEachron, L.; Underwood, L. W.; Ellis, C.; Fisher, B.

    2013-12-01

    Optical datasets from estuarine and coastal systems are increasingly available for remote sensing algorithm development, validation, and application. With validated algorithms, the data streams from satellite sensors can provide unprecedented spatial and temporal data for local and regional coastal water quality management. Our presentation will highlight two recent applications of optical data and remote sensing to water quality decision-making in coastal regions of the state of Florida; (1) informing the development of estuarine and coastal nutrient criteria for the state of Florida and (2) informing the rezoning of the Florida Keys National Marine Sanctuary. These efforts involved building up the underlying science to demonstrate the applicability of satellite data as well as an outreach component to educate decision-makers about the use, utility, and uncertainties of remote sensing data products. Scientific developments included testing existing algorithms and generating new algorithms for water clarity and chlorophylla in case II (CDOM or turbidity dominated) estuarine and coastal waters and demonstrating the accuracy of remote sensing data products in comparison to traditional field based measurements. Including members from decision-making organizations on the research team and interacting with decision-makers early and often in the process were key factors for the success of the outreach efforts and the eventual adoption of satellite data into the data records and analyses used in decision-making. Florida coastal water bodies (black boxes) for which remote sensing imagery were applied to derive numeric nutrient criteria and in situ observations (black dots) used to validate imagery. Florida ocean color applied to development of numeric nutrient criteria

  18. Development of an Algorithm for Satellite Remote Sensing of Sea and Lake Ice

    NASA Astrophysics Data System (ADS)

    Dorofy, Peter T.

    Satellite remote sensing of snow and ice has a long history. The traditional method for many snow and ice detection algorithms has been the use of the Normalized Difference Snow Index (NDSI). This manuscript is composed of two parts. Chapter 1, Development of a Mid-Infrared Sea and Lake Ice Index (MISI) using the GOES Imager, discusses the desirability, development, and implementation of alternative index for an ice detection algorithm, application of the algorithm to the detection of lake ice, and qualitative validation against other ice mapping products; such as, the Ice Mapping System (IMS). Chapter 2, Application of Dynamic Threshold in a Lake Ice Detection Algorithm, continues with a discussion of the development of a method that considers the variable viewing and illumination geometry of observations throughout the day. The method is an alternative to Bidirectional Reflectance Distribution Function (BRDF) models. Evaluation of the performance of the algorithm is introduced by aggregating classified pixels within geometrical boundaries designated by IMS and obtaining sensitivity and specificity statistical measures.

  19. A View from Above Without Leaving the Ground

    NASA Technical Reports Server (NTRS)

    2004-01-01

    In order to deliver accurate geospatial data and imagery to the remote sensing community, NASA is constantly developing new image-processing algorithms while refining existing ones for technical improvement. For 8 years, the NASA Regional Applications Center at Florida International University has served as a test bed for implementing and validating many of these algorithms, helping the Space Program to fulfill its strategic and educational goals in the area of remote sensing. The algorithms in return have helped the NASA Regional Applications Center develop comprehensive semantic database systems for data management, as well as new tools for disseminating geospatial information via the Internet.

  20. Alocomotino Control Algorithm for Robotic Linkage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dohner, Jeffrey L.

    This dissertation describes the development of a control algorithm that transitions a robotic linkage system between stabilized states producing responsive locomotion. The developed algorithm is demonstrated using a simple robotic construction consisting of a few links with actuation and sensing at each joint. Numerical and experimental validation is presented.

  1. Development and validation of a dual sensing scheme to improve accuracy of bradycardia and pause detection in an insertable cardiac monitor.

    PubMed

    Passman, Rod S; Rogers, John D; Sarkar, Shantanu; Reiland, Jerry; Reisfeld, Erin; Koehler, Jodi; Mittal, Suneet

    2017-07-01

    Undersensing of premature ventricular beats and low-amplitude R waves are primary causes for inappropriate bradycardia and pause detections in insertable cardiac monitors (ICMs). The purpose of this study was to develop and validate an enhanced algorithm to reduce inappropriately detected bradycardia and pause episodes. Independent data sets to develop and validate the enhanced algorithm were derived from a database of ICM-detected bradycardia and pause episodes in de-identified patients monitored for unexplained syncope. The original algorithm uses an auto-adjusting sensitivity threshold for R-wave sensing to detect tachycardia and avoid T-wave oversensing. In the enhanced algorithm, a second sensing threshold is used with a long blanking and fixed lower sensitivity threshold, looking for evidence of undersensed signals. Data reported includes percent change in appropriate and inappropriate bradycardia and pause detections as well as changes in episode detection sensitivity and positive predictive value with the enhanced algorithm. The validation data set, from 663 consecutive patients, consisted of 4904 (161 patients) bradycardia and 2582 (133 patients) pause episodes, of which 2976 (61%) and 996 (39%) were appropriately detected bradycardia and pause episodes. The enhanced algorithm reduced inappropriate bradycardia and pause episodes by 95% and 47%, respectively, with 1.7% and 0.6% reduction in appropriate episodes, respectively. The average episode positive predictive value improved by 62% (P < .001) for bradycardia detection and by 26% (P < .001) for pause detection, with an average relative sensitivity of 95% (P < .001) and 99% (P = .5), respectively. The enhanced dual sense algorithm for bradycardia and pause detection in ICMs substantially reduced inappropriate episode detection with a minimal reduction in true episode detection. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. The atmospheric correction algorithm for HY-1B/COCTS

    NASA Astrophysics Data System (ADS)

    He, Xianqiang; Bai, Yan; Pan, Delu; Zhu, Qiankun

    2008-10-01

    China has launched her second ocean color satellite HY-1B on 11 Apr., 2007, which carried two remote sensors. The Chinese Ocean Color and Temperature Scanner (COCTS) is the main sensor on HY-1B, and it has not only eight visible and near-infrared wavelength bands similar to the SeaWiFS, but also two more thermal infrared bands to measure the sea surface temperature. Therefore, COCTS has broad application potentiality, such as fishery resource protection and development, coastal monitoring and management and marine pollution monitoring. Atmospheric correction is the key of the quantitative ocean color remote sensing. In this paper, the operational atmospheric correction algorithm of HY-1B/COCTS has been developed. Firstly, based on the vector radiative transfer numerical model of coupled oceanatmosphere system- PCOART, the exact Rayleigh scattering look-up table (LUT), aerosol scattering LUT and atmosphere diffuse transmission LUT for HY-1B/COCTS have been generated. Secondly, using the generated LUTs, the exactly operational atmospheric correction algorithm for HY-1B/COCTS has been developed. The algorithm has been validated using the simulated spectral data generated by PCOART, and the result shows the error of the water-leaving reflectance retrieved by this algorithm is less than 0.0005, which meets the requirement of the exactly atmospheric correction of ocean color remote sensing. Finally, the algorithm has been applied to the HY-1B/COCTS remote sensing data, and the retrieved water-leaving radiances are consist with the Aqua/MODIS results, and the corresponding ocean color remote sensing products have been generated including the chlorophyll concentration and total suspended particle matter concentration.

  3. [The progress in retrieving land surface temperature based on thermal infrared and microwave remote sensing technologies].

    PubMed

    Zhang, Jia-Hua; Li, Xin; Yao, Feng-Mei; Li, Xian-Hua

    2009-08-01

    Land surface temperature (LST) is an important parameter in the study on the exchange of substance and energy between land surface and air for the land surface physics process at regional and global scales. Many applications of satellites remotely sensed data must provide exact and quantificational LST, such as drought, high temperature, forest fire, earthquake, hydrology and the vegetation monitor, and the models of global circulation and regional climate also need LST as input parameter. Therefore, the retrieval of LST using remote sensing technology becomes one of the key tasks in quantificational remote sensing study. Normally, in the spectrum bands, the thermal infrared (TIR, 3-15 microm) and microwave bands (1 mm-1 m) are important for retrieval of the LST. In the present paper, firstly, several methods for estimating the LST on the basis of thermal infrared (TIR) remote sensing were synthetically reviewed, i. e., the LST measured with an ground-base infrared thermometer, the LST retrieval from mono-window algorithm (MWA), single-channel algorithm (SCA), split-window techniques (SWT) and multi-channels algorithm(MCA), single-channel & multi-angle algorithm and multi-channels algorithm & multi-angle algorithm, and retrieval method of land surface component temperature using thermal infrared remotely sensed satellite observation. Secondly, the study status of land surface emissivity (epsilon) was presented. Thirdly, in order to retrieve LST for all weather conditions, microwave remotely sensed data, instead of thermal infrared data, have been developed recently, and the LST retrieval method from passive microwave remotely sensed data was also introduced. Finally, the main merits and shortcomings of different kinds of LST retrieval methods were discussed, respectively.

  4. A Plane Target Detection Algorithm in Remote Sensing Images based on Deep Learning Network Technology

    NASA Astrophysics Data System (ADS)

    Shuxin, Li; Zhilong, Zhang; Biao, Li

    2018-01-01

    Plane is an important target category in remote sensing targets and it is of great value to detect the plane targets automatically. As remote imaging technology developing continuously, the resolution of the remote sensing image has been very high and we can get more detailed information for detecting the remote sensing targets automatically. Deep learning network technology is the most advanced technology in image target detection and recognition, which provided great performance improvement in the field of target detection and recognition in the everyday scenes. We combined the technology with the application in the remote sensing target detection and proposed an algorithm with end to end deep network, which can learn from the remote sensing images to detect the targets in the new images automatically and robustly. Our experiments shows that the algorithm can capture the feature information of the plane target and has better performance in target detection with the old methods.

  5. Comparison of remote sensing algorithms for retrieval of suspended particulate matter concentration from reflectance in coastal waters

    NASA Astrophysics Data System (ADS)

    Freeman, Lauren A.; Ackleson, Steven G.; Rhea, William Joseph

    2017-10-01

    Suspended particulate matter (SPM) is a key environmental indicator for rivers, estuaries, and coastal waters, which can be calculated from remote sensing reflectance obtained by an airborne or satellite imager. Here, algorithms from prior studies are applied to a dataset of in-situ at surface hyperspectral remote sensing reflectance, collected in three geographic regions representing different water types. These data show the optically inherent exponential nature of the relationship between reflectance and sediment concentration. However, linear models are also shown to provide a reasonable estimate of sediment concentration when utilized with care in similar conditions to those under which the algorithms were developed, particularly at lower SPM values (0 to 20 mg/L). Fifteen published SPM algorithms are tested, returning strong correlations of R2>0.7, and in most cases, R2>0.8. Very low SPM values show weaker correlation with algorithm calculated SPM that is not wavelength dependent. None of the tested algorithms performs well for high SPM values (>30 mg/L), with most algorithms underestimating SPM. A shift toward a smaller number of simple exponential or linear models relating satellite remote sensing reflectance to suspended sediment concentration with regional consideration will greatly aid larger spatiotemporal studies of suspended sediment trends.

  6. Hyperspectral remote sensing of coral reefs: Deriving bathymetry, aquatic optical properties and a benthic spectral unmixing classification using AVIRIS data in the Hawaiian Islands

    NASA Astrophysics Data System (ADS)

    Goodman, James Ansell

    My research focuses on the development and application of hyperspectral remote sensing as a valuable component in the assessment and management of coral ecosystems. Remote sensing provides an important quantitative ability to investigate the spatial dynamics of coral health and evaluate the impacts of local, regional and global change on this important natural resource. Furthermore, advances in detector capabilities and analysis methods, particularly with respect to hyperspectral remote sensing, are also increasing the accuracy and level of effectiveness of the resulting data products. Using imagery of Kaneohe Bay and French Frigate Shoals in the Hawaiian Islands, acquired in 2000 by NASA's Airborne Visible InfraRed Imaging Spectrometer (AVIRIS), I developed, applied and evaluated algorithms for analyzing coral reefs using hyperspectral remote sensing data. Research included developing methods for acquiring in situ underwater reflectance, collecting spectral measurements of the dominant bottom components in Kaneohe Bay, applying atmospheric correction and sunglint removal algorithms, employing a semianalytical optimization model to derive bathymetry and aquatic optical properties, and developing a linear unmixing approach for deriving bottom composition. Additionally, algorithm development focused on using fundamental scientific principles to facilitate the portability of methods to diverse geographic locations and across variable environmental conditions. Assessments of this methodology compared favorably with available field measurements and habitat information, and the overall analysis demonstrated the capacity to derive information on water properties, bathymetry and habitat composition. Thus, results illustrated a successful approach for extracting environmental information and habitat composition from a coral reef environment using hyperspectral remote sensing.

  7. System development of the Screwworm Eradication Data System (SEDS) algorithm

    NASA Technical Reports Server (NTRS)

    Arp, G.; Forsberg, F.; Giddings, L.; Phinney, D.

    1976-01-01

    The use of remotely sensed data is reported in the eradication of the screwworm and in the study of the role of the weather in the activity and development of the screwworm fly. As a result, the Screwworm Eradication Data System (SEDS) algorithm was developed.

  8. Geologist's Field Assistant: Developing Image and Spectral Analyses Algorithms for Remote Science Exploration

    NASA Technical Reports Server (NTRS)

    Gulick, V. C.; Morris, R. L.; Bishop, J.; Gazis, P.; Alena, R.; Sierhuis, M.

    2002-01-01

    We are developing science analyses algorithms to interface with a Geologist's Field Assistant device to allow robotic or human remote explorers to better sense their surroundings during limited surface excursions. Our algorithms will interpret spectral and imaging data obtained by various sensors. Additional information is contained in the original extended abstract.

  9. Research on fast Fourier transforms algorithm of huge remote sensing image technology with GPU and partitioning technology.

    PubMed

    Yang, Xue; Li, Xue-You; Li, Jia-Guo; Ma, Jun; Zhang, Li; Yang, Jan; Du, Quan-Ye

    2014-02-01

    Fast Fourier transforms (FFT) is a basic approach to remote sensing image processing. With the improvement of capacity of remote sensing image capture with the features of hyperspectrum, high spatial resolution and high temporal resolution, how to use FFT technology to efficiently process huge remote sensing image becomes the critical step and research hot spot of current image processing technology. FFT algorithm, one of the basic algorithms of image processing, can be used for stripe noise removal, image compression, image registration, etc. in processing remote sensing image. CUFFT function library is the FFT algorithm library based on CPU and FFTW. FFTW is a FFT algorithm developed based on CPU in PC platform, and is currently the fastest CPU based FFT algorithm function library. However there is a common problem that once the available memory or memory is less than the capacity of image, there will be out of memory or memory overflow when using the above two methods to realize image FFT arithmetic. To address this problem, a CPU and partitioning technology based Huge Remote Fast Fourier Transform (HRFFT) algorithm is proposed in this paper. By improving the FFT algorithm in CUFFT function library, the problem of out of memory and memory overflow is solved. Moreover, this method is proved rational by experiment combined with the CCD image of HJ-1A satellite. When applied to practical image processing, it improves effect of the image processing, speeds up the processing, which saves the time of computation and achieves sound result.

  10. Compensating for pneumatic distortion in pressure sensing devices

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Leondes, Cornelius T.

    1990-01-01

    A technique of compensating for pneumatic distortion in pressure sensing devices was developed and verified. This compensation allows conventional pressure sensing technology to obtain improved unsteady pressure measurements. Pressure distortion caused by frictional attenuation and pneumatic resonance within the sensing system makes obtaining unsteady pressure measurements by conventional sensors difficult. Most distortion occurs within the pneumatic tubing which transmits pressure impulses from the aircraft's surface to the measurement transducer. To avoid pneumatic distortion, experiment designers mount the pressure sensor at the surface of the aircraft, (called in-situ mounting). In-situ transducers cannot always fit in the available space and sometimes pneumatic tubing must be run from the aircraft's surface to the pressure transducer. A technique to measure unsteady pressure data using conventional pressure sensing technology was developed. A pneumatic distortion model is reduced to a low-order, state-variable model retaining most of the dynamic characteristics of the full model. The reduced-order model is coupled with results from minimum variance estimation theory to develop an algorithm to compensate for the effects of pneumatic distortion. Both postflight and real-time algorithms are developed and evaluated using simulated and flight data.

  11. Data compressive paradigm for multispectral sensing using tunable DWELL mid-infrared detectors.

    PubMed

    Jang, Woo-Yong; Hayat, Majeed M; Godoy, Sebastián E; Bender, Steven C; Zarkesh-Ha, Payman; Krishna, Sanjay

    2011-09-26

    While quantum dots-in-a-well (DWELL) infrared photodetectors have the feature that their spectral responses can be shifted continuously by varying the applied bias, the width of the spectral response at any applied bias is not sufficiently narrow for use in multispectral sensing without the aid of spectral filters. To achieve higher spectral resolutions without using physical spectral filters, algorithms have been developed for post-processing the DWELL's bias-dependent photocurrents resulting from probing an object of interest repeatedly over a wide range of applied biases. At the heart of these algorithms is the ability to approximate an arbitrary spectral filter, which we desire the DWELL-algorithm combination to mimic, by forming a weighted superposition of the DWELL's non-orthogonal spectral responses over a range of applied biases. However, these algorithms assume availability of abundant DWELL data over a large number of applied biases (>30), leading to large overall acquisition times in proportion with the number of biases. This paper reports a new multispectral sensing algorithm to substantially compress the number of necessary bias values subject to a prescribed performance level across multiple sensing applications. The algorithm identifies a minimal set of biases to be used in sensing only the relevant spectral information for remote-sensing applications of interest. Experimental results on target spectrometry and classification demonstrate a reduction in the number of required biases by a factor of 7 (e.g., from 30 to 4). The tradeoff between performance and bias compression is thoroughly investigated. © 2011 Optical Society of America

  12. Polarimetric Remote Sensing of Atmospheric Particulate Pollutants

    NASA Astrophysics Data System (ADS)

    Li, Z.; Zhang, Y.; Hong, J.

    2018-04-01

    Atmospheric particulate pollutants not only reduce atmospheric visibility, change the energy balance of the troposphere, but also affect human and vegetation health. For monitoring the particulate pollutants, we establish and develop a series of inversion algorithms based on polarimetric remote sensing technology which has unique advantages in dealing with atmospheric particulates. A solution is pointed out to estimate the near surface PM2.5 mass concentrations from full remote sensing measurements including polarimetric, active and infrared remote sensing technologies. It is found that the mean relative error of PM2.5 retrieved by full remote sensing measurements is 35.5 % in the case of October 5th 2013, improved to a certain degree compared to previous studies. A systematic comparison with the ground-based observations further indicates the effectiveness of the inversion algorithm and reliability of results. A new generation of polarized sensors (DPC and PCF), whose observation can support these algorithms, will be onboard GF series satellites and launched by China in the near future.

  13. Equivalent Sensor Radiance Generation and Remote Sensing from Model Parameters. Part 1; Equivalent Sensor Radiance Formulation

    NASA Technical Reports Server (NTRS)

    Wind, Galina; DaSilva, Arlindo M.; Norris, Peter M.; Platnick, Steven E.

    2013-01-01

    In this paper we describe a general procedure for calculating equivalent sensor radiances from variables output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint the algorithm takes explicit account of the model subgrid variability, in particular its description of the probably density function of total water (vapor and cloud condensate.) The equivalent sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies. We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products.) We focus on clouds and cloud/aerosol interactions, because they are very important to model development and improvement.

  14. Multi-sensor Cloud Retrieval Simulator and Remote Sensing from Model Parameters . Pt. 1; Synthetic Sensor Radiance Formulation; [Synthetic Sensor Radiance Formulation

    NASA Technical Reports Server (NTRS)

    Wind, G.; DaSilva, A. M.; Norris, P. M.; Platnick, S.

    2013-01-01

    In this paper we describe a general procedure for calculating synthetic sensor radiances from variable output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint, the algorithm takes explicit account of the model subgrid variability, in particular its description of the probability density function of total water (vapor and cloud condensate.) The simulated sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies.We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products). We focus on clouds because they are very important to model development and improvement.

  15. Advanced Fiber Optic-Based Sensing Technology for Unmanned Aircraft Systems

    NASA Technical Reports Server (NTRS)

    Richards, Lance; Parker, Allen R.; Piazza, Anthony; Ko, William L.; Chan, Patrick; Bakalyar, John

    2011-01-01

    This presentation provides an overview of fiber optic sensing technology development activities performed at NASA Dryden in support of Unmanned Aircraft Systems. Examples of current and previous work are presented in the following categories: algorithm development, system development, instrumentation installation, ground R&D, and flight testing. Examples of current research and development activities are provided.

  16. Object-oriented recognition of high-resolution remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Yongyan; Li, Haitao; Chen, Hong; Xu, Yuannan

    2016-01-01

    With the development of remote sensing imaging technology and the improvement of multi-source image's resolution in satellite visible light, multi-spectral and hyper spectral , the high resolution remote sensing image has been widely used in various fields, for example military field, surveying and mapping, geophysical prospecting, environment and so forth. In remote sensing image, the segmentation of ground targets, feature extraction and the technology of automatic recognition are the hotspot and difficulty in the research of modern information technology. This paper also presents an object-oriented remote sensing image scene classification method. The method is consist of vehicles typical objects classification generation, nonparametric density estimation theory, mean shift segmentation theory, multi-scale corner detection algorithm, local shape matching algorithm based on template. Remote sensing vehicles image classification software system is designed and implemented to meet the requirements .

  17. Comparative Study on a Solving Model and Algorithm for a Flush Air Data Sensing System

    PubMed Central

    Liu, Yanbin; Xiao, Dibo; Lu, Yuping

    2014-01-01

    With the development of high-performance aircraft, precise air data are necessary to complete challenging tasks such as flight maneuvering with large angles of attack and high speed. As a result, the flush air data sensing system (FADS) was developed to satisfy the stricter control demands. In this paper, comparative stuides on the solving model and algorithm for FADS are conducted. First, the basic principles of FADS are given to elucidate the nonlinear relations between the inputs and the outputs. Then, several different solving models and algorithms of FADS are provided to compute the air data, including the angle of attck, sideslip angle, dynamic pressure and static pressure. Afterwards, the evaluation criteria of the resulting models and algorithms are discussed to satisfy the real design demands. Futhermore, a simulation using these algorithms is performed to identify the properites of the distinct models and algorithms such as the measuring precision and real-time features. The advantages of these models and algorithms corresponding to the different flight conditions are also analyzed, furthermore, some suggestions on their engineering applications are proposed to help future research. PMID:24859025

  18. Comparative study on a solving model and algorithm for a flush air data sensing system.

    PubMed

    Liu, Yanbin; Xiao, Dibo; Lu, Yuping

    2014-05-23

    With the development of high-performance aircraft, precise air data are necessary to complete challenging tasks such as flight maneuvering with large angles of attack and high speed. As a result, the flush air data sensing system (FADS) was developed to satisfy the stricter control demands. In this paper, comparative stuides on the solving model and algorithm for FADS are conducted. First, the basic principles of FADS are given to elucidate the nonlinear relations between the inputs and the outputs. Then, several different solving models and algorithms of FADS are provided to compute the air data, including the angle of attck, sideslip angle, dynamic pressure and static pressure. Afterwards, the evaluation criteria of the resulting models and algorithms are discussed to satisfy the real design demands. Futhermore, a simulation using these algorithms is performed to identify the properites of the distinct models and algorithms such as the measuring precision and real-time features. The advantages of these models and algorithms corresponding to the different flight conditions are also analyzed, furthermore, some suggestions on their engineering applications are proposed to help future research.

  19. A Multi-Band Analytical Algorithm for Deriving Absorption and Backscattering Coefficients from Remote-Sensing Reflectance of Optically Deep Waters

    NASA Technical Reports Server (NTRS)

    Lee, Zhong-Ping; Carder, Kendall L.

    2001-01-01

    A multi-band analytical (MBA) algorithm is developed to retrieve absorption and backscattering coefficients for optically deep waters, which can be applied to data from past and current satellite sensors, as well as data from hyperspectral sensors. This MBA algorithm applies a remote-sensing reflectance model derived from the Radiative Transfer Equation, and values of absorption and backscattering coefficients are analytically calculated from values of remote-sensing reflectance. There are only limited empirical relationships involved in the algorithm, which implies that this MBA algorithm could be applied to a wide dynamic range of waters. Applying the algorithm to a simulated non-"Case 1" data set, which has no relation to the development of the algorithm, the percentage error for the total absorption coefficient at 440 nm a (sub 440) is approximately 12% for a range of 0.012 - 2.1 per meter (approximately 6% for a (sub 440) less than approximately 0.3 per meter), while a traditional band-ratio approach returns a percentage error of approximately 30%. Applying it to a field data set ranging from 0.025 to 2.0 per meter, the result for a (sub 440) is very close to that using a full spectrum optimization technique (9.6% difference). Compared to the optimization approach, the MBA algorithm cuts the computation time dramatically with only a small sacrifice in accuracy, making it suitable for processing large data sets such as satellite images. Significant improvements over empirical algorithms have also been achieved in retrieving the optical properties of optically deep waters.

  20. The use of a MODIS band-ratio algorithm versus a new hybrid approach for estimating colored dissolved organic matter (CDOM)

    EPA Science Inventory

    Satellite remote sensing offers synoptic and frequent monitoring of optical water quality parameters, such as chlorophyll-a, turbidity, and colored dissolved organic matter (CDOM). While traditional satellite algorithms were developed for the open ocean, these algorithms often do...

  1. Development of mathematical techniques for the assimilation of remote sensing data into atmospheric models

    NASA Technical Reports Server (NTRS)

    Seinfeld, J. H. (Principal Investigator)

    1982-01-01

    The problem of the assimilation of remote sensing data into mathematical models of atmospheric pollutant species was investigated. The data assimilation problem is posed in terms of the matching of spatially integrated species burden measurements to the predicted three-dimensional concentration fields from atmospheric diffusion models. General conditions were derived for the reconstructability of atmospheric concentration distributions from data typical of remote sensing applications, and a computational algorithm (filter) for the processing of remote sensing data was developed.

  2. Development of mathematical techniques for the assimilation of remote sensing data into atmospheric models

    NASA Technical Reports Server (NTRS)

    Seinfeld, J. H. (Principal Investigator)

    1982-01-01

    The problem of the assimilation of remote sensing data into mathematical models of atmospheric pollutant species was investigated. The problem is posed in terms of the matching of spatially integrated species burden measurements to the predicted three dimensional concentration fields from atmospheric diffusion models. General conditions are derived for the "reconstructability' of atmospheric concentration distributions from data typical of remote sensing applications, and a computational algorithm (filter) for the processing of remote sensing data is developed.

  3. Research on optimal path planning algorithm of task-oriented optical remote sensing satellites

    NASA Astrophysics Data System (ADS)

    Liu, Yunhe; Xu, Shengli; Liu, Fengjing; Yuan, Jingpeng

    2015-08-01

    GEO task-oriented optical remote sensing satellite, is very suitable for long-term continuous monitoring and quick access to imaging. With the development of high resolution optical payload technology and satellite attitude control technology, GEO optical remote sensing satellites will become an important developing trend for aerospace remote sensing satellite in the near future. In the paper, we focused on GEO optical remote sensing satellite plane array stare imaging characteristics and real-time leading mission of earth observation mode, targeted on satisfying needs of the user with the minimum cost of maneuver, and put forward the optimal path planning algorithm centered on transformation from geographic coordinate space to Field of plane, and finally reduced the burden of the control system. In this algorithm, bounded irregular closed area on the ground would be transformed based on coordinate transformation relations in to the reference plane for field of the satellite payload, and then using the branch and bound method to search for feasible solutions, cutting off the non-feasible solution in the solution space based on pruning strategy; and finally trimming some suboptimal feasible solutions based on the optimization index until a feasible solution for the global optimum. Simulation and visualization presentation software testing results verified the feasibility and effectiveness of the strategy.

  4. Atmospheric Correction Algorithm for Hyperspectral Remote Sensing of Ocean Color from Space

    DTIC Science & Technology

    2000-02-20

    Existing atmospheric correction algorithms for multichannel remote sensing of ocean color from space were designed for retrieving water-leaving...atmospheric correction algorithm for hyperspectral remote sensing of ocean color with the near-future Coastal Ocean Imaging Spectrometer. The algorithm uses

  5. WinASEAN for remote sensing data analysis

    NASA Astrophysics Data System (ADS)

    Duong, Nguyen Dinh; Takeuchi, Shoji

    The image analysis system ASEAN (Advanced System for Environmental ANalysis with Remote Sensing Data) was designed and programmed by a software development group, ImaSOFr, Department of Remote Sensing Technology and GIS, Institute for Geography, National Centre for Natural Science and Technology of Vietnam under technical cooperation with the Remote Sensing Technology Centre of Japan and financial support from the National Space Development Agency of Japan. ASEAN has been in continuous development since 1989, with different versions ranging from the simplest one for MS-DOS with standard VGA 320×200×256 colours, through versions supporting SpeedStar 1.0 and SpeedStar PRO 2.0 true colour graphics cards, up to the latest version named WinASEAN, which is designed for the Windows 3.1 operating system. The most remarkable feature of WinASEAN is the use of algorithms that speed up the image analysis process, even on PC platforms. Today WinASEAN is continuously improved in cooperation with NASDA (National Space Development Agency of Japan), RESTEC (Remote Sensing Technology Center of Japan) and released as public domain software for training, research and education through the Regional Remote Sensing Seminar on Tropical Eco-system Management which is organised by NASDA and ESCAR In this paper, the authors describe the functionality of WinASEAN, some of the relevant analysis algorithms, and discuss its possibilities of computer-assisted teaching and training of remote sensing.

  6. Geometry correction Algorithm for UAV Remote Sensing Image Based on Improved Neural Network

    NASA Astrophysics Data System (ADS)

    Liu, Ruian; Liu, Nan; Zeng, Beibei; Chen, Tingting; Yin, Ninghao

    2018-03-01

    Aiming at the disadvantage of current geometry correction algorithm for UAV remote sensing image, a new algorithm is proposed. Adaptive genetic algorithm (AGA) and RBF neural network are introduced into this algorithm. And combined with the geometry correction principle for UAV remote sensing image, the algorithm and solving steps of AGA-RBF are presented in order to realize geometry correction for UAV remote sensing. The correction accuracy and operational efficiency is improved through optimizing the structure and connection weight of RBF neural network separately with AGA and LMS algorithm. Finally, experiments show that AGA-RBF algorithm has the advantages of high correction accuracy, high running rate and strong generalization ability.

  7. Advances in multi-sensor data fusion: algorithms and applications.

    PubMed

    Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying

    2009-01-01

    With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.

  8. Research on remote sensing image pixel attribute data acquisition method in AutoCAD

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoyang; Sun, Guangtong; Liu, Jun; Liu, Hui

    2013-07-01

    The remote sensing image has been widely used in AutoCAD, but AutoCAD lack of the function of remote sensing image processing. In the paper, ObjectARX was used for the secondary development tool, combined with the Image Engine SDK to realize remote sensing image pixel attribute data acquisition in AutoCAD, which provides critical technical support for AutoCAD environment remote sensing image processing algorithms.

  9. A modified approach combining FNEA and watershed algorithms for segmenting remotely-sensed optical images

    NASA Astrophysics Data System (ADS)

    Liu, Likun

    2018-01-01

    In the field of remote sensing image processing, remote sensing image segmentation is a preliminary step for later analysis of remote sensing image processing and semi-auto human interpretation, fully-automatic machine recognition and learning. Since 2000, a technique of object-oriented remote sensing image processing method and its basic thought prevails. The core of the approach is Fractal Net Evolution Approach (FNEA) multi-scale segmentation algorithm. The paper is intent on the research and improvement of the algorithm, which analyzes present segmentation algorithms and selects optimum watershed algorithm as an initialization. Meanwhile, the algorithm is modified by modifying an area parameter, and then combining area parameter with a heterogeneous parameter further. After that, several experiments is carried on to prove the modified FNEA algorithm, compared with traditional pixel-based method (FCM algorithm based on neighborhood information) and combination of FNEA and watershed, has a better segmentation result.

  10. Study on additional carrier sensing for IEEE 802.15.4 wireless sensor networks.

    PubMed

    Lee, Bih-Hwang; Lai, Ruei-Lung; Wu, Huai-Kuei; Wong, Chi-Ming

    2010-01-01

    Wireless sensor networks based on the IEEE 802.15.4 standard are able to achieve low-power transmissions in the guise of low-rate and short-distance wireless personal area networks (WPANs). The slotted carrier sense multiple access with collision avoidance (CSMA/CA) is used for contention mechanism. Sensor nodes perform a backoff process as soon as the clear channel assessment (CCA) detects a busy channel. In doing so they may neglect the implicit information of the failed CCA detection and further cause the redundant sensing. The blind backoff process in the slotted CSMA/CA will cause lower channel utilization. This paper proposes an additional carrier sensing (ACS) algorithm based on IEEE 802.15.4 to enhance the carrier sensing mechanism for the original slotted CSMA/CA. An analytical Markov chain model is developed to evaluate the performance of the ACS algorithm. Both analytical and simulation results show that the proposed algorithm performs better than IEEE 802.15.4, which in turn significantly improves throughput, average medium access control (MAC) delay and power consumption of CCA detection.

  11. Bundle block adjustment of large-scale remote sensing data with Block-based Sparse Matrix Compression combined with Preconditioned Conjugate Gradient

    NASA Astrophysics Data System (ADS)

    Zheng, Maoteng; Zhang, Yongjun; Zhou, Shunping; Zhu, Junfeng; Xiong, Xiaodong

    2016-07-01

    In recent years, new platforms and sensors in photogrammetry, remote sensing and computer vision areas have become available, such as Unmanned Aircraft Vehicles (UAV), oblique camera systems, common digital cameras and even mobile phone cameras. Images collected by all these kinds of sensors could be used as remote sensing data sources. These sensors can obtain large-scale remote sensing data which consist of a great number of images. Bundle block adjustment of large-scale data with conventional algorithm is very time and space (memory) consuming due to the super large normal matrix arising from large-scale data. In this paper, an efficient Block-based Sparse Matrix Compression (BSMC) method combined with the Preconditioned Conjugate Gradient (PCG) algorithm is chosen to develop a stable and efficient bundle block adjustment system in order to deal with the large-scale remote sensing data. The main contribution of this work is the BSMC-based PCG algorithm which is more efficient in time and memory than the traditional algorithm without compromising the accuracy. Totally 8 datasets of real data are used to test our proposed method. Preliminary results have shown that the BSMC method can efficiently decrease the time and memory requirement of large-scale data.

  12. Sensor fusion approaches for EMI and GPR-based subsurface threat identification

    NASA Astrophysics Data System (ADS)

    Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.

    2011-06-01

    Despite advances in both electromagnetic induction (EMI) and ground penetrating radar (GPR) sensing and related signal processing, neither sensor alone provides a perfect tool for detecting the myriad of possible buried objects that threaten the lives of Soldiers and civilians. However, while neither GPR nor EMI sensing alone can provide optimal detection across all target types, the two approaches are highly complementary. As a result, many landmine systems seek to make use of both sensing modalities simultaneously and fuse the results from both sensors to improve detection performance for targets with widely varying metal content and GPR responses. Despite this, little work has focused on large-scale comparisons of different approaches to sensor fusion and machine learning for combining data from these highly orthogonal phenomenologies. In this work we explore a wide array of pattern recognition techniques for algorithm development and sensor fusion. Results with the ARA Nemesis landmine detection system suggest that nonlinear and non-parametric classification algorithms provide significant performance benefits for single-sensor algorithm development, and that fusion of multiple algorithms can be performed satisfactorily using basic parametric approaches, such as logistic discriminant classification, for the targets under consideration in our data sets.

  13. 3D-Web-GIS RFID location sensing system for construction objects.

    PubMed

    Ko, Chien-Ho

    2013-01-01

    Construction site managers could benefit from being able to visualize on-site construction objects. Radio frequency identification (RFID) technology has been shown to improve the efficiency of construction object management. The objective of this study is to develop a 3D-Web-GIS RFID location sensing system for construction objects. An RFID 3D location sensing algorithm combining Simulated Annealing (SA) and a gradient descent method is proposed to determine target object location. In the algorithm, SA is used to stabilize the search process and the gradient descent method is used to reduce errors. The locations of the analyzed objects are visualized using the 3D-Web-GIS system. A real construction site is used to validate the applicability of the proposed method, with results indicating that the proposed approach can provide faster, more accurate, and more stable 3D positioning results than other location sensing algorithms. The proposed system allows construction managers to better understand worksite status, thus enhancing managerial efficiency.

  14. 3D-Web-GIS RFID Location Sensing System for Construction Objects

    PubMed Central

    2013-01-01

    Construction site managers could benefit from being able to visualize on-site construction objects. Radio frequency identification (RFID) technology has been shown to improve the efficiency of construction object management. The objective of this study is to develop a 3D-Web-GIS RFID location sensing system for construction objects. An RFID 3D location sensing algorithm combining Simulated Annealing (SA) and a gradient descent method is proposed to determine target object location. In the algorithm, SA is used to stabilize the search process and the gradient descent method is used to reduce errors. The locations of the analyzed objects are visualized using the 3D-Web-GIS system. A real construction site is used to validate the applicability of the proposed method, with results indicating that the proposed approach can provide faster, more accurate, and more stable 3D positioning results than other location sensing algorithms. The proposed system allows construction managers to better understand worksite status, thus enhancing managerial efficiency. PMID:23864821

  15. Novel Kalman Filter Algorithm for Statistical Monitoring of Extensive Landscapes with Synoptic Sensor Data

    PubMed Central

    Czaplewski, Raymond L.

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of study variables and auxiliary sensor variables. A National Forest Inventory (NFI) illustrates application within an official statistics program. Practical recommendations regarding remote sensing and statistical issues are offered. This algorithm has the potential to increase the value of synoptic sensor data for statistical monitoring of large geographic areas. PMID:26393588

  16. How similar are forest disturbance maps derived from different Landsat time series algorithms?

    Treesearch

    Warren B. Cohen; Sean P. Healey; Zhiqiang Yang; Stephen V. Stehman; C. Kenneth Brewer; Evan B. Brooks; Noel Gorelick; Chengqaun Huang; M. Joseph Hughes; Robert E. Kennedy; Thomas R. Loveland; Gretchen G. Moisen; Todd A. Schroeder; James E. Vogelmann; Curtis E. Woodcock; Limin Yang; Zhe Zhu

    2017-01-01

    Disturbance is a critical ecological process in forested systems, and disturbance maps are important for understanding forest dynamics. Landsat data are a key remote sensing dataset for monitoring forest disturbance and there recently has been major growth in the development of disturbance mapping algorithms. Many of these algorithms take advantage of the high temporal...

  17. Spectrum sensing algorithm based on autocorrelation energy in cognitive radio networks

    NASA Astrophysics Data System (ADS)

    Ren, Shengwei; Zhang, Li; Zhang, Shibing

    2016-10-01

    Cognitive radio networks have wide applications in the smart home, personal communications and other wireless communication. Spectrum sensing is the main challenge in cognitive radios. This paper proposes a new spectrum sensing algorithm which is based on the autocorrelation energy of signal received. By taking the autocorrelation energy of the received signal as the statistics of spectrum sensing, the effect of the channel noise on the detection performance is reduced. Simulation results show that the algorithm is effective and performs well in low signal-to-noise ratio. Compared with the maximum generalized eigenvalue detection (MGED) algorithm, function of covariance matrix based detection (FMD) algorithm and autocorrelation-based detection (AD) algorithm, the proposed algorithm has 2 11 dB advantage.

  18. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    PubMed Central

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-01-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592

  19. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    NASA Astrophysics Data System (ADS)

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-12-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  20. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification.

    PubMed

    Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard

    2016-12-06

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  1. Vision-based algorithms for near-host object detection and multilane sensing

    NASA Astrophysics Data System (ADS)

    Kenue, Surender K.

    1995-01-01

    Vision-based sensing can be used for lane sensing, adaptive cruise control, collision warning, and driver performance monitoring functions of intelligent vehicles. Current computer vision algorithms are not robust for handling multiple vehicles in highway scenarios. Several new algorithms are proposed for multi-lane sensing, near-host object detection, vehicle cut-in situations, and specifying regions of interest for object tracking. These algorithms were tested successfully on more than 6000 images taken from real-highway scenes under different daytime lighting conditions.

  2. Phase retrieval based wavefront sensing experimental implementation and wavefront sensing accuracy calibration

    NASA Astrophysics Data System (ADS)

    Mao, Heng; Wang, Xiao; Zhao, Dazun

    2009-05-01

    As a wavefront sensing (WFS) tool, Baseline algorithm, which is classified as the iterative-transform algorithm of phase retrieval, estimates the phase distribution at pupil from some known PSFs at defocus planes. By using multiple phase diversities and appropriate phase unwrapping methods, this algorithm can accomplish reliable unique solution and high dynamic phase measurement. In the paper, a Baseline algorithm based wavefront sensing experiment with modification of phase unwrapping has been implemented, and corresponding Graphical User Interfaces (GUI) software has also been given. The adaptability and repeatability of Baseline algorithm have been validated in experiments. Moreover, referring to the ZYGO interferometric results, the WFS accuracy of this algorithm has been exactly calibrated.

  3. Elaborate analysis and design of filter-bank-based sensing for wideband cognitive radios

    NASA Astrophysics Data System (ADS)

    Maliatsos, Konstantinos; Adamis, Athanasios; Kanatas, Athanasios G.

    2014-12-01

    The successful operation of a cognitive radio system strongly depends on its ability to sense the radio environment. With the use of spectrum sensing algorithms, the cognitive radio is required to detect co-existing licensed primary transmissions and to protect them from interference. This paper focuses on filter-bank-based sensing and provides a solid theoretical background for the design of these detectors. Optimum detectors based on the Neyman-Pearson theorem are developed for uniform discrete Fourier transform (DFT) and modified DFT filter banks with root-Nyquist filters. The proposed sensing framework does not require frequency alignment between the filter bank of the sensor and the primary signal. Each wideband primary channel is spanned and monitored by several sensor subchannels that analyse it in narrowband signals. Filter-bank-based sensing is proved to be robust and efficient under coloured noise. Moreover, the performance of the weighted energy detector as a sensing technique is evaluated. Finally, based on the Locally Most Powerful and the Generalized Likelihood Ratio test, real-world sensing algorithms that do not require a priori knowledge are proposed and tested.

  4. Development of a pneumatic high-angle-of-attack Flush Airdata Sensing (HI-FADS) system

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Moes, Timothy R.; Leondes, Cornelius T.

    1992-01-01

    The HI-FADS system design is an evolution of the FADS systems (e.g., Larson et al., 1980, 1987), which emphasizes the entire airdata system development. This paper describes the HI-FADS measurement system, with particular consideration given to the basic measurement hardware and the development of the HI-FADS aerodynamic model and the basic nonlinear regression algorithm. Algorithm initialization techniques are developed, and potential algorithm divergence problems are discussed. Data derived from HI-FADS flight tests are used to demonstrate the system accuracies and to illustrate the developed concepts and methods.

  5. MERIS Retrieval of Water Quality Components in the Turbid Albemarle-Pamlico Sound Estuary, USA

    EPA Science Inventory

    Two remote-sensing optical algorithms for the retrieval of the water quality components (WQCs) in the Albemarle-Pamlico Estuarine System (APES) have been developed and validated for chlorophyll a (Chl) concentration. Both algorithms are semiempirical because they incorporate some...

  6. A Two-Stage Reconstruction Processor for Human Detection in Compressive Sensing CMOS Radar.

    PubMed

    Tsao, Kuei-Chi; Lee, Ling; Chu, Ta-Shun; Huang, Yuan-Hao

    2018-04-05

    Complementary metal-oxide-semiconductor (CMOS) radar has recently gained much research attraction because small and low-power CMOS devices are very suitable for deploying sensing nodes in a low-power wireless sensing system. This study focuses on the signal processing of a wireless CMOS impulse radar system that can detect humans and objects in the home-care internet-of-things sensing system. The challenges of low-power CMOS radar systems are the weakness of human signals and the high computational complexity of the target detection algorithm. The compressive sensing-based detection algorithm can relax the computational costs by avoiding the utilization of matched filters and reducing the analog-to-digital converter bandwidth requirement. The orthogonal matching pursuit (OMP) is one of the popular signal reconstruction algorithms for compressive sensing radar; however, the complexity is still very high because the high resolution of human respiration leads to high-dimension signal reconstruction. Thus, this paper proposes a two-stage reconstruction algorithm for compressive sensing radar. The proposed algorithm not only has lower complexity than the OMP algorithm by 75% but also achieves better positioning performance than the OMP algorithm especially in noisy environments. This study also designed and implemented the algorithm by using Vertex-7 FPGA chip (Xilinx, San Jose, CA, USA). The proposed reconstruction processor can support the 256 × 13 real-time radar image display with a throughput of 28.2 frames per second.

  7. Deriving hourly evapotranspiration (ET) rates with SEBS: A lysimetric evaluation

    USDA-ARS?s Scientific Manuscript database

    Numerous energy balance (EB) algorithms have been developed to use remote sensing data for mapping evapotranspiration (ET) on a regional basis. Adopting any single or combination of these models for an operational ET remote sensing program requires a thorough evaluation. The Surface Energy Balance S...

  8. Co-design of software and hardware to implement remote sensing algorithms

    NASA Astrophysics Data System (ADS)

    Theiler, James P.; Frigo, Janette R.; Gokhale, Maya; Szymanski, John J.

    2002-01-01

    Both for offline searches through large data archives and for onboard computation at the sensor head, there is a growing need for ever-more rapid processing of remote sensing data. For many algorithms of use in remote sensing, the bulk of the processing takes place in an ``inner loop'' with a large number of simple operations. For these algorithms, dramatic speedups can often be obtained with specialized hardware. The difficulty and expense of digital design continues to limit applicability of this approach, but the development of new design tools is making this approach more feasible, and some notable successes have been reported. On the other hand, it is often the case that processing can also be accelerated by adopting a more sophisticated algorithm design. Unfortunately, a more sophisticated algorithm is much harder to implement in hardware, so these approaches are often at odds with each other. With careful planning, however, it is sometimes possible to combine software and hardware design in such a way that each complements the other, and the final implementation achieves speedup that would not have been possible with a hardware-only or a software-only solution. We will in particular discuss the co-design of software and hardware to achieve substantial speedup of algorithms for multispectral image segmentation and for endmember identification.

  9. Design of Restoration Method Based on Compressed Sensing and TwIST Algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Fei; Piao, Yan

    2018-04-01

    In order to improve the subjective and objective quality of degraded images at low sampling rates effectively,save storage space and reduce computational complexity at the same time, this paper proposes a joint restoration algorithm of compressed sensing and two step iterative threshold shrinkage (TwIST). The algorithm applies the TwIST algorithm which used in image restoration to the compressed sensing theory. Then, a small amount of sparse high-frequency information is obtained in frequency domain. The TwIST algorithm based on compressed sensing theory is used to accurately reconstruct the high frequency image. The experimental results show that the proposed algorithm achieves better subjective visual effects and objective quality of degraded images while accurately restoring degraded images.

  10. Development of a generalized multi-pixel and multi-parameter satellite remote sensing algorithm for aerosol properties

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Nakajima, T.; Takenaka, H.; Higurashi, A.

    2013-12-01

    We develop a new satellite remote sensing algorithm to retrieve the properties of aerosol particles in the atmosphere. In late years, high resolution and multi-wavelength, and multiple-angle observation data have been obtained by grand-based spectral radiometers and imaging sensors on board the satellite. With this development, optimized multi-parameter remote sensing methods based on the Bayesian theory have become popularly used (Turchin and Nozik, 1969; Rodgers, 2000; Dubovik et al., 2000). Additionally, a direct use of radiation transfer calculation has been employed for non-linear remote sensing problems taking place of look up table methods supported by the progress of computing technology (Dubovik et al., 2011; Yoshida et al., 2011). We are developing a flexible multi-pixel and multi-parameter remote sensing algorithm for aerosol optical properties. In this algorithm, the inversion method is a combination of the MAP method (Maximum a posteriori method, Rodgers, 2000) and the Phillips-Twomey method (Phillips, 1962; Twomey, 1963) as a smoothing constraint for the state vector. Furthermore, we include a radiation transfer calculation code, Rstar (Nakajima and Tanaka, 1986, 1988), numerically solved each time in iteration for solution search. The Rstar-code has been directly used in the AERONET operational processing system (Dubovik and King, 2000). Retrieved parameters in our algorithm are aerosol optical properties, such as aerosol optical thickness (AOT) of fine mode, sea salt, and dust particles, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength. We simultaneously retrieve all the parameters that characterize pixels in each of horizontal sub-domains consisting the target area. Then we successively apply the retrieval method to all the sub-domains in the target area. We conducted numerical tests for the retrieval of aerosol properties and ground surface albedo for GOSAT/CAI imager data to test the algorithm for the land area. In this test, we simulated satellite-observed radiances for a sub-domain consisting of 5 by 5 pixels by the Rstar code assuming wavelengths of 380, 674, 870 and 1600 [nm], atmospheric condition of the US standard atmosphere, and the several aerosol and ground surface conditions. The result of the experiment showed that AOTs of fine mode and dust particles, soot fraction and ground surface albedo at the wavelength of 674 [nm] are retrieved within absolute value differences of 0.04, 0.01, 0.06 and 0.006 from the true value, respectively, for the case of dark surface, and also, for the case of blight surface, 0.06, 0.03, 0.04 and 0.10 from the true value, respectively. We will conduct more tests to study the information contents of parameters needed for aerosol and land surface remote sensing with different boundary conditions among sub-domains.

  11. Experimental scheme and restoration algorithm of block compression sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Linxia; Zhou, Qun; Ke, Jun

    2018-01-01

    Compressed Sensing (CS) can use the sparseness of a target to obtain its image with much less data than that defined by the Nyquist sampling theorem. In this paper, we study the hardware implementation of a block compression sensing system and its reconstruction algorithms. Different block sizes are used. Two algorithms, the orthogonal matching algorithm (OMP) and the full variation minimum algorithm (TV) are used to obtain good reconstructions. The influence of block size on reconstruction is also discussed.

  12. Predicting fruit fly's sensing rate with insect flight simulations.

    PubMed

    Chang, Song; Wang, Z Jane

    2014-08-05

    Without sensory feedback, flies cannot fly. Exactly how various feedback controls work in insects is a complex puzzle to solve. What do insects measure to stabilize their flight? How often and how fast must insects adjust their wings to remain stable? To gain insights into algorithms used by insects to control their dynamic instability, we develop a simulation tool to study free flight. To stabilize flight, we construct a control algorithm that modulates wing motion based on discrete measurements of the body-pitch orientation. Our simulations give theoretical bounds on both the sensing rate and the delay time between sensing and actuation. Interpreting our findings together with experimental results on fruit flies' reaction time and sensory motor reflexes, we conjecture that fruit flies sense their kinematic states every wing beat to stabilize their flight. We further propose a candidate for such a control involving the fly's haltere and first basalar motor neuron. Although we focus on fruit flies as a case study, the framework for our simulation and discrete control algorithms is applicable to studies of both natural and man-made fliers.

  13. A Software Architecture for Adaptive Modular Sensing Systems

    PubMed Central

    Lyle, Andrew C.; Naish, Michael D.

    2010-01-01

    By combining a number of simple transducer modules, an arbitrarily complex sensing system may be produced to accommodate a wide range of applications. This work outlines a novel software architecture and knowledge representation scheme that has been developed to support this type of flexible and reconfigurable modular sensing system. Template algorithms are used to embed intelligence within each module. As modules are added or removed, the composite sensor is able to automatically determine its overall geometry and assume an appropriate collective identity. A virtual machine-based middleware layer runs on top of a real-time operating system with a pre-emptive kernel, enabling platform-independent template algorithms to be written once and run on any module, irrespective of its underlying hardware architecture. Applications that may benefit from easily reconfigurable modular sensing systems include flexible inspection, mobile robotics, surveillance, and space exploration. PMID:22163614

  14. A software architecture for adaptive modular sensing systems.

    PubMed

    Lyle, Andrew C; Naish, Michael D

    2010-01-01

    By combining a number of simple transducer modules, an arbitrarily complex sensing system may be produced to accommodate a wide range of applications. This work outlines a novel software architecture and knowledge representation scheme that has been developed to support this type of flexible and reconfigurable modular sensing system. Template algorithms are used to embed intelligence within each module. As modules are added or removed, the composite sensor is able to automatically determine its overall geometry and assume an appropriate collective identity. A virtual machine-based middleware layer runs on top of a real-time operating system with a pre-emptive kernel, enabling platform-independent template algorithms to be written once and run on any module, irrespective of its underlying hardware architecture. Applications that may benefit from easily reconfigurable modular sensing systems include flexible inspection, mobile robotics, surveillance, and space exploration.

  15. Developing Remote Sensing Products for Monitoring and Modeling Great Lakes Coastal Wetland Vulnerability to Climate Change and Land Use

    NASA Astrophysics Data System (ADS)

    Bourgeau-Chavez, L. L.; Miller, M. E.; Battaglia, M.; Banda, E.; Endres, S.; Currie, W. S.; Elgersma, K. J.; French, N. H. F.; Goldberg, D. E.; Hyndman, D. W.

    2014-12-01

    Spread of invasive plant species in the coastal wetlands of the Great Lakes is degrading wetland habitat, decreasing biodiversity, and decreasing ecosystem services. An understanding of the mechanisms of invasion is crucial to gaining control of this growing threat. To better understand the effects of land use and climatic drivers on the vulnerability of coastal zones to invasion, as well as to develop an understanding of the mechanisms of invasion, research is being conducted that integrates field studies, process-based ecosystem and hydrological models, and remote sensing. Spatial data from remote sensing is needed to parameterize the hydrological model and to test the outputs of the linked models. We will present several new remote sensing products that are providing important physiological, biochemical, and landscape information to parameterize and verify models. This includes a novel hybrid radar-optical technique to delineate stands of invasives, as well as natural wetland cover types; using radar to map seasonally inundated areas not hydrologically connected; and developing new algorithms to estimate leaf area index (LAI) using Landsat. A coastal map delineating wetland types including monocultures of the invaders (Typha spp. and Phragmites austrailis) was created using satellite radar (ALOS PALSAR, 20 m resolution) and optical data (Landsat 5, 30 m resolution) fusion from multiple dates in a Random Forests classifier. These maps provide verification of the integrated model showing areas at high risk of invasion. For parameterizing the hydrological model, maps of seasonal wetness are being developed using spring (wet) imagery and differencing that with summer (dry) imagery to detect the seasonally wet areas. Finally, development of LAI remote sensing high resolution algorithms for uplands and wetlands is underway. LAI algorithms for wetlands have not been previously developed due to the difficulty of a water background. These products are being used to improve the hydrological model through higher resolution products and parameterization of variables that have previously been largely unknown.

  16. Surface Energy Balance System for Estimating Daily Evapotranspiration Rates in the Texas High Plains

    USDA-ARS?s Scientific Manuscript database

    Numerous energy balance (EB) algorithms have been developed to use remote sensing data for mapping evapotranspiration (ET) on a regional basis. Adopting any single or a combination of these models for an operational ET remote sensing program requires thorough evaluation. The Surface Energy Balance S...

  17. Determination of phytoplankton chlorophyll concentrations in the Chesapeake Bay with aircraft remote sensing

    NASA Technical Reports Server (NTRS)

    Harding, Lawrence W., Jr.; Itsweire, Eric C.; Esaias, Wayne E.

    1992-01-01

    Remote sensing measurements of the distribution of phytoplankton chlorophyll concentrations in Chesapeake Bay during 1989 are described. It is shown that remote sensing from light aircraft can complement and extend measurements made from traditional platforms and provide data of improved temporal and spatial resolution, leading to a better understanding of phytoplankton dynamics in the estuary. The developments of the winter-spring diatom bloom in the polyhaline to mesohaline regions of the estuary and of the late-spring and summer dinoflagellate blooms in oligohaline and mesohaline regions are traced. The study presents the local chlorophyll algorithm developed using the NASA Ocean Data Acquisition System data and in situ chlorophyll data, interpolated maps of chlorophyll concentration generated by applying the algorithm to aircraft radiance data, ancillary in situ data on nutrients, turbidity, streamflow, and light availability, and an interpretation of phytoplankton dynamics in terms of the chlorophyll distribution in Chesapeake Bay during 1989.

  18. BOREAS RSS-7 Regional LAI and FPAR Images From 10-Day AVHRR-LAC Composites

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Chen, Jing; Cihlar, Josef

    2000-01-01

    The BOReal Ecosystem-Atmosphere Study Remote Sensing Science (BOREAS RSS-7) team collected various data sets to develop and validate an algorithm to allow the retrieval of the spatial distribution of Leaf Area Index (LAI) from remotely sensed images. Advanced Very High Resolution Radiometer (AVHRR) level-4c 10-day composite Normalized Difference Vegetation Index (NDVI) images produced at CCRS were used to produce images of LAI and the Fraction of Photosynthetically Active Radiation (FPAR) absorbed by plant canopies for the three summer IFCs in 1994 across the BOREAS region. The algorithms were developed based on ground measurements and Landsat Thematic Mapper (TM) images. The data are stored in binary image format files.

  19. Assessing the effectiveness of Landsat 8 chlorophyll a retrieval algorithms for regional freshwater monitoring.

    PubMed

    Boucher, Jonah; Weathers, Kathleen C; Norouzi, Hamid; Steele, Bethel

    2018-06-01

    Predicting algal blooms has become a priority for scientists, municipalities, businesses, and citizens. Remote sensing offers solutions to the spatial and temporal challenges facing existing lake research and monitoring programs that rely primarily on high-investment, in situ measurements. Techniques to remotely measure chlorophyll a (chl a) as a proxy for algal biomass have been limited to specific large water bodies in particular seasons and narrow chl a ranges. Thus, a first step toward prediction of algal blooms is generating regionally robust algorithms using in situ and remote sensing data. This study explores the relationship between in-lake measured chl a data from Maine and New Hampshire, USA lakes and remotely sensed chl a retrieval algorithm outputs. Landsat 8 images were obtained and then processed after required atmospheric and radiometric corrections. Six previously developed algorithms were tested on a regional scale on 11 scenes from 2013 to 2015 covering 192 lakes. The best performing algorithm across data from both states had a 0.16 correlation coefficient (R 2 ) and P ≤ 0.05 when Landsat 8 images within 5 d, and improved to R 2 of 0.25 when data from Maine only were used. The strength of the correlation varied with the specificity of the time window in relation to the in-situ sampling date, explaining up to 27% of the variation in the data across several scenes. Two previously published algorithms using Landsat 8's Bands 1-4 were best correlated with chl a, and for particular late-summer scenes, they accounted for up to 69% of the variation in in-situ measurements. A sensitivity analysis revealed that a longer time difference between in situ measurements and the satellite image increased uncertainty in the models, and an effect of the time of year on several indices was demonstrated. A regional model based on the best performing remote sensing algorithm was developed and was validated using independent in situ measurements and satellite images. These results suggest that, despite challenges including seasonal effects and low chl a thresholds, remote sensing could be an effective and accessible regional-scale tool for chl a monitoring programs in lakes. © 2018 The Authors. Ecological Applications published by Wiley Periodicals, Inc. on behalf of Ecological Society of America.

  20. A Two-Stage Reconstruction Processor for Human Detection in Compressive Sensing CMOS Radar

    PubMed Central

    Tsao, Kuei-Chi; Lee, Ling; Chu, Ta-Shun

    2018-01-01

    Complementary metal-oxide-semiconductor (CMOS) radar has recently gained much research attraction because small and low-power CMOS devices are very suitable for deploying sensing nodes in a low-power wireless sensing system. This study focuses on the signal processing of a wireless CMOS impulse radar system that can detect humans and objects in the home-care internet-of-things sensing system. The challenges of low-power CMOS radar systems are the weakness of human signals and the high computational complexity of the target detection algorithm. The compressive sensing-based detection algorithm can relax the computational costs by avoiding the utilization of matched filters and reducing the analog-to-digital converter bandwidth requirement. The orthogonal matching pursuit (OMP) is one of the popular signal reconstruction algorithms for compressive sensing radar; however, the complexity is still very high because the high resolution of human respiration leads to high-dimension signal reconstruction. Thus, this paper proposes a two-stage reconstruction algorithm for compressive sensing radar. The proposed algorithm not only has lower complexity than the OMP algorithm by 75% but also achieves better positioning performance than the OMP algorithm especially in noisy environments. This study also designed and implemented the algorithm by using Vertex-7 FPGA chip (Xilinx, San Jose, CA, USA). The proposed reconstruction processor can support the 256×13 real-time radar image display with a throughput of 28.2 frames per second. PMID:29621170

  1. Cognitive Nonlinear Radar

    DTIC Science & Technology

    2013-01-01

    intelligently selecting waveform parameters using adaptive algorithms. The adaptive algorithms optimize the waveform parameters based on (1) the EM...the environment. 15. SUBJECT TERMS cognitive radar, adaptive sensing, spectrum sensing, multi-objective optimization, genetic algorithms, machine...detection and classification block diagram. .........................................................6 Figure 5. Genetic algorithm block diagram

  2. Estimating dissolved organic carbon concentration in turbid coastal waters using optical remote sensing observations

    NASA Astrophysics Data System (ADS)

    Cherukuru, Nagur; Ford, Phillip W.; Matear, Richard J.; Oubelkheir, Kadija; Clementson, Lesley A.; Suber, Ken; Steven, Andrew D. L.

    2016-10-01

    Dissolved Organic Carbon (DOC) is an important component in the global carbon cycle. It also plays an important role in influencing the coastal ocean biogeochemical (BGC) cycles and light environment. Studies focussing on DOC dynamics in coastal waters are data constrained due to the high costs associated with in situ water sampling campaigns. Satellite optical remote sensing has the potential to provide continuous, cost-effective DOC estimates. In this study we used a bio-optics dataset collected in turbid coastal waters of Moreton Bay (MB), Australia, during 2011 to develop a remote sensing algorithm to estimate DOC. This dataset includes data from flood and non-flood conditions. In MB, DOC concentration varied over a wide range (20-520 μM C) and had a good correlation (R2 = 0.78) with absorption due to coloured dissolved organic matter (CDOM) and remote sensing reflectance. Using this data set we developed an empirical algorithm to derive DOC concentrations from the ratio of Rrs(412)/Rrs(488) and tested it with independent datasets. In this study, we demonstrate the ability to estimate DOC using remotely sensed optical observations in turbid coastal waters.

  3. [An operational remote sensing algorithm of land surface evapotranspiration based on NOAA PAL dataset].

    PubMed

    Hou, Ying-Yu; He, Yan-Bo; Wang, Jian-Lin; Tian, Guo-Liang

    2009-10-01

    Based on the time series 10-day composite NOAA Pathfinder AVHRR Land (PAL) dataset (8 km x 8 km), and by using land surface energy balance equation and "VI-Ts" (vegetation index-land surface temperature) method, a new algorithm of land surface evapotranspiration (ET) was constructed. This new algorithm did not need the support from meteorological observation data, and all of its parameters and variables were directly inversed or derived from remote sensing data. A widely accepted ET model of remote sensing, i. e., SEBS model, was chosen to validate the new algorithm. The validation test showed that both the ET and its seasonal variation trend estimated by SEBS model and our new algorithm accorded well, suggesting that the ET estimated from the new algorithm was reliable, being able to reflect the actual land surface ET. The new ET algorithm of remote sensing was practical and operational, which offered a new approach to study the spatiotemporal variation of ET in continental scale and global scale based on the long-term time series satellite remote sensing images.

  4. Digital Oblique Remote Ionospheric Sensing (DORIS) Program Development

    DTIC Science & Technology

    1992-04-01

    waveforms. A new with the ARTIST software (Reinisch and Iluang. autoscaling technique for oblique ionograms 1983, Gamache et al., 1985) which is...development and performance of a complete oblique ionogram autoscaling and inversion algorithm is presented. The inver.i-,n algorithm uses a three...OTIH radar. 14. SUBJECT TERMS 15. NUMBER OF PAGES Oblique Propagation; Oblique lonogram Autoscaling ; i Electron Density Profile Inversion; Simulated 16

  5. On the Satisfaction of Modulus and Ambiguity Function Constraints in Radar Waveform Optimization for Detection

    DTIC Science & Technology

    2010-06-01

    sense that the two waveforms are as close as possible in a Euclidean sense . Li et al. [33] later devised an algorithm that provides the optimal waveform...respectively), and the SWORD algorithm in [33]. These algorithms were designed for the problem of detecting a known signal in the presence of wide- sense ... sensing , astronomy, crystallography, signal processing, and image processing. (See references in the works cited below for examples.) In the general

  6. Rapid Change Detection Algorithm for Disaster Management

    NASA Astrophysics Data System (ADS)

    Michel, U.; Thunig, H.; Ehlers, M.; Reinartz, P.

    2012-07-01

    This paper focuses on change detection applications in areas where catastrophic events took place which resulted in rapid destruction especially of manmade objects. Standard methods for automated change detection prove not to be sufficient; therefore a new method was developed and tested. The presented method allows a fast detection and visualization of change in areas of crisis or catastrophes. While often new methods of remote sensing are developed without user oriented aspects, organizations and authorities are not able to use these methods because of absence of remote sensing know how. Therefore a semi-automated procedure was developed. Within a transferable framework, the developed algorithm can be implemented for a set of remote sensing data among different investigation areas. Several case studies are the base for the retrieved results. Within a coarse dividing into statistical parts and the segmentation in meaningful objects, the framework is able to deal with different types of change. By means of an elaborated Temporal Change Index (TCI) only panchromatic datasets are used to extract areas which are destroyed, areas which were not affected and in addition areas where rebuilding has already started.

  7. Reliable fusion of control and sensing in intelligent machines. Thesis

    NASA Technical Reports Server (NTRS)

    Mcinroy, John E.

    1991-01-01

    Although robotics research has produced a wealth of sophisticated control and sensing algorithms, very little research has been aimed at reliably combining these control and sensing strategies so that a specific task can be executed. To improve the reliability of robotic systems, analytic techniques are developed for calculating the probability that a particular combination of control and sensing algorithms will satisfy the required specifications. The probability can then be used to assess the reliability of the design. An entropy formulation is first used to quickly eliminate designs not capable of meeting the specifications. Next, a framework for analyzing reliability based on the first order second moment methods of structural engineering is proposed. To ensure performance over an interval of time, lower bounds on the reliability of meeting a set of quadratic specifications with a Gaussian discrete time invariant control system are derived. A case study analyzing visual positioning in robotic system is considered. The reliability of meeting timing and positioning specifications in the presence of camera pixel truncation, forward and inverse kinematic errors, and Gaussian joint measurement noise is determined. This information is used to select a visual sensing strategy, a kinematic algorithm, and a discrete compensator capable of accomplishing the desired task. Simulation results using PUMA 560 kinematic and dynamic characteristics are presented.

  8. A Technique for Remote Sensing of Suspended Sediments and Shallow Coastal Waters Using MODIS Visible and Near-IR Channels

    NASA Technical Reports Server (NTRS)

    Li, Rong-Rong; Kaufman, Yoram J.

    2002-01-01

    We have developed an algorithm to detect suspended sediments and shallow coastal waters using imaging data acquired with the Moderate Resolution Imaging SpectroRadiometer (MODIS). The MODIS instruments on board the NASA Terra and Aqua Spacecrafts are equipped with one set of narrow channels located in a wide 0.4 - 2.5 micron spectral range. These channels were designed primarily for remote sensing of the land surface and atmosphere. We have found that the set of land and cloud channels are also quite useful for remote sensing of the bright coastal waters. We have developed an empirical algorithm, which uses the narrow MODIS channels in this wide spectral range, for identifying areas with suspended sediments in turbid waters and shallow waters with bottom reflections. In our algorithm, we take advantage of the strong water absorption at wavelengths longer than 1 micron that does not allow illumination of sediments in the water or a shallow ocean floor. MODIS data acquired over the east coast of China, west coast of Africa, Arabian Sea, Mississippi Delta, and west coast of Florida are used in this study.

  9. A Technique For Remote Sensing Of Suspended Sediments And Shallow Coastal Waters Using MODIS Visible and Near-IR Channels

    NASA Astrophysics Data System (ADS)

    Li, R.; Kaufman, Y.

    2002-12-01

    ABSTRACT We have developed an algorithm to detect suspended sediments and shallow coastal waters using imaging data acquired with the Moderate Resolution Imaging SpectroRadiometer (MODIS). The MODIS instruments on board the NASA Terra and Aqua Spacecrafts are equipped with one set of narrow channels located in a wide 0.4 - 2.5 micron spectral range. These channels were designed primarily for remote sensing of the land surface and atmosphere. We have found that the set of land and cloud channels are also quite useful for remote sensing of the bright coastal waters. We have developed an empirical algorithm, which uses the narrow MODIS channels in this wide spectral range, for identifying areas with suspended sediments in turbid waters and shallow waters with bottom reflections. In our algorithm, we take advantage of the strong water absorption at wavelengths longer than 1 æm that does not allow illumination of sediments in the water or a shallow ocean floor. MODIS data acquired over the east coast of China, west coast of Africa, Arabian Sea, Mississippi Delta, and west coast of Florida are used in this study.

  10. Development of a remote sensing algorithm to retrieve atmospheric aerosol properties using multiwavelength and multipixel information

    NASA Astrophysics Data System (ADS)

    Hashimoto, Makiko; Nakajima, Teruyuki

    2017-06-01

    We developed a satellite remote sensing algorithm to retrieve the aerosol optical properties using satellite-received radiances for multiple wavelengths and pixels. Our algorithm utilizes spatial inhomogeneity of surface reflectance to retrieve aerosol properties, and the main target is urban aerosols. This algorithm can simultaneously retrieve aerosol optical thicknesses (AOT) for fine- and coarse-mode aerosols, soot volume fraction in fine-mode aerosols (SF), and surface reflectance over heterogeneous surfaces such as urban areas that are difficult to obtain by conventional pixel-by-pixel methods. We applied this algorithm to radiances measured by the Greenhouse Gases Observing Satellite/Thermal and Near Infrared Sensor for Carbon Observations-Cloud and Aerosol Image (GOSAT/TANSO-CAI) at four wavelengths and were able to retrieve the aerosol parameters in several urban regions and other surface types. A comparison of the retrieved AOTs with those from the Aerosol Robotic Network (AERONET) indicated retrieval accuracy within ±0.077 on average. It was also found that the column-averaged SF and the aerosol single scattering albedo (SSA) underwent seasonal changes as consistent with the ground surface measurements of SSA and black carbon at Beijing, China.

  11. Digital imaging and remote sensing image generator (DIRSIG) as applied to NVESD sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.

    2016-05-01

    The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.

  12. Angular Rate Sensing with GyroWheel Using Genetic Algorithm Optimized Neural Networks.

    PubMed

    Zhao, Yuyu; Zhao, Hui; Huo, Xin; Yao, Yu

    2017-07-22

    GyroWheel is an integrated device that can provide three-axis control torques and two-axis angular rate sensing for small spacecrafts. Large tilt angle of its rotor and de-tuned spin rate lead to a complex and non-linear dynamics as well as difficulties in measuring angular rates. In this paper, the problem of angular rate sensing with the GyroWheel is investigated. Firstly, a simplified rate sensing equation is introduced, and the error characteristics of the method are analyzed. According to the analysis results, a rate sensing principle based on torque balance theory is developed, and a practical way to estimate the angular rates within the whole operating range of GyroWheel is provided by using explicit genetic algorithm optimized neural networks. The angular rates can be determined by the measurable values of the GyroWheel (including tilt angles, spin rate and torque coil currents), the weights and the biases of the neural networks. Finally, the simulation results are presented to illustrate the effectiveness of the proposed angular rate sensing method with GyroWheel.

  13. Initial Results from Radiometer and Polarized Radar-Based Icing Algorithms Compared to In-Situ Data

    NASA Technical Reports Server (NTRS)

    Serke, David; Reehorst, Andrew L.; King, Michael

    2015-01-01

    In early 2015, a field campaign was conducted at the NASA Glenn Research Center in Cleveland, Ohio, USA. The purpose of the campaign is to test several prototype algorithms meant to detect the location and severity of in-flight icing (or icing aloft, as opposed to ground icing) within the terminal airspace. Terminal airspace for this project is currently defined as within 25 kilometers horizontal distance of the terminal, which in this instance is Hopkins International Airport in Cleveland. Two new and improved algorithms that utilize ground-based remote sensing instrumentation have been developed and were operated during the field campaign. The first is the 'NASA Icing Remote Sensing System', or NIRSS. The second algorithm is the 'Radar Icing Algorithm', or RadIA. In addition to these algorithms, which were derived from ground-based remote sensors, in-situ icing measurements of the profiles of super-cooled liquid water (SLW) collected with vibrating wire sondes attached to weather balloons produced a comprehensive database for comparison. Key fields from the SLW-sondes include air temperature, humidity and liquid water content, cataloged by time and 3-D location. This work gives an overview of the NIRSS and RadIA products and results are compared to in-situ SLW-sonde data from one icing case study. The location and quantity of super-cooled liquid as measured by the in-situ probes provide a measure of the utility of these prototype hazard-sensing algorithms.

  14. A Decentralized Eigenvalue Computation Method for Spectrum Sensing Based on Average Consensus

    NASA Astrophysics Data System (ADS)

    Mohammadi, Jafar; Limmer, Steffen; Stańczak, Sławomir

    2016-07-01

    This paper considers eigenvalue estimation for the decentralized inference problem for spectrum sensing. We propose a decentralized eigenvalue computation algorithm based on the power method, which is referred to as generalized power method GPM; it is capable of estimating the eigenvalues of a given covariance matrix under certain conditions. Furthermore, we have developed a decentralized implementation of GPM by splitting the iterative operations into local and global computation tasks. The global tasks require data exchange to be performed among the nodes. For this task, we apply an average consensus algorithm to efficiently perform the global computations. As a special case, we consider a structured graph that is a tree with clusters of nodes at its leaves. For an accelerated distributed implementation, we propose to use computation over multiple access channel (CoMAC) as a building block of the algorithm. Numerical simulations are provided to illustrate the performance of the two algorithms.

  15. An intercomparison study of TSM, SEBS, and SEBAL using high-resolution imagery and lysimetric data

    USDA-ARS?s Scientific Manuscript database

    Over the past three decades, numerous remote sensing based ET mapping algorithms were developed. These algorithms provided a robust, economical, and efficient tool for ET estimations at field and regional scales. The Two Source Model (TSM), Surface Energy Balance System (SEBS), and Surface Energy Ba...

  16. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  17. Robert Spencer | NREL

    Science.gov Websites

    & Simulation Research Interests Remote Sensing Natural Resource Modeling Machine Learning Education Analysis Center. Areas of Expertise Geospatial Analysis Data Visualization Algorithm Development Modeling

  18. Stochastic control approaches for sensor management in search and exploitation

    NASA Astrophysics Data System (ADS)

    Hitchings, Darin Chester

    Recent improvements in the capabilities of autonomous vehicles have motivated their increased use in such applications as defense, homeland security, environmental monitoring, and surveillance. To enhance performance in these applications, new algorithms are required to control teams of robots autonomously and through limited interactions with human operators. In this dissertation we develop new algorithms for control of robots performing information-seeking missions in unknown environments. These missions require robots to control their sensors in order to discover the presence of objects, keep track of the objects, and learn what these objects are, given a fixed sensing budget. Initially, we investigate control of multiple sensors, with a finite set of sensing options and finite-valued measurements, to locate and classify objects given a limited resource budget. The control problem is formulated as a Partially Observed Markov Decision Problem (POMDP), but its exact solution requires excessive computation. Under the assumption that sensor error statistics are independent and time-invariant, we develop a class of algorithms using Lagrangian Relaxation techniques to obtain optimal mixed strategies using performance bounds developed in previous research. We investigate alternative Receding Horizon (RH) controllers to convert the mixed strategies to feasible adaptive-sensing strategies and evaluate the relative performance of these controllers in simulation. The resulting controllers provide superior performance to alternative algorithms proposed in the literature and obtain solutions to large-scale POMDP problems several orders of magnitude faster than optimal Dynamic Programming (DP) approaches with comparable performance quality. We extend our results for finite action, finite measurement sensor control to scenarios with moving objects. We use Hidden Markov Models (HMMs) for the evolution of objects, according to the dynamics of a birth-death process. We develop a new lower bound on the performance of adaptive controllers in these scenarios, develop algorithms for computing solutions to this lower bound, and use these algorithms as part of a RH controller for sensor allocation in the presence of moving objects We also consider an adaptive Search problem where sensing actions are continuous and the underlying measurement space is also continuous. We extend our previous hierarchical decomposition approach based on performance bounds to this problem and develop novel implementations of Stochastic Dynamic Programming (SDP) techniques to solve this problem. Our algorithms are nearly two orders of magnitude faster than previously proposed approaches and yield solutions of comparable quality. For supervisory control, we discuss how human operators can work with and augment robotic teams performing these tasks. Our focus is on how tasks are partitioned among teams of robots and how a human operator can make intelligent decisions for task partitioning. We explore these questions through the design of a game that involves robot automata controlled by our algorithms and a human supervisor that partitions tasks based on different levels of support information. This game can be used with human subject experiments to explore the effect of information on quality of supervisory control.

  19. Control algorithm implementation for a redundant degree of freedom manipulator

    NASA Technical Reports Server (NTRS)

    Cohan, Steve

    1991-01-01

    This project's purpose is to develop and implement control algorithms for a kinematically redundant robotic manipulator. The manipulator is being developed concurrently by Odetics Inc., under internal research and development funding. This SBIR contract supports algorithm conception, development, and simulation, as well as software implementation and integration with the manipulator hardware. The Odetics Dexterous Manipulator is a lightweight, high strength, modular manipulator being developed for space and commercial applications. It has seven fully active degrees of freedom, is electrically powered, and is fully operational in 1 G. The manipulator consists of five self-contained modules. These modules join via simple quick-disconnect couplings and self-mating connectors which allow rapid assembly/disassembly for reconfiguration, transport, or servicing. Each joint incorporates a unique drive train design which provides zero backlash operation, is insensitive to wear, and is single fault tolerant to motor or servo amplifier failure. The sensing system is also designed to be single fault tolerant. Although the initial prototype is not space qualified, the design is well-suited to meeting space qualification requirements. The control algorithm design approach is to develop a hierarchical system with well defined access and interfaces at each level. The high level endpoint/configuration control algorithm transforms manipulator endpoint position/orientation commands to joint angle commands, providing task space motion. At the same time, the kinematic redundancy is resolved by controlling the configuration (pose) of the manipulator, using several different optimizing criteria. The center level of the hierarchy servos the joints to their commanded trajectories using both linear feedback and model-based nonlinear control techniques. The lowest control level uses sensed joint torque to close torque servo loops, with the goal of improving the manipulator dynamic behavior. The control algorithms are subjected to a dynamic simulation before implementation.

  20. NASA/GSFC Research Activities for the Global Ocean Carbon Cycle: A Prospectus for the 21st Century

    NASA Technical Reports Server (NTRS)

    Gregg, W. W.; Behrenfield, M. J.; Hoge, F. E.; Esaias, W. E.; Huang, N. E.; Long, S. R.; McClain, C. R.

    2000-01-01

    There are increasing concerns that anthropogenic inputs of carbon dioxide into the Earth system have the potential for climate change. In response to these concerns, the GSFC Laboratory for Hydrospheric Processes has formed the Ocean Carbon Science Team (OCST) to contribute to greater understanding of the global ocean carbon cycle. The overall goals of the OCST are to: 1) detect changes in biological components of the ocean carbon cycle through remote sensing of biooptical properties, 2) refine understanding of ocean carbon uptake and sequestration through application of basic research results, new satellite algorithms, and improved model parameterizations, 3) develop and implement new sensors providing critical missing environmental information related to the oceanic carbon cycle and the flux of CO2 across the air-sea interface. The specific objectives of the OCST are to: 1) establish a 20-year time series of ocean color, 2) develop new remote sensing technologies, 3) validate ocean remote sensing observations, 4) conduct ocean carbon cycle scientific investigations directly related to remote sensing data, emphasizing physiological, empirical and coupled physical/biological models, satellite algorithm development and improvement, and analysis of satellite data sets. These research and mission objectives are intended to improve our understanding of global ocean carbon cycling and contribute to national goals by maximizing the use of remote sensing data.

  1. Steganography algorithm multi pixel value differencing (MPVD) to increase message capacity and data security

    NASA Astrophysics Data System (ADS)

    Rojali, Siahaan, Ida Sri Rejeki; Soewito, Benfano

    2017-08-01

    Steganography is the art and science of hiding the secret messages so the existence of the message cannot be detected by human senses. The data concealment is using the Multi Pixel Value Differencing (MPVD) algorithm, utilizing the difference from each pixel. The development was done by using six interval tables. The objective of this algorithm is to enhance the message capacity and to maintain the data security.

  2. Spatial information technologies for remote sensing today and tomorrow; Proceedings of the Ninth Pecora Symposium, Sioux Falls, SD, October 2-4, 1984

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Topics discussed at the symposium include hardware, geographic information system (GIS) implementation, processing remotely sensed data, spatial data structures, and NASA programs in remote sensing information systems. Attention is also given GIS applications, advanced techniques, artificial intelligence, graphics, spatial navigation, and classification. Papers are included on the design of computer software for geographic image processing, concepts for a global resource information system, algorithm development for spatial operators, and an application of expert systems technology to remotely sensed image analysis.

  3. Primary analysis of the ocean color remote sensing data of the HY-1B/COCTS

    NASA Astrophysics Data System (ADS)

    He, Xianqiang; Bai, Yan; Pan, Delu; Zhu, Qiankun; Gong, Fang

    2009-01-01

    China had successfully launched her second ocean color satellite HY-1B on 11 Apr., 2007, which was the successor of the HY-1A satellite launched on 15 May, 2002. There were two sensors onboard HY-1B, named the Chinese Ocean Color and Temperature Scanner (COCTS) and the Coastal Zone Imager (CZI) respectively, and COCTS was the main sensor. COCTS had not only eight visible and near-infrared wave bands similar to the SeaWiFS, but also two more thermal infrared wave bands to measure the sea surface temperature. Therefore, COCTS had broad application potentiality, such as fishery resource protection and development, coastal monitoring and management and marine pollution monitoring. In this paper, the main characteristics of COCTS were described firstly. Then, using the crosscalibration method, the vicarious calibration of COCTS was carried out by the synchronous remote sensing data of SeaWiFS, and the results showed that COCTS had well linear responses for the visible light bands with the correlation coefficients more than 0.98, however, the performances of the near infrared wavelength bands were not good as visible light bands. Using the vicarious calibration result, the operational atmospheric correction (AC) algorithm of COCTS was developed based on the exact Rayleigh scattering look-up table (LUT), aerosol scattering LUT and atmosphere diffuse transmission LUT generated by the coupled ocean-atmospheric vector radiative transfer numerical model named PCOART. The AC algorithm had been validated by the simulated radiance data at the top-of-atmosphere, and the results showed the errors of the water-leaving reflectance retrieved by the AC algorithm were less than 0.0005, which met the requirement of the exactly atmospheric correction of ocean color remote sensing. Finally, the AC algorithm was applied to the HY-1B/COCTS remote sensing data, and the corresponding ocean color remote sensing products have been generated.

  4. A Polygon Model for Wireless Sensor Network Deployment with Directional Sensing Areas

    PubMed Central

    Wu, Chun-Hsien; Chung, Yeh-Ching

    2009-01-01

    The modeling of the sensing area of a sensor node is essential for the deployment algorithm of wireless sensor networks (WSNs). In this paper, a polygon model is proposed for the sensor node with directional sensing area. In addition, a WSN deployment algorithm is presented with topology control and scoring mechanisms to maintain network connectivity and improve sensing coverage rate. To evaluate the proposed polygon model and WSN deployment algorithm, a simulation is conducted. The simulation results show that the proposed polygon model outperforms the existed disk model and circular sector model in terms of the maximum sensing coverage rate. PMID:22303159

  5. Distributed Algorithm for Voronoi Partition of Wireless Sensor Networks with a Limited Sensing Range.

    PubMed

    He, Chenlong; Feng, Zuren; Ren, Zhigang

    2018-02-03

    For Wireless Sensor Networks (WSNs), the Voronoi partition of a region is a challenging problem owing to the limited sensing ability of each sensor and the distributed organization of the network. In this paper, an algorithm is proposed for each sensor having a limited sensing range to compute its limited Voronoi cell autonomously, so that the limited Voronoi partition of the entire WSN is generated in a distributed manner. Inspired by Graham's Scan (GS) algorithm used to compute the convex hull of a point set, the limited Voronoi cell of each sensor is obtained by sequentially scanning two consecutive bisectors between the sensor and its neighbors. The proposed algorithm called the Boundary Scan (BS) algorithm has a lower computational complexity than the existing Range-Constrained Voronoi Cell (RCVC) algorithm and reaches the lower bound of the computational complexity of the algorithms used to solve the problem of this kind. Moreover, it also improves the time efficiency of a key step in the Adjust-Sensing-Radius (ASR) algorithm used to compute the exact Voronoi cell. Extensive numerical simulations are performed to demonstrate the correctness and effectiveness of the BS algorithm. The distributed realization of the BS combined with a localization algorithm in WSNs is used to justify the WSN nature of the proposed algorithm.

  6. Distributed Algorithm for Voronoi Partition of Wireless Sensor Networks with a Limited Sensing Range

    PubMed Central

    Feng, Zuren; Ren, Zhigang

    2018-01-01

    For Wireless Sensor Networks (WSNs), the Voronoi partition of a region is a challenging problem owing to the limited sensing ability of each sensor and the distributed organization of the network. In this paper, an algorithm is proposed for each sensor having a limited sensing range to compute its limited Voronoi cell autonomously, so that the limited Voronoi partition of the entire WSN is generated in a distributed manner. Inspired by Graham’s Scan (GS) algorithm used to compute the convex hull of a point set, the limited Voronoi cell of each sensor is obtained by sequentially scanning two consecutive bisectors between the sensor and its neighbors. The proposed algorithm called the Boundary Scan (BS) algorithm has a lower computational complexity than the existing Range-Constrained Voronoi Cell (RCVC) algorithm and reaches the lower bound of the computational complexity of the algorithms used to solve the problem of this kind. Moreover, it also improves the time efficiency of a key step in the Adjust-Sensing-Radius (ASR) algorithm used to compute the exact Voronoi cell. Extensive numerical simulations are performed to demonstrate the correctness and effectiveness of the BS algorithm. The distributed realization of the BS combined with a localization algorithm in WSNs is used to justify the WSN nature of the proposed algorithm. PMID:29401649

  7. Remote Sensing of In-Flight Icing Conditions: Operational, Meteorological, and Technological Considerations

    NASA Technical Reports Server (NTRS)

    Ryerson, Charles C.

    2000-01-01

    Remote-sensing systems that map aircraft icing conditions in the flight path from airports or aircraft would allow icing to be avoided and exited. Icing remote-sensing system development requires consideration of the operational environment, the meteorological environment, and the technology available. Operationally, pilots need unambiguous cockpit icing displays for risk management decision-making. Human factors, aircraft integration, integration of remotely sensed icing information into the weather system infrastructures, and avoid-and-exit issues need resolution. Cost, maintenance, power, weight, and space concern manufacturers, operators, and regulators. An icing remote-sensing system detects cloud and precipitation liquid water, drop size, and temperature. An algorithm is needed to convert these conditions into icing potential estimates for cockpit display. Specification development requires that magnitudes of cloud microphysical conditions and their spatial and temporal variability be understood at multiple scales. The core of an icing remote-sensing system is the technology that senses icing microphysical conditions. Radar and microwave radiometers penetrate clouds and can estimate liquid water and drop size. Retrieval development is needed; differential attenuation and neural network assessment of multiple-band radar returns are most promising to date. Airport-based radar or radiometers are the most viable near-term technologies. A radiometer that profiles cloud liquid water, and experimental techniques to use radiometers horizontally, are promising. The most critical operational research needs are to assess cockpit and aircraft system integration, develop avoid-and-exit protocols, assess human factors, and integrate remote-sensing information into weather and air traffic control infrastructures. Improved spatial characterization of cloud and precipitation liquid-water content, drop-size spectra, and temperature are needed, as well as an algorithm to convert sensed conditions into a measure of icing potential. Technology development also requires refinement of inversion techniques. These goals can be accomplished with collaboration among federal agencies including NASA, the FAA, the National Center for Atmospheric Research, NOAA, and the Department of Defense. This report reviews operational, meteorological, and technological considerations in developing the capability to remotely map in-flight icing conditions from the ground and from the air.

  8. Needs Assessment for the Use of NASA Remote Sensing Data in the Development and Implementation of Estuarine and Coastal Water Quality Standards

    NASA Technical Reports Server (NTRS)

    Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake

    2010-01-01

    The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.

  9. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    NASA Astrophysics Data System (ADS)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  10. A software package for evaluating the performance of a star sensor operation

    NASA Astrophysics Data System (ADS)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  11. The Role of Combination Techniques in Maximizing the Utility of Precipitation Estimates from Several Multi-Purpose Remote-Sensing Systems

    NASA Technical Reports Server (NTRS)

    Huffman, George J.; Adler, Robert F.; Bolvin, David T.; Curtis, Scott; Einaudi, Franco (Technical Monitor)

    2001-01-01

    Multi-purpose remote-sensing products from various satellites have proved crucial in developing global estimates of precipitation. Examples of these products include low-earth-orbit and geosynchronous-orbit infrared (leo- and geo-IR), Outgoing Longwave Radiation (OLR), Television Infrared Operational Satellite (TIROS) Operational Vertical Sounder (TOVS) data, and passive microwave data such as that from the Special Sensor Microwave/ Imager (SSM/I). Each of these datasets has served as the basis for at least one useful quasi-global precipitation estimation algorithm; however, the quality of estimates varies tremendously among the algorithms for the different climatic regions around the globe.

  12. Characterization of Moving Dust Particles

    NASA Technical Reports Server (NTRS)

    Bos, Brent J.; Antonille, Scott R.; Memarsadeghi, Nargess

    2010-01-01

    A large depth-of-field Particle Image Velocimeter (PIV) has been developed at NASA GSFC to characterize dynamic dust environments on planetary surfaces. This instrument detects and senses lofted dust particles. We have been developing an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that it has to store and downlink. The algorithm analyzes PIV images and reduces the image information down to only the particle measurement data we are interested in receiving on the ground - typically reducing the amount of data to be handled by more than two orders of magnitude. We give a general description of PIV algorithms and describe only the algorithm for estimating the velocity of the traveling particles.

  13. [A review of atmospheric aerosol research by using polarization remote sensing].

    PubMed

    Guo, Hong; Gu, Xing-Fa; Xie, Dong-Hai; Yu, Tao; Meng, Qing-Yan

    2014-07-01

    In the present paper, aerosol research by using polarization remote sensing in last two decades (1993-2013) was reviewed, including aerosol researches based on POLDER/PARASOL, APS(Aerosol Polarimetry Sensor), Polarized Airborne camera and Ground-based measurements. We emphasize the following three aspects: (1) The retrieval algorithms developed for land and marine aerosol by using POLDER/PARASOL; The validation and application of POLDER/PARASOL AOD, and cross-comparison with AOD of other satellites, such as MODIS AOD. (2) The retrieval algorithms developed for land and marine aerosol by using MICROPOL and RSP/APS. We also introduce the new progress in aerosol research based on The Directional Polarimetric Camera (DPC), which was produced by Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences (CAS). (3) The aerosol retrieval algorithms by using measurements from ground-based instruments, such as CE318-2 and CE318-DP. The retrieval results from spaceborne sensors, airborne camera and ground-based measurements include total AOD, fine-mode AOD, coarse-mode AOD, size distribution, particle shape, complex refractive indices, single scattering albedo, scattering phase function, polarization phase function and AOD above cloud. Finally, based on the research, the authors present the problems and prospects of atmospheric aerosol research by using polarization remote sensing, and provide a valuable reference for the future studies of atmospheric aerosol.

  14. Multivariate Density Estimation and Remote Sensing

    NASA Technical Reports Server (NTRS)

    Scott, D. W.

    1983-01-01

    Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.

  15. Using Remotely Sensed Information for Near Real-Time Landslide Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Peters-Lidard, Christa

    2013-01-01

    The increasing availability of remotely sensed precipitation and surface products provides a unique opportunity to explore how landslide susceptibility and hazard assessment may be approached at larger spatial scales with higher resolution remote sensing products. A prototype global landslide hazard assessment framework has been developed to evaluate how landslide susceptibility and satellite-derived precipitation estimates can be used to identify potential landslide conditions in near-real time. Preliminary analysis of this algorithm suggests that forecasting errors are geographically variable due to the resolution and accuracy of the current susceptibility map and the application of satellite-based rainfall estimates. This research is currently working to improve the algorithm through considering higher spatial and temporal resolution landslide susceptibility information and testing different rainfall triggering thresholds, antecedent rainfall scenarios, and various surface products at regional and global scales.

  16. Benchmarking of data fusion algorithms in support of earth observation based Antarctic wildlife monitoring

    NASA Astrophysics Data System (ADS)

    Witharana, Chandi; LaRue, Michelle A.; Lynch, Heather J.

    2016-03-01

    Remote sensing is a rapidly developing tool for mapping the abundance and distribution of Antarctic wildlife. While both panchromatic and multispectral imagery have been used in this context, image fusion techniques have received little attention. We tasked seven widely-used fusion algorithms: Ehlers fusion, hyperspherical color space fusion, high-pass fusion, principal component analysis (PCA) fusion, University of New Brunswick fusion, and wavelet-PCA fusion to resolution enhance a series of single-date QuickBird-2 and Worldview-2 image scenes comprising penguin guano, seals, and vegetation. Fused images were assessed for spectral and spatial fidelity using a variety of quantitative quality indicators and visual inspection methods. Our visual evaluation elected the high-pass fusion algorithm and the University of New Brunswick fusion algorithm as best for manual wildlife detection while the quantitative assessment suggested the Gram-Schmidt fusion algorithm and the University of New Brunswick fusion algorithm as best for automated classification. The hyperspherical color space fusion algorithm exhibited mediocre results in terms of spectral and spatial fidelities. The PCA fusion algorithm showed spatial superiority at the expense of spectral inconsistencies. The Ehlers fusion algorithm and the wavelet-PCA algorithm showed the weakest performances. As remote sensing becomes a more routine method of surveying Antarctic wildlife, these benchmarks will provide guidance for image fusion and pave the way for more standardized products for specific types of wildlife surveys.

  17. Remote-sensing image encryption in hybrid domains

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqiang; Zhu, Guiliang; Ma, Shilong

    2012-04-01

    Remote-sensing technology plays an important role in military and industrial fields. Remote-sensing image is the main means of acquiring information from satellites, which always contain some confidential information. To securely transmit and store remote-sensing images, we propose a new image encryption algorithm in hybrid domains. This algorithm makes full use of the advantages of image encryption in both spatial domain and transform domain. First, the low-pass subband coefficients of image DWT (discrete wavelet transform) decomposition are sorted by a PWLCM system in transform domain. Second, the image after IDWT (inverse discrete wavelet transform) reconstruction is diffused with 2D (two-dimensional) Logistic map and XOR operation in spatial domain. The experiment results and algorithm analyses show that the new algorithm possesses a large key space and can resist brute-force, statistical and differential attacks. Meanwhile, the proposed algorithm has the desirable encryption efficiency to satisfy requirements in practice.

  18. Concept for a hyperspectral remote sensing algorithm for floating marine macro plastics.

    PubMed

    Goddijn-Murphy, Lonneke; Peters, Steef; van Sebille, Erik; James, Neil A; Gibb, Stuart

    2018-01-01

    There is growing global concern over the chemical, biological and ecological impact of plastics in the ocean. Remote sensing has the potential to provide long-term, global monitoring but for marine plastics it is still in its early stages. Some progress has been made in hyperspectral remote sensing of marine macroplastics in the visible (VIS) to short wave infrared (SWIR) spectrum. We present a reflectance model of sunlight interacting with a sea surface littered with macro plastics, based on geometrical optics and the spectral signatures of plastic and seawater. This is a first step towards the development of a remote sensing algorithm for marine plastic using light reflectance measurements in air. Our model takes the colour, transparency, reflectivity and shape of plastic litter into account. This concept model can aid the design of laboratory, field and Earth observation measurements in the VIS-SWIR spectrum and explain the results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    NASA Astrophysics Data System (ADS)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  20. An Adaptive 6-DOF Tracking Method by Hybrid Sensing for Ultrasonic Endoscopes

    PubMed Central

    Du, Chengyang; Chen, Xiaodong; Wang, Yi; Li, Junwei; Yu, Daoyin

    2014-01-01

    In this paper, a novel hybrid sensing method for tracking an ultrasonic endoscope within the gastrointestinal (GI) track is presented, and the prototype of the tracking system is also developed. We implement 6-DOF localization by sensing integration and information fusion. On the hardware level, a tri-axis gyroscope and accelerometer, and a magnetic angular rate and gravity (MARG) sensor array are attached at the end of endoscopes, and three symmetric cylindrical coils are placed around patients' abdomens. On the algorithm level, an adaptive fast quaternion convergence (AFQC) algorithm is introduced to determine the orientation by fusing inertial/magnetic measurements, in which the effects of magnetic disturbance and acceleration are estimated to gain an adaptive convergence output. A simplified electro-magnetic tracking (SEMT) algorithm for dimensional position is also implemented, which can easily integrate the AFQC's results and magnetic measurements. Subsequently, the average position error is under 0.3 cm by reasonable setting, and the average orientation error is 1° without noise. If magnetic disturbance or acceleration exists, the average orientation error can be controlled to less than 3.5°. PMID:24915179

  1. Novel Kalman filter algorithm for statistical monitoring of extensive landscapes with synoptic sensor data

    Treesearch

    Raymond L. Czaplewski

    2015-01-01

    Wall-to-wall remotely sensed data are increasingly available to monitor landscape dynamics over large geographic areas. However, statistical monitoring programs that use post-stratification cannot fully utilize those sensor data. The Kalman filter (KF) is an alternative statistical estimator. I develop a new KF algorithm that is numerically robust with large numbers of...

  2. Double-Group Particle Swarm Optimization and Its Application in Remote Sensing Image Segmentation

    PubMed Central

    Shen, Liang; Huang, Xiaotao; Fan, Chongyi

    2018-01-01

    Particle Swarm Optimization (PSO) is a well-known meta-heuristic. It has been widely used in both research and engineering fields. However, the original PSO generally suffers from premature convergence, especially in multimodal problems. In this paper, we propose a double-group PSO (DG-PSO) algorithm to improve the performance. DG-PSO uses a double-group based evolution framework. The individuals are divided into two groups: an advantaged group and a disadvantaged group. The advantaged group works according to the original PSO, while two new strategies are developed for the disadvantaged group. The proposed algorithm is firstly evaluated by comparing it with the other five popular PSO variants and two state-of-the-art meta-heuristics on various benchmark functions. The results demonstrate that DG-PSO shows a remarkable performance in terms of accuracy and stability. Then, we apply DG-PSO to multilevel thresholding for remote sensing image segmentation. The results show that the proposed algorithm outperforms five other popular algorithms in meta-heuristic-based multilevel thresholding, which verifies the effectiveness of the proposed algorithm. PMID:29724013

  3. Double-Group Particle Swarm Optimization and Its Application in Remote Sensing Image Segmentation.

    PubMed

    Shen, Liang; Huang, Xiaotao; Fan, Chongyi

    2018-05-01

    Particle Swarm Optimization (PSO) is a well-known meta-heuristic. It has been widely used in both research and engineering fields. However, the original PSO generally suffers from premature convergence, especially in multimodal problems. In this paper, we propose a double-group PSO (DG-PSO) algorithm to improve the performance. DG-PSO uses a double-group based evolution framework. The individuals are divided into two groups: an advantaged group and a disadvantaged group. The advantaged group works according to the original PSO, while two new strategies are developed for the disadvantaged group. The proposed algorithm is firstly evaluated by comparing it with the other five popular PSO variants and two state-of-the-art meta-heuristics on various benchmark functions. The results demonstrate that DG-PSO shows a remarkable performance in terms of accuracy and stability. Then, we apply DG-PSO to multilevel thresholding for remote sensing image segmentation. The results show that the proposed algorithm outperforms five other popular algorithms in meta-heuristic-based multilevel thresholding, which verifies the effectiveness of the proposed algorithm.

  4. Broadband Phase Retrieval for Image-Based Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.

    2007-01-01

    A focus-diverse phase-retrieval algorithm has been shown to perform adequately for the purpose of image-based wavefront sensing when (1) broadband light (typically spanning the visible spectrum) is used in forming the images by use of an optical system under test and (2) the assumption of monochromaticity is applied to the broadband image data. Heretofore, it had been assumed that in order to obtain adequate performance, it is necessary to use narrowband or monochromatic light. Some background information, including definitions of terms and a brief description of pertinent aspects of image-based phase retrieval, is prerequisite to a meaningful summary of the present development. Phase retrieval is a general term used in optics to denote estimation of optical imperfections or aberrations of an optical system under test. The term image-based wavefront sensing refers to a general class of algorithms that recover optical phase information, and phase-retrieval algorithms constitute a subset of this class. In phase retrieval, one utilizes the measured response of the optical system under test to produce a phase estimate. The optical response of the system is defined as the image of a point-source object, which could be a star or a laboratory point source. The phase-retrieval problem is characterized as image-based in the sense that a charge-coupled-device camera, preferably of scientific imaging quality, is used to collect image data where the optical system would normally form an image. In a variant of phase retrieval, denoted phase-diverse phase retrieval [which can include focus-diverse phase retrieval (in which various defocus planes are used)], an additional known aberration (or an equivalent diversity function) is superimposed as an aid in estimating unknown aberrations by use of an image-based wavefront-sensing algorithm. Image-based phase-retrieval differs from such other wavefront-sensing methods, such as interferometry, shearing interferometry, curvature wavefront sensing, and Shack-Hartmann sensing, all of which entail disadvantages in comparison with image-based methods. The main disadvantages of these non-image based methods are complexity of test equipment and the need for a wavefront reference.

  5. Modeling, simulation, and analysis of optical remote sensing systems

    NASA Technical Reports Server (NTRS)

    Kerekes, John Paul; Landgrebe, David A.

    1989-01-01

    Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.

  6. Vertical Jump Height Estimation Algorithm Based on Takeoff and Landing Identification Via Foot-Worn Inertial Sensing.

    PubMed

    Wang, Jianren; Xu, Junkai; Shull, Peter B

    2018-03-01

    Vertical jump height is widely used for assessing motor development, functional ability, and motor capacity. Traditional methods for estimating vertical jump height rely on force plates or optical marker-based motion capture systems limiting assessment to people with access to specialized laboratories. Current wearable designs need to be attached to the skin or strapped to an appendage which can potentially be uncomfortable and inconvenient to use. This paper presents a novel algorithm for estimating vertical jump height based on foot-worn inertial sensors. Twenty healthy subjects performed countermovement jumping trials and maximum jump height was determined via inertial sensors located above the toe and under the heel and was compared with the gold standard maximum jump height estimation via optical marker-based motion capture. Average vertical jump height estimation errors from inertial sensing at the toe and heel were -2.2±2.1 cm and -0.4±3.8 cm, respectively. Vertical jump height estimation with the presented algorithm via inertial sensing showed excellent reliability at the toe (ICC(2,1)=0.98) and heel (ICC(2,1)=0.97). There was no significant bias in the inertial sensing at the toe, but proportional bias (b=1.22) and fixed bias (a=-10.23cm) were detected in inertial sensing at the heel. These results indicate that the presented algorithm could be applied to foot-worn inertial sensors to estimate maximum jump height enabling assessment outside of traditional laboratory settings, and to avoid bias errors, the toe may be a more suitable location for inertial sensor placement than the heel.

  7. Monitoring terrestrial dissolved organic carbon export at land-water interfaces using remote sensing

    NASA Astrophysics Data System (ADS)

    Yu, Q.; Li, J.; Tian, Y. Q.

    2017-12-01

    Carbon flux from land to oceans and lakes is a crucial component of carbon cycling. However, this lateral carbon flow at land-water interface is often neglected in the terrestrial carbon cycle budget, mainly because observations of the carbon dynamics are very limited. Monitoring CDOM/DOC dynamics using remote sensing and assessing DOC export from land to water remains a challenge. Current CDOM retrieval algorithms in the field of ocean color are not simply applicable to inland aquatic ecosystems since they were developed for coarse resolution ocean-viewing imagery and less complex water types in open-sea. We developed a new semi-analytical algorithm, called SBOP (Shallow water Bio-Optical Properties algorithm) to adapt to shallow inland waters. SBOP was first developed and calibrated based on in situ hyperspectral radiometer data. Then we applied it to the Landsat-8 OLI images and evaluated the effectiveness of the multispectral images on inversion of CDOM absorption based on our field sampling at the Saginaw Bay in the Lake Huron. The algorithm performances (RMSE = 0.17 and R2 = 0.87 in the Saginaw Bay; R2 = 0.80 in the northeastern US lakes) is promising and we conclude the CDOM absorption can be derived from Landsat-8 OLI image in both optically deep and optically shallow waters with high accuracy. Our method addressed challenges on employing appropriate atmospheric correction, determining bottom reflectance influence for shallow waters, and improving for bio-optical properties retrieval, as well as adapting to both hyperspectral and the multispectral remote sensing imagery. Over 100 Landsat-8 images in Lake Huron, northeastern US lakes, and the Arctic major rivers were processed to understand the CDOM spatio-temporal dynamics and its associated driving factors.

  8. Simple algorithms for remote determination of mineral abundances and particle sizes from reflectance spectra

    NASA Technical Reports Server (NTRS)

    Johnson, Paul E.; Smith, Milton O.; Adams, John B.

    1992-01-01

    Algorithms were developed, based on Hapke's (1981) equations, for remote determinations of mineral abundances and particle sizes from reflectance spectra. In this method, spectra are modeled as a function of end-member abundances and illumination/viewing geometry. The method was tested on a laboratory data set. It is emphasized that, although there exist more sophisticated models, the present algorithms are particularly suited for remotely sensed data, where little opportunity exists to independently measure reflectance versus article size and phase function.

  9. A real-time MTFC algorithm of space remote-sensing camera based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhao, Liting; Huang, Gang; Lin, Zhe

    2018-01-01

    A real-time MTFC algorithm of space remote-sensing camera based on FPGA was designed. The algorithm can provide real-time image processing to enhance image clarity when the remote-sensing camera running on-orbit. The image restoration algorithm adopted modular design. The MTF measurement calculation module on-orbit had the function of calculating the edge extension function, line extension function, ESF difference operation, normalization MTF and MTFC parameters. The MTFC image filtering and noise suppression had the function of filtering algorithm and effectively suppressing the noise. The algorithm used System Generator to design the image processing algorithms to simplify the design structure of system and the process redesign. The image gray gradient dot sharpness edge contrast and median-high frequency were enhanced. The image SNR after recovery reduced less than 1 dB compared to the original image. The image restoration system can be widely used in various fields.

  10. RZA-NLMF algorithm-based adaptive sparse sensing for realizing compressive sensing

    NASA Astrophysics Data System (ADS)

    Gui, Guan; Xu, Li; Adachi, Fumiyuki

    2014-12-01

    Nonlinear sparse sensing (NSS) techniques have been adopted for realizing compressive sensing in many applications such as radar imaging. Unlike the NSS, in this paper, we propose an adaptive sparse sensing (ASS) approach using the reweighted zero-attracting normalized least mean fourth (RZA-NLMF) algorithm which depends on several given parameters, i.e., reweighted factor, regularization parameter, and initial step size. First, based on the independent assumption, Cramer-Rao lower bound (CRLB) is derived as for the performance comparisons. In addition, reweighted factor selection method is proposed for achieving robust estimation performance. Finally, to verify the algorithm, Monte Carlo-based computer simulations are given to show that the ASS achieves much better mean square error (MSE) performance than the NSS.

  11. Distributed optical fiber vibration sensing using phase-generated carrier demodulation algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Zhihua; Zhang, Qi; Zhang, Mingyu; Dai, Haolong; Zhang, Jingjing; Liu, Li; Zhang, Lijun; Jin, Xing; Wang, Gaifang; Qi, Guang

    2018-05-01

    A novel optical fiber-distributed vibration-sensing system is proposed, which is based on self-interference of Rayleigh backscattering with phase-generated carrier (PGC) demodulation algorithm. Pulsed lights are sent into the sensing fiber and the Rayleigh backscattering light from a certain position along the sensing fiber would interfere through an unbalanced Michelson interferometry to generate the interference light. An improved PGC demodulation algorithm is carried out to recover the phase information of the interference signal, which carries the sensing information. Three vibration events were applied simultaneously to different positions over 2000 m sensing fiber and demodulated correctly. The spatial resolution is 10 m, and the noise level of the Φ-OTDR system we proposed is about 10-3 rad/\\surd {Hz}, and the signal-to-noise ratio is about 30.34 dB.

  12. Super-resolution algorithm based on sparse representation and wavelet preprocessing for remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Ren, Ruizhi; Gu, Lingjia; Fu, Haoyang; Sun, Chenglin

    2017-04-01

    An effective super-resolution (SR) algorithm is proposed for actual spectral remote sensing images based on sparse representation and wavelet preprocessing. The proposed SR algorithm mainly consists of dictionary training and image reconstruction. Wavelet preprocessing is used to establish four subbands, i.e., low frequency, horizontal, vertical, and diagonal high frequency, for an input image. As compared to the traditional approaches involving the direct training of image patches, the proposed approach focuses on the training of features derived from these four subbands. The proposed algorithm is verified using different spectral remote sensing images, e.g., moderate-resolution imaging spectroradiometer (MODIS) images with different bands, and the latest Chinese Jilin-1 satellite images with high spatial resolution. According to the visual experimental results obtained from the MODIS remote sensing data, the SR images using the proposed SR algorithm are superior to those using a conventional bicubic interpolation algorithm or traditional SR algorithms without preprocessing. Fusion algorithms, e.g., standard intensity-hue-saturation, principal component analysis, wavelet transform, and the proposed SR algorithms are utilized to merge the multispectral and panchromatic images acquired by the Jilin-1 satellite. The effectiveness of the proposed SR algorithm is assessed by parameters such as peak signal-to-noise ratio, structural similarity index, correlation coefficient, root-mean-square error, relative dimensionless global error in synthesis, relative average spectral error, spectral angle mapper, and the quality index Q4, and its performance is better than that of the standard image fusion algorithms.

  13. A study on characterization of stratospheric aerosol and gas parameters with the spacecraft solar occultation experiment

    NASA Technical Reports Server (NTRS)

    Chu, W. P.

    1977-01-01

    Spacecraft remote sensing of stratospheric aerosol and ozone vertical profiles using the solar occultation experiment has been analyzed. A computer algorithm has been developed in which a two step inversion of the simulated data can be performed. The radiometric data are first inverted into a vertical extinction profile using a linear inversion algorithm. Then the multiwavelength extinction profiles are solved with a nonlinear least square algorithm to produce aerosol and ozone vertical profiles. Examples of inversion results are shown illustrating the resolution and noise sensitivity of the inversion algorithms.

  14. Classification of bottom composition and bathymetry of shallow waters by passive remote sensing

    NASA Astrophysics Data System (ADS)

    Spitzer, D.; Dirks, R. W. J.

    The use of remote sensing data in the development of algorithms to remove the influence of the watercolumn on upwelling optical signals when mapping the bottom depth and composition in shallow waters. Calculations relating the reflectance spectra to the parameters of the watercolumn and the diverse bottom types are performed and measurements of the underwater reflection coefficient of sandy, mud, and vegetation-type seabottoms are taken. The two-flow radiative transfer model is used. Reflectances within the spectral bands of the Landsat MSS, the Landsat TM, SPOT HVR, and the TIROS-N series AVHRR were computed in order to develop appropriate algorithms suitable for the bottom depth and type mapping. Bottom depth and features appear to be observable down to 3-20 m depending on the water composition and bottom type.

  15. Interdisciplinary Investigations in Support of Project DI-MOD

    NASA Technical Reports Server (NTRS)

    Starks, Scott A. (Principal Investigator)

    1996-01-01

    Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.

  16. Nonlinear-Based MEMS Sensors and Active Switches for Gas Detection.

    PubMed

    Bouchaala, Adam; Jaber, Nizar; Yassine, Omar; Shekhah, Osama; Chernikova, Valeriya; Eddaoudi, Mohamed; Younis, Mohammad I

    2016-05-25

    The objective of this paper is to demonstrate the integration of a MOF thin film on electrostatically actuated microstructures to realize a switch triggered by gas and a sensing algorithm based on amplitude tracking. The devices are based on the nonlinear response of micromachined clamped-clamped beams. The microbeams are coated with a metal-organic framework (MOF), namely HKUST-1, to achieve high sensitivity. The softening and hardening nonlinear behaviors of the microbeams are exploited to demonstrate the ideas. For gas sensing, an amplitude-based tracking algorithm is developed to quantify the captured quantity of gas. Then, a MEMS switch triggered by gas using the nonlinear response of the microbeam is demonstrated. Noise analysis is conducted, which shows that the switch has high stability against thermal noise. The proposed switch is promising for delivering binary sensing information, and also can be used directly to activate useful functionalities, such as alarming.

  17. [Investigation on remote measurement of air pollution by a method of infrared passive scanning imaging].

    PubMed

    Jiao, Yang; Xu, Liang; Gao, Min-Guang; Feng, Ming-Chun; Jin, Ling; Tong, Jing-Jing; Li, Sheng

    2012-07-01

    Passive remote sensing by Fourier-transform infrared (FTIR) spectrometry allows detection of air pollution. However, for the localization of a leak and a complete assessment of the situation in the case of the release of a hazardous cloud, information about the position and the distribution of a cloud is essential. Therefore, an imaging passive remote sensing system comprising an interferometer, a data acquisition and processing software, scan system, a video system, and a personal computer has been developed. The remote sensing of SF6 was done. The column densities of all directions in which a target compound has been identified may be retrieved by a nonlinear least squares fitting algorithm and algorithm of radiation transfer, and a false color image is displayed. The results were visualized by a video image, overlaid by false color concentration distribution image. The system has a high selectivity, and allows visualization and quantification of pollutant clouds.

  18. Nonlinear-Based MEMS Sensors and Active Switches for Gas Detection

    PubMed Central

    Bouchaala, Adam; Jaber, Nizar; Yassine, Omar; Shekhah, Osama; Chernikova, Valeriya; Eddaoudi, Mohamed; Younis, Mohammad I.

    2016-01-01

    The objective of this paper is to demonstrate the integration of a MOF thin film on electrostatically actuated microstructures to realize a switch triggered by gas and a sensing algorithm based on amplitude tracking. The devices are based on the nonlinear response of micromachined clamped-clamped beams. The microbeams are coated with a metal-organic framework (MOF), namely HKUST-1, to achieve high sensitivity. The softening and hardening nonlinear behaviors of the microbeams are exploited to demonstrate the ideas. For gas sensing, an amplitude-based tracking algorithm is developed to quantify the captured quantity of gas. Then, a MEMS switch triggered by gas using the nonlinear response of the microbeam is demonstrated. Noise analysis is conducted, which shows that the switch has high stability against thermal noise. The proposed switch is promising for delivering binary sensing information, and also can be used directly to activate useful functionalities, such as alarming. PMID:27231914

  19. Mathematical model and coordination algorithms for ensuring complex security of an organization

    NASA Astrophysics Data System (ADS)

    Novoseltsev, V. I.; Orlova, D. E.; Dubrovin, A. S.; Irkhin, V. P.

    2018-03-01

    The mathematical model of coordination when ensuring complex security of the organization is considered. On the basis of use of a method of casual search three types of algorithms of effective coordination adequate to mismatch level concerning security are developed: a coordination algorithm at domination of instructions of the coordinator; a coordination algorithm at domination of decisions of performers; a coordination algorithm at parity of interests of the coordinator and performers. Assessment of convergence of the algorithms considered above it was made by carrying out a computing experiment. The described algorithms of coordination have property of convergence in the sense stated above. And, the following regularity is revealed: than more simply in the structural relation the algorithm, for the smaller number of iterations is provided to those its convergence.

  20. An Efficient Distributed Compressed Sensing Algorithm for Decentralized Sensor Network.

    PubMed

    Liu, Jing; Huang, Kaiyu; Zhang, Guoxian

    2017-04-20

    We consider the joint sparsity Model 1 (JSM-1) in a decentralized scenario, where a number of sensors are connected through a network and there is no fusion center. A novel algorithm, named distributed compact sensing matrix pursuit (DCSMP), is proposed to exploit the computational and communication capabilities of the sensor nodes. In contrast to the conventional distributed compressed sensing algorithms adopting a random sensing matrix, the proposed algorithm focuses on the deterministic sensing matrices built directly on the real acquisition systems. The proposed DCSMP algorithm can be divided into two independent parts, the common and innovation support set estimation processes. The goal of the common support set estimation process is to obtain an estimated common support set by fusing the candidate support set information from an individual node and its neighboring nodes. In the following innovation support set estimation process, the measurement vector is projected into a subspace that is perpendicular to the subspace spanned by the columns indexed by the estimated common support set, to remove the impact of the estimated common support set. We can then search the innovation support set using an orthogonal matching pursuit (OMP) algorithm based on the projected measurement vector and projected sensing matrix. In the proposed DCSMP algorithm, the process of estimating the common component/support set is decoupled with that of estimating the innovation component/support set. Thus, the inaccurately estimated common support set will have no impact on estimating the innovation support set. It is proven that under the condition the estimated common support set contains the true common support set, the proposed algorithm can find the true innovation set correctly. Moreover, since the innovation support set estimation process is independent of the common support set estimation process, there is no requirement for the cardinality of both sets; thus, the proposed DCSMP algorithm is capable of tackling the unknown sparsity problem successfully.

  1. Robust Planning for Autonomous Navigation of Mobile Robots in Unstructured, Dynamic Environments: An LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    EISLER, G. RICHARD

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less

  2. Remote Sensing of Water Quality in Multipurpose Reservoirs: Case Study Applications in Indonesia, Mexico, and Uruguay

    NASA Astrophysics Data System (ADS)

    Miralles-Wilhelm, F.; Serrat-Capdevila, A.; Rodriguez, D.

    2017-12-01

    This research is focused on development of remote sensing methods to assess surface water pollution issues, particularly in multipurpose reservoirs. Three case study applications are presented to comparatively analyze remote sensing techniquesforo detection of nutrient related pollution, i.e., Nitrogen, Phosphorus, Chlorophyll, as this is a major water quality issue that has been identified in terms of pollution of major water sources around the country. This assessment will contribute to a better understanding of options for nutrient remote sensing capabilities and needs and assist water agencies in identifying the appropriate remote sensing tools and devise an application strategy to provide information needed to support decision-making regarding the targeting and monitoring of nutrient pollution prevention and mitigation measures. A detailed review of the water quality data available from ground based measurements was conducted in order to determine their suitability for a case study application of remote sensing. In the first case study, the Valle de Bravo reservoir in Mexico City reservoir offers a larger database of water quality which may be used to better calibrate and validate the algorithms required to obtain water quality data from remote sensing raw data. In the second case study application, the relatively data scarce Lake Toba in Indonesia can be useful to illustrate the value added of remote sensing data in locations where water quality data is deficient or inexistent. The third case study in the Paso Severino reservoir in Uruguay offers a combination of data scarcity and persistent development of harmful algae blooms. Landsat-TM data was obteined for the 3 study sites and algorithms for three key water quality parameters that are related to nutrient pollution: Chlorophyll-a, Total Nitrogen, and Total Phosphorus were calibrated and validated at the study sites. The three case study applications were developed into capacity building/training workshops for water resources students, applied scientists, practitioners, reservoir and water quality managers, and other interested stakeholders.

  3. An experimental study of graph connectivity for unsupervised word sense disambiguation.

    PubMed

    Navigli, Roberto; Lapata, Mirella

    2010-04-01

    Word sense disambiguation (WSD), the task of identifying the intended meanings (senses) of words in context, has been a long-standing research objective for natural language processing. In this paper, we are concerned with graph-based algorithms for large-scale WSD. Under this framework, finding the right sense for a given word amounts to identifying the most "important" node among the set of graph nodes representing its senses. We introduce a graph-based WSD algorithm which has few parameters and does not require sense-annotated data for training. Using this algorithm, we investigate several measures of graph connectivity with the aim of identifying those best suited for WSD. We also examine how the chosen lexicon and its connectivity influences WSD performance. We report results on standard data sets and show that our graph-based approach performs comparably to the state of the art.

  4. MEMS-based sensing and algorithm development for fall detection and gait analysis

    NASA Astrophysics Data System (ADS)

    Gupta, Piyush; Ramirez, Gabriel; Lie, Donald Y. C.; Dallas, Tim; Banister, Ron E.; Dentino, Andrew

    2010-02-01

    Falls by the elderly are highly detrimental to health, frequently resulting in injury, high medical costs, and even death. Using a MEMS-based sensing system, algorithms are being developed for detecting falls and monitoring the gait of elderly and disabled persons. In this study, wireless sensors utilize Zigbee protocols were incorporated into planar shoe insoles and a waist mounted device. The insole contains four sensors to measure pressure applied by the foot. A MEMS based tri-axial accelerometer is embedded in the insert and a second one is utilized by the waist mounted device. The primary fall detection algorithm is derived from the waist accelerometer. The differential acceleration is calculated from samples received in 1.5s time intervals. This differential acceleration provides the quantification via an energy index. From this index one may ascertain different gait and identify fall events. Once a pre-determined index threshold is exceeded, the algorithm will classify an event as a fall or a stumble. The secondary algorithm is derived from frequency analysis techniques. The analysis consists of wavelet transforms conducted on the waist accelerometer data. The insole pressure data is then used to underline discrepancies in the transforms, providing more accurate data for classifying gait and/or detecting falls. The range of the transform amplitude in the fourth iteration of a Daubechies-6 transform was found sufficient to detect and classify fall events.

  5. Distributed Task Offloading in Heterogeneous Vehicular Crowd Sensing

    PubMed Central

    Liu, Yazhi; Wang, Wendong; Ma, Yuekun; Yang, Zhigang; Yu, Fuxing

    2016-01-01

    The ability of road vehicles to efficiently execute different sensing tasks varies because of the heterogeneity in their sensing ability and trajectories. Therefore, the data collection sensing task, which requires tempo-spatial sensing data, becomes a serious problem in vehicular sensing systems, particularly those with limited sensing capabilities. A utility-based sensing task decomposition and offloading algorithm is proposed in this paper. The utility function for a task executed by a certain vehicle is built according to the mobility traces and sensing interfaces of the vehicle, as well as the sensing data type and tempo-spatial coverage requirements of the sensing task. Then, the sensing tasks are decomposed and offloaded to neighboring vehicles according to the utilities of the neighboring vehicles to the decomposed sensing tasks. Real trace-driven simulation shows that the proposed task offloading is able to collect much more comprehensive and uniformly distributed sensing data than other algorithms. PMID:27428967

  6. SAR-EDU - An education initiative for applied Synthetic Aperture Radar remote sensing

    NASA Astrophysics Data System (ADS)

    Eckardt, Robert; Richter, Nicole; Auer, Stefan; Eineder, Michael; Roth, Achim; Hajnsek, Irena; Walter, Diana; Braun, Matthias; Motagh, Mahdi; Pathe, Carsten; Pleskachevsky, Andrey; Thiel, Christian; Schmullius, Christiane

    2013-04-01

    Since the 1970s, radar remote sensing techniques have evolved rapidly and are increasingly employed in all fields of earth sciences. Applications are manifold and still expanding due to the continuous development of new instruments and missions as well as the availability of very high-quality data. The trend worldwide is towards operational employment of the various algorithms and methods that have been developed. However, the utilization of operational services does not keep up yet with the rate of technical developments and the improvements in sensor technology. With the enhancing availability and variety of space borne Synthetic Aperture Radar (SAR) data and a growing number of analysis algorithms the need for a vital user community is increasing. Therefore the German Aerospace Center (DLR) together with the Friedrich-Schiller-University Jena (FSU) and the Technical University Munich (TUM) launched the education initiative SAR-EDU. The aim of the project is to facilitate access to expert knowledge in the scientific field of radar remote sensing. Within this effort a web portal will be created to provide seminar material on SAR basics, methods and applications to support both, lecturers and students. The overall intension of the project SAR-EDU is to provide seminar material for higher education in radar remote sensing covering the topic holistically from the very basics to the most advanced methods and applications that are available. The principles of processing and interpreting SAR data are going to be taught using test data sets and open-source as well as commercial software packages. The material that is provided by SAR-EDU will be accessible at no charge from a DLR web portal. The educational tool will have a modular structure, consisting of separate modules that broach the issue of a particular topic. The aim of the implementation of SAR-EDU as application-oriented radar remote sensing educational tool is to advocate the development and wider use of operational services on the base of pre-existing algorithms and sensors on the one hand, and to aid the extension of radar remote sensing techniques to a broader field of application on the other. SAR-EDU therefore combines the knowledge, expertise and experience of an excellent German consortium.

  7. The Spectrum Analysis Solution (SAS) System: Theoretical Analysis, Hardware Design and Implementation.

    PubMed

    Narayanan, Ram M; Pooler, Richard K; Martone, Anthony F; Gallagher, Kyle A; Sherbondy, Kelly D

    2018-02-22

    This paper describes a multichannel super-heterodyne signal analyzer, called the Spectrum Analysis Solution (SAS), which performs multi-purpose spectrum sensing to support spectrally adaptive and cognitive radar applications. The SAS operates from ultrahigh frequency (UHF) to the S-band and features a wideband channel with eight narrowband channels. The wideband channel acts as a monitoring channel that can be used to tune the instantaneous band of the narrowband channels to areas of interest in the spectrum. The data collected from the SAS has been utilized to develop spectrum sensing algorithms for the budding field of spectrum sharing (SS) radar. Bandwidth (BW), average total power, percent occupancy (PO), signal-to-interference-plus-noise ratio (SINR), and power spectral entropy (PSE) have been examined as metrics for the characterization of the spectrum. These metrics are utilized to determine a contiguous optimal sub-band (OSB) for a SS radar transmission in a given spectrum for different modalities. Three OSB algorithms are presented and evaluated: the spectrum sensing multi objective (SS-MO), the spectrum sensing with brute force PSE (SS-BFE), and the spectrum sensing multi-objective with brute force PSE (SS-MO-BFE).

  8. The Spectrum Analysis Solution (SAS) System: Theoretical Analysis, Hardware Design and Implementation

    PubMed Central

    Pooler, Richard K.; Martone, Anthony F.; Gallagher, Kyle A.; Sherbondy, Kelly D.

    2018-01-01

    This paper describes a multichannel super-heterodyne signal analyzer, called the Spectrum Analysis Solution (SAS), which performs multi-purpose spectrum sensing to support spectrally adaptive and cognitive radar applications. The SAS operates from ultrahigh frequency (UHF) to the S-band and features a wideband channel with eight narrowband channels. The wideband channel acts as a monitoring channel that can be used to tune the instantaneous band of the narrowband channels to areas of interest in the spectrum. The data collected from the SAS has been utilized to develop spectrum sensing algorithms for the budding field of spectrum sharing (SS) radar. Bandwidth (BW), average total power, percent occupancy (PO), signal-to-interference-plus-noise ratio (SINR), and power spectral entropy (PSE) have been examined as metrics for the characterization of the spectrum. These metrics are utilized to determine a contiguous optimal sub-band (OSB) for a SS radar transmission in a given spectrum for different modalities. Three OSB algorithms are presented and evaluated: the spectrum sensing multi objective (SS-MO), the spectrum sensing with brute force PSE (SS-BFE), and the spectrum sensing multi-objective with brute force PSE (SS-MO-BFE). PMID:29470448

  9. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  10. Remote Sensing Image Change Detection Based on NSCT-HMT Model and Its Application.

    PubMed

    Chen, Pengyun; Zhang, Yichen; Jia, Zhenhong; Yang, Jie; Kasabov, Nikola

    2017-06-06

    Traditional image change detection based on a non-subsampled contourlet transform always ignores the neighborhood information's relationship to the non-subsampled contourlet coefficients, and the detection results are susceptible to noise interference. To address these disadvantages, we propose a denoising method based on the non-subsampled contourlet transform domain that uses the Hidden Markov Tree model (NSCT-HMT) for change detection of remote sensing images. First, the ENVI software is used to calibrate the original remote sensing images. After that, the mean-ratio operation is adopted to obtain the difference image that will be denoised by the NSCT-HMT model. Then, using the Fuzzy Local Information C-means (FLICM) algorithm, the difference image is divided into the change area and unchanged area. The proposed algorithm is applied to a real remote sensing data set. The application results show that the proposed algorithm can effectively suppress clutter noise, and retain more detailed information from the original images. The proposed algorithm has higher detection accuracy than the Markov Random Field-Fuzzy C-means (MRF-FCM), the non-subsampled contourlet transform-Fuzzy C-means clustering (NSCT-FCM), the pointwise approach and graph theory (PA-GT), and the Principal Component Analysis-Nonlocal Means (PCA-NLM) denosing algorithm. Finally, the five algorithms are used to detect the southern boundary of the Gurbantunggut Desert in Xinjiang Uygur Autonomous Region of China, and the results show that the proposed algorithm has the best effect on real remote sensing image change detection.

  11. Remote Sensing Image Change Detection Based on NSCT-HMT Model and Its Application

    PubMed Central

    Chen, Pengyun; Zhang, Yichen; Jia, Zhenhong; Yang, Jie; Kasabov, Nikola

    2017-01-01

    Traditional image change detection based on a non-subsampled contourlet transform always ignores the neighborhood information’s relationship to the non-subsampled contourlet coefficients, and the detection results are susceptible to noise interference. To address these disadvantages, we propose a denoising method based on the non-subsampled contourlet transform domain that uses the Hidden Markov Tree model (NSCT-HMT) for change detection of remote sensing images. First, the ENVI software is used to calibrate the original remote sensing images. After that, the mean-ratio operation is adopted to obtain the difference image that will be denoised by the NSCT-HMT model. Then, using the Fuzzy Local Information C-means (FLICM) algorithm, the difference image is divided into the change area and unchanged area. The proposed algorithm is applied to a real remote sensing data set. The application results show that the proposed algorithm can effectively suppress clutter noise, and retain more detailed information from the original images. The proposed algorithm has higher detection accuracy than the Markov Random Field-Fuzzy C-means (MRF-FCM), the non-subsampled contourlet transform-Fuzzy C-means clustering (NSCT-FCM), the pointwise approach and graph theory (PA-GT), and the Principal Component Analysis-Nonlocal Means (PCA-NLM) denosing algorithm. Finally, the five algorithms are used to detect the southern boundary of the Gurbantunggut Desert in Xinjiang Uygur Autonomous Region of China, and the results show that the proposed algorithm has the best effect on real remote sensing image change detection. PMID:28587299

  12. Adaptive Bayes classifiers for remotely sensed data

    NASA Technical Reports Server (NTRS)

    Raulston, H. S.; Pace, M. O.; Gonzalez, R. C.

    1975-01-01

    An algorithm is developed for a learning, adaptive, statistical pattern classifier for remotely sensed data. The estimation procedure consists of two steps: (1) an optimal stochastic approximation of the parameters of interest, and (2) a projection of the parameters in time and space. The results reported are for Gaussian data in which the mean vector of each class may vary with time or position after the classifier is trained.

  13. Small Business Innovation Research (SBIR) Program, FY 1993. Program Solicitation 93.2, Closing Date: 2 August 1993

    DTIC Science & Technology

    1993-01-01

    demonstrate improved Pd and Pfa with advanced algorithms, prepare final drop test demonstration. Potential Commercial Market: LADAR profiling and sensing...Field Refrigeration (CRFR) CATEGORY: Exploratory Development OBJECTIVE: To develop a nonpowered (nonelectric) closed-cycle solid-gas sorption

  14. Development and validation of a MODIS colored dissolved organic matter (CDOM) algorithm in northwest Florida estuaries

    EPA Science Inventory

    Satellite remote sensing provides synoptic and frequent monitoring of water quality parameters that aids in determining the health of aquatic ecosystems and the development of effective management strategies. Northwest Florida estuaries are classified as optically-complex, or wat...

  15. Fast and accurate image recognition algorithms for fresh produce food safety sensing

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Chieh; Kim, Moon S.; Chao, Kuanglin; Kang, Sukwon; Lefcourt, Alan M.

    2011-06-01

    This research developed and evaluated the multispectral algorithms derived from hyperspectral line-scan fluorescence imaging under violet LED excitation for detection of fecal contamination on Golden Delicious apples. The algorithms utilized the fluorescence intensities at four wavebands, 680 nm, 684 nm, 720 nm, and 780 nm, for computation of simple functions for effective detection of contamination spots created on the apple surfaces using four concentrations of aqueous fecal dilutions. The algorithms detected more than 99% of the fecal spots. The effective detection of feces showed that a simple multispectral fluorescence imaging algorithm based on violet LED excitation may be appropriate to detect fecal contamination on fast-speed apple processing lines.

  16. Filtering method of star control points for geometric correction of remote sensing image based on RANSAC algorithm

    NASA Astrophysics Data System (ADS)

    Tan, Xiangli; Yang, Jungang; Deng, Xinpu

    2018-04-01

    In the process of geometric correction of remote sensing image, occasionally, a large number of redundant control points may result in low correction accuracy. In order to solve this problem, a control points filtering algorithm based on RANdom SAmple Consensus (RANSAC) was proposed. The basic idea of the RANSAC algorithm is that using the smallest data set possible to estimate the model parameters and then enlarge this set with consistent data points. In this paper, unlike traditional methods of geometric correction using Ground Control Points (GCPs), the simulation experiments are carried out to correct remote sensing images, which using visible stars as control points. In addition, the accuracy of geometric correction without Star Control Points (SCPs) optimization is also shown. The experimental results show that the SCPs's filtering method based on RANSAC algorithm has a great improvement on the accuracy of remote sensing image correction.

  17. Compositing multitemporal remote sensing data sets

    USGS Publications Warehouse

    Qi, J.; Huete, A.R.; Hood, J.; Kerr, Y.

    1993-01-01

    To eliminate cloud and atmosphere-affected pixels, the compositing of multi temporal remote sensing data sets is done by selecting the maximum vale of the normalized different vegetation index (NDVI) within a compositing period. The NDVI classifier, however, is strongly affected by surface type and anisotropic properties, sensor viewing geometries, and atmospheric conditions. Consequently, the composited, multi temporal, remote sensing data contain substantial noise from these external conditions. Consequently, the composited, multi temporal, remote sensing data contain substantial noise from these external effects. To improve the accuracy of compositing products, two key approaches can be taken: one is to refine the compositing classifier (NDVI) and the other is to improve existing compositing algorithms. In this project, an alternative classifier was developed and an alternative pixel selection criterion was proposed for compositing. The new classifier and the alternative compositing algorithm were applied to an advanced very high resolution radiometer data set of different biome types in the United States. The results were compared with the maximum value compositing and the best index slope extraction algorithms. The new approaches greatly reduced the high frequency noises related to the external factors and repainted more reliable data. The results suggest that the geometric-optical canopy properties of specific biomes may be needed in compositing. Limitations of the new approaches include the dependency of pixel selection on the length of the composite period and data discontinuity.

  18. Information surfing with the JHU/APL coherent imager

    NASA Astrophysics Data System (ADS)

    Ratto, Christopher R.; Shipley, Kara R.; Beagley, Nathaniel; Wolfe, Kevin C.

    2015-05-01

    The ability to perform remote forensics in situ is an important application of autonomous undersea vehicles (AUVs). Forensics objectives may include remediation of mines and/or unexploded ordnance, as well as monitoring of seafloor infrastructure. At JHU/APL, digital holography is being explored for the potential application to underwater imaging and integration with an AUV. In previous work, a feature-based approach was developed for processing the holographic imagery and performing object recognition. In this work, the results of the image processing method were incorporated into a Bayesian framework for autonomous path planning referred to as information surfing. The framework was derived assuming that the location of the object of interest is known a priori, but the type of object and its pose are unknown. The path-planning algorithm adaptively modifies the trajectory of the sensing platform based on historical performance of object and pose classification. The algorithm is called information surfing because the direction of motion is governed by the local information gradient. Simulation experiments were carried out using holographic imagery collected from submerged objects. The autonomous sensing algorithm was compared to a deterministic sensing CONOPS, and demonstrated improved accuracy and faster convergence in several cases.

  19. Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm.

    PubMed

    Yang, Mengzhao; Song, Wei; Mei, Haibin

    2017-07-23

    The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient.

  20. Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm

    PubMed Central

    Song, Wei; Mei, Haibin

    2017-01-01

    The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient. PMID:28737699

  1. Development of model reference adaptive control theory for electric power plant control applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mabius, L.E.

    1982-09-15

    The scope of this effort includes the theoretical development of a multi-input, multi-output (MIMO) Model Reference Control (MRC) algorithm, (i.e., model following control law), Model Reference Adaptive Control (MRAC) algorithm and the formulation of a nonlinear model of a typical electric power plant. Previous single-input, single-output MRAC algorithm designs have been generalized to MIMO MRAC designs using the MIMO MRC algorithm. This MRC algorithm, which has been developed using Command Generator Tracker methodologies, represents the steady state behavior (in the adaptive sense) of the MRAC algorithm. The MRC algorithm is a fundamental component in the MRAC design and stability analysis.more » An enhanced MRC algorithm, which has been developed for systems with more controls than regulated outputs, alleviates the MRC stability constraint of stable plant transmission zeroes. The nonlinear power plant model is based on the Cromby model with the addition of a governor valve management algorithm, turbine dynamics and turbine interactions with extraction flows. An application of the MRC algorithm to a linearization of this model demonstrates its applicability to power plant systems. In particular, the generated power changes at 7% per minute while throttle pressure and temperature, reheat temperature and drum level are held constant with a reasonable level of control. The enhanced algorithm reduces significantly control fluctuations without modifying the output response.« less

  2. Rail integrity alert system (RIAS) feature discrimination : final report.

    DOT National Transportation Integrated Search

    2016-08-01

    This report describes GE Global Researchs research, in partnership with GE Transportation, into developing and deploying : algorithms for a locomotive-based inductive sensing system that has a very high probability of detecting broken rails with v...

  3. Fast vision-based catheter 3D reconstruction

    NASA Astrophysics Data System (ADS)

    Moradi Dalvand, Mohsen; Nahavandi, Saeid; Howe, Robert D.

    2016-07-01

    Continuum robots offer better maneuverability and inherent compliance and are well-suited for surgical applications as catheters, where gentle interaction with the environment is desired. However, sensing their shape and tip position is a challenge as traditional sensors can not be employed in the way they are in rigid robotic manipulators. In this paper, a high speed vision-based shape sensing algorithm for real-time 3D reconstruction of continuum robots based on the views of two arbitrary positioned cameras is presented. The algorithm is based on the closed-form analytical solution of the reconstruction of quadratic curves in 3D space from two arbitrary perspective projections. High-speed image processing algorithms are developed for the segmentation and feature extraction from the images. The proposed algorithms are experimentally validated for accuracy by measuring the tip position, length and bending and orientation angles for known circular and elliptical catheter shaped tubes. Sensitivity analysis is also carried out to evaluate the robustness of the algorithm. Experimental results demonstrate good accuracy (maximum errors of  ±0.6 mm and  ±0.5 deg), performance (200 Hz), and robustness (maximum absolute error of 1.74 mm, 3.64 deg for the added noises) of the proposed high speed algorithms.

  4. Fast parallel MR image reconstruction via B1-based, adaptive restart, iterative soft thresholding algorithms (BARISTA).

    PubMed

    Muckley, Matthew J; Noll, Douglas C; Fessler, Jeffrey A

    2015-02-01

    Sparsity-promoting regularization is useful for combining compressed sensing assumptions with parallel MRI for reducing scan time while preserving image quality. Variable splitting algorithms are the current state-of-the-art algorithms for SENSE-type MR image reconstruction with sparsity-promoting regularization. These methods are very general and have been observed to work with almost any regularizer; however, the tuning of associated convergence parameters is a commonly-cited hindrance in their adoption. Conversely, majorize-minimize algorithms based on a single Lipschitz constant have been observed to be slow in shift-variant applications such as SENSE-type MR image reconstruction since the associated Lipschitz constants are loose bounds for the shift-variant behavior. This paper bridges the gap between the Lipschitz constant and the shift-variant aspects of SENSE-type MR imaging by introducing majorizing matrices in the range of the regularizer matrix. The proposed majorize-minimize methods (called BARISTA) converge faster than state-of-the-art variable splitting algorithms when combined with momentum acceleration and adaptive momentum restarting. Furthermore, the tuning parameters associated with the proposed methods are unitless convergence tolerances that are easier to choose than the constraint penalty parameters required by variable splitting algorithms.

  5. Fast Parallel MR Image Reconstruction via B1-based, Adaptive Restart, Iterative Soft Thresholding Algorithms (BARISTA)

    PubMed Central

    Noll, Douglas C.; Fessler, Jeffrey A.

    2014-01-01

    Sparsity-promoting regularization is useful for combining compressed sensing assumptions with parallel MRI for reducing scan time while preserving image quality. Variable splitting algorithms are the current state-of-the-art algorithms for SENSE-type MR image reconstruction with sparsity-promoting regularization. These methods are very general and have been observed to work with almost any regularizer; however, the tuning of associated convergence parameters is a commonly-cited hindrance in their adoption. Conversely, majorize-minimize algorithms based on a single Lipschitz constant have been observed to be slow in shift-variant applications such as SENSE-type MR image reconstruction since the associated Lipschitz constants are loose bounds for the shift-variant behavior. This paper bridges the gap between the Lipschitz constant and the shift-variant aspects of SENSE-type MR imaging by introducing majorizing matrices in the range of the regularizer matrix. The proposed majorize-minimize methods (called BARISTA) converge faster than state-of-the-art variable splitting algorithms when combined with momentum acceleration and adaptive momentum restarting. Furthermore, the tuning parameters associated with the proposed methods are unitless convergence tolerances that are easier to choose than the constraint penalty parameters required by variable splitting algorithms. PMID:25330484

  6. Remote sensing image denoising application by generalized morphological component analysis

    NASA Astrophysics Data System (ADS)

    Yu, Chong; Chen, Xiong

    2014-12-01

    In this paper, we introduced a remote sensing image denoising method based on generalized morphological component analysis (GMCA). This novel algorithm is the further extension of morphological component analysis (MCA) algorithm to the blind source separation framework. The iterative thresholding strategy adopted by GMCA algorithm firstly works on the most significant features in the image, and then progressively incorporates smaller features to finely tune the parameters of whole model. Mathematical analysis of the computational complexity of GMCA algorithm is provided. Several comparison experiments with state-of-the-art denoising algorithms are reported. In order to make quantitative assessment of algorithms in experiments, Peak Signal to Noise Ratio (PSNR) index and Structural Similarity (SSIM) index are calculated to assess the denoising effect from the gray-level fidelity aspect and the structure-level fidelity aspect, respectively. Quantitative analysis on experiment results, which is consistent with the visual effect illustrated by denoised images, has proven that the introduced GMCA algorithm possesses a marvelous remote sensing image denoising effectiveness and ability. It is even hard to distinguish the original noiseless image from the recovered image by adopting GMCA algorithm through visual effect.

  7. Model Based Optimal Sensor Network Design for Condition Monitoring in an IGCC Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Rajeeva; Kumar, Aditya; Dai, Dan

    2012-12-31

    This report summarizes the achievements and final results of this program. The objective of this program is to develop a general model-based sensor network design methodology and tools to address key issues in the design of an optimal sensor network configuration: the type, location and number of sensors used in a network, for online condition monitoring. In particular, the focus in this work is to develop software tools for optimal sensor placement (OSP) and use these tools to design optimal sensor network configuration for online condition monitoring of gasifier refractory wear and radiant syngas cooler (RSC) fouling. The methodology developedmore » will be applicable to sensing system design for online condition monitoring for broad range of applications. The overall approach consists of (i) defining condition monitoring requirement in terms of OSP and mapping these requirements in mathematical terms for OSP algorithm, (ii) analyzing trade-off of alternate OSP algorithms, down selecting the most relevant ones and developing them for IGCC applications (iii) enhancing the gasifier and RSC models as required by OSP algorithms, (iv) applying the developed OSP algorithm to design the optimal sensor network required for the condition monitoring of an IGCC gasifier refractory and RSC fouling. Two key requirements for OSP for condition monitoring are desired precision for the monitoring variables (e.g. refractory wear) and reliability of the proposed sensor network in the presence of expected sensor failures. The OSP problem is naturally posed within a Kalman filtering approach as an integer programming problem where the key requirements of precision and reliability are imposed as constraints. The optimization is performed over the overall network cost. Based on extensive literature survey two formulations were identified as being relevant to OSP for condition monitoring; one based on LMI formulation and the other being standard INLP formulation. Various algorithms to solve these two formulations were developed and validated. For a given OSP problem the computation efficiency largely depends on the “size” of the problem. Initially a simplified 1-D gasifier model assuming axial and azimuthal symmetry was used to test out various OSP algorithms. Finally these algorithms were used to design the optimal sensor network for condition monitoring of IGCC gasifier refractory wear and RSC fouling. The sensors type and locations obtained as solution to the OSP problem were validated using model based sensing approach. The OSP algorithm has been developed in a modular form and has been packaged as a software tool for OSP design where a designer can explore various OSP design algorithm is a user friendly way. The OSP software tool is implemented in Matlab/Simulink© in-house. The tool also uses few optimization routines that are freely available on World Wide Web. In addition a modular Extended Kalman Filter (EKF) block has also been developed in Matlab/Simulink© which can be utilized for model based sensing of important process variables that are not directly measured through combining the online sensors with model based estimation once the hardware sensor and their locations has been finalized. The OSP algorithm details and the results of applying these algorithms to obtain optimal sensor location for condition monitoring of gasifier refractory wear and RSC fouling profile are summarized in this final report.« less

  8. The IEEE GRSS Standardized Remote Sensing Data Website: A Step Towards "Science 2.0" in Remote Sensing

    NASA Astrophysics Data System (ADS)

    Dell'Acqua, Fabio; Iannelli, Gianni Cristian; Kerekes, John; Lisini, Gianni; Moser, Gabriele; Ricardi, Niccolo; Pierce, Leland

    2016-08-01

    The issue of homogeneity in performance assessment of proposed algorithms for information extraction is generally perceived also in the Earth Observation (EO) domain. Different authors propose different datasets to test their developed algorithms and to the reader it is frequently difficult to assess which is better for his/her specific application, given the wide variability in test sets that makes pure comparison of e.g. accuracy values less meaningful than one would desire. With our work, we gave a modest contribution to ease the problem by making it possible to automatically distribute a limited set of possible "standard" open datasets, together with some ground truth info, and automatically assess processing results provided by the users.

  9. Resource sharing on CSMA/CD networks in the presence of noise. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Dinschel, Duane Edward

    1987-01-01

    Resource sharing on carrier sense multiple access with collision detection (CSMA/CD) networks can be accomplished by using window-control algorithms for bus contention. The window-control algorithms are designed to grant permission to transmit to the station with the minimum contention parameter. Proper operation of the window-control algorithm requires that all stations sense the same state of the newtork in each contention slot. Noise causes the state of the network to appear as a collision. False collisions can cause the window-control algorithm to terminate without isolating any stations. A two-phase window-control protocol and approximate recurrence equation with noise as a parameter to improve the performance of the window-control algorithms in the presence of noise are developed. The results are compared through simulation, with the approximate recurrence equation yielding the best overall performance. Noise is even a bigger problem when it is not detected by all stations. In such cases it is possible for the window boundaries of the contending stations to become out of phase. Consequently, it is possible to isolate a station other than the one with the minimum contention parameter. To guarantee proper isolation of the minimum, a broadcast phase must be added after the termination of the algorithm. The protocol required to correct the window-control algorithm when noise is not detected by all stations is discussed.

  10. RF tomography of metallic objects in free space: preliminary results

    NASA Astrophysics Data System (ADS)

    Li, Jia; Ewing, Robert L.; Berdanier, Charles; Baker, Christopher

    2015-05-01

    RF tomography has great potential in defense and homeland security applications. A distributed sensing research facility is under development at Air Force Research Lab. To develop a RF tomographic imaging system for the facility, preliminary experiments have been performed in an indoor range with 12 radar sensors distributed on a circle of 3m radius. Ultra-wideband pulses are used to illuminate single and multiple metallic targets. The echoes received by distributed sensors were processed and combined for tomography reconstruction. Traditional matched filter algorithm and truncated singular value decomposition (SVD) algorithm are compared in terms of their complexity, accuracy, and suitability for distributed processing. A new algorithm is proposed for shape reconstruction, which jointly estimates the object boundary and scatter points on the waveform's propagation path. The results show that the new algorithm allows accurate reconstruction of object shape, which is not available through the matched filter and truncated SVD algorithms.

  11. An Innovative Thinking-Based Intelligent Information Fusion Algorithm

    PubMed Central

    Hu, Liang; Liu, Gang; Zhou, Jin

    2013-01-01

    This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information. PMID:23956699

  12. An innovative thinking-based intelligent information fusion algorithm.

    PubMed

    Lu, Huimin; Hu, Liang; Liu, Gang; Zhou, Jin

    2013-01-01

    This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.

  13. Terrestrial Planet Finder Coronagraph and Enabling Technologies

    NASA Technical Reports Server (NTRS)

    Ford, Virginia G.

    2005-01-01

    Starlight suppression research is Stowed in Delta IV-H advancing rapidly to approach the required contrast ratio. The current analysis of the TPF Coronagraph system indicates that it is feasible to achieve the stability required by using developing technologies: a) Wave Front Sensing and Control (DMs, control algorithms, and sensing); b) Laser metrology. Yet needed: a) Property data measured with great precision in the required environments; b) Modeling tools that are verified with testbeds.

  14. Objected-oriented remote sensing image classification method based on geographic ontology model

    NASA Astrophysics Data System (ADS)

    Chu, Z.; Liu, Z. J.; Gu, H. Y.

    2016-11-01

    Nowadays, with the development of high resolution remote sensing image and the wide application of laser point cloud data, proceeding objected-oriented remote sensing classification based on the characteristic knowledge of multi-source spatial data has been an important trend on the field of remote sensing image classification, which gradually replaced the traditional method through improving algorithm to optimize image classification results. For this purpose, the paper puts forward a remote sensing image classification method that uses the he characteristic knowledge of multi-source spatial data to build the geographic ontology semantic network model, and carries out the objected-oriented classification experiment to implement urban features classification, the experiment uses protégé software which is developed by Stanford University in the United States, and intelligent image analysis software—eCognition software as the experiment platform, uses hyperspectral image and Lidar data that is obtained through flight in DaFeng City of JiangSu as the main data source, first of all, the experiment uses hyperspectral image to obtain feature knowledge of remote sensing image and related special index, the second, the experiment uses Lidar data to generate nDSM(Normalized DSM, Normalized Digital Surface Model),obtaining elevation information, the last, the experiment bases image feature knowledge, special index and elevation information to build the geographic ontology semantic network model that implement urban features classification, the experiment results show that, this method is significantly higher than the traditional classification algorithm on classification accuracy, especially it performs more evidently on the respect of building classification. The method not only considers the advantage of multi-source spatial data, for example, remote sensing image, Lidar data and so on, but also realizes multi-source spatial data knowledge integration and application of the knowledge to the field of remote sensing image classification, which provides an effective way for objected-oriented remote sensing image classification in the future.

  15. Development of a Cost-Effective Airborne Remote Sensing System for Coastal Monitoring

    PubMed Central

    Kim, Duk-jin; Jung, Jungkyo; Kang, Ki-mook; Kim, Seung Hee; Xu, Zhen; Hensley, Scott; Swan, Aaron; Duersch, Michael

    2015-01-01

    Coastal lands and nearshore marine areas are productive and rapidly changing places. However, these areas face many environmental challenges related to climate change and human-induced impacts. Space-borne remote sensing systems may be restricted in monitoring these areas because of their spatial and temporal resolutions. In situ measurements are also constrained from accessing the area and obtaining wide-coverage data. In these respects, airborne remote sensing sensors could be the most appropriate tools for monitoring these coastal areas. In this study, a cost-effective airborne remote sensing system with synthetic aperture radar and thermal infrared sensors was implemented to survey coastal areas. Calibration techniques and geophysical model algorithms were developed for the airborne system to observe the topography of intertidal flats, coastal sea surface current, sea surface temperature, and submarine groundwater discharge. PMID:26437413

  16. Development of a Cost-Effective Airborne Remote Sensing System for Coastal Monitoring.

    PubMed

    Kim, Duk-jin; Jung, Jungkyo; Kang, Ki-mook; Kim, Seung Hee; Xu, Zhen; Hensley, Scott; Swan, Aaron; Duersch, Michael

    2015-09-30

    Coastal lands and nearshore marine areas are productive and rapidly changing places. However, these areas face many environmental challenges related to climate change and human-induced impacts. Space-borne remote sensing systems may be restricted in monitoring these areas because of their spatial and temporal resolutions. In situ measurements are also constrained from accessing the area and obtaining wide-coverage data. In these respects, airborne remote sensing sensors could be the most appropriate tools for monitoring these coastal areas. In this study, a cost-effective airborne remote sensing system with synthetic aperture radar and thermal infrared sensors was implemented to survey coastal areas. Calibration techniques and geophysical model algorithms were developed for the airborne system to observe the topography of intertidal flats, coastal sea surface current, sea surface temperature, and submarine groundwater discharge.

  17. Disaggregation Of Passive Microwave Soil Moisture For Use In Watershed Hydrology Applications

    NASA Astrophysics Data System (ADS)

    Fang, Bin

    In recent years the passive microwave remote sensing has been providing soil moisture products using instruments on board satellite/airborne platforms. Spatial resolution has been restricted by the diameter of antenna which is inversely proportional to resolution. As a result, typical products have a spatial resolution of tens of kilometers, which is not compatible for some hydrological research applications. For this reason, the dissertation explores three disaggregation algorithms that estimate L-band passive microwave soil moisture at the subpixel level by using high spatial resolution remote sensing products from other optical and radar instruments were proposed and implemented in this investigation. The first technique utilized a thermal inertia theory to establish a relationship between daily temperature change and average soil moisture modulated by the vegetation condition was developed by using NLDAS, AVHRR, SPOT and MODIS data were applied to disaggregate the 25 km AMSR-E soil moisture to 1 km in Oklahoma. The second algorithm was built on semi empirical physical models (NP89 and LP92) derived from numerical experiments between soil evaporation efficiency and soil moisture over the surface skin sensing depth (a few millimeters) by using simulated soil temperature derived from MODIS and NLDAS as well as AMSR-E soil moisture at 25 km to disaggregate the coarse resolution soil moisture to 1 km in Oklahoma. The third algorithm modeled the relationship between the change in co-polarized radar backscatter and the remotely sensed microwave change in soil moisture retrievals and assumed that change in soil moisture was a function of only the canopy opacity. The change detection algorithm was implemented using aircraft based the remote sensing data from PALS and UAVSAR that were collected in SMPAVEX12 in southern Manitoba, Canada. The PALS L-band h-polarization radiometer soil moisture retrievals were disaggregated by combining them with the PALS and UAVSAR L-band hh-polarization radar spatial resolutions of 1500 m and 5 m/800 m, respectively. All three algorithms were validated using ground measurements from network in situ stations or handheld hydra probes. The validation results demonstrate the practicability on coarse resolution passive microwave soil moisture products.

  18. Image quality enhancement in low-light-level ghost imaging using modified compressive sensing method

    NASA Astrophysics Data System (ADS)

    Shi, Xiaohui; Huang, Xianwei; Nan, Suqin; Li, Hengxing; Bai, Yanfeng; Fu, Xiquan

    2018-04-01

    Detector noise has a significantly negative impact on ghost imaging at low light levels, especially for existing recovery algorithm. Based on the characteristics of the additive detector noise, a method named modified compressive sensing ghost imaging is proposed to reduce the background imposed by the randomly distributed detector noise at signal path. Experimental results show that, with an appropriate choice of threshold value, modified compressive sensing ghost imaging algorithm can dramatically enhance the contrast-to-noise ratio of the object reconstruction significantly compared with traditional ghost imaging and compressive sensing ghost imaging methods. The relationship between the contrast-to-noise ratio of the reconstruction image and the intensity ratio (namely, the average signal intensity to average noise intensity ratio) for the three reconstruction algorithms are also discussed. This noise suppression imaging technique will have great applications in remote-sensing and security areas.

  19. Advances in Landslide Nowcasting: Evaluation of a Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia Bach; Peters-Lidard, Christa; Adler, Robert; Hong, Yang; Kumar, Sujay; Lerner-Lam, Arthur

    2011-01-01

    The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that may experience landslide activity. This system combines a calculation of static landslide susceptibility with satellite-derived rainfall estimates and uses a threshold approach to generate a set of nowcasts that classify potentially hazardous areas. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale near real-time landslide hazard assessment efforts, it requires several modifications before it can be fully realized as an operational tool. This study draws upon a prior work s recommendations to develop a new approach for considering landslide susceptibility and hazard at the regional scale. This case study calculates a regional susceptibility map using remotely sensed and in situ information and a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America. The susceptibility map is evaluated with a regional rainfall intensity duration triggering threshold and results are compared with the global algorithm framework for the same event. Evaluation of this regional system suggests that this empirically based approach provides one plausible way to approach some of the data and resolution issues identified in the global assessment. The presented methodology is straightforward to implement, improves upon the global approach, and allows for results to be transferable between regions. The results also highlight several remaining challenges, including the empirical nature of the algorithm framework and adequate information for algorithm validation. Conclusions suggest that integrating additional triggering factors such as soil moisture may help to improve algorithm performance accuracy. The regional algorithm scenario represents an important step forward in advancing regional and global-scale landslide hazard assessment.

  20. Retrieving Atmospheric Profiles Data in the Presence of Clouds from Hyperspectral Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Liu, Xu; Larar, Allen M.; Zhou, Daniel K.; Kizer, Susan H.; Wu, Wan; Barnet, Christopher; Divakarla, Murty; Guo, Guang; Blackwell, Bill; Smith, William L.; hide

    2011-01-01

    Different methods for retrieving atmospheric profiles in the presence of clouds from hyperspectral satellite remote sensing data will be described. We will present results from the JPSS cloud-clearing algorithm and NASA Langley cloud retrieval algorithm.

  1. Quantitative interpretation of Great Lakes remote sensing data

    NASA Technical Reports Server (NTRS)

    Shook, D. F.; Salzman, J.; Svehla, R. A.; Gedney, R. T.

    1980-01-01

    The paper discusses the quantitative interpretation of Great Lakes remote sensing water quality data. Remote sensing using color information must take into account (1) the existence of many different organic and inorganic species throughout the Great Lakes, (2) the occurrence of a mixture of species in most locations, and (3) spatial variations in types and concentration of species. The radiative transfer model provides a potential method for an orderly analysis of remote sensing data and a physical basis for developing quantitative algorithms. Predictions and field measurements of volume reflectances are presented which show the advantage of using a radiative transfer model. Spectral absorptance and backscattering coefficients for two inorganic sediments are reported.

  2. Cloudnet Project

    DOE Data Explorer

    Hogan, Robin

    2008-01-15

    Cloudnet is a research project supported by the European Commission. This project aims to use data obtained quasi-continuously for the development and implementation of cloud remote sensing synergy algorithms. The use of active instruments (lidar and radar) results in detailed vertical profiles of important cloud parameters which cannot be derived from current satellite sensing techniques. A network of three already existing cloud remote sensing stations (CRS-stations) will be operated for a two year period, activities will be co-ordinated, data formats harmonised and analysis of the data performed to evaluate the representation of clouds in four major european weather forecast models.

  3. Adaptive Cross-correlation Algorithm and Experiment of Extended Scene Shack-Hartmann Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Morgan, Rhonda M.; Green, Joseph J.; Ohara, Catherine M.; Redding, David C.

    2007-01-01

    We have developed a new, adaptive cross-correlation (ACC) algorithm to estimate with high accuracy the shift as large as several pixels in two extended-scene images captured by a Shack-Hartmann wavefront sensor (SH-WFS). It determines the positions of all of the extended-scene image cells relative to a reference cell using an FFT-based iterative image shifting algorithm. It works with both point-source spot images as well as extended scene images. We have also set up a testbed for extended0scene SH-WFS, and tested the ACC algorithm with the measured data of both point-source and extended-scene images. In this paper we describe our algorithm and present out experimental results.

  4. Development of an Aircraft Approach and Departure Atmospheric Profile Generation Algorithm

    NASA Technical Reports Server (NTRS)

    Buck, Bill K.; Velotas, Steven G.; Rutishauser, David K. (Technical Monitor)

    2004-01-01

    In support of NASA Virtual Airspace Modeling and Simulation (VAMS) project, an effort was initiated to develop and test techniques for extracting meteorological data from landing and departing aircraft, and for building altitude based profiles for key meteorological parameters from these data. The generated atmospheric profiles will be used as inputs to NASA s Aircraft Vortex Spacing System (AVOLSS) Prediction Algorithm (APA) for benefits and trade analysis. A Wake Vortex Advisory System (WakeVAS) is being developed to apply weather and wake prediction and sensing technologies with procedures to reduce current wake separation criteria when safe and appropriate to increase airport operational efficiency. The purpose of this report is to document the initial theory and design of the Aircraft Approach Departure Atmospheric Profile Generation Algorithm.

  5. A comparison of PCA/ICA for data preprocessing in remote sensing imagery classification

    NASA Astrophysics Data System (ADS)

    He, Hui; Yu, Xianchuan

    2005-10-01

    In this paper a performance comparison of a variety of data preprocessing algorithms in remote sensing image classification is presented. These selected algorithms are principal component analysis (PCA) and three different independent component analyses, ICA (Fast-ICA (Aapo Hyvarinen, 1999), Kernel-ICA (KCCA and KGV (Bach & Jordan, 2002), EFFICA (Aiyou Chen & Peter Bickel, 2003). These algorithms were applied to a remote sensing imagery (1600×1197), obtained from Shunyi, Beijing. For classification, a MLC method is used for the raw and preprocessed data. The results show that classification with the preprocessed data have more confident results than that with raw data and among the preprocessing algorithms, ICA algorithms improve on PCA and EFFICA performs better than the others. The convergence of these ICA algorithms (for data points more than a million) are also studied, the result shows EFFICA converges much faster than the others. Furthermore, because EFFICA is a one-step maximum likelihood estimate (MLE) which reaches asymptotic Fisher efficiency (EFFICA), it computers quite small so that its demand of memory come down greatly, which settled the "out of memory" problem occurred in the other algorithms.

  6. Validation of a wireless modular monitoring system for structures

    NASA Astrophysics Data System (ADS)

    Lynch, Jerome P.; Law, Kincho H.; Kiremidjian, Anne S.; Carryer, John E.; Kenny, Thomas W.; Partridge, Aaron; Sundararajan, Arvind

    2002-06-01

    A wireless sensing unit for use in a Wireless Modular Monitoring System (WiMMS) has been designed and constructed. Drawing upon advanced technological developments in the areas of wireless communications, low-power microprocessors and micro-electro mechanical system (MEMS) sensing transducers, the wireless sensing unit represents a high-performance yet low-cost solution to monitoring the short-term and long-term performance of structures. A sophisticated reduced instruction set computer (RISC) microcontroller is placed at the core of the unit to accommodate on-board computations, measurement filtering and data interrogation algorithms. The functionality of the wireless sensing unit is validated through various experiments involving multiple sensing transducers interfaced to the sensing unit. In particular, MEMS-based accelerometers are used as the primary sensing transducer in this study's validation experiments. A five degree of freedom scaled test structure mounted upon a shaking table is employed for system validation.

  7. Local multiplicative Schwarz algorithms for convection-diffusion equations

    NASA Technical Reports Server (NTRS)

    Cai, Xiao-Chuan; Sarkis, Marcus

    1995-01-01

    We develop a new class of overlapping Schwarz type algorithms for solving scalar convection-diffusion equations discretized by finite element or finite difference methods. The preconditioners consist of two components, namely, the usual two-level additive Schwarz preconditioner and the sum of some quadratic terms constructed by using products of ordered neighboring subdomain preconditioners. The ordering of the subdomain preconditioners is determined by considering the direction of the flow. We prove that the algorithms are optimal in the sense that the convergence rates are independent of the mesh size, as well as the number of subdomains. We show by numerical examples that the new algorithms are less sensitive to the direction of the flow than either the classical multiplicative Schwarz algorithms, and converge faster than the additive Schwarz algorithms. Thus, the new algorithms are more suitable for fluid flow applications than the classical additive or multiplicative Schwarz algorithms.

  8. Flight State Identification of a Self-Sensing Wing via an Improved Feature Selection Method and Machine Learning Approaches.

    PubMed

    Chen, Xi; Kopsaftopoulos, Fotis; Wu, Qi; Ren, He; Chang, Fu-Kuo

    2018-04-29

    In this work, a data-driven approach for identifying the flight state of a self-sensing wing structure with an embedded multi-functional sensing network is proposed. The flight state is characterized by the structural vibration signals recorded from a series of wind tunnel experiments under varying angles of attack and airspeeds. A large feature pool is created by extracting potential features from the signals covering the time domain, the frequency domain as well as the information domain. Special emphasis is given to feature selection in which a novel filter method is developed based on the combination of a modified distance evaluation algorithm and a variance inflation factor. Machine learning algorithms are then employed to establish the mapping relationship from the feature space to the practical state space. Results from two case studies demonstrate the high identification accuracy and the effectiveness of the model complexity reduction via the proposed method, thus providing new perspectives of self-awareness towards the next generation of intelligent air vehicles.

  9. A new optimization method using a compressed sensing inspired solver for real-time LDR-brachytherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Guthier, C.; Aschenbrenner, K. P.; Buergy, D.; Ehmann, M.; Wenz, F.; Hesser, J. W.

    2015-03-01

    This work discusses a novel strategy for inverse planning in low dose rate brachytherapy. It applies the idea of compressed sensing to the problem of inverse treatment planning and a new solver for this formulation is developed. An inverse planning algorithm was developed incorporating brachytherapy dose calculation methods as recommended by AAPM TG-43. For optimization of the functional a new variant of a matching pursuit type solver is presented. The results are compared with current state-of-the-art inverse treatment planning algorithms by means of real prostate cancer patient data. The novel strategy outperforms the best state-of-the-art methods in speed, while achieving comparable quality. It is able to find solutions with comparable values for the objective function and it achieves these results within a few microseconds, being up to 542 times faster than competing state-of-the-art strategies, allowing real-time treatment planning. The sparse solution of inverse brachytherapy planning achieved with methods from compressed sensing is a new paradigm for optimization in medical physics. Through the sparsity of required needles and seeds identified by this method, the cost of intervention may be reduced.

  10. A new optimization method using a compressed sensing inspired solver for real-time LDR-brachytherapy treatment planning.

    PubMed

    Guthier, C; Aschenbrenner, K P; Buergy, D; Ehmann, M; Wenz, F; Hesser, J W

    2015-03-21

    This work discusses a novel strategy for inverse planning in low dose rate brachytherapy. It applies the idea of compressed sensing to the problem of inverse treatment planning and a new solver for this formulation is developed. An inverse planning algorithm was developed incorporating brachytherapy dose calculation methods as recommended by AAPM TG-43. For optimization of the functional a new variant of a matching pursuit type solver is presented. The results are compared with current state-of-the-art inverse treatment planning algorithms by means of real prostate cancer patient data. The novel strategy outperforms the best state-of-the-art methods in speed, while achieving comparable quality. It is able to find solutions with comparable values for the objective function and it achieves these results within a few microseconds, being up to 542 times faster than competing state-of-the-art strategies, allowing real-time treatment planning. The sparse solution of inverse brachytherapy planning achieved with methods from compressed sensing is a new paradigm for optimization in medical physics. Through the sparsity of required needles and seeds identified by this method, the cost of intervention may be reduced.

  11. Estimation's Role in Calculations with Fractions

    ERIC Educational Resources Information Center

    Johanning, Debra I.

    2011-01-01

    Estimation is more than a skill or an isolated topic. It is a thinking tool that needs to be emphasized during instruction so that students will learn to develop algorithmic procedures and meaning for fraction operations. For students to realize when fractions should be added, subtracted, multiplied, or divided, they need to develop a sense of…

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, K; Huang, T; Buttler, D

    We present the C-Cat Wordnet package, an open source library for using and modifying Wordnet. The package includes four key features: an API for modifying Synsets; implementations of standard similarity metrics, implementations of well known Word Sense Disambiguation algorithms, and an implementation of the Castanet algorithm. The library is easily extendible and usable in many runtime environments. We demonstrate it's use on two standard Word Sense Disambiguation tasks and apply the Castanet algorithm to a corpus.

  13. Contextual classification on a CDC Flexible Processor system. [for photomapped remote sensing data

    NASA Technical Reports Server (NTRS)

    Smith, B. W.; Siegel, H. J.; Swain, P. H.

    1981-01-01

    A potential hardware organization for the Flexible Processor Array is presented. An algorithm that implements a contextual classifier for remote sensing data analysis is given, along with uniprocessor classification algorithms. The Flexible Processor algorithm is provided, as are simulated timings for contextual classifiers run on the Flexible Processor Array and another system. The timings are analyzed for context neighborhoods of sizes three and nine.

  14. Algorithms and Sensors for Small Robot Path Following

    NASA Technical Reports Server (NTRS)

    Hogg, Robert W.; Rankin, Arturo L.; Roumeliotis, Stergios I.; McHenry, Michael C.; Helmick, Daniel M.; Bergh, Charles F.; Matthies, Larry

    2002-01-01

    Tracked mobile robots in the 20 kg size class are under development for applications in urban reconnaissance. For efficient deployment, it is desirable for teams of robots to be able to automatically execute path following behaviors, with one or more followers tracking the path taken by a leader. The key challenges to enabling such a capability are (l) to develop sensor packages for such small robots that can accurately determine the path of the leader and (2) to develop path following algorithms for the subsequent robots. To date, we have integrated gyros, accelerometers, compass/inclinometers, odometry, and differential GPS into an effective sensing package. This paper describes the sensor package, sensor processing algorithm, and path tracking algorithm we have developed for the leader/follower problem in small robots and shows the result of performance characterization of the system. We also document pragmatic lessons learned about design, construction, and electromagnetic interference issues particular to the performance of state sensors on small robots.

  15. A new method of Quickbird own image fusion

    NASA Astrophysics Data System (ADS)

    Han, Ying; Jiang, Hong; Zhang, Xiuying

    2009-10-01

    With the rapid development of remote sensing technology, the means of accessing to remote sensing data become increasingly abundant, thus the same area can form a large number of multi-temporal, different resolution image sequence. At present, the fusion methods are mainly: HPF, IHS transform method, PCA method, Brovey, Mallat algorithm and wavelet transform and so on. There exists a serious distortion of the spectrums in the IHS transform, Mallat algorithm omits low-frequency information of the high spatial resolution images, the integration results of which has obvious blocking effects. Wavelet multi-scale decomposition for different sizes, the directions, details and the edges can have achieved very good results, but different fusion rules and algorithms can achieve different effects. This article takes the Quickbird own image fusion as an example, basing on wavelet transform and HVS, wavelet transform and IHS integration. The result shows that the former better. This paper introduces the correlation coefficient, the relative average spectral error index and usual index to evaluate the quality of image.

  16. A simulation of remote sensor systems and data processing algorithms for spectral feature classification

    NASA Technical Reports Server (NTRS)

    Arduini, R. F.; Aherron, R. M.; Samms, R. W.

    1984-01-01

    A computational model of the deterministic and stochastic processes involved in multispectral remote sensing was designed to evaluate the performance of sensor systems and data processing algorithms for spectral feature classification. Accuracy in distinguishing between categories of surfaces or between specific types is developed as a means to compare sensor systems and data processing algorithms. The model allows studies to be made of the effects of variability of the atmosphere and of surface reflectance, as well as the effects of channel selection and sensor noise. Examples of these effects are shown.

  17. Research Issues in Image Registration for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Eastman, Roger D.; LeMoigne, Jacqueline; Netanyahu, Nathan S.

    2007-01-01

    Image registration is an important element in data processing for remote sensing with many applications and a wide range of solutions. Despite considerable investigation the field has not settled on a definitive solution for most applications and a number of questions remain open. This article looks at selected research issues by surveying the experience of operational satellite teams, application-specific requirements for Earth science, and our experiments in the evaluation of image registration algorithms with emphasis on the comparison of algorithms for subpixel accuracy. We conclude that remote sensing applications put particular demands on image registration algorithms to take into account domain-specific knowledge of geometric transformations and image content.

  18. Optimalisation of remote sensing algorithm in mapping of chlorophyl-a concentration at Pasuruan coastal based on surface reflectance images of Aqua Modis

    NASA Astrophysics Data System (ADS)

    Wibisana, H.; Zainab, S.; Dara K., A.

    2018-01-01

    Chlorophyll-a is one of the parameters used to detect the presence of fish populations, as well as one of the parameters to state the quality of a water. Research on chlorophyll concentrations has been extensively investigated as well as with chlorophyll-a mapping using remote sensing satellites. Mapping of chlorophyll concentration is used to obtain an optimal picture of the condition of waters that is often used as a fishing area by the fishermen. The role of remote sensing is a technological breakthrough in broadly monitoring the condition of waters. And in the process to get a complete picture of the aquatic conditions it would be used an algorithm that can provide an image of the concentration of chlorophyll at certain points scattered in the research area of capture fisheries. Remote sensing algorithms have been widely used by researchers to detect the presence of chlorophyll content, where the channels corresponding to the mapping of chlorophyll -concentrations from Landsat 8 images are canals 4, 3 and 2. With multiple channels from Landsat-8 satellite imagery used for chlorophyll detection, optimum algorithmic search can be formulated to obtain maximum results of chlorophyll-a concentration in the research area. From the calculation of remote sensing algorithm hence can be known the suitable algorithm for condition at coast of Pasuruan, where green channel give good enough correlation equal to R2 = 0,853 with algorithm for Chlorophyll-a (mg / m3) = 0,093 (R (-0) Red - 3,7049, from this result it can be concluded that there is a good correlation of the green channel that can illustrate the concentration of chlorophyll scattered along the coast of Pasuruan

  19. Satellite Imagery Analysis for Nighttime Temperature Inversion Clouds

    NASA Technical Reports Server (NTRS)

    Kawamoto, K.; Minnis, P.; Arduini, R.; Smith, W., Jr.

    2001-01-01

    Clouds play important roles in the climate system. Their optical and microphysical properties, which largely determine their radiative property, need to be investigated. Among several measurement means, satellite remote sensing seems to be the most promising. Since most of the cloud algorithms proposed so far are daytime use which utilizes solar radiation, Minnis et al. (1998) developed a nighttime use one using 3.7-, 11 - and 12-microns channels. Their algorithm, however, has a drawback that is not able to treat temperature inversion cases. We update their algorithm, incorporating new parameterization by Arduini et al. (1999) which is valid for temperature inversion cases. This updated algorithm has been applied to GOES satellite data and reasonable retrieval results were obtained.

  20. Parallel-Computing Architecture for JWST Wavefront-Sensing Algorithms

    DTIC Science & Technology

    2011-09-01

    results due to the increasing cost and complexity of each test. 2. ALGORITHM OVERVIEW Phase retrieval is an image-based wavefront-sensing...broadband illumination problems we have found that hand-tuning the right matrix sizes can account for a speedup of 86x faster. This comes from hand-picking...Wavefront Sensing and Control”. Proceedings of SPIE (2007) vol. 6687 (08). [5] Greenhouse, M. A., Drury , M. P., Dunn, J. L., Glazer, S. D., Greville, E

  1. Earth resources data analysis program, phase 2

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The efforts and findings of the Earth Resources Data Analysis Program are summarized. Results of a detailed study of the needs of EOD with respect to an applications development system (ADS) for the analysis of remotely sensed data, including an evaluation of four existing systems with respect to these needs are described. Recommendations as to possible courses for EOD to follow to obtain a viable ADS are presented. Algorithmic development comprised of several subtasks is discussed. These subtasks include the following: (1) two algorithms for multivariate density estimation; (2) a data smoothing algorithm; (3) a method for optimally estimating prior probabilities of unclassified data; and (4) further applications of the modified Cholesky decomposition in various calculations. Little effort was expended on task 3, however, two reports were reviewed.

  2. LAI inversion algorithm based on directional reflectance kernels.

    PubMed

    Tang, S; Chen, J M; Zhu, Q; Li, X; Chen, M; Sun, R; Zhou, Y; Deng, F; Xie, D

    2007-11-01

    Leaf area index (LAI) is an important ecological and environmental parameter. A new LAI algorithm is developed using the principles of ground LAI measurements based on canopy gap fraction. First, the relationship between LAI and gap fraction at various zenith angles is derived from the definition of LAI. Then, the directional gap fraction is acquired from a remote sensing bidirectional reflectance distribution function (BRDF) product. This acquisition is obtained by using a kernel driven model and a large-scale directional gap fraction algorithm. The algorithm has been applied to estimate a LAI distribution in China in mid-July 2002. The ground data acquired from two field experiments in Changbai Mountain and Qilian Mountain were used to validate the algorithm. To resolve the scale discrepancy between high resolution ground observations and low resolution remote sensing data, two TM images with a resolution approaching the size of ground plots were used to relate the coarse resolution LAI map to ground measurements. First, an empirical relationship between the measured LAI and a vegetation index was established. Next, a high resolution LAI map was generated using the relationship. The LAI value of a low resolution pixel was calculated from the area-weighted sum of high resolution LAIs composing the low resolution pixel. The results of this comparison showed that the inversion algorithm has an accuracy of 82%. Factors that may influence the accuracy are also discussed in this paper.

  3. Earth resources data analysis program, phase 3

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Tasks were performed in two areas: (1) systems analysis and (2) algorithmic development. The major effort in the systems analysis task was the development of a recommended approach to the monitoring of resource utilization data for the Large Area Crop Inventory Experiment (LACIE). Other efforts included participation in various studies concerning the LACIE Project Plan, the utility of the GE Image 100, and the specifications for a special purpose processor to be used in the LACIE. In the second task, the major effort was the development of improved algorithms for estimating proportions of unclassified remotely sensed data. Also, work was performed on optimal feature extraction and optimal feature extraction for proportion estimation.

  4. Assessment, Validation, and Refinement of the Atmospheric Correction Algorithm for the Ocean Color Sensors. Chapter 19

    NASA Technical Reports Server (NTRS)

    Wang, Menghua

    2003-01-01

    The primary focus of this proposed research is for the atmospheric correction algorithm evaluation and development and satellite sensor calibration and characterization. It is well known that the atmospheric correction, which removes more than 90% of sensor-measured signals contributed from atmosphere in the visible, is the key procedure in the ocean color remote sensing (Gordon and Wang, 1994). The accuracy and effectiveness of the atmospheric correction directly affect the remotely retrieved ocean bio-optical products. On the other hand, for ocean color remote sensing, in order to obtain the required accuracy in the derived water-leaving signals from satellite measurements, an on-orbit vicarious calibration of the whole system, i.e., sensor and algorithms, is necessary. In addition, it is important to address issues of (i) cross-calibration of two or more sensors and (ii) in-orbit vicarious calibration of the sensor-atmosphere system. The goal of these researches is to develop methods for meaningful comparison and possible merging of data products from multiple ocean color missions. In the past year, much efforts have been on (a) understanding and correcting the artifacts appeared in the SeaWiFS-derived ocean and atmospheric produces; (b) developing an efficient method in generating the SeaWiFS aerosol lookup tables, (c) evaluating the effects of calibration error in the near-infrared (NIR) band to the atmospheric correction of the ocean color remote sensors, (d) comparing the aerosol correction algorithm using the singlescattering epsilon (the current SeaWiFS algorithm) vs. the multiple-scattering epsilon method, and (e) continuing on activities for the International Ocean-Color Coordinating Group (IOCCG) atmospheric correction working group. In this report, I will briefly present and discuss these and some other research activities.

  5. Detecting and visualizing weak signatures in hyperspectral data

    NASA Astrophysics Data System (ADS)

    MacPherson, Duncan James

    This thesis evaluates existing techniques for detecting weak spectral signatures from remotely sensed hyperspectral data. Algorithms are presented that successfully detect hard-to-find 'mystery' signatures in unknown cluttered backgrounds. The term 'mystery' is used to describe a scenario where the spectral target and background endmembers are unknown. Sub-Pixel analysis and background suppression are used to find deeply embedded signatures which can be less than 10% of the total signal strength. Existing 'mystery target' detection algorithms are derived and compared. Several techniques are shown to be superior both visually and quantitatively. Detection performance is evaluated using confidence metrics that are developed. A multiple algorithm approach is shown to improve detection confidence significantly. Although the research focuses on remote sensing applications, the algorithms presented can be applied to a wide variety of diverse fields such as medicine, law enforcement, manufacturing, earth science, food production, and astrophysics. The algorithms are shown to be general and can be applied to both the reflective and emissive parts of the electromagnetic spectrum. The application scope is a broad one and the final results open new opportunities for many specific applications including: land mine detection, pollution and hazardous waste detection, crop abundance calculations, volcanic activity monitoring, detecting diseases in food, automobile or airplane target recognition, cancer detection, mining operations, extracting galactic gas emissions, etc.

  6. Estimation of atmospheric columnar organic matter (OM) mass concentration from remote sensing measurements of aerosol spectral refractive indices

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Li, Zhengqiang; Sun, Yele; Lv, Yang; Xie, Yisong

    2018-04-01

    Aerosols have adverse effects on human health and air quality, changing Earth's energy balance and lead to climate change. The components of aerosol are important because of the different spectral characteristics. Based on the low hygroscopic and high scattering properties of organic matter (OM) in fine modal atmospheric aerosols, we develop an inversion algorithm using remote sensing to obtain aerosol components including black carbon (BC), organic matter (OM), ammonium nitrate-like (AN), dust-like (DU) components and aerosol water content (AW). In the algorithm, the microphysical characteristics (i.e. volume distribution and complex refractive index) of particulates are preliminarily separated to fine and coarse modes, and then aerosol components are retrieved using bimodal parameters. We execute the algorithm using remote sensing measurements of sun-sky radiometer at AERONET site (Beijing RADI) in a period from October of 2014 to January of 2015. The results show a reasonable distribution of aerosol components and a good fit for spectral feature calculations. The mean OM mass concentration in atmospheric column is account for 14.93% of the total and 56.34% of dry and fine-mode aerosol, being a fairly good correlation (R = 0.56) with the in situ observations near the surface layer.

  7. Detecting the red tide based on remote sensing data in optically complex East China Sea

    NASA Astrophysics Data System (ADS)

    Xu, Xiaohui; Pan, Delu; Mao, Zhihua; Tao, Bangyi; Liu, Qiong

    2012-09-01

    Red tide not only destroys marine fishery production, deteriorates the marine environment, affects coastal tourist industry, but also causes human poison, even death by eating toxic seafood contaminated by red tide organisms. Remote sensing technology has the characteristics of large-scale, synchronized, rapid monitoring, so it is one of the most important and most effective means of red tide monitoring. This paper selects the high frequency red tides areas of the East China Sea as study area, MODIS/Aqua L2 data as the data source, analysis and compares the spectral differences in the red tide water bodies and non-red tide water bodies of many historical events. Based on the spectral differences, this paper develops the algorithm of Rrs555/Rrs488> 1.5 to extract the red tide information. Apply the algorithm on red tide event happened in the East China Sea on May 28, 2009 to extract the information of red tide, and found that the method can determine effectively the location of the occurrence of red tide; there is a good corresponding relationship between red tide extraction result and chlorophyll a concentration extracted by remote sensing, shows that these algorithm can determine effectively the location and extract the red tide information.

  8. Construction of Green Tide Monitoring System and Research on its Key Techniques

    NASA Astrophysics Data System (ADS)

    Xing, B.; Li, J.; Zhu, H.; Wei, P.; Zhao, Y.

    2018-04-01

    As a kind of marine natural disaster, Green Tide has been appearing every year along the Qingdao Coast, bringing great loss to this region, since the large-scale bloom in 2008. Therefore, it is of great value to obtain the real time dynamic information about green tide distribution. In this study, methods of optical remote sensing and microwave remote sensing are employed in Green Tide Monitoring Research. A specific remote sensing data processing flow and a green tide information extraction algorithm are designed, according to the optical and microwave data of different characteristics. In the aspect of green tide spatial distribution information extraction, an automatic extraction algorithm of green tide distribution boundaries is designed based on the principle of mathematical morphology dilation/erosion. And key issues in information extraction, including the division of green tide regions, the obtaining of basic distributions, the limitation of distribution boundary, and the elimination of islands, have been solved. The automatic generation of green tide distribution boundaries from the results of remote sensing information extraction is realized. Finally, a green tide monitoring system is built based on IDL/GIS secondary development in the integrated environment of RS and GIS, achieving the integration of RS monitoring and information extraction.

  9. Adaptive Temporal Matched Filtering for Noise Suppression in Fiber Optic Distributed Acoustic Sensing.

    PubMed

    Ölçer, İbrahim; Öncü, Ahmet

    2017-06-05

    Distributed vibration sensing based on phase-sensitive optical time domain reflectometry ( ϕ -OTDR) is being widely used in several applications. However, one of the main challenges in coherent detection-based ϕ -OTDR systems is the fading noise, which impacts the detection performance. In addition, typical signal averaging and differentiating techniques are not suitable for detecting high frequency events. This paper presents a new approach for reducing the effect of fading noise in fiber optic distributed acoustic vibration sensing systems without any impact on the frequency response of the detection system. The method is based on temporal adaptive processing of ϕ -OTDR signals. The fundamental theory underlying the algorithm, which is based on signal-to-noise ratio (SNR) maximization, is presented, and the efficacy of our algorithm is demonstrated with laboratory experiments and field tests. With the proposed digital processing technique, the results show that more than 10 dB of SNR values can be achieved without any reduction in the system bandwidth and without using additional optical amplifier stages in the hardware. We believe that our proposed adaptive processing approach can be effectively used to develop fiber optic-based distributed acoustic vibration sensing systems.

  10. Adaptive Temporal Matched Filtering for Noise Suppression in Fiber Optic Distributed Acoustic Sensing

    PubMed Central

    Ölçer, İbrahim; Öncü, Ahmet

    2017-01-01

    Distributed vibration sensing based on phase-sensitive optical time domain reflectometry (ϕ-OTDR) is being widely used in several applications. However, one of the main challenges in coherent detection-based ϕ-OTDR systems is the fading noise, which impacts the detection performance. In addition, typical signal averaging and differentiating techniques are not suitable for detecting high frequency events. This paper presents a new approach for reducing the effect of fading noise in fiber optic distributed acoustic vibration sensing systems without any impact on the frequency response of the detection system. The method is based on temporal adaptive processing of ϕ-OTDR signals. The fundamental theory underlying the algorithm, which is based on signal-to-noise ratio (SNR) maximization, is presented, and the efficacy of our algorithm is demonstrated with laboratory experiments and field tests. With the proposed digital processing technique, the results show that more than 10 dB of SNR values can be achieved without any reduction in the system bandwidth and without using additional optical amplifier stages in the hardware. We believe that our proposed adaptive processing approach can be effectively used to develop fiber optic-based distributed acoustic vibration sensing systems. PMID:28587240

  11. A compressed sensing based 3D resistivity inversion algorithm for hydrogeological applications

    NASA Astrophysics Data System (ADS)

    Ranjan, Shashi; Kambhammettu, B. V. N. P.; Peddinti, Srinivasa Rao; Adinarayana, J.

    2018-04-01

    Image reconstruction from discrete electrical responses pose a number of computational and mathematical challenges. Application of smoothness constrained regularized inversion from limited measurements may fail to detect resistivity anomalies and sharp interfaces separated by hydro stratigraphic units. Under favourable conditions, compressed sensing (CS) can be thought of an alternative to reconstruct the image features by finding sparse solutions to highly underdetermined linear systems. This paper deals with the development of a CS assisted, 3-D resistivity inversion algorithm for use with hydrogeologists and groundwater scientists. CS based l1-regularized least square algorithm was applied to solve the resistivity inversion problem. Sparseness in the model update vector is introduced through block oriented discrete cosine transformation, with recovery of the signal achieved through convex optimization. The equivalent quadratic program was solved using primal-dual interior point method. Applicability of the proposed algorithm was demonstrated using synthetic and field examples drawn from hydrogeology. The proposed algorithm has outperformed the conventional (smoothness constrained) least square method in recovering the model parameters with much fewer data, yet preserving the sharp resistivity fronts separated by geologic layers. Resistivity anomalies represented by discrete homogeneous blocks embedded in contrasting geologic layers were better imaged using the proposed algorithm. In comparison to conventional algorithm, CS has resulted in an efficient (an increase in R2 from 0.62 to 0.78; a decrease in RMSE from 125.14 Ω-m to 72.46 Ω-m), reliable, and fast converging (run time decreased by about 25%) solution.

  12. Comparison of Compressed Sensing Algorithms for Inversion of 3-D Electrical Resistivity Tomography.

    NASA Astrophysics Data System (ADS)

    Peddinti, S. R.; Ranjan, S.; Kbvn, D. P.

    2016-12-01

    Image reconstruction algorithms derived from electrical resistivity tomography (ERT) are highly non-linear, sparse, and ill-posed. The inverse problem is much severe, when dealing with 3-D datasets that result in large sized matrices. Conventional gradient based techniques using L2 norm minimization with some sort of regularization can impose smoothness constraint on the solution. Compressed sensing (CS) is relatively new technique that takes the advantage of inherent sparsity in parameter space in one or the other form. If favorable conditions are met, CS was proven to be an efficient image reconstruction technique that uses limited observations without losing edge sharpness. This paper deals with the development of an open source 3-D resistivity inversion tool using CS framework. The forward model was adopted from RESINVM3D (Pidlisecky et al., 2007) with CS as the inverse code. Discrete cosine transformation (DCT) function was used to induce model sparsity in orthogonal form. Two CS based algorithms viz., interior point method and two-step IST were evaluated on a synthetic layered model with surface electrode observations. The algorithms were tested (in terms of quality and convergence) under varying degrees of parameter heterogeneity, model refinement, and reduced observation data space. In comparison to conventional gradient algorithms, CS was proven to effectively reconstruct the sub-surface image with less computational cost. This was observed by a general increase in NRMSE from 0.5 in 10 iterations using gradient algorithm to 0.8 in 5 iterations using CS algorithms.

  13. Performance comparison between total variation (TV)-based compressed sensing and statistical iterative reconstruction algorithms.

    PubMed

    Tang, Jie; Nett, Brian E; Chen, Guang-Hong

    2009-10-07

    Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.

  14. Upwelling-Induced Primary Productivity in Coastal Waters of the Black Sea: Impact on Algorithms for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Goldman, Joel C.; Brink, Kenneth K.; Gawarkiewicz, Glen; Sosik, Heidi M.

    1997-01-01

    This research program was a collaborative effort to investigate the impact of rapid changes in the water column during coastal upwelling, on biological and optical properties. These properties are important for constructing region or event-specific algorithms for remote sensing of pigment concentration and primary productivity and for comparing these algorithms with those used for the development of large scale maps from ocean color. We successfully achieved the primary objective of this research project which was to study in situ the dynamics of rapid spatial and temporal changes in properties of the water column during, coastal upwelling off the Crimean Coast in the Black Sea. The work was a collaborative effort between a group of biological and physical oceanographers from the Woods Hole Oceanographic Institution and from two oceanographic research institutions in the Crimea, Ukraine, located near the study site, the Marine Hydrophysical Institute (MHI) and the Institute of Biology of the Southern Seas (IBSS). The site was an ideal experimental model, both from a technical and economic standpoint, because of the predictable summer upwelling that occurs in the region and because of the availability of both a ship on call and laboratory and remote sensing facilities at the nearby marine institutes. We used a combination of shipboard measurements and remote sensing to investigate the physical evolution of rapid upwelling events and their impact on photoplankton and water column optical properties. The field work involved a two day cruise for mooring, deployment and a three day baseline survey cruise, followed by an eleven day primary cruise during, a summer upwelling event (anticipated by monitoring local winds and tracked by remote sensing imaging). An MHI ship was outfitted and used for these purposes.

  15. Bio-inspired Computing for Robots

    NASA Technical Reports Server (NTRS)

    Laufenberg, Larry

    2003-01-01

    Living creatures may provide algorithms to enable active sensing/control systems in robots. Active sensing could enable planetary rovers to feel their way in unknown environments. The surface of Jupiter's moon Europa consists of fractured ice over a liquid sea that may contain microbes similar to those on Earth. To explore such extreme environments, NASA needs robots that autonomously survive, navigate, and gather scientific data. They will be too far away for guidance from Earth. They must sense their environment and control their own movements to avoid obstacles or investigate a science opportunity. To meet this challenge, CICT's Information Technology Strategic Research (ITSR) Project is funding neurobiologists at NASA's Jet Propulsion Laboratory (JPL) and selected universities to search for biologically inspired algorithms that enable robust active sensing and control for exploratory robots. Sources for these algorithms are living creatures, including rats and electric fish.

  16. Research on compressive sensing reconstruction algorithm based on total variation model

    NASA Astrophysics Data System (ADS)

    Gao, Yu-xuan; Sun, Huayan; Zhang, Tinghua; Du, Lin

    2017-12-01

    Compressed sensing for breakthrough Nyquist sampling theorem provides a strong theoretical , making compressive sampling for image signals be carried out simultaneously. In traditional imaging procedures using compressed sensing theory, not only can it reduces the storage space, but also can reduce the demand for detector resolution greatly. Using the sparsity of image signal, by solving the mathematical model of inverse reconfiguration, realize the super-resolution imaging. Reconstruction algorithm is the most critical part of compression perception, to a large extent determine the accuracy of the reconstruction of the image.The reconstruction algorithm based on the total variation (TV) model is more suitable for the compression reconstruction of the two-dimensional image, and the better edge information can be obtained. In order to verify the performance of the algorithm, Simulation Analysis the reconstruction result in different coding mode of the reconstruction algorithm based on the TV reconstruction algorithm. The reconstruction effect of the reconfigurable algorithm based on TV based on the different coding methods is analyzed to verify the stability of the algorithm. This paper compares and analyzes the typical reconstruction algorithm in the same coding mode. On the basis of the minimum total variation algorithm, the Augmented Lagrangian function term is added and the optimal value is solved by the alternating direction method.Experimental results show that the reconstruction algorithm is compared with the traditional classical algorithm based on TV has great advantages, under the low measurement rate can be quickly and accurately recovers target image.

  17. VisitSense: Sensing Place Visit Patterns from Ambient Radio on Smartphones for Targeted Mobile Ads in Shopping Malls.

    PubMed

    Kim, Byoungjip; Kang, Seungwoo; Ha, Jin-Young; Song, Junehwa

    2015-07-16

    In this paper, we introduce a novel smartphone framework called VisitSense that automatically detects and predicts a smartphone user's place visits from ambient radio to enable behavioral targeting for mobile ads in large shopping malls. VisitSense enables mobile app developers to adopt visit-pattern-aware mobile advertising for shopping mall visitors in their apps. It also benefits mobile users by allowing them to receive highly relevant mobile ads that are aware of their place visit patterns in shopping malls. To achieve the goal, VisitSense employs accurate visit detection and prediction methods. For accurate visit detection, we develop a change-based detection method to take into consideration the stability change of ambient radio and the mobility change of users. It performs well in large shopping malls where ambient radio is quite noisy and causes existing algorithms to easily fail. In addition, we proposed a causality-based visit prediction model to capture the causality in the sequential visit patterns for effective prediction. We have developed a VisitSense prototype system, and a visit-pattern-aware mobile advertising application that is based on it. Furthermore, we deploy the system in the COEX Mall, one of the largest shopping malls in Korea, and conduct diverse experiments to show the effectiveness of VisitSense.

  18. Extended capture range for focus-diverse phase retrieval in segmented aperture systems using geometrical optics.

    PubMed

    Jurling, Alden S; Fienup, James R

    2014-03-01

    Extending previous work by Thurman on wavefront sensing for segmented-aperture systems, we developed an algorithm for estimating segment tips and tilts from multiple point spread functions in different defocused planes. We also developed methods for overcoming two common modes for stagnation in nonlinear optimization-based phase retrieval algorithms for segmented systems. We showed that when used together, these methods largely solve the capture range problem in focus-diverse phase retrieval for segmented systems with large tips and tilts. Monte Carlo simulations produced a rate of success better than 98% for the combined approach.

  19. Probabilistic models, learning algorithms, and response variability: sampling in cognitive development.

    PubMed

    Bonawitz, Elizabeth; Denison, Stephanie; Griffiths, Thomas L; Gopnik, Alison

    2014-10-01

    Although probabilistic models of cognitive development have become increasingly prevalent, one challenge is to account for how children might cope with a potentially vast number of possible hypotheses. We propose that children might address this problem by 'sampling' hypotheses from a probability distribution. We discuss empirical results demonstrating signatures of sampling, which offer an explanation for the variability of children's responses. The sampling hypothesis provides an algorithmic account of how children might address computationally intractable problems and suggests a way to make sense of their 'noisy' behavior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  1. Scientific Programming Using Java: A Remote Sensing Example

    NASA Technical Reports Server (NTRS)

    Prados, Don; Mohamed, Mohamed A.; Johnson, Michael; Cao, Changyong; Gasser, Jerry

    1999-01-01

    This paper presents results of a project to port remote sensing code from the C programming language to Java. The advantages and disadvantages of using Java versus C as a scientific programming language in remote sensing applications are discussed. Remote sensing applications deal with voluminous data that require effective memory management, such as buffering operations, when processed. Some of these applications also implement complex computational algorithms, such as Fast Fourier Transformation analysis, that are very performance intensive. Factors considered include performance, precision, complexity, rapidity of development, ease of code reuse, ease of maintenance, memory management, and platform independence. Performance of radiometric calibration code written in Java for the graphical user interface and of using C for the domain model are also presented.

  2. Mapping ephemeral stream networks in desert environments using very-high-spatial-resolution multispectral remote sensing

    DOE PAGES

    Hamada, Yuki; O'Connor, Ben L.; Orr, Andrew B.; ...

    2016-03-26

    In this paper, understanding the spatial patterns of ephemeral streams is crucial for understanding how hydrologic processes influence the abundance and distribution of wildlife habitats in desert regions. Available methods for mapping ephemeral streams at the watershed scale typically underestimate the size of channel networks. Although remote sensing is an effective means of collecting data and obtaining information on large, inaccessible areas, conventional techniques for extracting channel features are not sufficient in regions that have small topographic gradients and subtle target-background spectral contrast. By using very high resolution multispectral imagery, we developed a new algorithm that applies landscape information tomore » map ephemeral channels in desert regions of the Southwestern United States where utility-scale solar energy development is occurring. Knowledge about landscape features and structures was integrated into the algorithm using a series of spectral transformation and spatial statistical operations to integrate information about landscape features and structures. The algorithm extracted ephemeral stream channels at a local scale, with the result that approximately 900% more ephemeral streams was identified than what were identified by using the U.S. Geological Survey’s National Hydrography Dataset. The accuracy of the algorithm in detecting channel areas was as high as 92%, and its accuracy in delineating channel center lines was 91% when compared to a subset of channel networks that were digitized by using the very high resolution imagery. Although the algorithm captured stream channels in desert landscapes across various channel sizes and forms, it often underestimated stream headwaters and channels obscured by bright soils and sparse vegetation. While further improvement is warranted, the algorithm provides an effective means of obtaining detailed information about ephemeral streams, and it could make a significant contribution toward improving the hydrological modelling of desert environments.« less

  3. Mapping ephemeral stream networks in desert environments using very-high-spatial-resolution multispectral remote sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Yuki; O'Connor, Ben L.; Orr, Andrew B.

    In this paper, understanding the spatial patterns of ephemeral streams is crucial for understanding how hydrologic processes influence the abundance and distribution of wildlife habitats in desert regions. Available methods for mapping ephemeral streams at the watershed scale typically underestimate the size of channel networks. Although remote sensing is an effective means of collecting data and obtaining information on large, inaccessible areas, conventional techniques for extracting channel features are not sufficient in regions that have small topographic gradients and subtle target-background spectral contrast. By using very high resolution multispectral imagery, we developed a new algorithm that applies landscape information tomore » map ephemeral channels in desert regions of the Southwestern United States where utility-scale solar energy development is occurring. Knowledge about landscape features and structures was integrated into the algorithm using a series of spectral transformation and spatial statistical operations to integrate information about landscape features and structures. The algorithm extracted ephemeral stream channels at a local scale, with the result that approximately 900% more ephemeral streams was identified than what were identified by using the U.S. Geological Survey’s National Hydrography Dataset. The accuracy of the algorithm in detecting channel areas was as high as 92%, and its accuracy in delineating channel center lines was 91% when compared to a subset of channel networks that were digitized by using the very high resolution imagery. Although the algorithm captured stream channels in desert landscapes across various channel sizes and forms, it often underestimated stream headwaters and channels obscured by bright soils and sparse vegetation. While further improvement is warranted, the algorithm provides an effective means of obtaining detailed information about ephemeral streams, and it could make a significant contribution toward improving the hydrological modelling of desert environments.« less

  4. Online identification algorithms for integrated dielectric electroactive polymer sensors and self-sensing concepts

    NASA Astrophysics Data System (ADS)

    Hoffstadt, Thorben; Griese, Martin; Maas, Jürgen

    2014-10-01

    Transducers based on dielectric electroactive polymers (DEAP) use electrostatic pressure to convert electric energy into strain energy or vice versa. Besides this, they are also designed for sensor applications in monitoring the actual stretch state on the basis of the deformation dependent capacitive-resistive behavior of the DEAP. In order to enable an efficient and proper closed loop control operation of these transducers, e.g. in positioning or energy harvesting applications, on the one hand, sensors based on DEAP material can be integrated into the transducers and evaluated externally, and on the other hand, the transducer itself can be used as a sensor, also in terms of self-sensing. For this purpose the characteristic electrical behavior of the transducer has to be evaluated in order to determine the mechanical state. Also, adequate online identification algorithms with sufficient accuracy and dynamics are required, independent from the sensor concept utilized, in order to determine the electrical DEAP parameters in real time. Therefore, in this contribution, algorithms are developed in the frequency domain for identifications of the capacitance as well as the electrode and polymer resistance of a DEAP, which are validated by measurements. These algorithms are designed for self-sensing applications, especially if the power electronics utilized is operated at a constant switching frequency, and parasitic harmonic oscillations are induced besides the desired DC value. These oscillations can be used for the online identification, so an additional superimposed excitation is no longer necessary. For this purpose a dual active bridge (DAB) is introduced to drive the DEAP transducer. The capabilities of the real-time identification algorithm in combination with the DAB are presented in detail and discussed, finally.

  5. Development of a generalized algorithm of satellite remote sensing using multi-wavelength and multi-pixel information (MWP method) for aerosol properties by satellite-borne imager

    NASA Astrophysics Data System (ADS)

    Hashimoto, M.; Nakajima, T.; Morimoto, S.; Takenaka, H.

    2014-12-01

    We have developed a new satellite remote sensing algorithm to retrieve the aerosol optical characteristics using multi-wavelength and multi-pixel information of satellite imagers (MWP method). In this algorithm, the inversion method is a combination of maximum a posteriori (MAP) method (Rodgers, 2000) and the Phillips-Twomey method (Phillips, 1962; Twomey, 1963) as a smoothing constraint for the state vector. Furthermore, with the progress of computing technique, this method has being combined with the direct radiation transfer calculation numerically solved by each iteration step of the non-linear inverse problem, without using LUT (Look Up Table) with several constraints.Retrieved parameters in our algorithm are aerosol optical properties, such as aerosol optical thickness (AOT) of fine and coarse mode particles, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength. We simultaneously retrieve all the parameters that characterize pixels in each of horizontal sub-domains consisting the target area. Then we successively apply the retrieval method to all the sub-domains in the target area.We conducted numerical tests for the retrieval of aerosol properties and ground surface albedo for GOSAT/CAI imager data to test the algorithm for the land area. The result of the experiment showed that AOTs of fine mode and coarse mode, soot fraction and ground surface albedo are successfully retrieved within expected accuracy. We discuss the accuracy of the algorithm for various land surface types. Then, we applied this algorithm to GOSAT/CAI imager data, and we compared retrieved and surface-observed AOTs at the CAI pixel closest to an AERONET (Aerosol Robotic Network) or SKYNET site in each region. Comparison at several sites in urban area indicated that AOTs retrieved by our method are in agreement with surface-observed AOT within ±0.066.Our future work is to extend the algorithm for analysis of AGEOS-II/GLI and GCOM/C-SGLI data.

  6. Experimental results for correlation-based wavefront sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyneer, L A; Palmer, D W; LaFortune, K N

    2005-07-01

    Correlation wave-front sensing can improve Adaptive Optics (AO) system performance in two keys areas. For point-source-based AO systems, Correlation is more accurate, more robust to changing conditions and provides lower noise than a centroiding algorithm. Experimental results from the Lick AO system and the SSHCL laser AO system confirm this. For remote imaging, Correlation enables the use of extended objects for wave-front sensing. Results from short horizontal-path experiments will show algorithm properties and requirements.

  7. Spectral imaging applications: Remote sensing, environmental monitoring, medicine, military operations, factory automation and manufacturing

    NASA Technical Reports Server (NTRS)

    Gat, N.; Subramanian, S.; Barhen, J.; Toomarian, N.

    1996-01-01

    This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.

  8. Remote sensing estimation of colored dissolved organic matter (CDOM) in optically shallow waters

    NASA Astrophysics Data System (ADS)

    Li, Jiwei; Yu, Qian; Tian, Yong Q.; Becker, Brian L.

    2017-06-01

    It is not well understood how bottom reflectance of optically shallow waters affects the algorithm performance of colored dissolved organic matters (CDOM) retrieval. This study proposes a new algorithm that considers bottom reflectance in estimating CDOM absorption from optically shallow inland or coastal waters. The field sampling was conducted during four research cruises within the Saginaw River, Kawkawlin River and Saginaw Bay of Lake Huron. A stratified field sampling campaign collected water samples, determined the depth at each sampling location and measured optical properties. The sampled CDOM absorption at 440 nm broadly ranged from 0.12 to 8.46 m-1. Field sample analysis revealed that bottom reflectance does significantly change water apparent optical properties. We developed a CDOM retrieval algorithm (Shallow water Bio-Optical Properties algorithm, SBOP) that effectively reduces uncertainty by considering bottom reflectance in shallow waters. By incorporating the bottom contribution in upwelling radiances, the SBOP algorithm was able to explain 74% of the variance of CDOM values (RMSE = 0.22 and R2 = 0.74). The bottom effect index (BEI) was introduced to efficiently separate optically shallow and optically deep waters. Based on the BEI, an adaptive approach was proposed that references the amount of bottom effect in order to identify the most suitable algorithm (optically shallow water algorithm [SBOP] or optically deep water algorithm [QAA-CDOM]) to improve CDOM estimation (RMSE = 0.22 and R2 = 0.81). Our results potentially help to advance the capability of remote sensing in monitoring carbon pools at the land-water interface.

  9. Using Collision Cones to Asses Biological Deconiction Methods

    NASA Astrophysics Data System (ADS)

    Brace, Natalie

    For autonomous vehicles to navigate the world as efficiently and effectively as biological species, improvements are needed in terms of control strategies and estimation algorithms. Reactive collision avoidance is one specific area where biological systems outperform engineered algorithms. To better understand the discrepancy between engineered and biological systems, a collision avoidance algorithm was applied to frames of trajectory data from three biological species (Myotis velifer, Hirundo rustica, and Danio aequipinnatus). The algorithm uses information that can be sensed through visual cues (relative position and velocity) to define collision cones which are used to determine if agents are on a collision course and if so, to find a safe velocity that requires minimal deviation from the original velocity for each individual agent. Two- and three-dimensional versions of the algorithm with constant speed and maximum speed velocity requirements were considered. The obstacles provided to the algorithm were determined by the sensing range in terms of either metric or topological distance. The calculated velocities showed good correlation with observed velocities over the range of sensing parameters, indicating that the algorithm is a good basis for comparison and could potentially be improved with further study.

  10. Kalman/Map filtering-aided fast normalized cross correlation-based Wi-Fi fingerprinting location sensing.

    PubMed

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-11-13

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results.

  11. Kalman/Map Filtering-Aided Fast Normalized Cross Correlation-Based Wi-Fi Fingerprinting Location Sensing

    PubMed Central

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-01-01

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results. PMID:24233027

  12. Bio-optical water quality dynamics observed from MERIS in Pensacola Bay, Florida

    EPA Science Inventory

    Observed bio-optical water quality data collected from 2009 to 2011 in Pensacola Bay, Florida were used to develop empirical remote sensing retrieval algorithms for chlorophyll a (Chla), colored dissolved organic matter (CDOM), and suspended particulate matter (SPM). Time-series ...

  13. Spatiotemporal chlorophyll-a dynamics on the Louisiana continental shelf derived from a dual satellite imagery algorithm

    EPA Science Inventory

    A monthly time series of remotely sensed chlorophyll-a (Chlars) over the Louisiana continental shelf (LCS) was developed and examined for its relationship to river discharge, nitrate concentration, total phosphorus concentration, photosynthetically available radiation (PAR), wind...

  14. Kerr Reservoir LANDSAT experiment analysis for March 1981

    NASA Technical Reports Server (NTRS)

    Lecroy, S. R. (Principal Investigator)

    1982-01-01

    LANDSAT radiance data were used in an experiment conducted on the waters of Kerr Reservoir to determine if reliable algorithms could be developed that relate water quality parameters to remotely sensed data. A mix of different types of algorithms using the LANDSAT bands was generated to provide a thorough understanding of the relationships among the data involved. Except for secchi depth, the study demonstrated that for the ranges measured, the algorithms that satisfactorily represented the data encompass a mix of linear and nonlinear forms using only one LANDSAT band. Ratioing techniques did not improve the results since the initial design of the experiment minimized the errors against which this procedure is effective. Good correlations were found for total suspended solids, iron, turbidity, and secchi depth. Marginal correlations were discovered for nitrate and tannin + lignin. Quantification maps of Kerr Reservoir are presented for many of the water quality parameters using the developed algorithms.

  15. An adaptive inverse kinematics algorithm for robot manipulators

    NASA Technical Reports Server (NTRS)

    Colbaugh, R.; Glass, K.; Seraji, H.

    1990-01-01

    An adaptive algorithm for solving the inverse kinematics problem for robot manipulators is presented. The algorithm is derived using model reference adaptive control (MRAC) theory and is computationally efficient for online applications. The scheme requires no a priori knowledge of the kinematics of the robot if Cartesian end-effector sensing is available, and it requires knowledge of only the forward kinematics if joint position sensing is used. Computer simulation results are given for the redundant seven-DOF robotics research arm, demonstrating that the proposed algorithm yields accurate joint angle trajectories for a given end-effector position/orientation trajectory.

  16. Integration of Libration Point Orbit Dynamics into a Universal 3-D Autonomous Formation Flying Algorithm

    NASA Technical Reports Server (NTRS)

    Folta, David; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    The autonomous formation flying control algorithm developed by the Goddard Space Flight Center (GSFC) for the New Millennium Program (NMP) Earth Observing-1 (EO-1) mission is investigated for applicability to libration point orbit formations. In the EO-1 formation-flying algorithm, control is accomplished via linearization about a reference transfer orbit with a state transition matrix (STM) computed from state inputs. The effect of libration point orbit dynamics on this algorithm architecture is explored via computation of STMs using the flight proven code, a monodromy matrix developed from a N-body model of a libration orbit, and a standard STM developed from the gravitational and coriolis effects as measured at the libration point. A comparison of formation flying Delta-Vs calculated from these methods is made to a standard linear quadratic regulator (LQR) method. The universal 3-D approach is optimal in the sense that it can be accommodated as an open-loop or closed-loop control using only state information.

  17. Monitoring Traffic Information with a Developed Acceleration Sensing Node.

    PubMed

    Ye, Zhoujing; Wang, Linbing; Xu, Wen; Gao, Zhifei; Yan, Guannan

    2017-12-05

    In this paper, an acceleration sensing node for pavement vibration was developed to monitor traffic information, including vehicle speed, vehicle types, and traffic flow, where a hardware design with low energy consumption and node encapsulation could be accomplished. The service performance of the sensing node was evaluated, by methods including waterproof test, compression test, sensing performance analysis, and comparison test. The results demonstrate that the sensing node is low in energy consumption, high in strength, IPX8 waterproof, and high in sensitivity and resolution. These characteristics can be applied to practical road environments. Two sensing nodes were spaced apart in the direction of travelling. In the experiment, three types of vehicles passed by the monitoring points at several different speeds and values of d (the distance between the sensor and the nearest tire center line). Based on cross-correlation with kernel pre-smoothing, a calculation method was applied to process the raw data. New algorithms for traffic flow, speed, and axle length were proposed. Finally, the effects of vehicle speed, vehicle weight, and d value on acceleration amplitude were statistically evaluated. It was found that the acceleration sensing node can be used for traffic flow, vehicle speed, and other types of monitoring.

  18. Monitoring Traffic Information with a Developed Acceleration Sensing Node

    PubMed Central

    Ye, Zhoujing; Wang, Linbing; Xu, Wen; Gao, Zhifei; Yan, Guannan

    2017-01-01

    In this paper, an acceleration sensing node for pavement vibration was developed to monitor traffic information, including vehicle speed, vehicle types, and traffic flow, where a hardware design with low energy consumption and node encapsulation could be accomplished. The service performance of the sensing node was evaluated, by methods including waterproof test, compression test, sensing performance analysis, and comparison test. The results demonstrate that the sensing node is low in energy consumption, high in strength, IPX8 waterproof, and high in sensitivity and resolution. These characteristics can be applied to practical road environments. Two sensing nodes were spaced apart in the direction of travelling. In the experiment, three types of vehicles passed by the monitoring points at several different speeds and values of d (the distance between the sensor and the nearest tire center line). Based on cross-correlation with kernel pre-smoothing, a calculation method was applied to process the raw data. New algorithms for traffic flow, speed, and axle length were proposed. Finally, the effects of vehicle speed, vehicle weight, and d value on acceleration amplitude were statistically evaluated. It was found that the acceleration sensing node can be used for traffic flow, vehicle speed, and other types of monitoring. PMID:29206169

  19. A Fast Implementation of the ISOCLUS Algorithm

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; Mount, David M.; Netanyahu, Nathan S.; LeMoigne, Jacqueline

    2003-01-01

    Unsupervised clustering is a fundamental tool in numerous image processing and remote sensing applications. For example, unsupervised clustering is often used to obtain vegetation maps of an area of interest. This approach is useful when reliable training data are either scarce or expensive, and when relatively little a priori information about the data is available. Unsupervised clustering methods play a significant role in the pursuit of unsupervised classification. One of the most popular and widely used clustering schemes for remote sensing applications is the ISOCLUS algorithm, which is based on the ISODATA method. The algorithm is given a set of n data points (or samples) in d-dimensional space, an integer k indicating the initial number of clusters, and a number of additional parameters. The general goal is to compute a set of cluster centers in d-space. Although there is no specific optimization criterion, the algorithm is similar in spirit to the well known k-means clustering method in which the objective is to minimize the average squared distance of each point to its nearest center, called the average distortion. One significant feature of ISOCLUS over k-means is that clusters may be merged or split, and so the final number of clusters may be different from the number k supplied as part of the input. This algorithm will be described in later in this paper. The ISOCLUS algorithm can run very slowly, particularly on large data sets. Given its wide use in remote sensing, its efficient computation is an important goal. We have developed a fast implementation of the ISOCLUS algorithm. Our improvement is based on a recent acceleration to the k-means algorithm, the filtering algorithm, by Kanungo et al.. They showed that, by storing the data in a kd-tree, it was possible to significantly reduce the running time of k-means. We have adapted this method for the ISOCLUS algorithm. For technical reasons, which are explained later, it is necessary to make a minor modification to the ISOCLUS specification. We provide empirical evidence, on both synthetic and Landsat image data sets, that our algorithm's performance is essentially the same as that of ISOCLUS, but with significantly lower running times. We show that our algorithm runs from 3 to 30 times faster than a straightforward implementation of ISOCLUS. Our adaptation of the filtering algorithm involves the efficient computation of a number of cluster statistics that are needed for ISOCLUS, but not for k-means.

  20. A method of extracting impervious surface based on rule algorithm

    NASA Astrophysics Data System (ADS)

    Peng, Shuangyun; Hong, Liang; Xu, Quanli

    2018-02-01

    The impervious surface has become an important index to evaluate the urban environmental quality and measure the development level of urbanization. At present, the use of remote sensing technology to extract impervious surface has become the main way. In this paper, a method to extract impervious surface based on rule algorithm is proposed. The main ideas of the method is to use the rule-based algorithm to extract impermeable surface based on the characteristics and the difference which is between the impervious surface and the other three types of objects (water, soil and vegetation) in the seven original bands, NDWI and NDVI. The steps can be divided into three steps: 1) Firstly, the vegetation is extracted according to the principle that the vegetation is higher in the near-infrared band than the other bands; 2) Then, the water is extracted according to the characteristic of the water with the highest NDWI and the lowest NDVI; 3) Finally, the impermeable surface is extracted based on the fact that the impervious surface has a higher NDWI value and the lowest NDVI value than the soil.In order to test the accuracy of the rule algorithm, this paper uses the linear spectral mixed decomposition algorithm, the CART algorithm, the NDII index algorithm for extracting the impervious surface based on six remote sensing image of the Dianchi Lake Basin from 1999 to 2014. Then, the accuracy of the above three methods is compared with the accuracy of the rule algorithm by using the overall classification accuracy method. It is found that the extraction method based on the rule algorithm is obviously higher than the above three methods.

  1. Coordinating robot motion, sensing, and control in plans. LDRD project final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xavier, P.G.; Brown, R.G.; Watterberg, P.A.

    1997-08-01

    The goal of this project was to develop a framework for robotic planning and execution that provides a continuum of adaptability with respect to model incompleteness, model error, and sensing error. For example, dividing robot motion into gross-motion planning, fine-motion planning, and sensor-augmented control had yielded productive research and solutions to individual problems. Unfortunately, these techniques could only be combined by hand with ad hoc methods and were restricted to systems where all kinematics are completely modeled in planning. The original intent was to develop methods for understanding and autonomously synthesizing plans that coordinate motion, sensing, and control. The projectmore » considered this problem from several perspectives. Results included (1) theoretical methods to combine and extend gross-motion and fine-motion planning; (2) preliminary work in flexible-object manipulation and an implementable algorithm for planning shortest paths through obstacles for the free-end of an anchored cable; (3) development and implementation of a fast swept-body distance algorithm; and (4) integration of Sandia`s C-Space Toolkit geometry engine and SANDROS motion planer and improvements, which yielded a system practical for everyday motion planning, with path-segment planning at interactive speeds. Results (3) and (4) have either led to follow-on work or are being used in current projects, and they believe that (2) will eventually be also.« less

  2. Supercooled Liquid Water Content Instrument Analysis and Winter 2014 Data with Comparisons to the NASA Icing Remote Sensing System and Pilot Reports

    NASA Technical Reports Server (NTRS)

    King, Michael C.

    2016-01-01

    The National Aeronautics and Space Administration (NASA) has developed a system for remotely detecting the hazardous conditions leading to aircraft icing in flight, the NASA Icing Remote Sensing System (NIRSS). Newly developed, weather balloon-borne instruments have been used to obtain in-situ measurements of supercooled liquid water during March 2014 to validate the algorithms used in the NIRSS. A mathematical model and a processing method were developed to analyze the data obtained from the weather balloon soundings. The data from soundings obtained in March 2014 were analyzed and compared to the output from the NIRSS and pilot reports.

  3. Simulation and Flight Test Capability for Testing Prototype Sense and Avoid System Elements

    NASA Technical Reports Server (NTRS)

    Howell, Charles T.; Stock, Todd M.; Verstynen, Harry A.; Wehner, Paul J.

    2012-01-01

    NASA Langley Research Center (LaRC) and The MITRE Corporation (MITRE) have developed, and successfully demonstrated, an integrated simulation-to-flight capability for evaluating sense and avoid (SAA) system elements. This integrated capability consists of a MITRE developed fast-time computer simulation for evaluating SAA algorithms, and a NASA LaRC surrogate unmanned aircraft system (UAS) equipped to support hardware and software in-the-loop evaluation of SAA system elements (e.g., algorithms, sensors, architecture, communications, autonomous systems), concepts, and procedures. The fast-time computer simulation subjects algorithms to simulated flight encounters/ conditions and generates a fitness report that records strengths, weaknesses, and overall performance. Reviewed algorithms (and their fitness report) are then transferred to NASA LaRC where additional (joint) airworthiness evaluations are performed on the candidate SAA system-element configurations, concepts, and/or procedures of interest; software and hardware components are integrated into the Surrogate UAS research systems; and flight safety and mission planning activities are completed. Onboard the Surrogate UAS, candidate SAA system element configurations, concepts, and/or procedures are subjected to flight evaluations and in-flight performance is monitored. The Surrogate UAS, which can be controlled remotely via generic Ground Station uplink or automatically via onboard systems, operates with a NASA Safety Pilot/Pilot in Command onboard to permit safe operations in mixed airspace with manned aircraft. An end-to-end demonstration of a typical application of the capability was performed in non-exclusionary airspace in October 2011; additional research, development, flight testing, and evaluation efforts using this integrated capability are planned throughout fiscal year 2012 and 2013.

  4. Aerosol Remote Sensing

    NASA Technical Reports Server (NTRS)

    Lenoble, Jacqueline (Editor); Remer, Lorraine (Editor); Tanre, Didier (Editor)

    2012-01-01

    This book gives a much needed explanation of the basic physical principles of radia5tive transfer and remote sensing, and presents all the instruments and retrieval algorithms in a homogenous manner. For the first time, an easy path from theory to practical algorithms is available in one easily accessible volume, making the connection between theoretical radiative transfer and individual practical solutions to retrieve aerosol information from remote sensing. In addition, the specifics and intercomparison of all current and historical methods are explained and clarified.

  5. Performance of target detection algorithm in compressive sensing miniature ultraspectral imaging compressed sensing system

    NASA Astrophysics Data System (ADS)

    Gedalin, Daniel; Oiknine, Yaniv; August, Isaac; Blumberg, Dan G.; Rotman, Stanley R.; Stern, Adrian

    2017-04-01

    Compressive sensing theory was proposed to deal with the high quantity of measurements demanded by traditional hyperspectral systems. Recently, a compressive spectral imaging technique dubbed compressive sensing miniature ultraspectral imaging (CS-MUSI) was presented. This system uses a voltage controlled liquid crystal device to create multiplexed hyperspectral cubes. We evaluate the utility of the data captured using the CS-MUSI system for the task of target detection. Specifically, we compare the performance of the matched filter target detection algorithm in traditional hyperspectral systems and in CS-MUSI multiplexed hyperspectral cubes. We found that the target detection algorithm performs similarly in both cases, despite the fact that the CS-MUSI data is up to an order of magnitude less than that in conventional hyperspectral cubes. Moreover, the target detection is approximately an order of magnitude faster in CS-MUSI data.

  6. Capacitive touch sensing : signal and image processing algorithms

    NASA Astrophysics Data System (ADS)

    Baharav, Zachi; Kakarala, Ramakrishna

    2011-03-01

    Capacitive touch sensors have been in use for many years, and recently gained center stage with the ubiquitous use in smart-phones. In this work we will analyze the most common method of projected capacitive sensing, that of absolute capacitive sensing, together with the most common sensing pattern, that of diamond-shaped sensors. After a brief introduction to the problem, and the reasons behind its popularity, we will formulate the problem as a reconstruction from projections. We derive analytic solutions for two simple cases: circular finger on a wire grid, and square finger on a square grid. The solutions give insight into the ambiguities of finding finger location from sensor readings. The main contribution of our paper is the discussion of interpolation algorithms including simple linear interpolation , curve fitting (parabolic and Gaussian), filtering, general look-up-table, and combinations thereof. We conclude with observations on the limits of the present algorithmic methods, and point to possible future research.

  7. Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture.

    PubMed

    Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin

    2015-07-10

    Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology.

  8. Towards the Development of a Smart Flying Sensor: Illustration in the Field of Precision Agriculture

    PubMed Central

    Hernandez, Andres; Murcia, Harold; Copot, Cosmin; De Keyser, Robin

    2015-01-01

    Sensing is an important element to quantify productivity, product quality and to make decisions. Applications, such as mapping, surveillance, exploration and precision agriculture, require a reliable platform for remote sensing. This paper presents the first steps towards the development of a smart flying sensor based on an unmanned aerial vehicle (UAV). The concept of smart remote sensing is illustrated and its performance tested for the task of mapping the volume of grain inside a trailer during forage harvesting. Novelty lies in: (1) the development of a position-estimation method with time delay compensation based on inertial measurement unit (IMU) sensors and image processing; (2) a method to build a 3D map using information obtained from a regular camera; and (3) the design and implementation of a path-following control algorithm using model predictive control (MPC). Experimental results on a lab-scale system validate the effectiveness of the proposed methodology. PMID:26184205

  9. Laser vibrometry exploitation for vehicle identification

    NASA Astrophysics Data System (ADS)

    Nolan, Adam; Lingg, Andrew; Goley, Steve; Sigmund, Kevin; Kangas, Scott

    2014-06-01

    Vibration signatures sensed from distant vehicles using laser vibrometry systems provide valuable information that may be used to help identify key vehicle features such as engine type, engine speed, and number of cylinders. Through the use of physics models of the vibration phenomenology, features are chosen to support classification algorithms. Various individual exploitation algorithms were developed using these models to classify vibration signatures into engine type (piston vs. turbine), engine configuration (Inline 4 vs. Inline 6 vs. V6 vs. V8 vs. V12) and vehicle type. The results of these algorithms will be presented for an 8 class problem. Finally, the benefits of using a factor graph representation to link these independent algorithms together will be presented which constructs a classification hierarchy for the vibration exploitation problem.

  10. ASTEP user's guide and software documentation

    NASA Technical Reports Server (NTRS)

    Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.

    1974-01-01

    The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.

  11. Channel estimation based on quantized MMP for FDD massive MIMO downlink

    NASA Astrophysics Data System (ADS)

    Guo, Yao-ting; Wang, Bing-he; Qu, Yi; Cai, Hua-jie

    2016-10-01

    In this paper, we consider channel estimation for Massive MIMO systems operating in frequency division duplexing mode. By exploiting the sparsity of propagation paths in Massive MIMO channel, we develop a compressed sensing(CS) based channel estimator which can reduce the pilot overhead. As compared with the conventional least squares (LS) and linear minimum mean square error(LMMSE) estimation, the proposed algorithm is based on the quantized multipath matching pursuit - MMP - reduced the pilot overhead and performs better than other CS algorithms. The simulation results demonstrate the advantage of the proposed algorithm over various existing methods including the LS, LMMSE, CoSaMP and conventional MMP estimators.

  12. Mobile transporter path planning

    NASA Technical Reports Server (NTRS)

    Baffes, Paul; Wang, Lui

    1990-01-01

    The use of a genetic algorithm (GA) for solving the mobile transporter path planning problem is investigated. The mobile transporter is a traveling robotic vehicle proposed for the space station which must be able to reach any point of the structure autonomously. Elements of the genetic algorithm are explored in both a theoretical and experimental sense. Specifically, double crossover, greedy crossover, and tournament selection techniques are examined. Additionally, the use of local optimization techniques working in concert with the GA are also explored. Recent developments in genetic algorithm theory are shown to be particularly effective in a path planning problem domain, though problem areas can be cited which require more research.

  13. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  14. Development of a WRF-RTFDDA-based high-resolution hybrid data-assimilation and forecasting system toward to operation in the Middle East

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Wu, W.; Zhang, Y.; Kucera, P. A.; Liu, Y.; Pan, L.

    2012-12-01

    Weather forecasting in the Middle East is challenging because of its complicated geographical nature including massive coastal area and heterogeneous land, and regional spare observational network. Strong air-land-sea interactions form multi-scale weather regimes in the area, which require a numerical weather prediction model capable of properly representing multi-scale atmospheric flow with appropriate initial conditions. The WRF-based Real-Time Four Dimensional Data Assimilation (RTFDDA) system is one of advanced multi-scale weather analysis and forecasting facilities developed at the Research Applications Laboratory (RAL) of NCAR. The forecasting system is applied for the Middle East with careful configuration. To overcome the limitation of the very sparsely available conventional observations in the region, we develop a hybrid data assimilation algorithm combining RTFDDA and WRF-3DVAR, which ingests remote sensing data from satellites and radar. This hybrid data assimilation blends Newtonian nudging FDDA and 3DVAR technology to effectively assimilate both conventional observations and remote sensing measurements and provide improved initial conditions for the forecasting system. For brevity, the forecasting system is called RTF3H (RTFDDA-3DVAR Hybrid). In this presentation, we will discuss the hybrid data assimilation algorithm, and its implementation, and the applications for high-impact weather events in the area. Sensitivity studies are conducted to understand the strength and limitations of this hybrid data assimilation algorithm.

  15. Surface imaging microscope

    NASA Astrophysics Data System (ADS)

    Rogala, Eric W.; Bankman, Isaac N.

    2008-04-01

    The three-dimensional shapes of microscopic objects are becoming increasingly important for battlespace CBRNE sensing. Potential applications of microscopic 3D shape observations include characterization of biological weapon particles and manufacturing of micromechanical components. Aerosol signatures of stand-off lidar systems, using elastic backscatter or polarization, are dictated by the aerosol particle shapes and sizes that must be well characterized in the lab. A low-cost, fast instrument for 3D surface shape microscopy will be a valuable point sensor for biological particle sensing applications. Both the cost and imaging durations of traditional techniques such as confocal microscopes, atomic force microscopes, and electron scanning microscopes are too high. We investigated the feasibility of a low-cost, fast interferometric technique for imaging the 3D surface shape of microscopic objects at frame rates limited only by the camera in the system. The system operates at two laser wavelengths producing two fringe images collected simultaneously by a digital camera, and a specialized algorithm we developed reconstructs the surface map of the microscopic object. The current implementation assembled to test the concept and develop the new 3D reconstruction algorithm has 0.25 micron resolution in the x and y directions, and about 0.1 micron accuracy in the z direction, as tested on a microscopic glass test object manufactured with etching techniques. We describe the interferometric instrument, present the reconstruction algorithm, and discuss further development.

  16. Post-hurricane forest damage assessment using satellite remote sensing

    Treesearch

    W. Wang; J.J. Qu; X. Hao; Y. Liu; J.A. Stanturf

    2010-01-01

    This study developed a rapid assessment algorithm for post-hurricane forest damage estimation using moderate resolution imaging spectroradiometer (MODIS) measurements. The performance of five commonly used vegetation indices as post-hurricane forest damage indicators was investigated through statistical analysis. The Normalized Difference Infrared Index (NDII) was...

  17. From Whole Numbers to Invert and Multiply

    ERIC Educational Resources Information Center

    Cavey, Laurie O.; Kinzel, Margaret T.

    2014-01-01

    Teachers report that engaging students in solving contextual problems is an important part of supporting student understanding of algorithms for fraction division. Meaning for whole-number operations is a crucial part of making sense of contextual problems involving rational numbers. The authors present a developed instructional sequence to…

  18. Object-Oriented Classification of Sugarcane Using Time-Series Middle-Resolution Remote Sensing Data Based on AdaBoost

    PubMed Central

    Zhou, Zhen; Huang, Jingfeng; Wang, Jing; Zhang, Kangyu; Kuang, Zhaomin; Zhong, Shiquan; Song, Xiaodong

    2015-01-01

    Most areas planted with sugarcane are located in southern China. However, remote sensing of sugarcane has been limited because useable remote sensing data are limited due to the cloudy climate of this region during the growing season and severe spectral mixing with other crops. In this study, we developed a methodology for automatically mapping sugarcane over large areas using time-series middle-resolution remote sensing data. For this purpose, two major techniques were used, the object-oriented method (OOM) and data mining (DM). In addition, time-series Chinese HJ-1 CCD images were obtained during the sugarcane growing period. Image objects were generated using a multi-resolution segmentation algorithm, and DM was implemented using the AdaBoost algorithm, which generated the prediction model. The prediction model was applied to the HJ-1 CCD time-series image objects, and then a map of the sugarcane planting area was produced. The classification accuracy was evaluated using independent field survey sampling points. The confusion matrix analysis showed that the overall classification accuracy reached 93.6% and that the Kappa coefficient was 0.85. Thus, the results showed that this method is feasible, efficient, and applicable for extrapolating the classification of other crops in large areas where the application of high-resolution remote sensing data is impractical due to financial considerations or because qualified images are limited. PMID:26528811

  19. Object-Oriented Classification of Sugarcane Using Time-Series Middle-Resolution Remote Sensing Data Based on AdaBoost.

    PubMed

    Zhou, Zhen; Huang, Jingfeng; Wang, Jing; Zhang, Kangyu; Kuang, Zhaomin; Zhong, Shiquan; Song, Xiaodong

    2015-01-01

    Most areas planted with sugarcane are located in southern China. However, remote sensing of sugarcane has been limited because useable remote sensing data are limited due to the cloudy climate of this region during the growing season and severe spectral mixing with other crops. In this study, we developed a methodology for automatically mapping sugarcane over large areas using time-series middle-resolution remote sensing data. For this purpose, two major techniques were used, the object-oriented method (OOM) and data mining (DM). In addition, time-series Chinese HJ-1 CCD images were obtained during the sugarcane growing period. Image objects were generated using a multi-resolution segmentation algorithm, and DM was implemented using the AdaBoost algorithm, which generated the prediction model. The prediction model was applied to the HJ-1 CCD time-series image objects, and then a map of the sugarcane planting area was produced. The classification accuracy was evaluated using independent field survey sampling points. The confusion matrix analysis showed that the overall classification accuracy reached 93.6% and that the Kappa coefficient was 0.85. Thus, the results showed that this method is feasible, efficient, and applicable for extrapolating the classification of other crops in large areas where the application of high-resolution remote sensing data is impractical due to financial considerations or because qualified images are limited.

  20. ROBUST ONLINE MONITORING FOR CALIBRATION ASSESSMENT OF TRANSMITTERS AND INSTRUMENTATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Tipireddy, Ramakrishna; Lerchen, Megan E.

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. Specifically, the next generation of OLM technology is expected to include newly developed advanced algorithms that improve monitoring of sensor/system performance and enable the use of plant data to derive information that currently cannot be measured. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this paper, we discuss an overview of research beingmore » performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or more sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation – fault detection and selection of acceptance criteria • Virtual sensing – signal value prediction and acceptance criteria • Response-time assessment – fault detection and acceptance criteria selection A GP-based uncertainty quantification (UQ) method previously developed for UQ in OLM, was adapted for use in sensor-fault detection and virtual sensing. For signal validation, the various components to the OLM residual (which is computed using an AAKR model) were explicitly defined and modeled using a GP. Evaluation was conducted using flow loop data from multiple sources. Results using experimental data from laboratory-scale flow loops indicate that the approach, while capable of detecting sensor drift, may be incapable of discriminating between sensor drift and model inadequacy. This may be due to a simplification applied in the initial modeling, where the sensor degradation is assumed to be stationary. In the case of virtual sensors, the GP model was used in a predictive mode to estimate the correct sensor reading for sensors that may have failed. Results have indicated the viability of using this approach for virtual sensing. However, the GP model has proven to be computationally expensive, and so alternative algorithms for virtual sensing are being evaluated. Finally, automated approaches to performing noise analysis for extracting sensor response time were developed. Evaluation of this technique using laboratory-scale data indicates that it compares well with manual techniques previously used for noise analysis. Moreover, the automated and manual approaches for noise analysis also compare well with the current “gold standard”, hydraulic ramp testing, for response time monitoring. Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  1. Expert system constant false alarm rate processor

    NASA Astrophysics Data System (ADS)

    Baldygo, William J., Jr.; Wicks, Michael C.

    1993-10-01

    The requirements for high detection probability and low false alarm probability in modern wide area surveillance radars are rarely met due to spatial variations in clutter characteristics. Many filtering and CFAR detection algorithms have been developed to effectively deal with these variations; however, any single algorithm is likely to exhibit excessive false alarms and intolerably low detection probabilities in a dynamically changing environment. A great deal of research has led to advances in the state of the art in Artificial Intelligence (AI) and numerous areas have been identified for application to radar signal processing. The approach suggested here, discussed in a patent application submitted by the authors, is to intelligently select the filtering and CFAR detection algorithms being executed at any given time, based upon the observed characteristics of the interference environment. This approach requires sensing the environment, employing the most suitable algorithms, and applying an appropriate multiple algorithm fusion scheme or consensus algorithm to produce a global detection decision.

  2. An Approach of Registration between Remote Sensing Image and Electronic Chart Based on Coastal Line

    NASA Astrophysics Data System (ADS)

    Li, Ying; Yu, Shuiming; Li, Chuanlong

    Remote sensing plays an important role marine oil spill emergency. In order to implement a timely and effective countermeasure, it is important to provide exact position of oil spills. Therefore it is necessary to match remote sensing image and electronic chart properly. Variance ordinarily exists between oil spill image and electronic chart, although geometric correction is applied to remote sensing image. It is difficult to find the steady control points on sea to make exact rectification of remote sensing image. An improved relaxation algorithm was developed for finding the control points along the coastline since oil spills occurs generally near the coast. A conversion function is created with the least square, and remote sensing image can be registered with the vector map based on this function. SAR image was used as the remote sensing data and shape format map as the electronic chart data. The results show that this approach can guarantee the precision of the registration, which is essential for oil spill monitoring.

  3. A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations

    NASA Astrophysics Data System (ADS)

    Felix, Simon; Bolzern, Roman; Battaglia, Marina

    2017-11-01

    One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS_CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS_CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation of quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.

  4. A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felix, Simon; Bolzern, Roman; Battaglia, Marina, E-mail: simon.felix@fhnw.ch, E-mail: roman.bolzern@fhnw.ch, E-mail: marina.battaglia@fhnw.ch

    One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS-CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS-CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation ofmore » quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.« less

  5. Local Competition-Based Superpixel Segmentation Algorithm in Remote Sensing

    PubMed Central

    Liu, Jiayin; Tang, Zhenmin; Cui, Ying; Wu, Guoxing

    2017-01-01

    Remote sensing technologies have been widely applied in urban environments’ monitoring, synthesis and modeling. Incorporating spatial information in perceptually coherent regions, superpixel-based approaches can effectively eliminate the “salt and pepper” phenomenon which is common in pixel-wise approaches. Compared with fixed-size windows, superpixels have adaptive sizes and shapes for different spatial structures. Moreover, superpixel-based algorithms can significantly improve computational efficiency owing to the greatly reduced number of image primitives. Hence, the superpixel algorithm, as a preprocessing technique, is more and more popularly used in remote sensing and many other fields. In this paper, we propose a superpixel segmentation algorithm called Superpixel Segmentation with Local Competition (SSLC), which utilizes a local competition mechanism to construct energy terms and label pixels. The local competition mechanism leads to energy terms locality and relativity, and thus, the proposed algorithm is less sensitive to the diversity of image content and scene layout. Consequently, SSLC could achieve consistent performance in different image regions. In addition, the Probability Density Function (PDF), which is estimated by Kernel Density Estimation (KDE) with the Gaussian kernel, is introduced to describe the color distribution of superpixels as a more sophisticated and accurate measure. To reduce computational complexity, a boundary optimization framework is introduced to only handle boundary pixels instead of the whole image. We conduct experiments to benchmark the proposed algorithm with the other state-of-the-art ones on the Berkeley Segmentation Dataset (BSD) and remote sensing images. Results demonstrate that the SSLC algorithm yields the best overall performance, while the computation time-efficiency is still competitive. PMID:28604641

  6. Local Competition-Based Superpixel Segmentation Algorithm in Remote Sensing.

    PubMed

    Liu, Jiayin; Tang, Zhenmin; Cui, Ying; Wu, Guoxing

    2017-06-12

    Remote sensing technologies have been widely applied in urban environments' monitoring, synthesis and modeling. Incorporating spatial information in perceptually coherent regions, superpixel-based approaches can effectively eliminate the "salt and pepper" phenomenon which is common in pixel-wise approaches. Compared with fixed-size windows, superpixels have adaptive sizes and shapes for different spatial structures. Moreover, superpixel-based algorithms can significantly improve computational efficiency owing to the greatly reduced number of image primitives. Hence, the superpixel algorithm, as a preprocessing technique, is more and more popularly used in remote sensing and many other fields. In this paper, we propose a superpixel segmentation algorithm called Superpixel Segmentation with Local Competition (SSLC), which utilizes a local competition mechanism to construct energy terms and label pixels. The local competition mechanism leads to energy terms locality and relativity, and thus, the proposed algorithm is less sensitive to the diversity of image content and scene layout. Consequently, SSLC could achieve consistent performance in different image regions. In addition, the Probability Density Function (PDF), which is estimated by Kernel Density Estimation (KDE) with the Gaussian kernel, is introduced to describe the color distribution of superpixels as a more sophisticated and accurate measure. To reduce computational complexity, a boundary optimization framework is introduced to only handle boundary pixels instead of the whole image. We conduct experiments to benchmark the proposed algorithm with the other state-of-the-art ones on the Berkeley Segmentation Dataset (BSD) and remote sensing images. Results demonstrate that the SSLC algorithm yields the best overall performance, while the computation time-efficiency is still competitive.

  7. VIIRS validation and algorithm development efforts in coastal and inland Waters

    NASA Astrophysics Data System (ADS)

    Stengel, E.; Ondrusek, M.

    2016-02-01

    Accurate satellite ocean color measurements in coastal and inland waters are more challenging than open-ocean measurements. Complex water and atmospheric conditions can limit the utilization of remote sensing data in coastal waters where it is most needed. The Coastal Optical Characterization Experiment (COCE) is an ongoing project at NOAA/NESDIS/STAR Satellite Oceanography and Climatology Division. The primary goals of COCE are satellite ocean color validation and application development. Currently, this effort concentrates on the initialization and validation of the Joint Polar Satellite System (JPSS) VIIRS sensor using a Satlantic HyperPro II radiometer as a validation tool. A report on VIIRS performance in coastal waters will be given by presenting comparisons between in situ ground truth measurements and VIIRS retrievals made in the Chesapeake Bay, and inland waters of the Gulf of Mexico and Puerto Rico. The COCE application development effort focuses on developing new ocean color satellite remote sensing tools for monitoring relevant coastal ocean parameters. A new VIIRS total suspended matter algorithm will be presented for the Chesapeake Bay. These activities improve the utility of ocean color satellite data in monitoring and analyzing coastal and oceanic processes. Progress on these activities will be reported.

  8. Analysis of the moderate resolution imaging spectroradiometer contextual algorithm for small fire detection, Journal of Applied Remote Sensing Vol.3

    Treesearch

    W. Wang; J.J. Qu; X. Hao; Y. Liu

    2009-01-01

    In the southeastern United States, most wildland fires are of low intensity. A substantial number of these fires cannot be detected by the MODIS contextual algorithm. To improve the accuracy of fire detection for this region, the remote-sensed characteristics of these fires have to be...

  9. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  10. Flight State Identification of a Self-Sensing Wing via an Improved Feature Selection Method and Machine Learning Approaches

    PubMed Central

    Chen, Xi; Wu, Qi; Ren, He; Chang, Fu-Kuo

    2018-01-01

    In this work, a data-driven approach for identifying the flight state of a self-sensing wing structure with an embedded multi-functional sensing network is proposed. The flight state is characterized by the structural vibration signals recorded from a series of wind tunnel experiments under varying angles of attack and airspeeds. A large feature pool is created by extracting potential features from the signals covering the time domain, the frequency domain as well as the information domain. Special emphasis is given to feature selection in which a novel filter method is developed based on the combination of a modified distance evaluation algorithm and a variance inflation factor. Machine learning algorithms are then employed to establish the mapping relationship from the feature space to the practical state space. Results from two case studies demonstrate the high identification accuracy and the effectiveness of the model complexity reduction via the proposed method, thus providing new perspectives of self-awareness towards the next generation of intelligent air vehicles. PMID:29710832

  11. An object-oriented simulator for 3D digital breast tomosynthesis imaging system.

    PubMed

    Seyyedi, Saeed; Cengiz, Kubra; Kamasak, Mustafa; Yildirim, Isa

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values.

  12. An Object-Oriented Simulator for 3D Digital Breast Tomosynthesis Imaging System

    PubMed Central

    Cengiz, Kubra

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values. PMID:24371468

  13. A DNA-based semantic fusion model for remote sensing data.

    PubMed

    Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H

    2013-01-01

    Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.

  14. A DNA-Based Semantic Fusion Model for Remote Sensing Data

    PubMed Central

    Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H.

    2013-01-01

    Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology. PMID:24116207

  15. Using Distance Sensors to Perform Collision Avoidance Maneuvres on Uav Applications

    NASA Astrophysics Data System (ADS)

    Raimundo, A.; Peres, D.; Santos, N.; Sebastião, P.; Souto, N.

    2017-08-01

    The Unmanned Aerial Vehicles (UAV) and its applications are growing for both civilian and military purposes. The operability of an UAV proved that some tasks and operations can be done easily and at a good cost-efficiency ratio. Nowadays, an UAV can perform autonomous missions. It is very useful to certain UAV applications, such as meteorology, vigilance systems, agriculture, environment mapping and search and rescue operations. One of the biggest problems that an UAV faces is the possibility of collision with other objects in the flight area. To avoid this, an algorithm was developed and implemented in order to prevent UAV collision with other objects. "Sense and Avoid" algorithm was developed as a system for UAVs to avoid objects in collision course. This algorithm uses a Light Detection and Ranging (LiDAR), to detect objects facing the UAV in mid-flights. This light sensor is connected to an on-board hardware, Pixhawk's flight controller, which interfaces its communications with another hardware: Raspberry Pi. Communications between Ground Control Station and UAV are made via Wi-Fi or cellular third or fourth generation (3G/4G). Some tests were made in order to evaluate the "Sense and Avoid" algorithm's overall performance. These tests were done in two different environments: A 3D simulated environment and a real outdoor environment. Both modes worked successfully on a simulated 3D environment, and "Brake" mode on a real outdoor, proving its concepts.

  16. Satellite remote sensing of harmful algal blooms: A new multi-algorithm method for detecting the Florida Red Tide (Karenia brevis).

    PubMed

    Carvalho, Gustavo A; Minnett, Peter J; Fleming, Lora E; Banzon, Viva F; Baringer, Warner

    2010-06-01

    In a continuing effort to develop suitable methods for the surveillance of Harmful Algal Blooms (HABs) of Karenia brevis using satellite radiometers, a new multi-algorithm method was developed to explore whether improvements in the remote sensing detection of the Florida Red Tide was possible. A Hybrid Scheme was introduced that sequentially applies the optimized versions of two pre-existing satellite-based algorithms: an Empirical Approach (using water-leaving radiance as a function of chlorophyll concentration) and a Bio-optical Technique (using particulate backscatter along with chlorophyll concentration). The long-term evaluation of the new multi-algorithm method was performed using a multi-year MODIS dataset (2002 to 2006; during the boreal Summer-Fall periods - July to December) along the Central West Florida Shelf between 25.75°N and 28.25°N. Algorithm validation was done with in situ measurements of the abundances of K. brevis; cell counts ≥1.5×10(4) cells l(-1) defined a detectable HAB. Encouraging statistical results were derived when either or both algorithms correctly flagged known samples. The majority of the valid match-ups were correctly identified (~80% of both HABs and non-blooming conditions) and few false negatives or false positives were produced (~20% of each). Additionally, most of the HAB-positive identifications in the satellite data were indeed HAB samples (positive predictive value: ~70%) and those classified as HAB-negative were almost all non-bloom cases (negative predictive value: ~86%). These results demonstrate an excellent detection capability, on average ~10% more accurate than the individual algorithms used separately. Thus, the new Hybrid Scheme could become a powerful tool for environmental monitoring of K. brevis blooms, with valuable consequences including leading to the more rapid and efficient use of ships to make in situ measurements of HABs.

  17. Satellite remote sensing of harmful algal blooms: A new multi-algorithm method for detecting the Florida Red Tide (Karenia brevis)

    PubMed Central

    Carvalho, Gustavo A.; Minnett, Peter J.; Fleming, Lora E.; Banzon, Viva F.; Baringer, Warner

    2010-01-01

    In a continuing effort to develop suitable methods for the surveillance of Harmful Algal Blooms (HABs) of Karenia brevis using satellite radiometers, a new multi-algorithm method was developed to explore whether improvements in the remote sensing detection of the Florida Red Tide was possible. A Hybrid Scheme was introduced that sequentially applies the optimized versions of two pre-existing satellite-based algorithms: an Empirical Approach (using water-leaving radiance as a function of chlorophyll concentration) and a Bio-optical Technique (using particulate backscatter along with chlorophyll concentration). The long-term evaluation of the new multi-algorithm method was performed using a multi-year MODIS dataset (2002 to 2006; during the boreal Summer-Fall periods – July to December) along the Central West Florida Shelf between 25.75°N and 28.25°N. Algorithm validation was done with in situ measurements of the abundances of K. brevis; cell counts ≥1.5×104 cells l−1 defined a detectable HAB. Encouraging statistical results were derived when either or both algorithms correctly flagged known samples. The majority of the valid match-ups were correctly identified (~80% of both HABs and non-blooming conditions) and few false negatives or false positives were produced (~20% of each). Additionally, most of the HAB-positive identifications in the satellite data were indeed HAB samples (positive predictive value: ~70%) and those classified as HAB-negative were almost all non-bloom cases (negative predictive value: ~86%). These results demonstrate an excellent detection capability, on average ~10% more accurate than the individual algorithms used separately. Thus, the new Hybrid Scheme could become a powerful tool for environmental monitoring of K. brevis blooms, with valuable consequences including leading to the more rapid and efficient use of ships to make in situ measurements of HABs. PMID:21037979

  18. Peak-locking centroid bias in Shack-Hartmann wavefront sensing

    NASA Astrophysics Data System (ADS)

    Anugu, Narsireddy; Garcia, Paulo J. V.; Correia, Carlos M.

    2018-05-01

    Shack-Hartmann wavefront sensing relies on accurate spot centre measurement. Several algorithms were developed with this aim, mostly focused on precision, i.e. minimizing random errors. In the solar and extended scene community, the importance of the accuracy (bias error due to peak-locking, quantization, or sampling) of the centroid determination was identified and solutions proposed. But these solutions only allow partial bias corrections. To date, no systematic study of the bias error was conducted. This article bridges the gap by quantifying the bias error for different correlation peak-finding algorithms and types of sub-aperture images and by proposing a practical solution to minimize its effects. Four classes of sub-aperture images (point source, elongated laser guide star, crowded field, and solar extended scene) together with five types of peak-finding algorithms (1D parabola, the centre of gravity, Gaussian, 2D quadratic polynomial, and pyramid) are considered, in a variety of signal-to-noise conditions. The best performing peak-finding algorithm depends on the sub-aperture image type, but none is satisfactory to both bias and random errors. A practical solution is proposed that relies on the antisymmetric response of the bias to the sub-pixel position of the true centre. The solution decreases the bias by a factor of ˜7 to values of ≲ 0.02 pix. The computational cost is typically twice of current cross-correlation algorithms.

  19. GPU-Accelerated Hybrid Algorithm for 3D Localization of Fluorescent Emitters in Dense Clusters

    NASA Astrophysics Data System (ADS)

    Jung, Yoon; Barsic, Anthony; Piestun, Rafael; Fakhri, Nikta

    In stochastic switching-based super-resolution imaging, a random subset of fluorescent emitters are imaged and localized for each frame to construct a single high resolution image. However, the condition of non-overlapping point spread functions (PSFs) imposes constraints on experimental parameters. Recent development in post processing methods such as dictionary-based sparse support recovery using compressive sensing has shown up to an order of magnitude higher recall rate than single emitter fitting methods. However, the computational complexity of this approach scales poorly with the grid size and requires long runtime. Here, we introduce a fast and accurate compressive sensing algorithm for localizing fluorescent emitters in high density in 3D, namely sparse support recovery using Orthogonal Matching Pursuit (OMP) and L1-Homotopy algorithm for reconstructing STORM images (SOLAR STORM). SOLAR STORM combines OMP with L1-Homotopy to reduce computational complexity, which is further accelerated by parallel implementation using GPUs. This method can be used in a variety of experimental conditions for both in vitro and live cell fluorescence imaging.

  20. A research of road centerline extraction algorithm from high resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhang, Yushan; Xu, Tingfa

    2017-09-01

    Satellite remote sensing technology has become one of the most effective methods for land surface monitoring in recent years, due to its advantages such as short period, large scale and rich information. Meanwhile, road extraction is an important field in the applications of high resolution remote sensing images. An intelligent and automatic road extraction algorithm with high precision has great significance for transportation, road network updating and urban planning. The fuzzy c-means (FCM) clustering segmentation algorithms have been used in road extraction, but the traditional algorithms did not consider spatial information. An improved fuzzy C-means clustering algorithm combined with spatial information (SFCM) is proposed in this paper, which is proved to be effective for noisy image segmentation. Firstly, the image is segmented using the SFCM. Secondly, the segmentation result is processed by mathematical morphology to remover the joint region. Thirdly, the road centerlines are extracted by morphology thinning and burr trimming. The average integrity of the centerline extraction algorithm is 97.98%, the average accuracy is 95.36% and the average quality is 93.59%. Experimental results show that the proposed method in this paper is effective for road centerline extraction.

  1. A Skylab program for the International Hydrological Decade (IHD). [Lake Ontario Basin

    NASA Technical Reports Server (NTRS)

    Polcyn, F. C. (Principal Investigator); Rebel, D. L.

    1974-01-01

    The author has identified the following significant results. The development of the algorithm (using real data) relating red and IR reflectance to surface soil moisture over regions of variable vegetation cover will enable remote sensing to make direct inputs into determination of this important hydrologic parameter.

  2. NASA Satellite Monitoring of Water Clarity in Mobile Bay for Nutrient Criteria Development

    NASA Technical Reports Server (NTRS)

    Blonski, Slawomir; Holekamp, Kara; Spiering, Bruce A.

    2009-01-01

    This project has demonstrated feasibility of deriving from MODIS daily measurements time series of water clarity parameters that provide coverage of a specific location or an area of interest for 30-50% of days. Time series derived for estuarine and coastal waters display much higher variability than time series of ecological parameters (such as vegetation indices) derived for land areas. (Temporal filtering often applied in terrestrial studies cannot be used effectively in ocean color processing). IOP-based algorithms for retrieval of diffuse light attenuation coefficient and TSS concentration perform well for the Mobile Bay environment: only a minor adjustment was needed in the TSS algorithm, despite generally recognized dependence of such algorithms on local conditions. The current IOP-based algorithm for retrieval of chlorophyll a concentration has not performed as well: a more reliable algorithm is needed that may be based on IOPs at additional wavelengths or on remote sensing reflectance from multiple spectral bands. CDOM algorithm also needs improvement to provide better separation between effects of gilvin (gelbstoff) and detritus. (Identification or development of such algorithm requires more data from in situ measurements of CDOM concentration in Gulf of Mexico coastal waters (ongoing collaboration with the EPA Gulf Ecology Division))

  3. Collaborative autonomous sensing with Bayesians in the loop

    NASA Astrophysics Data System (ADS)

    Ahmed, Nisar

    2016-10-01

    There is a strong push to develop intelligent unmanned autonomy that complements human reasoning for applications as diverse as wilderness search and rescue, military surveillance, and robotic space exploration. More than just replacing humans for `dull, dirty and dangerous' work, autonomous agents are expected to cope with a whole host of uncertainties while working closely together with humans in new situations. The robotics revolution firmly established the primacy of Bayesian algorithms for tackling challenging perception, learning and decision-making problems. Since the next frontier of autonomy demands the ability to gather information across stretches of time and space that are beyond the reach of a single autonomous agent, the next generation of Bayesian algorithms must capitalize on opportunities to draw upon the sensing and perception abilities of humans-in/on-the-loop. This work summarizes our recent research toward harnessing `human sensors' for information gathering tasks. The basic idea behind is to allow human end users (i.e. non-experts in robotics, statistics, machine learning, etc.) to directly `talk to' the information fusion engine and perceptual processes aboard any autonomous agent. Our approach is grounded in rigorous Bayesian modeling and fusion of flexible semantic information derived from user-friendly interfaces, such as natural language chat and locative hand-drawn sketches. This naturally enables `plug and play' human sensing with existing probabilistic algorithms for planning and perception, and has been successfully demonstrated with human-robot teams in target localization applications.

  4. VisitSense: Sensing Place Visit Patterns from Ambient Radio on Smartphones for Targeted Mobile Ads in Shopping Malls

    PubMed Central

    Kim, Byoungjip; Kang, Seungwoo; Ha, Jin-Young; Song, Junehwa

    2015-01-01

    In this paper, we introduce a novel smartphone framework called VisitSense that automatically detects and predicts a smartphone user’s place visits from ambient radio to enable behavioral targeting for mobile ads in large shopping malls. VisitSense enables mobile app developers to adopt visit-pattern-aware mobile advertising for shopping mall visitors in their apps. It also benefits mobile users by allowing them to receive highly relevant mobile ads that are aware of their place visit patterns in shopping malls. To achieve the goal, VisitSense employs accurate visit detection and prediction methods. For accurate visit detection, we develop a change-based detection method to take into consideration the stability change of ambient radio and the mobility change of users. It performs well in large shopping malls where ambient radio is quite noisy and causes existing algorithms to easily fail. In addition, we proposed a causality-based visit prediction model to capture the causality in the sequential visit patterns for effective prediction. We have developed a VisitSense prototype system, and a visit-pattern-aware mobile advertising application that is based on it. Furthermore, we deploy the system in the COEX Mall, one of the largest shopping malls in Korea, and conduct diverse experiments to show the effectiveness of VisitSense. PMID:26193275

  5. Low-dose CT reconstruction via L1 dictionary learning regularization using iteratively reweighted least-squares.

    PubMed

    Zhang, Cheng; Zhang, Tao; Li, Ming; Peng, Chengtao; Liu, Zhaobang; Zheng, Jian

    2016-06-18

    In order to reduce the radiation dose of CT (computed tomography), compressed sensing theory has been a hot topic since it provides the possibility of a high quality recovery from the sparse sampling data. Recently, the algorithm based on DL (dictionary learning) was developed to deal with the sparse CT reconstruction problem. However, the existing DL algorithm focuses on the minimization problem with the L2-norm regularization term, which leads to reconstruction quality deteriorating while the sampling rate declines further. Therefore, it is essential to improve the DL method to meet the demand of more dose reduction. In this paper, we replaced the L2-norm regularization term with the L1-norm one. It is expected that the proposed L1-DL method could alleviate the over-smoothing effect of the L2-minimization and reserve more image details. The proposed algorithm solves the L1-minimization problem by a weighting strategy, solving the new weighted L2-minimization problem based on IRLS (iteratively reweighted least squares). Through the numerical simulation, the proposed algorithm is compared with the existing DL method (adaptive dictionary based statistical iterative reconstruction, ADSIR) and other two typical compressed sensing algorithms. It is revealed that the proposed algorithm is more accurate than the other algorithms especially when further reducing the sampling rate or increasing the noise. The proposed L1-DL algorithm can utilize more prior information of image sparsity than ADSIR. By transforming the L2-norm regularization term of ADSIR with the L1-norm one and solving the L1-minimization problem by IRLS strategy, L1-DL could reconstruct the image more exactly.

  6. An improved flexible telemetry system to autonomously monitor sub-bandage pressure and wound moisture.

    PubMed

    Mehmood, Nasir; Hariz, Alex; Templeton, Sue; Voelcker, Nicolas H

    2014-11-18

    This paper presents the development of an improved mobile-based telemetric dual mode sensing system to monitor pressure and moisture levels in compression bandages and dressings used for chronic wound management. The system is fabricated on a 0.2 mm thick flexible printed circuit material, and is capable of sensing pressure and moisture at two locations simultaneously within a compression bandage and wound dressing. The sensors are calibrated to sense both parameters accurately, and the data are then transmitted wirelessly to a receiver connected to a mobile device. An error-correction algorithm is developed to compensate the degradation in measurement quality due to battery power drop over time. An Android application is also implemented to automatically receive, process, and display the sensed wound parameters. The performance of the sensing system is first validated on a mannequin limb using a compression bandage and wound dressings, and then tested on a healthy volunteer to acquire real-time performance parameters. The results obtained here suggest that this dual mode sensor can perform reliably when placed on a human limb.

  7. An Improved Flexible Telemetry System to Autonomously Monitor Sub-Bandage Pressure and Wound Moisture

    PubMed Central

    Mehmood, Nasir; Hariz, Alex; Templeton, Sue; Voelcker, Nicolas H.

    2014-01-01

    This paper presents the development of an improved mobile-based telemetric dual mode sensing system to monitor pressure and moisture levels in compression bandages and dressings used for chronic wound management. The system is fabricated on a 0.2 mm thick flexible printed circuit material, and is capable of sensing pressure and moisture at two locations simultaneously within a compression bandage and wound dressing. The sensors are calibrated to sense both parameters accurately, and the data are then transmitted wirelessly to a receiver connected to a mobile device. An error-correction algorithm is developed to compensate the degradation in measurement quality due to battery power drop over time. An Android application is also implemented to automatically receive, process, and display the sensed wound parameters. The performance of the sensing system is first validated on a mannequin limb using a compression bandage and wound dressings, and then tested on a healthy volunteer to acquire real-time performance parameters. The results obtained here suggest that this dual mode sensor can perform reliably when placed on a human limb. PMID:25412216

  8. An investigation of vegetation and other Earth resource/feature parameters using LANDSAT and other remote sensing data. 1: LANDSAT. 2: Remote sensing of volcanic emissions. [New England forest and emissions from Mt. St. Helens and Central American volcanoes

    NASA Technical Reports Server (NTRS)

    Birnie, R. W.; Stoiber, R. E. (Principal Investigator)

    1981-01-01

    A fanning technique based on a simplistic physical model provided a classification algorithm for mixture landscapes. Results of applications to LANDSAT inventory of 1.5 million acres of forest land in Northern Maine are presented. Signatures for potential deer year habitat in New Hampshire were developed. Volcanic activity was monitored in Nicaragua, El Salvador, and Guatemala along with the Mt. St. Helens eruption. Emphasis in the monitoring was placed on the remote sensing of SO2 concentrations in the plumes of the volcanoes.

  9. Wavefront Control Testbed (WCT) Experiment Results

    NASA Technical Reports Server (NTRS)

    Burns, Laura A.; Basinger, Scott A.; Campion, Scott D.; Faust, Jessica A.; Feinberg, Lee D.; Hayden, William L.; Lowman, Andrew E.; Ohara, Catherine M.; Petrone, Peter P., III

    2004-01-01

    The Wavefront Control Testbed (WCT) was created to develop and test wavefront sensing and control algorithms and software for the segmented James Webb Space Telescope (JWST). Last year, we changed the system configuration from three sparse aperture segments to a filled aperture with three pie shaped segments. With this upgrade we have performed experiments on fine phasing with line-of-sight and segment-to-segment jitter, dispersed fringe visibility and grism angle;. high dynamic range tilt sensing; coarse phasing with large aberrations, and sampled sub-aperture testing. This paper reviews the results of these experiments.

  10. The ORSER System for the Analysis of Remotely Sensed Digital Data

    NASA Technical Reports Server (NTRS)

    Myers, W. L.; Turner, B. J.

    1981-01-01

    The main effort of the University of Pennsylvania's Office for Remote Sensing of Earth Resources (ORSER) is the processing, analysis, and interpretation of multispectral data, most often supplied by NASA in the form of imagery and digital data. The facilities used for data reduction and image enhancement are described as well as the development of algorithms for producing a computer map showing various environmental and land use characteristics of data points in the analyzed scenes. The application of an (ORSER) capability for statewide monitoring of gypsy moth defoliation is discussed.

  11. Contribution of Phycoerythrin-Containing Phytoplankton to Remotely Sensed Signals in the Ocean

    NASA Technical Reports Server (NTRS)

    Vernet, Maria; Iturriaga, Rodolfo

    1997-01-01

    The purpose of this project was to evaluate the importance of phycoerythrin-containing phytoplankton, in particular coccoid cyanobacteria, to remote sensing. We proposed to estimate cyanobacteria abundance and pigmentation and their relationship to water-column optics. We have estimated the contribution of cyanobacteria to scattering and backscattering in both open ocean (Sargasso Sea) and coastal waters (western coast of North Atlantic and the California Current). Sampling and data processing is performed. Relationship between water column optics and phycoerythrin concentration and algorithms development are being carried out.

  12. Integration of Remote Sensing Data In Operational Flood Forecast In Southwest Germany

    NASA Astrophysics Data System (ADS)

    Bach, H.; Appel, F.; Schulz, W.; Merkel, U.; Ludwig, R.; Mauser, W.

    Methods to accurately assess and forecast flood discharge are mandatory to minimise the impact of hydrological hazards. However, existing rainfall-runoff models rarely accurately consider the spatial characteristics of the watershed, which is essential for a suitable and physics-based description of processes relevant for runoff formation. Spatial information with low temporal variability like elevation, slopes and land use can be mapped or extracted from remote sensing data. However, land surface param- eters of high temporal variability, like soil moisture and snow properties are hardly available and used in operational forecasts. Remote sensing methods can improve flood forecast by providing information on the actual water retention capacities in the watershed and facilitate the regionalisation of hydrological models. To prove and demonstrate this, the project 'InFerno' (Integration of remote sensing data in opera- tional water balance and flood forecast modelling) has been set up, funded by DLR (50EE0053). Within InFerno remote sensing data (optical and microwave) are thor- oughly processed to deliver spatially distributed parameters of snow properties and soil moisture. Especially during the onset of a flood this information is essential to estimate the initial conditions of the model. At the flood forecast centres of 'Baden- Württemberg' and 'Rheinland-Pfalz' (Southwest Germany) the remote sensing based maps on soil moisture and snow properties will be integrated in the continuously op- erated water balance and flood forecast model LARSIM. The concept is to transfer the developed methodology from the Neckar to the Mosel basin. The major challenges lie on the one hand in the implementation of algorithms developed for a multisensoral synergy and the creation of robust, operationally applicable remote sensing products. On the other hand, the operational flood forecast must be adapted to make full use of the new data sources. In the operational phase of the project ESA's ENVISAT satellite, which will be launched in 2002, will serve as remote sensing data source. Until EN- VISAT data is available, algorithm retrieval, software development and product gener- ation is performed using existing sensors with ENVISAT-like specifications. Based on these data sets test cases and demonstration runs are conducted and will be presented to prove the advantages of the approach.

  13. Model and Data Reduction for Control, Identification and Compressed Sensing

    NASA Astrophysics Data System (ADS)

    Kramer, Boris

    This dissertation focuses on problems in design, optimization and control of complex, large-scale dynamical systems from different viewpoints. The goal is to develop new algorithms and methods, that solve real problems more efficiently, together with providing mathematical insight into the success of those methods. There are three main contributions in this dissertation. In Chapter 3, we provide a new method to solve large-scale algebraic Riccati equations, which arise in optimal control, filtering and model reduction. We present a projection based algorithm utilizing proper orthogonal decomposition, which is demonstrated to produce highly accurate solutions at low rank. The method is parallelizable, easy to implement for practitioners, and is a first step towards a matrix free approach to solve AREs. Numerical examples for n ≥ 106 unknowns are presented. In Chapter 4, we develop a system identification method which is motivated by tangential interpolation. This addresses the challenge of fitting linear time invariant systems to input-output responses of complex dynamics, where the number of inputs and outputs is relatively large. The method reduces the computational burden imposed by a full singular value decomposition, by carefully choosing directions on which to project the impulse response prior to assembly of the Hankel matrix. The identification and model reduction step follows from the eigensystem realization algorithm. We present three numerical examples, a mass spring damper system, a heat transfer problem, and a fluid dynamics system. We obtain error bounds and stability results for this method. Chapter 5 deals with control and observation design for parameter dependent dynamical systems. We address this by using local parametric reduced order models, which can be used online. Data available from simulations of the system at various configurations (parameters, boundary conditions) is used to extract a sparse basis to represent the dynamics (via dynamic mode decomposition). Subsequently, a new, compressed sensing based classification algorithm is developed which incorporates the extracted dynamic information into the sensing basis. We show that this augmented classification basis makes the method more robust to noise, and results in superior identification of the correct parameter. Numerical examples consist of a Navier-Stokes, as well as a Boussinesq flow application.

  14. Automated Means of Identifying Landslide Deposits using LiDAR Data using the Contour Connection Method Algorithm

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Leshchinsky, B. A.; Tanyu, B. F.

    2014-12-01

    Landslides are a global natural hazard, resulting in severe economic, environmental and social impacts every year. Often, landslides occur in areas of repeated slope instability, but despite these trends, significant residential developments and critical infrastructure are built in the shadow of past landslide deposits and marginally stable slopes. These hazards, despite their sometimes enormous scale and regional propensity, however, are difficult to detect on the ground, often due to vegetative cover. However, new developments in remote sensing technology, specifically Light Detection and Ranging mapping (LiDAR) are providing a new means of viewing our landscape. Airborne LiDAR, combined with a level of post-processing, enable the creation of spatial data representative of the earth beneath the vegetation, highlighting the scars of unstable slopes of the past. This tool presents a revolutionary technique to mapping landslide deposits and their associated regions of risk; yet, their inventorying is often done manually, an approach that can be tedious, time-consuming and subjective. However, the associated LiDAR bare earth data present the opportunity to use this remote sensing technology and typical landslide geometry to create an automated algorithm that can detect and inventory deposits on a landscape scale. This algorithm, called the Contour Connection Method (CCM), functions by first detecting steep gradients, often associated with the headscarp of a failed hillslope, and initiating a search, highlighting deposits downslope of the failure. Based on input of search gradients, CCM can assist in highlighting regions identified as landslides consistently on a landscape scale, capable of mapping more than 14,000 hectares rapidly (<30 minutes). CCM has shown preliminary agreement with manual landslide inventorying in Oregon's Coast Range, realizing almost 90% agreement with inventorying performed by a trained geologist. The global threat of landslides necessitates new and effective tools for inventorying regions of risk to protect people, infrastructure and the environment from landslide hazards. Use of the CCM algorithm combined with judgment and rapidly developing remote sensing technology may help better define these regions of risk.

  15. Sparse signals recovered by non-convex penalty in quasi-linear systems.

    PubMed

    Cui, Angang; Li, Haiyang; Wen, Meng; Peng, Jigen

    2018-01-01

    The goal of compressed sensing is to reconstruct a sparse signal under a few linear measurements far less than the dimension of the ambient space of the signal. However, many real-life applications in physics and biomedical sciences carry some strongly nonlinear structures, and the linear model is no longer suitable. Compared with the compressed sensing under the linear circumstance, this nonlinear compressed sensing is much more difficult, in fact also NP-hard, combinatorial problem, because of the discrete and discontinuous nature of the [Formula: see text]-norm and the nonlinearity. In order to get a convenience for sparse signal recovery, we set the nonlinear models have a smooth quasi-linear nature in this paper, and study a non-convex fraction function [Formula: see text] in this quasi-linear compressed sensing. We propose an iterative fraction thresholding algorithm to solve the regularization problem [Formula: see text] for all [Formula: see text]. With the change of parameter [Formula: see text], our algorithm could get a promising result, which is one of the advantages for our algorithm compared with some state-of-art algorithms. Numerical experiments show that our method performs much better than some state-of-the-art methods.

  16. RMP: Reduced-set matching pursuit approach for efficient compressed sensing signal reconstruction.

    PubMed

    Abdel-Sayed, Michael M; Khattab, Ahmed; Abu-Elyazeed, Mohamed F

    2016-11-01

    Compressed sensing enables the acquisition of sparse signals at a rate that is much lower than the Nyquist rate. Compressed sensing initially adopted [Formula: see text] minimization for signal reconstruction which is computationally expensive. Several greedy recovery algorithms have been recently proposed for signal reconstruction at a lower computational complexity compared to the optimal [Formula: see text] minimization, while maintaining a good reconstruction accuracy. In this paper, the Reduced-set Matching Pursuit (RMP) greedy recovery algorithm is proposed for compressed sensing. Unlike existing approaches which either select too many or too few values per iteration, RMP aims at selecting the most sufficient number of correlation values per iteration, which improves both the reconstruction time and error. Furthermore, RMP prunes the estimated signal, and hence, excludes the incorrectly selected values. The RMP algorithm achieves a higher reconstruction accuracy at a significantly low computational complexity compared to existing greedy recovery algorithms. It is even superior to [Formula: see text] minimization in terms of the normalized time-error product, a new metric introduced to measure the trade-off between the reconstruction time and error. RMP superior performance is illustrated with both noiseless and noisy samples.

  17. Retrieving the properties of ice-phase precipitation with multi-frequency radar measurements

    NASA Astrophysics Data System (ADS)

    Mace, G. G.; Gergely, M.; Mascio, J.

    2017-12-01

    The objective of most retrieval algorithms applied to remote sensing measurements is the microphysical properties that a model might predict such as condensed water content, particle number, or effective size. However, because ice crystals grow and aggregate into complex non spherical shapes, the microphysical properties of interest are very much dependent on the physical characteristics of the precipitation such as how mass and crystal area are distributed as a function of particle size. Such physical properties also have a strong influence on how microwave electromagnetic energy scatters from ice crystals causing significant ambiguity in retrieval algorithms. In fact, passive and active microwave remote sensing measurements are typically nearly as sensitive to the ice crystal physical properties as they are to the microphysical characteristics that are typically the aim of the retrieval algorithm. There has, however, been active development of multi frequency algorithms recently that attempt to ameliorate and even exploit this sensitivity. In this paper, we will review these approaches and present practical applications of retrieving ice crystal properties such as mass- and area dimensional relationships from single and dual frequency radar measurements of precipitating ice using data collected aboard ship in the Southern Ocean and from remote sensors in the Rocky Mountains of the Western U.S.

  18. Circuit Design Optimization Using Genetic Algorithm with Parameterized Uniform Crossover

    NASA Astrophysics Data System (ADS)

    Bao, Zhiguo; Watanabe, Takahiro

    Evolvable hardware (EHW) is a new research field about the use of Evolutionary Algorithms (EAs) to construct electronic systems. EHW refers in a narrow sense to use evolutionary mechanisms as the algorithmic drivers for system design, while in a general sense to the capability of the hardware system to develop and to improve itself. Genetic Algorithm (GA) is one of typical EAs. We propose optimal circuit design by using GA with parameterized uniform crossover (GApuc) and with fitness function composed of circuit complexity, power, and signal delay. Parameterized uniform crossover is much more likely to distribute its disruptive trials in an unbiased manner over larger portions of the space, then it has more exploratory power than one and two-point crossover, so we have more chances of finding better solutions. Its effectiveness is shown by experiments. From the results, we can see that the best elite fitness, the average value of fitness of the correct circuits and the number of the correct circuits of GApuc are better than that of GA with one-point crossover or two-point crossover. The best case of optimal circuits generated by GApuc is 10.18% and 6.08% better in evaluating value than that by GA with one-point crossover and two-point crossover, respectively.

  19. Information mining in remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Li, Jiang

    The volume of remotely sensed imagery continues to grow at an enormous rate due to the advances in sensor technology, and our capability for collecting and storing images has greatly outpaced our ability to analyze and retrieve information from the images. This motivates us to develop image information mining techniques, which is very much an interdisciplinary endeavor drawing upon expertise in image processing, databases, information retrieval, machine learning, and software design. This dissertation proposes and implements an extensive remote sensing image information mining (ReSIM) system prototype for mining useful information implicitly stored in remote sensing imagery. The system consists of three modules: image processing subsystem, database subsystem, and visualization and graphical user interface (GUI) subsystem. Land cover and land use (LCLU) information corresponding to spectral characteristics is identified by supervised classification based on support vector machines (SVM) with automatic model selection, while textural features that characterize spatial information are extracted using Gabor wavelet coefficients. Within LCLU categories, textural features are clustered using an optimized k-means clustering approach to acquire search efficient space. The clusters are stored in an object-oriented database (OODB) with associated images indexed in an image database (IDB). A k-nearest neighbor search is performed using a query-by-example (QBE) approach. Furthermore, an automatic parametric contour tracing algorithm and an O(n) time piecewise linear polygonal approximation (PLPA) algorithm are developed for shape information mining of interesting objects within the image. A fuzzy object-oriented database based on the fuzzy object-oriented data (FOOD) model is developed to handle the fuzziness and uncertainty. Three specific applications are presented: integrated land cover and texture pattern mining, shape information mining for change detection of lakes, and fuzzy normalized difference vegetation index (NDVI) pattern mining. The study results show the effectiveness of the proposed system prototype and the potentials for other applications in remote sensing.

  20. Estimating Coastal Turbidity using MODIS 250 m Band Observations

    NASA Technical Reports Server (NTRS)

    Davies, James E.; Moeller, Christopher C.; Gunshor, Mathew M.; Menzel, W. Paul; Walker, Nan D.

    2004-01-01

    Terra MODIS 250 m observations are being applied to a Suspended Sediment Concentration (SSC) algorithm that is under development for coastal case 2 waters where reflectance is dominated by sediment entrained in major fluvial outflows. An atmospheric correction based on MODIS observations in the 500 m resolution 1.6 and 2.1 micron bands is used to isolate the remote sensing reflectance in the MODIS 25Om resolution 650 and 865 nanometer bands. SSC estimates from remote sensing reflectance are based on accepted inherent optical properties of sediment types known to be prevalent in the U.S. Gulf of Mexico coastal zone. We present our findings for the Atchafalaya Bay region of the Louisiana Coast, in the form of processed imagery over the annual cycle. We also apply our algorithm to selected sites worldwide with a goal of extending the utility of our approach to the global direct broadcast community.

  1. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction

    PubMed Central

    Cruz Zurian, Heber; Atefi, Seyed Reza; Seoane Martinez, Fernando; Lukowicz, Paul

    2017-01-01

    In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments’ contribution. The best performing feature-classifier combination can recognize the gestures with a 93.3% accuracy from a known group of participants, and 89.1% from strangers. PMID:29120389

  2. Textile Pressure Mapping Sensor for Emotional Touch Detection in Human-Robot Interaction.

    PubMed

    Zhou, Bo; Altamirano, Carlos Andres Velez; Zurian, Heber Cruz; Atefi, Seyed Reza; Billing, Erik; Martinez, Fernando Seoane; Lukowicz, Paul

    2017-11-09

    In this paper, we developed a fully textile sensing fabric for tactile touch sensing as the robot skin to detect human-robot interactions. The sensor covers a 20-by-20 cm 2 area with 400 sensitive points and samples at 50 Hz per point. We defined seven gestures which are inspired by the social and emotional interactions of typical people to people or pet scenarios. We conducted two groups of mutually blinded experiments, involving 29 participants in total. The data processing algorithm first reduces the spatial complexity to frame descriptors, and temporal features are calculated through basic statistical representations and wavelet analysis. Various classifiers are evaluated and the feature calculation algorithms are analyzed in details to determine each stage and segments' contribution. The best performing feature-classifier combination can recognize the gestures with a 93 . 3 % accuracy from a known group of participants, and 89 . 1 % from strangers.

  3. ICON: 3D reconstruction with 'missing-information' restoration in biological electron tomography.

    PubMed

    Deng, Yuchen; Chen, Yu; Zhang, Yan; Wang, Shengliu; Zhang, Fa; Sun, Fei

    2016-07-01

    Electron tomography (ET) plays an important role in revealing biological structures, ranging from macromolecular to subcellular scale. Due to limited tilt angles, ET reconstruction always suffers from the 'missing wedge' artifacts, thus severely weakens the further biological interpretation. In this work, we developed an algorithm called Iterative Compressed-sensing Optimized Non-uniform fast Fourier transform reconstruction (ICON) based on the theory of compressed-sensing and the assumption of sparsity of biological specimens. ICON can significantly restore the missing information in comparison with other reconstruction algorithms. More importantly, we used the leave-one-out method to verify the validity of restored information for both simulated and experimental data. The significant improvement in sub-tomogram averaging by ICON indicates its great potential in the future application of high-resolution structural determination of macromolecules in situ. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Comparison and analysis of nonlinear algorithms for compressed sensing in MRI.

    PubMed

    Yu, Yeyang; Hong, Mingjian; Liu, Feng; Wang, Hua; Crozier, Stuart

    2010-01-01

    Compressed sensing (CS) theory has been recently applied in Magnetic Resonance Imaging (MRI) to accelerate the overall imaging process. In the CS implementation, various algorithms have been used to solve the nonlinear equation system for better image quality and reconstruction speed. However, there are no explicit criteria for an optimal CS algorithm selection in the practical MRI application. A systematic and comparative study of those commonly used algorithms is therefore essential for the implementation of CS in MRI. In this work, three typical algorithms, namely, the Gradient Projection For Sparse Reconstruction (GPSR) algorithm, Interior-point algorithm (l(1)_ls), and the Stagewise Orthogonal Matching Pursuit (StOMP) algorithm are compared and investigated in three different imaging scenarios, brain, angiogram and phantom imaging. The algorithms' performances are characterized in terms of image quality and reconstruction speed. The theoretical results show that the performance of the CS algorithms is case sensitive; overall, the StOMP algorithm offers the best solution in imaging quality, while the GPSR algorithm is the most efficient one among the three methods. In the next step, the algorithm performances and characteristics will be experimentally explored. It is hoped that this research will further support the applications of CS in MRI.

  5. Development of ocean color algorithms for estimating chlorophyll-a concentrations and inherent optical properties using gene expression programming (GEP).

    PubMed

    Chang, Chih-Hua

    2015-03-09

    This paper proposes new inversion algorithms for the estimation of Chlorophyll-a concentration (Chla) and the ocean's inherent optical properties (IOPs) from the measurement of remote sensing reflectance (Rrs). With in situ data from the NASA bio-optical marine algorithm data set (NOMAD), inversion algorithms were developed by the novel gene expression programming (GEP) approach, which creates, manipulates and selects the most appropriate tree-structured functions based on evolutionary computing. The limitations and validity of the proposed algorithms are evaluated by simulated Rrs spectra with respect to NOMAD, and a closure test for IOPs obtained at a single reference wavelength. The application of GEP-derived algorithms is validated against in situ, synthetic and satellite match-up data sets compiled by NASA and the International Ocean Color Coordinate Group (IOCCG). The new algorithms are able to provide Chla and IOPs retrievals to those derived by other state-of-the-art regression approaches and obtained with the semi- and quasi-analytical algorithms, respectively. In practice, there are no significant differences between GEP, support vector regression, and multilayer perceptron model in terms of the overall performance. The GEP-derived algorithms are successfully applied in processing the images taken by the Sea Wide Field-of-view Sensor (SeaWiFS), generate Chla and IOPs maps which show better details of developing algal blooms, and give more information on the distribution of water constituents between different water bodies.

  6. Compressive Sensing of Foot Gait Signals and Its Application for the Estimation of Clinically Relevant Time Series.

    PubMed

    Pant, Jeevan K; Krishnan, Sridhar

    2016-07-01

    A new signal reconstruction algorithm for compressive sensing based on the minimization of a pseudonorm which promotes block-sparse structure on the first-order difference of the signal is proposed. Involved optimization is carried out by using a sequential version of Fletcher-Reeves' conjugate-gradient algorithm, and the line search is based on Banach's fixed-point theorem. The algorithm is suitable for the reconstruction of foot gait signals which admit block-sparse structure on the first-order difference. An additional algorithm for the estimation of stride-interval, swing-interval, and stance-interval time series from the reconstructed foot gait signals is also proposed. This algorithm is based on finding zero crossing indices of the foot gait signal and using the resulting indices for the computation of time series. Extensive simulation results demonstrate that the proposed signal reconstruction algorithm yields improved signal-to-noise ratio and requires significantly reduced computational effort relative to several competing algorithms over a wide range of compression ratio. For a compression ratio in the range from 88% to 94%, the proposed algorithm is found to offer improved accuracy for the estimation of clinically relevant time-series parameters, namely, the mean value, variance, and spectral index of stride-interval, stance-interval, and swing-interval time series, relative to its nearest competitor algorithm. The improvement in performance for compression ratio as high as 94% indicates that the proposed algorithms would be useful for designing compressive sensing-based systems for long-term telemonitoring of human gait signals.

  7. Application of Novel Lateral Tire Force Sensors to Vehicle Parameter Estimation of Electric Vehicles.

    PubMed

    Nam, Kanghyun

    2015-11-11

    This article presents methods for estimating lateral vehicle velocity and tire cornering stiffness, which are key parameters in vehicle dynamics control, using lateral tire force measurements. Lateral tire forces acting on each tire are directly measured by load-sensing hub bearings that were invented and further developed by NSK Ltd. For estimating the lateral vehicle velocity, tire force models considering lateral load transfer effects are used, and a recursive least square algorithm is adapted to identify the lateral vehicle velocity as an unknown parameter. Using the estimated lateral vehicle velocity, tire cornering stiffness, which is an important tire parameter dominating the vehicle's cornering responses, is estimated. For the practical implementation, the cornering stiffness estimation algorithm based on a simple bicycle model is developed and discussed. Finally, proposed estimation algorithms were evaluated using experimental test data.

  8. NASA sea ice and snow validation plan for the Defense Meteorological Satellite Program special sensor microwave/imager

    NASA Technical Reports Server (NTRS)

    Cavalieri, Donald J. (Editor); Swift, Calvin T. (Editor)

    1987-01-01

    This document addresses the task of developing and executing a plan for validating the algorithm used for initial processing of sea ice data from the Special Sensor Microwave/Imager (SSMI). The document outlines a plan for monitoring the performance of the SSMI, for validating the derived sea ice parameters, and for providing quality data products before distribution to the research community. Because of recent advances in the application of passive microwave remote sensing to snow cover on land, the validation of snow algorithms is also addressed.

  9. Analysis of soil moisture extraction algorithm using data from aircraft experiments

    NASA Technical Reports Server (NTRS)

    Burke, H. H. K.; Ho, J. H.

    1981-01-01

    A soil moisture extraction algorithm is developed using a statistical parameter inversion method. Data sets from two aircraft experiments are utilized for the test. Multifrequency microwave radiometric data surface temperature, and soil moisture information are contained in the data sets. The surface and near surface ( or = 5 cm) soil moisture content can be extracted with accuracy of approximately 5% to 6% for bare fields and fields with grass cover by using L, C, and X band radiometer data. This technique is used for handling large amounts of remote sensing data from space.

  10. a New Graduation Algorithm for Color Balance of Remote Sensing Image

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Liu, X.; Yue, T.; Wang, Q.; Sha, H.; Huang, S.; Pan, Q.

    2018-05-01

    In order to expand the field of view to obtain more data and information when doing research on remote sensing image, workers always need to mosaicking images together. However, the image after mosaic always has the large color differences and produces the gap line. This paper based on the graduation algorithm of tarigonometric function proposed a new algorithm of Two Quarter-rounds Curves (TQC). The paper uses the Gaussian filter to solve the program about the image color noise and the gap line. The paper used one of Greenland compiled data acquired in 1963 from Declassified Intelligence Photography Project (DISP) by ARGON KH-5 satellite, and used the photography of North Gulf, China, by Landsat satellite to experiment. The experimental results show that the proposed method has improved the accuracy of the results in two parts: on the one hand, for the large color differences remote sensing image will become more balanced. On the other hands, the remote sensing image will achieve more smooth transition.

  11. Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology.

    PubMed

    Hsu, Yu-Liang; Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen

    2017-07-15

    This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents' wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident's feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment.

  12. Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology

    PubMed Central

    Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen

    2017-01-01

    This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents’ wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident’s feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment. PMID:28714884

  13. Application of Near-Surface Remote Sensing and computer algorithms in evaluating impacts of agroecosystem management on Zea mays (corn) phenological development in the Platte River - High Plains Aquifer Long Term Agroecosystem Research Network field sites.

    NASA Astrophysics Data System (ADS)

    Okalebo, J. A.; Das Choudhury, S.; Awada, T.; Suyker, A.; LeBauer, D.; Newcomb, M.; Ward, R.

    2017-12-01

    The Long-term Agroecosystem Research (LTAR) network is a USDA-ARS effort that focuses on conducting research that addresses current and emerging issues in agriculture related to sustainability and profitability of agroecosystems in the face of climate change and population growth. There are 18 sites across the USA covering key agricultural production regions. In Nebraska, a partnership between the University of Nebraska - Lincoln and ARD/USDA resulted in the establishment of the Platte River - High Plains Aquifer LTAR site in 2014. The site conducts research to sustain multiple ecosystem services focusing specifically on Nebraska's main agronomic production agroecosystems that comprise of abundant corn, soybeans, managed grasslands and beef production. As part of the national LTAR network, PR-HPA participates and contributes near-surface remotely sensed imagery of corn, soybean and grassland canopy phenology to the PhenoCam Network through high-resolution digital cameras. This poster highlights the application, advantages and usefulness of near-surface remotely sensed imagery in agroecosystem studies and management. It demonstrates how both Infrared and Red-Green-Blue imagery may be applied to monitor phenological events as well as crop abiotic stresses. Computer-based algorithms and analytic techniques proved very instrumental in revealing crop phenological changes such as green-up and tasseling in corn. This poster also reports the suitability and applicability of corn-derived computer based algorithms for evaluating phenological development of sorghum since both crops have similarities in their phenology; with sorghum panicles being similar to corn tassels. This later assessment was carried out using a sorghum dataset obtained from the Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform project, Maricopa Agricultural Center, Arizona.

  14. Retrieval of chlorophyll from remote-sensing reflectance in the china seas.

    PubMed

    He, M X; Liu, Z S; Du, K P; Li, L P; Chen, R; Carder, K L; Lee, Z P

    2000-05-20

    The East China Sea is a typical case 2 water environment, where concentrations of phytoplankton pigments, suspended matter, and chromophoric dissolved organic matter (CDOM) are all higher than those in the open oceans, because of the discharge from the Yangtze River and the Yellow River. By using a hyperspectral semianalytical model, we simulated a set of remote-sensing reflectance for a variety of chlorophyll, suspended matter, and CDOM concentrations. From this simulated data set, a new algorithm for the retrieval of chlorophyll concentration from remote-sensing reflectance is proposed. For this method, we took into account the 682-nm spectral channel in addition to the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) channels. When this algorithm was applied to a field data set, the chlorophyll concentrations retrieved through the new algorithm were consistent with field measurements to within a small error of 18%, in contrast with that of 147% between the SeaWiFS ocean chlorophyll 2 algorithm and the in situ observation.

  15. Genetic algorithm for investigating flight MH370 in Indian Ocean using remotely sensed data

    NASA Astrophysics Data System (ADS)

    Marghany, Maged; Mansor, Shattri; Shariff, Abdul Rashid Bin Mohamed

    2016-06-01

    This study utilized Genetic algorithm (GA) for automatic detection and simulation trajectory movements of flight MH370 debris. In doing so, the Ocean Surface Topography Mission(OSTM) on the Jason- 2 satellite have been used within 1 and half year covers data to simulate the pattern of Flight MH370 debris movements across the southern Indian Ocean. Further, multi-objectives evolutionary algorithm also used to discriminate uncertainty of flight MH370 imagined and detection. The study shows that the ocean surface current speed is 0.5 m/s. This current patterns have developed a large anticlockwise gyre over a water depth of 8,000 m. The multi-objectives evolutionary algorithm suggested that objects are existed on satellite data are not flight MH370 debris. In addition, multiobjectives evolutionary algorithm suggested that the difficulties to acquire the exact location of flight MH370 due to complicated hydrodynamic movements across the southern Indian Ocean.

  16. Image-algebraic design of multispectral target recognition algorithms

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.; Ritter, Gerhard X.

    1994-06-01

    In this paper, we discuss methods for multispectral ATR (Automated Target Recognition) of small targets that are sensed under suboptimal conditions, such as haze, smoke, and low light levels. In particular, we discuss our ongoing development of algorithms and software that effect intelligent object recognition by selecting ATR filter parameters according to ambient conditions. Our algorithms are expressed in terms of IA (image algebra), a concise, rigorous notation that unifies linear and nonlinear mathematics in the image processing domain. IA has been implemented on a variety of parallel computers, with preprocessors available for the Ada and FORTRAN languages. An image algebra C++ class library has recently been made available. Thus, our algorithms are both feasible implementationally and portable to numerous machines. Analyses emphasize the aspects of image algebra that aid the design of multispectral vision algorithms, such as parameterized templates that facilitate the flexible specification of ATR filters.

  17. Accurate reconstruction of hyperspectral images from compressive sensing measurements

    NASA Astrophysics Data System (ADS)

    Greer, John B.; Flake, J. C.

    2013-05-01

    The emerging field of Compressive Sensing (CS) provides a new way to capture data by shifting the heaviest burden of data collection from the sensor to the computer on the user-end. This new means of sensing requires fewer measurements for a given amount of information than traditional sensors. We investigate the efficacy of CS for capturing HyperSpectral Imagery (HSI) remotely. We also introduce a new family of algorithms for constructing HSI from CS measurements with Split Bregman Iteration [Goldstein and Osher,2009]. These algorithms combine spatial Total Variation (TV) with smoothing in the spectral dimension. We examine models for three different CS sensors: the Coded Aperture Snapshot Spectral Imager-Single Disperser (CASSI-SD) [Wagadarikar et al.,2008] and Dual Disperser (CASSI-DD) [Gehm et al.,2007] cameras, and a hypothetical random sensing model closer to CS theory, but not necessarily implementable with existing technology. We simulate the capture of remotely sensed images by applying the sensor forward models to well-known HSI scenes - an AVIRIS image of Cuprite, Nevada and the HYMAP Urban image. To measure accuracy of the CS models, we compare the scenes constructed with our new algorithm to the original AVIRIS and HYMAP cubes. The results demonstrate the possibility of accurately sensing HSI remotely with significantly fewer measurements than standard hyperspectral cameras.

  18. QR-decomposition based SENSE reconstruction using parallel architecture.

    PubMed

    Ullah, Irfan; Nisar, Habab; Raza, Haseeb; Qasim, Malik; Inam, Omair; Omer, Hammad

    2018-04-01

    Magnetic Resonance Imaging (MRI) is a powerful medical imaging technique that provides essential clinical information about the human body. One major limitation of MRI is its long scan time. Implementation of advance MRI algorithms on a parallel architecture (to exploit inherent parallelism) has a great potential to reduce the scan time. Sensitivity Encoding (SENSE) is a Parallel Magnetic Resonance Imaging (pMRI) algorithm that utilizes receiver coil sensitivities to reconstruct MR images from the acquired under-sampled k-space data. At the heart of SENSE lies inversion of a rectangular encoding matrix. This work presents a novel implementation of GPU based SENSE algorithm, which employs QR decomposition for the inversion of the rectangular encoding matrix. For a fair comparison, the performance of the proposed GPU based SENSE reconstruction is evaluated against single and multicore CPU using openMP. Several experiments against various acceleration factors (AFs) are performed using multichannel (8, 12 and 30) phantom and in-vivo human head and cardiac datasets. Experimental results show that GPU significantly reduces the computation time of SENSE reconstruction as compared to multi-core CPU (approximately 12x speedup) and single-core CPU (approximately 53x speedup) without any degradation in the quality of the reconstructed images. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Optimal Periodic Cooperative Spectrum Sensing Based on Weight Fusion in Cognitive Radio Networks

    PubMed Central

    Liu, Xin; Jia, Min; Gu, Xuemai; Tan, Xuezhi

    2013-01-01

    The performance of cooperative spectrum sensing in cognitive radio (CR) networks depends on the sensing mode, the sensing time and the number of cooperative users. In order to improve the sensing performance and reduce the interference to the primary user (PU), a periodic cooperative spectrum sensing model based on weight fusion is proposed in this paper. Moreover, the sensing period, the sensing time and the searching time are optimized, respectively. Firstly the sensing period is optimized to improve the spectrum utilization and reduce the interference, then the joint optimization algorithm of the local sensing time and the number of cooperative users, is proposed to obtain the optimal sensing time for improving the throughput of the cognitive radio user (CRU) during each period, and finally the water-filling principle is applied to optimize the searching time in order to make the CRU find an idle channel within the shortest time. The simulation results show that compared with the previous algorithms, the optimal sensing period can improve the spectrum utilization of the CRU and decrease the interference to the PU significantly, the optimal sensing time can make the CRU achieve the largest throughput, and the optimal searching time can make the CRU find an idle channel with the least time. PMID:23604027

  20. THREAD: A programming environment for interactive planning-level robotics applications

    NASA Technical Reports Server (NTRS)

    Beahan, John J., Jr.

    1989-01-01

    THREAD programming language, which was developed to meet the needs of researchers in developing robotics applications that perform such tasks as grasp, trajectory design, sensor data analysis, and interfacing with external subsystems in order to perform servo-level control of manipulators and real time sensing is discussed. The philosophy behind THREAD, the issues which entered into its design, and the features of the language are discussed from the viewpoint of researchers who want to develop algorithms in a simulation environment, and from those who want to implement physical robotics systems. The detailed functions of the many complex robotics algorithms and tools which are part of the language are not explained, but an overall impression of their capability is given.

  1. Description and performance analysis of a generalized optimal algorithm for aerobraking guidance

    NASA Technical Reports Server (NTRS)

    Evans, Steven W.; Dukeman, Greg A.

    1993-01-01

    A practical real-time guidance algorithm has been developed for aerobraking vehicles which nearly minimizes the maximum heating rate, the maximum structural loads, and the post-aeropass delta V requirement for orbit insertion. The algorithm is general and reusable in the sense that a minimum of assumptions are made, thus greatly reducing the number of parameters that must be determined prior to a given mission. A particularly interesting feature is that in-plane guidance performance is tuned by adjusting one mission-dependent, the bank margin; similarly, the out-of-plane guidance performance is tuned by adjusting a plane controller time constant. Other features of the algorithm are simplicity, efficiency and ease of use. The trimmed vehicle with bank angle modulation as the method of trajectory control. Performance of this guidance algorithm is examined by its use in an aerobraking testbed program. The performance inquiry extends to a wide range of entry speeds covering a number of potential mission applications. Favorable results have been obtained with a minimum of development effort, and directions for improvement of performance are indicated.

  2. Development of GK-2A cloud optical and microphysical properties retrieval algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Yum, S. S.; Um, J.

    2017-12-01

    Cloud and aerosol radiative forcing is known to be one of the the largest uncertainties in climate change prediction. To reduce this uncertainty, remote sensing observation of cloud radiative and microphysical properties have been used since 1970s and the corresponding remote sensing techniques and instruments have been developed. As a part of such effort, Geo-KOMPSAT-2A (Geostationary Korea Multi-Purpose Satellite-2A, GK-2A) will be launched in 2018. On the GK-2A, the Advanced Meteorological Imager (AMI) is primary instrument which have 3 visible, 3 near-infrared, and 10 infrared channels. To retrieve optical and microphysical properties of clouds using AMI measurements, the preliminary version of new cloud retrieval algorithm for GK-2A was developed and several validation tests were conducted. This algorithm retrieves cloud optical thickness (COT), cloud effective radius (CER), liquid water path (LWP), and ice water path (IWP), so we named this algorithm as Daytime Cloud Optical thickness, Effective radius and liquid and ice Water path (DCOEW). The DCOEW uses cloud reflectance at visible and near-infrared channels as input data. An optimal estimation (OE) approach that requires appropriate a-priori values and measurement error information is used to retrieve COT and CER. LWP and IWP are calculated using empirical relationships between COT/CER and cloud water path that were determined previously. To validate retrieved cloud properties, we compared DCOEW output data with other operational satellite data. For COT and CER validation, we used two different data sets. To compare algorithms that use cloud reflectance at visible and near-IR channels as input data, MODIS MYD06 cloud product was selected. For the validation with cloud products that are based on microwave measurements, COT(2B-TAU)/CER(2C-ICE) data retrieved from CloudSat cloud profiling radar (W-band, 94 GHz) was used. For cloud water path validation, AMSR-2 Level-3 Cloud liquid water data was used. Detailed results will be shown at the conference.

  3. Ground-Based Correction of Remote-Sensing Spectral Imagery

    NASA Technical Reports Server (NTRS)

    Alder-Golden, Steven M.; Rochford, Peter; Matthew, Michael; Berk, Alexander

    2007-01-01

    Software has been developed for an improved method of correcting for the atmospheric optical effects (primarily, effects of aerosols and water vapor) in spectral images of the surface of the Earth acquired by airborne and spaceborne remote-sensing instruments. In this method, the variables needed for the corrections are extracted from the readings of a radiometer located on the ground in the vicinity of the scene of interest. The software includes algorithms that analyze measurement data acquired from a shadow-band radiometer. These algorithms are based on a prior radiation transport software model, called MODTRAN, that has been developed through several versions up to what are now known as MODTRAN4 and MODTRAN5 . These components have been integrated with a user-friendly Interactive Data Language (IDL) front end and an advanced version of MODTRAN4. Software tools for handling general data formats, performing a Langley-type calibration, and generating an output file of retrieved atmospheric parameters for use in another atmospheric-correction computer program known as FLAASH have also been incorporated into the present soft-ware. Concomitantly with the soft-ware described thus far, there has been developed a version of FLAASH that utilizes the retrieved atmospheric parameters to process spectral image data.

  4. Research on enhancing the utilization of digital multispectral data and geographic information systems in global habitability studies

    NASA Technical Reports Server (NTRS)

    Martinko, Edward A.; Merchant, James W.

    1988-01-01

    During 1986 to 1987, the Kansas Applied Remote Sensing (KARS) Program continued to build upon long-term research efforts oriented towards enhancement and development of technologies for using remote sensing in the inventory and evaluation of land use and renewable resources (both natural and agricultural). These research efforts directly addressed needs and objectives of NASA's Land-Related Global Habitability Program as well as needs of and interests of public agencies and private firms. The KARS Program placed particular emphasis on two major areas: development of intelligent algorithms to improve automated classification of digital multispectral data; and integrating and merging digital multispectral data with ancillary data in spatial modes.

  5. Using remote sensing in support of environmental management: A framework for selecting products, algorithms and methods.

    PubMed

    de Klerk, Helen M; Gilbertson, Jason; Lück-Vogel, Melanie; Kemp, Jaco; Munch, Zahn

    2016-11-01

    Traditionally, to map environmental features using remote sensing, practitioners will use training data to develop models on various satellite data sets using a number of classification approaches and use test data to select a single 'best performer' from which the final map is made. We use a combination of an omission/commission plot to evaluate various results and compile a probability map based on consistently strong performing models across a range of standard accuracy measures. We suggest that this easy-to-use approach can be applied in any study using remote sensing to map natural features for management action. We demonstrate this approach using optical remote sensing products of different spatial and spectral resolution to map the endemic and threatened flora of quartz patches in the Knersvlakte, South Africa. Quartz patches can be mapped using either SPOT 5 (used due to its relatively fine spatial resolution) or Landsat8 imagery (used because it is freely accessible and has higher spectral resolution). Of the variety of classification algorithms available, we tested maximum likelihood and support vector machine, and applied these to raw spectral data, the first three PCA summaries of the data, and the standard normalised difference vegetation index. We found that there is no 'one size fits all' solution to the choice of a 'best fit' model (i.e. combination of classification algorithm or data sets), which is in agreement with the literature that classifier performance will vary with data properties. We feel this lends support to our suggestion that rather than the identification of a 'single best' model and a map based on this result alone, a probability map based on the range of consistently top performing models provides a rigorous solution to environmental mapping. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Spectral Detection of Human Skin in VIS-SWIR Hyperspectral Imagery without Radiometric Calibration

    DTIC Science & Technology

    2012-03-01

    range than the high-altitude scenarios for which the re- mote sensing algorithms were developed. At this close range, there is relatively little...sequence contain a dismount with arms extended out to the side. The dismount, Caucasian male with dark brown hair, is wearing a black, cotton , short

  7. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  8. Geospatial approach towards enumerative analysis of suspended sediment concentration for Ganges-Brahmaputra Bay

    NASA Astrophysics Data System (ADS)

    Pandey, Palak; Kunte, Pravin D.

    2016-10-01

    This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.

  9. SAFER vehicle inspection: a multimodal robotic sensing platform

    NASA Astrophysics Data System (ADS)

    Page, David L.; Fougerolle, Yohan; Koschan, Andreas F.; Gribok, Andrei; Abidi, Mongi A.; Gorsich, David J.; Gerhart, Grant R.

    2004-09-01

    The current threats to U.S. security both military and civilian have led to an increased interest in the development of technologies to safeguard national facilities such as military bases, federal buildings, nuclear power plants, and national laboratories. As a result, the Imaging, Robotics, and Intelligent Systems (IRIS) Laboratory at The University of Tennessee (UT) has established a research consortium, known as SAFER (Security Automation and Future Electromotive Robotics), to develop, test, and deploy sensing and imaging systems for unmanned ground vehicles (UGV). The targeted missions for these UGV systems include -- but are not limited to --under vehicle threat assessment, stand-off check-point inspections, scout surveillance, intruder detection, obstacle-breach situations, and render-safe scenarios. This paper presents a general overview of the SAFER project. Beyond this general overview, we further focus on a specific problem where we collect 3D range scans of under vehicle carriages. These scans require appropriate segmentation and representation algorithms to facilitate the vehicle inspection process. We discuss the theory for these algorithms and present results from applying them to actual vehicle scans.

  10. SMAP Soil Moisture Disaggregation using Land Surface Temperature and Vegetation Data

    NASA Astrophysics Data System (ADS)

    Fang, B.; Lakshmi, V.

    2016-12-01

    Soil moisture (SM) is a key parameter in agriculture, hydrology and ecology studies. The global SM retrievals have been providing by microwave remote sensing technology since late 1970s and many SM retrieval algorithms have been developed, calibrated and applied on satellite sensors such as AMSR-E (Advanced Microwave Scanning Radiometer for the Earth Observing System), AMSR-2 (Advanced Microwave Scanning Radiometer 2) and SMOS (Soil Moisture and Ocean Salinity). Particularly, SMAP (Soil Moisture Active/Passive) satellite, which was developed by NASA, was launched in January 2015. SMAP provides soil moisture products of 9 km and 36 km spatial resolutions which are not capable for research and applications of finer scale. Toward this issue, this study applied a SM disaggregation algorithm to disaggregate SMAP passive microwave soil moisture 36 km product. This algorithm was developed based on the thermal inertial relationship between daily surface temperature variation and daily average soil moisture which is modulated by vegetation condition, by using remote sensing retrievals from AVHRR (Advanced Very High Resolution Radiometer, MODIS (Moderate Resolution Imaging Spectroradiometer), SPOT (Satellite Pour l'Observation de la Terre), as well as Land Surface Model (LSM) output from NLDAS (North American Land Data Assimilation System). The disaggregation model was built at 1/8o spatial resolution on monthly basis and was implemented to calculate and disaggregate SMAP 36 km SM retrievals to 1 km resolution in Oklahoma. The SM disaggregation results were also validated using MESONET (Mesoscale Network) and MICRONET (Microscale Network) ground SM measurements.

  11. Zombie algorithms: a timesaving remote sensing systems engineering tool

    NASA Astrophysics Data System (ADS)

    Ardanuy, Philip E.; Powell, Dylan C.; Marley, Stephen

    2008-08-01

    In modern horror fiction, zombies are generally undead corpses brought back from the dead by supernatural or scientific means, and are rarely under anyone's direct control. They typically have very limited intelligence, and hunger for the flesh of the living [1]. Typical spectroradiometric or hyperspectral instruments providess calibrated radiances for a number of remote sensing algorithms. The algorithms typically must meet specified latency and availability requirements while yielding products at the required quality. These systems, whether research, operational, or a hybrid, are typically cost constrained. Complexity of the algorithms can be high, and may evolve and mature over time as sensor characterization changes, product validation occurs, and areas of scientific basis improvement are identified and completed. This suggests the need for a systems engineering process for algorithm maintenance that is agile, cost efficient, repeatable, and predictable. Experience on remote sensing science data systems suggests the benefits of "plug-n-play" concepts of operation. The concept, while intuitively simple, can be challenging to implement in practice. The use of zombie algorithms-empty shells that outwardly resemble the form, fit, and function of a "complete" algorithm without the implemented theoretical basis-provides the ground systems advantages equivalent to those obtained by integrating sensor engineering models onto the spacecraft bus. Combined with a mature, repeatable process for incorporating the theoretical basis, or scientific core, into the "head" of the zombie algorithm, along with associated scripting and registration, provides an easy "on ramp" for the rapid and low-risk integration of scientific applications into operational systems.

  12. The Scientific and Societal Need for Accurate Global Remote Sensing of Marine Suspended Sediments

    NASA Technical Reports Server (NTRS)

    Acker, James G.

    2006-01-01

    Population pressure, commercial development, and climate change are expected to cause continuing alteration of the vital oceanic coastal zone environment. These pressures will influence both the geology and biology of the littoral, nearshore, and continental shelf regions. A pressing need for global observation of coastal change processes is an accurate remotely-sensed data product for marine suspended sediments. The concentration, delivery, transport, and deposition of sediments is strongly relevant to coastal primary production, inland and coastal hydrology, coastal erosion, and loss of fragile wetland and island habitats. Sediment transport and deposition is also related to anthropogenic activities including agriculture, fisheries, aquaculture, harbor and port commerce, and military operations. Because accurate estimation of marine suspended sediment concentrations requires advanced ocean optical analysis, a focused collaborative program of algorithm development and assessment is recommended, following the successful experience of data refinement for remotely-sensed global ocean chlorophyll concentrations.

  13. 3-Dimensional stereo implementation of photoacoustic imaging based on a new image reconstruction algorithm without using discrete Fourier transform

    NASA Astrophysics Data System (ADS)

    Ham, Woonchul; Song, Chulgyu

    2017-05-01

    In this paper, we propose a new three-dimensional stereo image reconstruction algorithm for a photoacoustic medical imaging system. We also introduce and discuss a new theoretical algorithm by using the physical concept of Radon transform. The main key concept of proposed theoretical algorithm is to evaluate the existence possibility of the acoustic source within a searching region by using the geometric distance between each sensor element of acoustic detector and the corresponding searching region denoted by grid. We derive the mathematical equation for the magnitude of the existence possibility which can be used for implementing a new proposed algorithm. We handle and derive mathematical equations of proposed algorithm for the one-dimensional sensing array case as well as two dimensional sensing array case too. A mathematical k-wave simulation data are used for comparing the image quality of the proposed algorithm with that of general conventional algorithm in which the FFT should be necessarily used. From the k-wave Matlab simulation results, we can prove the effectiveness of the proposed reconstruction algorithm.

  14. Efficient two-dimensional compressive sensing in MIMO radar

    NASA Astrophysics Data System (ADS)

    Shahbazi, Nafiseh; Abbasfar, Aliazam; Jabbarian-Jahromi, Mohammad

    2017-12-01

    Compressive sensing (CS) has been a way to lower sampling rate leading to data reduction for processing in multiple-input multiple-output (MIMO) radar systems. In this paper, we further reduce the computational complexity of a pulse-Doppler collocated MIMO radar by introducing a two-dimensional (2D) compressive sensing. To do so, we first introduce a new 2D formulation for the compressed received signals and then we propose a new measurement matrix design for our 2D compressive sensing model that is based on minimizing the coherence of sensing matrix using gradient descent algorithm. The simulation results show that our proposed 2D measurement matrix design using gradient decent algorithm (2D-MMDGD) has much lower computational complexity compared to one-dimensional (1D) methods while having better performance in comparison with conventional methods such as Gaussian random measurement matrix.

  15. A feed-forward Hopfield neural network algorithm (FHNNA) with a colour satellite image for water quality mapping

    NASA Astrophysics Data System (ADS)

    Asal Kzar, Ahmed; Mat Jafri, M. Z.; Hwee San, Lim; Al-Zuky, Ali A.; Mutter, Kussay N.; Hassan Al-Saleh, Anwar

    2016-06-01

    There are many techniques that have been given for water quality problem, but the remote sensing techniques have proven their success, especially when the artificial neural networks are used as mathematical models with these techniques. Hopfield neural network is one type of artificial neural networks which is common, fast, simple, and efficient, but it when it deals with images that have more than two colours such as remote sensing images. This work has attempted to solve this problem via modifying the network that deals with colour remote sensing images for water quality mapping. A Feed-forward Hopfield Neural Network Algorithm (FHNNA) was modified and used with a satellite colour image from type of Thailand earth observation system (THEOS) for TSS mapping in the Penang strait, Malaysia, through the classification of TSS concentrations. The new algorithm is based essentially on three modifications: using HNN as feed-forward network, considering the weights of bitplanes, and non-self-architecture or zero diagonal of weight matrix, in addition, it depends on a validation data. The achieved map was colour-coded for visual interpretation. The efficiency of the new algorithm has found out by the higher correlation coefficient (R=0.979) and the lower root mean square error (RMSE=4.301) between the validation data that were divided into two groups. One used for the algorithm and the other used for validating the results. The comparison was with the minimum distance classifier. Therefore, TSS mapping of polluted water in Penang strait, Malaysia, can be performed using FHNNA with remote sensing technique (THEOS). It is a new and useful application of HNN, so it is a new model with remote sensing techniques for water quality mapping which is considered important environmental problem.

  16. Fuzzy Classification of High Resolution Remote Sensing Scenes Using Visual Attention Features.

    PubMed

    Li, Linyi; Xu, Tingbao; Chen, Yun

    2017-01-01

    In recent years the spatial resolutions of remote sensing images have been improved greatly. However, a higher spatial resolution image does not always lead to a better result of automatic scene classification. Visual attention is an important characteristic of the human visual system, which can effectively help to classify remote sensing scenes. In this study, a novel visual attention feature extraction algorithm was proposed, which extracted visual attention features through a multiscale process. And a fuzzy classification method using visual attention features (FC-VAF) was developed to perform high resolution remote sensing scene classification. FC-VAF was evaluated by using remote sensing scenes from widely used high resolution remote sensing images, including IKONOS, QuickBird, and ZY-3 images. FC-VAF achieved more accurate classification results than the others according to the quantitative accuracy evaluation indices. We also discussed the role and impacts of different decomposition levels and different wavelets on the classification accuracy. FC-VAF improves the accuracy of high resolution scene classification and therefore advances the research of digital image analysis and the applications of high resolution remote sensing images.

  17. Fuzzy Classification of High Resolution Remote Sensing Scenes Using Visual Attention Features

    PubMed Central

    Xu, Tingbao; Chen, Yun

    2017-01-01

    In recent years the spatial resolutions of remote sensing images have been improved greatly. However, a higher spatial resolution image does not always lead to a better result of automatic scene classification. Visual attention is an important characteristic of the human visual system, which can effectively help to classify remote sensing scenes. In this study, a novel visual attention feature extraction algorithm was proposed, which extracted visual attention features through a multiscale process. And a fuzzy classification method using visual attention features (FC-VAF) was developed to perform high resolution remote sensing scene classification. FC-VAF was evaluated by using remote sensing scenes from widely used high resolution remote sensing images, including IKONOS, QuickBird, and ZY-3 images. FC-VAF achieved more accurate classification results than the others according to the quantitative accuracy evaluation indices. We also discussed the role and impacts of different decomposition levels and different wavelets on the classification accuracy. FC-VAF improves the accuracy of high resolution scene classification and therefore advances the research of digital image analysis and the applications of high resolution remote sensing images. PMID:28761440

  18. Comparison of multihardware parallel implementations for a phase unwrapping algorithm

    NASA Astrophysics Data System (ADS)

    Hernandez-Lopez, Francisco Javier; Rivera, Mariano; Salazar-Garibay, Adan; Legarda-Sáenz, Ricardo

    2018-04-01

    Phase unwrapping is an important problem in the areas of optical metrology, synthetic aperture radar (SAR) image analysis, and magnetic resonance imaging (MRI) analysis. These images are becoming larger in size and, particularly, the availability and need for processing of SAR and MRI data have increased significantly with the acquisition of remote sensing data and the popularization of magnetic resonators in clinical diagnosis. Therefore, it is important to develop faster and accurate phase unwrapping algorithms. We propose a parallel multigrid algorithm of a phase unwrapping method named accumulation of residual maps, which builds on a serial algorithm that consists of the minimization of a cost function; minimization achieved by means of a serial Gauss-Seidel kind algorithm. Our algorithm also optimizes the original cost function, but unlike the original work, our algorithm is a parallel Jacobi class with alternated minimizations. This strategy is known as the chessboard type, where red pixels can be updated in parallel at same iteration since they are independent. Similarly, black pixels can be updated in parallel in an alternating iteration. We present parallel implementations of our algorithm for different parallel multicore architecture such as CPU-multicore, Xeon Phi coprocessor, and Nvidia graphics processing unit. In all the cases, we obtain a superior performance of our parallel algorithm when compared with the original serial version. In addition, we present a detailed comparative performance of the developed parallel versions.

  19. SWIM: A Semi-Analytical Ocean Color Inversion Algorithm for Optically Shallow Waters

    NASA Technical Reports Server (NTRS)

    McKinna, Lachlan I. W.; Werdell, P. Jeremy; Fearns, Peter R. C. S.; Weeks, Scarla J.; Reichstetter, Martina; Franz, Bryan A.; Shea, Donald M.; Feldman, Gene C.

    2014-01-01

    Ocean color remote sensing provides synoptic-scale, near-daily observations of marine inherent optical properties (IOPs). Whilst contemporary ocean color algorithms are known to perform well in deep oceanic waters, they have difficulty operating in optically clear, shallow marine environments where light reflected from the seafloor contributes to the water-leaving radiance. The effect of benthic reflectance in optically shallow waters is known to adversely affect algorithms developed for optically deep waters [1, 2]. Whilst adapted versions of optically deep ocean color algorithms have been applied to optically shallow regions with reasonable success [3], there is presently no approach that directly corrects for bottom reflectance using existing knowledge of bathymetry and benthic albedo.To address the issue of optically shallow waters, we have developed a semi-analytical ocean color inversion algorithm: the Shallow Water Inversion Model (SWIM). SWIM uses existing bathymetry and a derived benthic albedo map to correct for bottom reflectance using the semi-analytical model of Lee et al [4]. The algorithm was incorporated into the NASA Ocean Biology Processing Groups L2GEN program and tested in optically shallow waters of the Great Barrier Reef, Australia. In-lieu of readily available in situ matchup data, we present a comparison between SWIM and two contemporary ocean color algorithms, the Generalized Inherent Optical Property Algorithm (GIOP) and the Quasi-Analytical Algorithm (QAA).

  20. Unified commutation-pruning technique for efficient computation of composite DFTs

    NASA Astrophysics Data System (ADS)

    Castro-Palazuelos, David E.; Medina-Melendrez, Modesto Gpe.; Torres-Roman, Deni L.; Shkvarko, Yuriy V.

    2015-12-01

    An efficient computation of a composite length discrete Fourier transform (DFT), as well as a fast Fourier transform (FFT) of both time and space data sequences in uncertain (non-sparse or sparse) computational scenarios, requires specific processing algorithms. Traditional algorithms typically employ some pruning methods without any commutations, which prevents them from attaining the potential computational efficiency. In this paper, we propose an alternative unified approach with automatic commutations between three computational modalities aimed at efficient computations of the pruned DFTs adapted for variable composite lengths of the non-sparse input-output data. The first modality is an implementation of the direct computation of a composite length DFT, the second one employs the second-order recursive filtering method, and the third one performs the new pruned decomposed transform. The pruned decomposed transform algorithm performs the decimation in time or space (DIT) data acquisition domain and, then, decimation in frequency (DIF). The unified combination of these three algorithms is addressed as the DFTCOMM technique. Based on the treatment of the combinational-type hypotheses testing optimization problem of preferable allocations between all feasible commuting-pruning modalities, we have found the global optimal solution to the pruning problem that always requires a fewer or, at most, the same number of arithmetic operations than other feasible modalities. The DFTCOMM method outperforms the existing competing pruning techniques in the sense of attainable savings in the number of required arithmetic operations. It requires fewer or at most the same number of arithmetic operations for its execution than any other of the competing pruning methods reported in the literature. Finally, we provide the comparison of the DFTCOMM with the recently developed sparse fast Fourier transform (SFFT) algorithmic family. We feature that, in the sensing scenarios with sparse/non-sparse data Fourier spectrum, the DFTCOMM technique manifests robustness against such model uncertainties in the sense of insensitivity for sparsity/non-sparsity restrictions and the variability of the operating parameters.

  1. Advancing the capabilities of reservoir remote sensing by leveraging multi-source satellite data

    NASA Astrophysics Data System (ADS)

    Gao, H.; Zhang, S.; Zhao, G.; Li, Y.

    2017-12-01

    With a total global capacity of more than 6000 km3, reservoirs play a key role in the hydrological cycle and in water resources management. However, essential reservoir data (e.g., elevation, storage, and evaporation loss) are usually not shared at a large scale. While satellite remote sensing offers a unique opportunity for monitoring large reservoirs from space, the commonly used radar altimeters can only detect storage variations of about 15% of global lakes at a repeat period of 10 days or longer. To advance the capabilities of reservoir sensing, we developed a series of algorithms geared towards generating long term reservoir records at improved spatial coverage, and at improved temporal resolution. To this goal, observations are leveraged from multiple satellite sensors, which include radar/laser altimeters, imagers, and passive microwave radiometers. In South Asia, we demonstrate that reservoir storage can be estimated under all-weather conditions at a 4 day time step, with the total capacity of monitored reservoirs increased to 45%. Within the Continuous United States, a first Landsat based evaporation loss dataset was developed (containing 204 reservoirs) from 1984 to 2011. The evaporation trends of these reservoirs are identified and the causes are analyzed. All of these algorithms and products were validated with gauge observations. Future satellite missions, which will make significant contributions to monitoring global reservoirs, are also discussed.

  2. Performance of the Falling Snow Retrieval Algorithms for the Global Precipitation Measurement (GPM) Mission

    NASA Technical Reports Server (NTRS)

    Skofronick-Jackson, Gail; Munchak, Stephen J.; Ringerud, Sarah

    2016-01-01

    Retrievals of falling snow from space represent an important data set for understanding the Earth's atmospheric, hydrological, and energy cycles, especially during climate change. Estimates of falling snow must be captured to obtain the true global precipitation water cycle, snowfall accumulations are required for hydrological studies, and without knowledge of the frozen particles in clouds one cannot adequately understand the energy and radiation budgets. While satellite-based remote sensing provides global coverage of falling snow events, the science is relatively new and retrievals are still undergoing development with challenges remaining). This work reports on the development and testing of retrieval algorithms for the Global Precipitation Measurement (GPM) mission Core Satellite, launched February 2014.

  3. Remote Sensing of Cloud, Aerosol, and Water Vapor Properties from MODIS

    NASA Technical Reports Server (NTRS)

    King, Michael D.

    2001-01-01

    MODIS is an earth-viewing cross-track scanning spectroradiometer launched on the Terra satellite in December 1999. MODIS scans a swath width sufficient to provide nearly complete global coverage every two days from a polar-orbiting, sun-synchronous, platform at an altitude of 705 km, and provides images in 36 spectral bands from 0.415 to 14.235 microns with spatial resolutions of 250 m (2 bands), 500 m (5 bands) and 1000 m (29 bands). These bands have been carefully selected to enable advanced studies of land, ocean, and atmospheric processes. In this presentation I will review the comprehensive set of remote sensing algorithms that have been developed for the remote sensing of atmospheric properties using MODIS data, placing primary emphasis on the principal atmospheric applications of: (1) developing a cloud mask for distinguishing clear sky from clouds, (2) retrieving global cloud radiative and microphysical properties, including cloud top pressure and temperature, effective emissivity, cloud optical thickness, thermodynamic phase, and effective radius, (3) monitoring tropospheric aerosol optical thickness over the land and ocean and aerosol size distribution over the ocean, (4) determining atmospheric profiles of moisture and temperature, and (5) estimating column water amount. The physical principles behind the determination of each of these atmospheric products will be described, together with an example of their application using MODIS observations. All products are archived into two categories: pixel-level retrievals (referred to as Level-2 products) and global gridded products at a latitude and longitude resolution of 1 deg (Level-3 products). An overview of the MODIS atmosphere algorithms and products, status, validation activities, and early level-2 and -3 results will be presented. Finally, I will present some highlights from the land and ocean algorithms developed for processing global MODIS observations, including: (1) surface reflectance, (2) vegetation indices, leaf area index, and FPAR, (3) albedo and nadir BRDF-adjusted reflectance, (4) normalized water-leaving radiance, (5) chlorophyll-a concentration, and (6) sea surface temperature.

  4. A Distributed Compressive Sensing Scheme for Event Capture in Wireless Visual Sensor Networks

    NASA Astrophysics Data System (ADS)

    Hou, Meng; Xu, Sen; Wu, Weiling; Lin, Fei

    2018-01-01

    Image signals which acquired by wireless visual sensor network can be used for specific event capture. This event capture is realized by image processing at the sink node. A distributed compressive sensing scheme is used for the transmission of these image signals from the camera nodes to the sink node. A measurement and joint reconstruction algorithm for these image signals are proposed in this paper. Make advantage of spatial correlation between images within a sensing area, the cluster head node which as the image decoder can accurately co-reconstruct these image signals. The subjective visual quality and the reconstruction error rate are used for the evaluation of reconstructed image quality. Simulation results show that the joint reconstruction algorithm achieves higher image quality at the same image compressive rate than the independent reconstruction algorithm.

  5. Development of data processing, interpretation and analysis system for the remote sensing of trace atmospheric gas species

    NASA Technical Reports Server (NTRS)

    Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.

    1987-01-01

    The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.

  6. High-performance technology for indexing of high volumes of Earth remote sensing data

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Kolesenkov, Aleksandr N.; Kostrov, Boris V.

    2017-10-01

    The present paper has suggested a technology for search, indexing, cataloging and distribution of aerospace images on the basis of geo-information approach, cluster and spectral analysis. It has considered information and algorithmic support of the system. Functional circuit of the system and structure of the geographical data base have been developed on the basis of the geographical online portal technology. Taking into account heterogeneity of information obtained from various sources it is reasonable to apply a geoinformation platform that allows analyzing space location of objects and territories and executing complex processing of information. Geoinformation platform is based on cartographic fundamentals with the uniform coordinate system, the geographical data base, a set of algorithms and program modules for execution of various tasks. The technology for adding by particular users and companies of images taken by means of professional and amateur devices and also processed by various software tools to the array system has been suggested. Complex usage of visual and instrumental approaches allows significantly expanding an application area of Earth remote sensing data. Development and implementation of new algorithms based on the complex usage of new methods for processing of structured and unstructured data of high volumes will increase periodicity and rate of data updating. The paper has shown that application of original algorithms for search, indexing and cataloging of aerospace images will provide an easy access to information spread by hundreds of suppliers and allow increasing an access rate to aerospace images up to 5 times in comparison with current analogues.

  7. Research on bathymetry estimation by Worldview-2 based with the semi-analytical model

    NASA Astrophysics Data System (ADS)

    Sheng, L.; Bai, J.; Zhou, G.-W.; Zhao, Y.; Li, Y.-C.

    2015-04-01

    South Sea Islands of China are far away from the mainland, the reefs takes more than 95% of south sea, and most reefs scatter over interested dispute sensitive area. Thus, the methods of obtaining the reefs bathymetry accurately are urgent to be developed. Common used method, including sonar, airborne laser and remote sensing estimation, are limited by the long distance, large area and sensitive location. Remote sensing data provides an effective way for bathymetry estimation without touching over large area, by the relationship between spectrum information and bathymetry. Aimed at the water quality of the south sea of China, our paper develops a bathymetry estimation method without measured water depth. Firstly the semi-analytical optimization model of the theoretical interpretation models has been studied based on the genetic algorithm to optimize the model. Meanwhile, OpenMP parallel computing algorithm has been introduced to greatly increase the speed of the semi-analytical optimization model. One island of south sea in China is selected as our study area, the measured water depth are used to evaluate the accuracy of bathymetry estimation from Worldview-2 multispectral images. The results show that: the semi-analytical optimization model based on genetic algorithm has good results in our study area;the accuracy of estimated bathymetry in the 0-20 meters shallow water area is accepted.Semi-analytical optimization model based on genetic algorithm solves the problem of the bathymetry estimation without water depth measurement. Generally, our paper provides a new bathymetry estimation method for the sensitive reefs far away from mainland.

  8. An Adaptive Cross-Correlation Algorithm for Extended-Scene Shack-Hartmann Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Green, Joseph J.; Ohara, Catherine M.; Redding, David C.

    2007-01-01

    This viewgraph presentation reviews the Adaptive Cross-Correlation (ACC) Algorithm for extended scene-Shack Hartmann wavefront (WF) sensing. A Shack-Hartmann sensor places a lenslet array at a plane conjugate to the WF error source. Each sub-aperture lenslet samples the WF in the corresponding patch of the WF. A description of the ACC algorithm is included. The ACC has several benefits; amongst them are: ACC requires only about 4 image-shifting iterations to achieve 0.01 pixel accuracy and ACC is insensitive to both background light and noise much more robust than centroiding,

  9. Distributed Sensor Fusion for Scalar Field Mapping Using Mobile Sensor Networks.

    PubMed

    La, Hung Manh; Sheng, Weihua

    2013-04-01

    In this paper, autonomous mobile sensor networks are deployed to measure a scalar field and build its map. We develop a novel method for multiple mobile sensor nodes to build this map using noisy sensor measurements. Our method consists of two parts. First, we develop a distributed sensor fusion algorithm by integrating two different distributed consensus filters to achieve cooperative sensing among sensor nodes. This fusion algorithm has two phases. In the first phase, the weighted average consensus filter is developed, which allows each sensor node to find an estimate of the value of the scalar field at each time step. In the second phase, the average consensus filter is used to allow each sensor node to find a confidence of the estimate at each time step. The final estimate of the value of the scalar field is iteratively updated during the movement of the mobile sensors via weighted average. Second, we develop the distributed flocking-control algorithm to drive the mobile sensors to form a network and track the virtual leader moving along the field when only a small subset of the mobile sensors know the information of the leader. Experimental results are provided to demonstrate our proposed algorithms.

  10. Farm Management Support on Cloud Computing Platform: A System for Cropland Monitoring Using Multi-Source Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.

    2015-12-01

    Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in combination with advanced analytic and extraction techniques provides a vital remote sensing tool for decision makers and scientists with a high-degree of flexibility to adapt to different uses.

  11. Tracking the Evolution of Smartphone Sensing for Monitoring Human Movement.

    PubMed

    del Rosario, Michael B; Redmond, Stephen J; Lovell, Nigel H

    2015-07-31

    Advances in mobile technology have led to the emergence of the "smartphone", a new class of device with more advanced connectivity features that have quickly made it a constant presence in our lives. Smartphones are equipped with comparatively advanced computing capabilities, a global positioning system (GPS) receivers, and sensing capabilities (i.e., an inertial measurement unit (IMU) and more recently magnetometer and barometer) which can be found in wearable ambulatory monitors (WAMs). As a result, algorithms initially developed for WAMs that "count" steps (i.e., pedometers); gauge physical activity levels; indirectly estimate energy expenditure and monitor human movement can be utilised on the smartphone. These algorithms may enable clinicians to "close the loop" by prescribing timely interventions to improve or maintain wellbeing in populations who are at risk of falling or suffer from a chronic disease whose progression is linked to a reduction in movement and mobility. The ubiquitous nature of smartphone technology makes it the ideal platform from which human movement can be remotely monitored without the expense of purchasing, and inconvenience of using, a dedicated WAM. In this paper, an overview of the sensors that can be found in the smartphone are presented, followed by a summary of the developments in this field with an emphasis on the evolution of algorithms used to classify human movement. The limitations identified in the literature will be discussed, as well as suggestions about future research directions.

  12. Added Value of Far-Infrared Radiometry for Ice Cloud Remote Sensing

    NASA Astrophysics Data System (ADS)

    Libois, Q.; Blanchet, J. P.; Ivanescu, L.; S Pelletier, L.; Laurence, C.

    2017-12-01

    Several cloud retrieval algorithms based on satellite observations in the infrared have been developed in the last decades. However, most of these observations only cover the midinfrared (MIR, λ < 15 μm) part of the spectrum, and none are available in the far-infrared (FIR, λ ≥ 15 μm). Recent developments in FIR sensors technology, though, now make it possible to consider spaceborne remote sensing in the FIR. Here we show that adding a few FIR channels with realistic radiometric performances to existing spaceborne narrowband radiometers would significantly improve their ability to retrieve ice cloud radiative properties. For clouds encountered in the polar regions and the upper troposphere, where the atmosphere above clouds is sufficiently transparent in the FIR, using FIR channels would reduce by more than 50% the uncertainties on retrieved values of optical thickness, effective particle diameter, and cloud top altitude. This would somehow extend the range of applicability of current infrared retrieval methods to the polar regions and to clouds with large optical thickness, where MIR algorithms perform poorly. The high performance of solar reflection-based algorithms would thus be reached in nighttime conditions. Using FIR observations is a promising venue for studying ice cloud microphysics and precipitation processes, which is highly relevant for cirrus clouds and convective towers, and for investigating the water cycle in the driest regions of the atmosphere.

  13. Demodulation algorithm for optical fiber F-P sensor.

    PubMed

    Yang, Huadong; Tong, Xinglin; Cui, Zhang; Deng, Chengwei; Guo, Qian; Hu, Pan

    2017-09-10

    The demodulation algorithm is very important to improving the measurement accuracy of a sensing system. In this paper, the variable step size hill climbing search method will be initially used for the optical fiber Fabry-Perot (F-P) sensing demodulation algorithm. Compared with the traditional discrete gap transformation demodulation algorithm, the computation is greatly reduced by changing step size of each climb, which could achieve nano-scale resolution, high measurement accuracy, high demodulation rates, and large dynamic demodulation range. An optical fiber F-P pressure sensor based on micro-electro-mechanical system (MEMS) has been fabricated to carry out the experiment, and the results show that the resolution of the algorithm can reach nano-scale level, the sensor's sensitivity is about 2.5  nm/KPa, which is similar to the theoretical value, and this sensor has great reproducibility.

  14. Wavelet phase extracting demodulation algorithm based on scale factor for optical fiber Fabry-Perot sensing.

    PubMed

    Zhang, Baolin; Tong, Xinglin; Hu, Pan; Guo, Qian; Zheng, Zhiyuan; Zhou, Chaoran

    2016-12-26

    Optical fiber Fabry-Perot (F-P) sensors have been used in various on-line monitoring of physical parameters such as acoustics, temperature and pressure. In this paper, a wavelet phase extracting demodulation algorithm for optical fiber F-P sensing is first proposed. In application of this demodulation algorithm, search range of scale factor is determined by estimated cavity length which is obtained by fast Fourier transform (FFT) algorithm. Phase information of each point on the optical interference spectrum can be directly extracted through the continuous complex wavelet transform without de-noising. And the cavity length of the optical fiber F-P sensor is calculated by the slope of fitting curve of the phase. Theorical analysis and experiment results show that this algorithm can greatly reduce the amount of computation and improve demodulation speed and accuracy.

  15. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia.

    PubMed

    Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David

    2006-05-30

    Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article describes the evolution of number sense and arithmetic scores before and after training. The software, open-source and freely available online, is designed for learning disabled children aged 5-8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains.

  16. Compressed sensing based missing nodes prediction in temporal communication network

    NASA Astrophysics Data System (ADS)

    Cheng, Guangquan; Ma, Yang; Liu, Zhong; Xie, Fuli

    2018-02-01

    The reconstruction of complex network topology is of great theoretical and practical significance. Most research so far focuses on the prediction of missing links. There are many mature algorithms for link prediction which have achieved good results, but research on the prediction of missing nodes has just begun. In this paper, we propose an algorithm for missing node prediction in complex networks. We detect the position of missing nodes based on their neighbor nodes under the theory of compressed sensing, and extend the algorithm to the case of multiple missing nodes using spectral clustering. Experiments on real public network datasets and simulated datasets show that our algorithm can detect the locations of hidden nodes effectively with high precision.

  17. Real-time dynamic simulation of the Cassini spacecraft using DARTS. Part 2: Parallel/vectorized real-time implementation

    NASA Technical Reports Server (NTRS)

    Fijany, A.; Roberts, J. A.; Jain, A.; Man, G. K.

    1993-01-01

    Part 1 of this paper presented the requirements for the real-time simulation of Cassini spacecraft along with some discussion of the DARTS algorithm. Here, in Part 2 we discuss the development and implementation of parallel/vectorized DARTS algorithm and architecture for real-time simulation. Development of the fast algorithms and architecture for real-time hardware-in-the-loop simulation of spacecraft dynamics is motivated by the fact that it represents a hard real-time problem, in the sense that the correctness of the simulation depends on both the numerical accuracy and the exact timing of the computation. For a given model fidelity, the computation should be computed within a predefined time period. Further reduction in computation time allows increasing the fidelity of the model (i.e., inclusion of more flexible modes) and the integration routine.

  18. Spheres: from Ground Development to ISS Operations

    NASA Technical Reports Server (NTRS)

    Katterhagen, A.

    2016-01-01

    SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) is an internal International Space Station (ISS) Facility that supports multiple investigations for the development of multi-spacecraft and robotic control algorithms. The SPHERES National Lab Facility aboard ISS is managed and operated by NASA Ames Research Center (ARC) at Moffett Field California. The SPHERES Facility on ISS consists of three self-contained eight-inch diameter free-floating satellites which perform the various flight algorithms and serve as a platform to support the integration of experimental hardware. SPHERES has served to mature the adaptability of control algorithms of future formation flight missions in microgravity (6 DOF (Degrees of Freedom) / long duration microgravity), demonstrate key close-proximity formation flight and rendezvous and docking maneuvers, understand fault diagnosis and recovery, improve the field of human telerobotic operation and control, and lessons learned on ISS have significant impact on ground robotics, mapping, localization, and sensing in three-dimensions - among several other areas of study.

  19. Application of Novel Lateral Tire Force Sensors to Vehicle Parameter Estimation of Electric Vehicles

    PubMed Central

    Nam, Kanghyun

    2015-01-01

    This article presents methods for estimating lateral vehicle velocity and tire cornering stiffness, which are key parameters in vehicle dynamics control, using lateral tire force measurements. Lateral tire forces acting on each tire are directly measured by load-sensing hub bearings that were invented and further developed by NSK Ltd. For estimating the lateral vehicle velocity, tire force models considering lateral load transfer effects are used, and a recursive least square algorithm is adapted to identify the lateral vehicle velocity as an unknown parameter. Using the estimated lateral vehicle velocity, tire cornering stiffness, which is an important tire parameter dominating the vehicle’s cornering responses, is estimated. For the practical implementation, the cornering stiffness estimation algorithm based on a simple bicycle model is developed and discussed. Finally, proposed estimation algorithms were evaluated using experimental test data. PMID:26569246

  20. An Improved Unsupervised Image Segmentation Evaluation Approach Based on - and Over-Segmentation Aware

    NASA Astrophysics Data System (ADS)

    Su, Tengfei

    2018-04-01

    In this paper, an unsupervised evaluation scheme for remote sensing image segmentation is developed. Based on a method called under- and over-segmentation aware (UOA), the new approach is improved by overcoming the defect in the part of estimating over-segmentation error. Two cases of such error-prone defect are listed, and edge strength is employed to devise a solution to this issue. Two subsets of high resolution remote sensing images were used to test the proposed algorithm, and the experimental results indicate its superior performance, which is attributed to its improved OSE detection model.

  1. Radio Frequency Interference Detection for Passive Remote Sensing Using Eigenvalue Analysis

    NASA Technical Reports Server (NTRS)

    Schoenwald, Adam; Kim, Seung-Jun; Mohammed-Tano, Priscilla

    2017-01-01

    Radio frequency interference (RFI) can corrupt passive remote sensing measurements taken with microwave radiometers. With the increasingly utilized spectrum and the push for larger bandwidth radiometers, the likelihood of RFI contamination has grown significantly. In this work, an eigenvalue-based algorithm is developed to detect the presence of RFI and provide estimates of RFI-free radiation levels. Simulated tests show that the proposed detector outperforms conventional kurtosis-based RFI detectors in the low-to-medium interferece-to-noise-power-ratio (INR) regime under continuous wave (CW) and quadrature phase shift keying (QPSK) RFIs.

  2. Radio Frequency Interference Detection for Passive Remote Sensing Using Eigenvalue Analysis

    NASA Technical Reports Server (NTRS)

    Schoenwald, Adam J.; Kim, Seung-Jun; Mohammed, Priscilla N.

    2017-01-01

    Radio frequency interference (RFI) can corrupt passive remote sensing measurements taken with microwave radiometers. With the increasingly utilized spectrum and the push for larger bandwidth radiometers, the likelihood of RFI contamination has grown significantly. In this work, an eigenvalue-based algorithm is developed to detect the presence of RFI and provide estimates of RFI-free radiation levels. Simulated tests show that the proposed detector outperforms conventional kurtosis-based RFI detectors in the low-to-medium interference-to-noise-power-ratio (INR) regime under continuous wave (CW) and quadrature phase shift keying (QPSK) RFIs.

  3. Using linear polarization for LWIR hyperspectral sensing of liquid contaminants

    NASA Astrophysics Data System (ADS)

    Thériault, Jean-Marc; Fortin, Gilles; Lacasse, Paul; Bouffard, François; Lavoie, Hugo

    2013-09-01

    We report and analyze recent results obtained with the MoDDIFS sensor (Multi-option Differential Detection and Imaging Fourier Spectrometer) for the passive polarization sensing of liquid contaminants in the long wave infrared (LWIR). Field measurements of polarized spectral radiance done on ethylene glycol and SF96 probed at distances of 6.5 and 450 meters, respectively, have been used to develop and test a GLRT-type detection algorithm adapted for liquid contaminants. The GLRT detection results serve to establish the potential and advantage of probing the vertical and horizontal linear hyperspectral polarization components for improving liquid contaminants detection.

  4. GPU implementation of prior image constrained compressed sensing (PICCS)

    NASA Astrophysics Data System (ADS)

    Nett, Brian E.; Tang, Jie; Chen, Guang-Hong

    2010-04-01

    The Prior Image Constrained Compressed Sensing (PICCS) algorithm (Med. Phys. 35, pg. 660, 2008) has been applied to several computed tomography applications with both standard CT systems and flat-panel based systems designed for guiding interventional procedures and radiation therapy treatment delivery. The PICCS algorithm typically utilizes a prior image which is reconstructed via the standard Filtered Backprojection (FBP) reconstruction algorithm. The algorithm then iteratively solves for the image volume that matches the measured data, while simultaneously assuring the image is similar to the prior image. The PICCS algorithm has demonstrated utility in several applications including: improved temporal resolution reconstruction, 4D respiratory phase specific reconstructions for radiation therapy, and cardiac reconstruction from data acquired on an interventional C-arm. One disadvantage of the PICCS algorithm, just as other iterative algorithms, is the long computation times typically associated with reconstruction. In order for an algorithm to gain clinical acceptance reconstruction must be achievable in minutes rather than hours. In this work the PICCS algorithm has been implemented on the GPU in order to significantly reduce the reconstruction time of the PICCS algorithm. The Compute Unified Device Architecture (CUDA) was used in this implementation.

  5. Remotely monitoring evaporation rate and soil water status using thermal imaging and "three-temperatures model (3T Model)" under field-scale conditions.

    PubMed

    Qiu, Guo Yu; Zhao, Ming

    2010-03-01

    Remote monitoring of soil evaporation and soil water status is necessary for water resource and environment management. Ground based remote sensing can be the bridge between satellite remote sensing and ground-based point measurement. The primary object of this study is to provide an algorithm to estimate evaporation and soil water status by remote sensing and to verify its accuracy. Observations were carried out in a flat field with varied soil water content. High-resolution thermal images were taken with a thermal camera; soil evaporation was measured with a weighing lysimeter; weather data were recorded at a nearby meteorological station. Based on the thermal imaging and the three-temperatures model (3T model), we developed an algorithm to estimate soil evaporation and soil water status. The required parameters of the proposed method were soil surface temperature, air temperature, and solar radiation. By using the proposed method, daily variation in soil evaporation was estimated. Meanwhile, soil water status was remotely monitored by using the soil evaporation transfer coefficient. Results showed that the daily variation trends of measured and estimated evaporation agreed with each other, with a regression line of y = 0.92x and coefficient of determination R(2) = 0.69. The simplicity of the proposed method makes the 3T model a potentially valuable tool for remote sensing.

  6. Advanced end-to-end fiber optic sensing systems for demanding environments

    NASA Astrophysics Data System (ADS)

    Black, Richard J.; Moslehi, Behzad

    2010-09-01

    Optical fibers are small-in-diameter, light-in-weight, electromagnetic-interference immune, electrically passive, chemically inert, flexible, embeddable into different materials, and distributed-sensing enabling, and can be temperature and radiation tolerant. With appropriate processing and/or packaging, they can be very robust and well suited to demanding environments. In this paper, we review a range of complete end-to-end fiber optic sensor systems that IFOS has developed comprising not only (1) packaged sensors and mechanisms for integration with demanding environments, but (2) ruggedized sensor interrogators, and (3) intelligent decision aid algorithms software systems. We examine the following examples: " Fiber Bragg Grating (FBG) optical sensors systems supporting arrays of environmentally conditioned multiplexed FBG point sensors on single or multiple optical fibers: In conjunction with advanced signal processing, decision aid algorithms and reasoners, FBG sensor based structural health monitoring (SHM) systems are expected to play an increasing role in extending the life and reducing costs of new generations of aerospace systems. Further, FBG based structural state sensing systems have the potential to considerably enhance the performance of dynamic structures interacting with their environment (including jet aircraft, unmanned aerial vehicles (UAVs), and medical or extravehicular space robots). " Raman based distributed temperature sensing systems: The complete length of optical fiber acts as a very long distributed sensor which may be placed down an oil well or wrapped around a cryogenic tank.

  7. Land use/cover classification in the Brazilian Amazon using satellite images.

    PubMed

    Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant'anna, Sidnei João Siqueira

    2012-09-01

    Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

  8. Land use/cover classification in the Brazilian Amazon using satellite images

    PubMed Central

    Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant’Anna, Sidnei João Siqueira

    2013-01-01

    Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data. PMID:24353353

  9. ICAP: An Interactive Cluster Analysis Procedure for analyzing remotely sensed data. [to classify the radiance data to produce a thematic map

    NASA Technical Reports Server (NTRS)

    Wharton, S. W.

    1980-01-01

    An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets.

  10. Classification of simple vegetation types using POLSAR image data

    NASA Technical Reports Server (NTRS)

    Freeman, A.

    1993-01-01

    Mapping basic vegetation or land cover types is a fairly common problem in remote sensing. Knowledge of the land cover type is a key input to algorithms which estimate geophysical parameters, such as soil moisture, surface roughness, leaf area index or biomass from remotely sensed data. In an earlier paper, an algorithm for fitting a simple three-component scattering model to POLSAR data was presented. The algorithm yielded estimates for surface scatter, double-bounce scatter and volume scatter for each pixel in a POLSAR image data set. In this paper, we show how the relative levels of each of the three components can be used as inputs to simple classifier for vegetation type. Vegetation classes include no vegetation cover (e.g. bare soil or desert), low vegetation cover (e.g. grassland), moderate vegetation cover (e.g. fully developed crops), forest and urban areas. Implementation of the approach requires estimates for the three components from all three frequencies available using the NASA/JPL AIRSAR, i.e. C-, L- and P-bands. The research described in this paper was carried out by the Jet Propulsion Laboratory, California Institute of Technology under a contract with the National Aeronautics and Space Administration.

  11. Spatiotemporal Local-Remote Senor Fusion (ST-LRSF) for Cooperative Vehicle Positioning.

    PubMed

    Jeong, Han-You; Nguyen, Hoa-Hung; Bhawiyuga, Adhitya

    2018-04-04

    Vehicle positioning plays an important role in the design of protocols, algorithms, and applications in the intelligent transport systems. In this paper, we present a new framework of spatiotemporal local-remote sensor fusion (ST-LRSF) that cooperatively improves the accuracy of absolute vehicle positioning based on two state estimates of a vehicle in the vicinity: a local sensing estimate, measured by the on-board exteroceptive sensors, and a remote sensing estimate, received from neighbor vehicles via vehicle-to-everything communications. Given both estimates of vehicle state, the ST-LRSF scheme identifies the set of vehicles in the vicinity, determines the reference vehicle state, proposes a spatiotemporal dissimilarity metric between two reference vehicle states, and presents a greedy algorithm to compute a minimal weighted matching (MWM) between them. Given the outcome of MWM, the theoretical position uncertainty of the proposed refinement algorithm is proven to be inversely proportional to the square root of matching size. To further reduce the positioning uncertainty, we also develop an extended Kalman filter model with the refined position of ST-LRSF as one of the measurement inputs. The numerical results demonstrate that the proposed ST-LRSF framework can achieve high positioning accuracy for many different scenarios of cooperative vehicle positioning.

  12. Accuracy assessment of satellite Ocean colour products in coastal waters.

    NASA Astrophysics Data System (ADS)

    Tilstone, G.; Lotliker, A.; Groom, S.

    2012-04-01

    The use of Ocean Colour Remote Sensing to monitor phytoplankton blooms in coastal waters is hampered by the absorption and scattering from substances in the water that vary independently of phytoplankton. In this paper we compare different ocean colour algorithms available for SeaWiFS, MODIS and MERIS with in situ observations of Remote Sensing Reflectance, Chlorophyll-a (Chla), Total Suspended Material and Coloured Dissolved Organic Material in coastal waters of the Arabian Sea, Bay of Bengal, North Sea and Western English Channel, which have contrasting inherent optical properties. We demonstrate a clustering method on specific-Inherent Optical Properties (sIOP) that gives accurate water quality products from MERIS data (HYDROPT) and also test the recently developed ESA CoastColour MERIS products. We found that for coastal waters of the Bay of Bengal, OC5 gave the most accurate Chla, for the Arabian Sea GSM and OC3M Chla were more accurate and for the North Sea and Western English Channel, MERIS HYDROPT were more accurate than standard algorithms. The reasons for these differences will be discussed. A Chla time series from 2002-2011 will be presented to illustrate differences in algorithms between coastal regions and inter- and intra-annual variability in phytoplankton blooms

  13. Object-based classification of earthquake damage from high-resolution optical imagery using machine learning

    NASA Astrophysics Data System (ADS)

    Bialas, James; Oommen, Thomas; Rebbapragada, Umaa; Levin, Eugene

    2016-07-01

    Object-based approaches in the segmentation and classification of remotely sensed images yield more promising results compared to pixel-based approaches. However, the development of an object-based approach presents challenges in terms of algorithm selection and parameter tuning. Subjective methods are often used, but yield less than optimal results. Objective methods are warranted, especially for rapid deployment in time-sensitive applications, such as earthquake damage assessment. Herein, we used a systematic approach in evaluating object-based image segmentation and machine learning algorithms for the classification of earthquake damage in remotely sensed imagery. We tested a variety of algorithms and parameters on post-event aerial imagery for the 2011 earthquake in Christchurch, New Zealand. Results were compared against manually selected test cases representing different classes. In doing so, we can evaluate the effectiveness of the segmentation and classification of different classes and compare different levels of multistep image segmentations. Our classifier is compared against recent pixel-based and object-based classification studies for postevent imagery of earthquake damage. Our results show an improvement against both pixel-based and object-based methods for classifying earthquake damage in high resolution, post-event imagery.

  14. Optimal and robust control of transition

    NASA Technical Reports Server (NTRS)

    Bewley, T. R.; Agarwal, R.

    1996-01-01

    Optimal and robust control theories are used to determine feedback control rules that effectively stabilize a linearly unstable flow in a plane channel. Wall transpiration (unsteady blowing/suction) with zero net mass flux is used as the control. Control algorithms are considered that depend both on full flowfield information and on estimates of that flowfield based on wall skin-friction measurements only. The development of these control algorithms accounts for modeling errors and measurement noise in a rigorous fashion; these disturbances are considered in both a structured (Gaussian) and unstructured ('worst case') sense. The performance of these algorithms is analyzed in terms of the eigenmodes of the resulting controlled systems, and the sensitivity of individual eigenmodes to both control and observation is quantified.

  15. Processing Digital Imagery to Enhance Perceptions of Realism

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2003-01-01

    Multi-scale retinex with color restoration (MSRCR) is a method of processing digital image data based on Edwin Land s retinex (retina + cortex) theory of human color vision. An outgrowth of basic scientific research and its application to NASA s remote-sensing mission, MSRCR is embodied in a general-purpose algorithm that greatly improves the perception of visual realism and the quantity and quality of perceived information in a digitized image. In addition, the MSRCR algorithm includes provisions for automatic corrections to accelerate and facilitate what could otherwise be a tedious image-editing process. The MSRCR algorithm has been, and is expected to continue to be, the basis for development of commercial image-enhancement software designed to extend and refine its capabilities for diverse applications.

  16. The research of road and vehicle information extraction algorithm based on high resolution remote sensing image

    NASA Astrophysics Data System (ADS)

    Zhou, Tingting; Gu, Lingjia; Ren, Ruizhi; Cao, Qiong

    2016-09-01

    With the rapid development of remote sensing technology, the spatial resolution and temporal resolution of satellite imagery also have a huge increase. Meanwhile, High-spatial-resolution images are becoming increasingly popular for commercial applications. The remote sensing image technology has broad application prospects in intelligent traffic. Compared with traditional traffic information collection methods, vehicle information extraction using high-resolution remote sensing image has the advantages of high resolution and wide coverage. This has great guiding significance to urban planning, transportation management, travel route choice and so on. Firstly, this paper preprocessed the acquired high-resolution multi-spectral and panchromatic remote sensing images. After that, on the one hand, in order to get the optimal thresholding for image segmentation, histogram equalization and linear enhancement technologies were applied into the preprocessing results. On the other hand, considering distribution characteristics of road, the normalized difference vegetation index (NDVI) and normalized difference water index (NDWI) were used to suppress water and vegetation information of preprocessing results. Then, the above two processing result were combined. Finally, the geometric characteristics were used to completed road information extraction. The road vector extracted was used to limit the target vehicle area. Target vehicle extraction was divided into bright vehicles extraction and dark vehicles extraction. Eventually, the extraction results of the two kinds of vehicles were combined to get the final results. The experiment results demonstrated that the proposed algorithm has a high precision for the vehicle information extraction for different high resolution remote sensing images. Among these results, the average fault detection rate was about 5.36%, the average residual rate was about 13.60% and the average accuracy was approximately 91.26%.

  17. Place Value and Mathematics for Students with Mild Disabilities: Data and Suggested Practices

    ERIC Educational Resources Information Center

    Cawley, John F.; Parmar, Rene S.; Lucas-Fusco, Lynn M.; Kilian, Joy D.; Foley, Teresa E.

    2007-01-01

    Place value is a phenomenon that has ominous implications for developing number sense and meaning and for using alternative algorithms and alternative representations within whole number arithmetic. For the most part, school programs examine place value at a surface level, with a primary focus on having the student identify or state a number value…

  18. The development of machine technology processing for earth resource survey

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A.

    1970-01-01

    The following technologies are considered for automatic processing of earth resources data: (1) registration of multispectral and multitemporal images, (2) digital image display systems, (3) data system parameter effects on satellite remote sensing systems, and (4) data compression techniques based on spectral redundancy. The importance of proper spectral band and compression algorithm selections is pointed out.

  19. Hyperspectral absorption and backscattering coefficients of bulk water retrieved from a combination of remote-sensing reflectance and attenuation coefficient.

    PubMed

    Lin, Junfang; Lee, Zhongping; Ondrusek, Michael; Liu, Xiaohan

    2018-01-22

    Absorption (a) and backscattering (bb) coefficients play a key role in determining the light field; they also serve as the link between remote sensing and concentrations of optically active water constituents. Here we present an updated scheme to derive hyperspectral a and bb with hyperspectral remote-sensing reflectance (Rrs) and diffuse attenuation coefficient (Kd) as the inputs. Results show that the system works very well from clear open oceans to highly turbid inland waters, with an overall difference less than 25% between these retrievals and those from instrument measurements. This updated scheme advocates the measurement and generation of hyperspectral a and bb from hyperspectral Rrs and Kd, as an independent data source for cross-evaluation of in situ measurements of a and bb and for the development and/or evaluation of remote sensing algorithms for such optical properties.

  20. Cybernetic group method of data handling (GMDH) statistical learning for hyperspectral remote sensing inverse problems in coastal ocean optics

    NASA Astrophysics Data System (ADS)

    Filippi, Anthony Matthew

    For complex systems, sufficient a priori knowledge is often lacking about the mathematical or empirical relationship between cause and effect or between inputs and outputs of a given system. Automated machine learning may offer a useful solution in such cases. Coastal marine optical environments represent such a case, as the optical remote sensing inverse problem remains largely unsolved. A self-organizing, cybernetic mathematical modeling approach known as the group method of data handling (GMDH), a type of statistical learning network (SLN), was used to generate explicit spectral inversion models for optically shallow coastal waters. Optically shallow water light fields represent a particularly difficult challenge in oceanographic remote sensing. Several algorithm-input data treatment combinations were utilized in multiple experiments to automatically generate inverse solutions for various inherent optical property (IOP), bottom optical property (BOP), constituent concentration, and bottom depth estimations. The objective was to identify the optimal remote-sensing reflectance Rrs(lambda) inversion algorithm. The GMDH also has the potential of inductive discovery of physical hydro-optical laws. Simulated data were used to develop generalized, quasi-universal relationships. The Hydrolight numerical forward model, based on radiative transfer theory, was used to compute simulated above-water remote-sensing reflectance Rrs(lambda) psuedodata, matching the spectral channels and resolution of the experimental Naval Research Laboratory Ocean PHILLS (Portable Hyperspectral Imager for Low-Light Spectroscopy) sensor. The input-output pairs were for GMDH and artificial neural network (ANN) model development, the latter of which was used as a baseline, or control, algorithm. Both types of models were applied to in situ and aircraft data. Also, in situ spectroradiometer-derived Rrs(lambda) were used as input to an optimization-based inversion procedure. Target variables included bottom depth z b, chlorophyll a concentration [chl- a], spectral bottom irradiance reflectance Rb(lambda), and spectral total absorption a(lambda) and spectral total backscattering bb(lambda) coefficients. When applying the cybernetic and neural models to in situ HyperTSRB-derived Rrs, the difference in the means of the absolute error of the inversion estimates for zb was significant (alpha = 0.05). GMDH yielded significantly better zb than the ANN. The ANN model posted a mean absolute error (MAE) of 0.62214 m, compared with 0.55161 m for GMDH.

  1. Expansion of Smartwatch Touch Interface from Touchscreen to Around Device Interface Using Infrared Line Image Sensors.

    PubMed

    Lim, Soo-Chul; Shin, Jungsoon; Kim, Seung-Chan; Park, Joonah

    2015-07-09

    Touchscreen interaction has become a fundamental means of controlling mobile phones and smartwatches. However, the small form factor of a smartwatch limits the available interactive surface area. To overcome this limitation, we propose the expansion of the touch region of the screen to the back of the user's hand. We developed a touch module for sensing the touched finger position on the back of the hand using infrared (IR) line image sensors, based on the calibrated IR intensity and the maximum intensity region of an IR array. For complete touch-sensing solution, a gyroscope installed in the smartwatch is used to read the wrist gestures. The gyroscope incorporates a dynamic time warping gesture recognition algorithm for eliminating unintended touch inputs during the free motion of the wrist while wearing the smartwatch. The prototype of the developed sensing module was implemented in a commercial smartwatch, and it was confirmed that the sensed positional information of the finger when it was used to touch the back of the hand could be used to control the smartwatch graphical user interface. Our system not only affords a novel experience for smartwatch users, but also provides a basis for developing other useful interfaces.

  2. Local Learning Strategies for Wake Identification

    NASA Astrophysics Data System (ADS)

    Colvert, Brendan; Alsalman, Mohamad; Kanso, Eva

    2017-11-01

    Swimming agents, biological and engineered alike, must navigate the underwater environment to survive. Tasks such as autonomous navigation, foraging, mating, and predation require the ability to extract critical cues from the hydrodynamic environment. A substantial body of evidence supports the hypothesis that biological systems leverage local sensing modalities, including flow sensing, to gain knowledge of their global surroundings. The nonlinear nature and high degree of complexity of fluid dynamics makes the development of algorithms for implementing localized sensing in bioinspired engineering systems essentially intractable for many systems of practical interest. In this work, we use techniques from machine learning for training a bioinspired swimmer to learn from its environment. We demonstrate the efficacy of this strategy by learning how to sense global characteristics of the wakes of other swimmers measured only from local sensory information. We conclude by commenting on the advantages and limitations of this data-driven, machine learning approach and its potential impact on broader applications in underwater sensing and navigation.

  3. Integrated sensing and actuation of dielectric elastomer actuator

    NASA Astrophysics Data System (ADS)

    Ye, Zhihang; Chen, Zheng

    2017-04-01

    Dielectric elastomer (DE) is a type of soft actuating material, the shape of which can be changed under electrical voltage stimuli. DE materials have great potential in applications involving energy harvesters, micro-manipulators, and adaptive optics. In this paper, a stripe DE actuator with integrated sensing and actuation is designed and fabricated, and characterized through several experiments. Considering the actuator's capacitor-like structure and its deform mechanism, detecting the actuator's displacement through the actuator's circuit feature is a potential approach. A self-sensing scheme that adds a high frequency probing signal into actuation signal is developed. A fast Fourier transform (FFT) algorithm is used to extract the magnitude change of the probing signal, and a non-linear fitting method and artificial neural network (ANN) approach are utilized to reflect the relationship between the probing signal and the actuator's displacement. Experimental results showed this structure has capability of performing self-sensing and actuation, simultaneously. With an enhanced ANN, the self-sensing scheme can achieve 2.5% accuracy.

  4. 2005 AG20/20 Annual Review

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney D.

    2005-01-01

    Topics covered include: Implementation and Validation of Sensor-Based Site-Specific Crop Management; Enhanced Management of Agricultural Perennial Systems (EMAPS) Using GIS and Remote Sensing; Validation and Application of Geospatial Information for Early Identification of Stress in Wheat; Adapting and Validating Precision Technologies for Cotton Production in the Mid-Southern United States - 2004 Progress Report; Development of a System to Automatically Geo-Rectify Images; Economics of Precision Agriculture Technologies in Cotton Production-AG 2020 Prescription Farming Automation Algorithms; Field Testing a Sensor-Based Applicator for Nitrogen and Phosphorus Application; Early Detection of Citrus Diseases Using Machine Vision and DGPS; Remote Sensing of Citrus Tree Stress Levels and Factors; Spectral-based Nitrogen Sensing for Citrus; Characterization of Tree Canopies; In-field Sensing of Shallow Water Tables and Hydromorphic Soils with an Electromagnetic Induction Profiler; Maintaining the Competitiveness of Tree Fruit Production Through Precision Agriculture; Modeling and Visualizing Terrain and Remote Sensing Data for Research and Education in Precision Agriculture; Thematic Soil Mapping and Crop-Based Strategies for Site-Specific Management; and Crop-Based Strategies for Site-Specific Management.

  5. Evolving land cover classification algorithms for multispectral and multitemporal imagery

    NASA Astrophysics Data System (ADS)

    Brumby, Steven P.; Theiler, James P.; Bloch, Jeffrey J.; Harvey, Neal R.; Perkins, Simon J.; Szymanski, John J.; Young, Aaron C.

    2002-01-01

    The Cerro Grande/Los Alamos forest fire devastated over 43,000 acres (17,500 ha) of forested land, and destroyed over 200 structures in the town of Los Alamos and the adjoining Los Alamos National Laboratory. The need to measure the continuing impact of the fire on the local environment has led to the application of a number of remote sensing technologies. During and after the fire, remote-sensing data was acquired from a variety of aircraft- and satellite-based sensors, including Landsat 7 Enhanced Thematic Mapper (ETM+). We now report on the application of a machine learning technique to the automated classification of land cover using multi-spectral and multi-temporal imagery. We apply a hybrid genetic programming/supervised classification technique to evolve automatic feature extraction algorithms. We use a software package we have developed at Los Alamos National Laboratory, called GENIE, to carry out this evolution. We use multispectral imagery from the Landsat 7 ETM+ instrument from before, during, and after the wildfire. Using an existing land cover classification based on a 1992 Landsat 5 TM scene for our training data, we evolve algorithms that distinguish a range of land cover categories, and an algorithm to mask out clouds and cloud shadows. We report preliminary results of combining individual classification results using a K-means clustering approach. The details of our evolved classification are compared to the manually produced land-cover classification.

  6. Detection and Alert of muscle fatigue considering a Surface Electromyography Chaotic Model

    NASA Astrophysics Data System (ADS)

    Herrera, V.; Romero, J. F.; Amestegui, M.

    2011-03-01

    This work propose a detection and alert algorithm for muscle fatigue in paraplegic patients undergoing electro-therapy sessions. The procedure is based on a mathematical chaotic model emulating physiological signals and Continuous Wavelet Transform (CWT). The chaotic model developed is based on a logistic map that provides suitable data accomplishing some physiological signal class patterns. The CWT was applied to signals generated by the model and the resulting vector was obtained through Total Wavelet Entropy (TWE). In this sense, the presented work propose a viable and practical alert and detection algorithm for muscle fatigue.

  7. Least mean square fourth based microgrid state estimation algorithm using the internet of things technology.

    PubMed

    Rana, Md Masud

    2017-01-01

    This paper proposes an innovative internet of things (IoT) based communication framework for monitoring microgrid under the condition of packet dropouts in measurements. First of all, the microgrid incorporating the renewable distributed energy resources is represented by a state-space model. The IoT embedded wireless sensor network is adopted to sense the system states. Afterwards, the information is transmitted to the energy management system using the communication network. Finally, the least mean square fourth algorithm is explored for estimating the system states. The effectiveness of the developed approach is verified through numerical simulations.

  8. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    NASA Technical Reports Server (NTRS)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  9. Target Coverage in Wireless Sensor Networks with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao

    2016-01-01

    Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902

  10. Novel image compression-encryption hybrid algorithm based on key-controlled measurement matrix in compressive sensing

    NASA Astrophysics Data System (ADS)

    Zhou, Nanrun; Zhang, Aidi; Zheng, Fen; Gong, Lihua

    2014-10-01

    The existing ways to encrypt images based on compressive sensing usually treat the whole measurement matrix as the key, which renders the key too large to distribute and memorize or store. To solve this problem, a new image compression-encryption hybrid algorithm is proposed to realize compression and encryption simultaneously, where the key is easily distributed, stored or memorized. The input image is divided into 4 blocks to compress and encrypt, then the pixels of the two adjacent blocks are exchanged randomly by random matrices. The measurement matrices in compressive sensing are constructed by utilizing the circulant matrices and controlling the original row vectors of the circulant matrices with logistic map. And the random matrices used in random pixel exchanging are bound with the measurement matrices. Simulation results verify the effectiveness, security of the proposed algorithm and the acceptable compression performance.

  11. An Online Dictionary Learning-Based Compressive Data Gathering Algorithm in Wireless Sensor Networks

    PubMed Central

    Wang, Donghao; Wan, Jiangwen; Chen, Junying; Zhang, Qiang

    2016-01-01

    To adapt to sense signals of enormous diversities and dynamics, and to decrease the reconstruction errors caused by ambient noise, a novel online dictionary learning method-based compressive data gathering (ODL-CDG) algorithm is proposed. The proposed dictionary is learned from a two-stage iterative procedure, alternately changing between a sparse coding step and a dictionary update step. The self-coherence of the learned dictionary is introduced as a penalty term during the dictionary update procedure. The dictionary is also constrained with sparse structure. It’s theoretically demonstrated that the sensing matrix satisfies the restricted isometry property (RIP) with high probability. In addition, the lower bound of necessary number of measurements for compressive sensing (CS) reconstruction is given. Simulation results show that the proposed ODL-CDG algorithm can enhance the recovery accuracy in the presence of noise, and reduce the energy consumption in comparison with other dictionary based data gathering methods. PMID:27669250

  12. An Online Dictionary Learning-Based Compressive Data Gathering Algorithm in Wireless Sensor Networks.

    PubMed

    Wang, Donghao; Wan, Jiangwen; Chen, Junying; Zhang, Qiang

    2016-09-22

    To adapt to sense signals of enormous diversities and dynamics, and to decrease the reconstruction errors caused by ambient noise, a novel online dictionary learning method-based compressive data gathering (ODL-CDG) algorithm is proposed. The proposed dictionary is learned from a two-stage iterative procedure, alternately changing between a sparse coding step and a dictionary update step. The self-coherence of the learned dictionary is introduced as a penalty term during the dictionary update procedure. The dictionary is also constrained with sparse structure. It's theoretically demonstrated that the sensing matrix satisfies the restricted isometry property (RIP) with high probability. In addition, the lower bound of necessary number of measurements for compressive sensing (CS) reconstruction is given. Simulation results show that the proposed ODL-CDG algorithm can enhance the recovery accuracy in the presence of noise, and reduce the energy consumption in comparison with other dictionary based data gathering methods.

  13. Spectrum Access In Cognitive Radio Using a Two-Stage Reinforcement Learning Approach

    NASA Astrophysics Data System (ADS)

    Raj, Vishnu; Dias, Irene; Tholeti, Thulasi; Kalyani, Sheetal

    2018-02-01

    With the advent of the 5th generation of wireless standards and an increasing demand for higher throughput, methods to improve the spectral efficiency of wireless systems have become very important. In the context of cognitive radio, a substantial increase in throughput is possible if the secondary user can make smart decisions regarding which channel to sense and when or how often to sense. Here, we propose an algorithm to not only select a channel for data transmission but also to predict how long the channel will remain unoccupied so that the time spent on channel sensing can be minimized. Our algorithm learns in two stages - a reinforcement learning approach for channel selection and a Bayesian approach to determine the optimal duration for which sensing can be skipped. Comparisons with other learning methods are provided through extensive simulations. We show that the number of sensing is minimized with negligible increase in primary interference; this implies that lesser energy is spent by the secondary user in sensing and also higher throughput is achieved by saving on sensing.

  14. Remote sensing of suspended sediment water research: principles, methods, and progress

    NASA Astrophysics Data System (ADS)

    Shen, Ping; Zhang, Jing

    2011-12-01

    In this paper, we reviewed the principle, data, methods and steps in suspended sediment research by using remote sensing, summed up some representative models and methods, and analyzes the deficiencies of existing methods. Combined with the recent progress of remote sensing theory and application in water suspended sediment research, we introduced in some data processing methods such as atmospheric correction method, adjacent effect correction, and some intelligence algorithms such as neural networks, genetic algorithms, support vector machines into the suspended sediment inversion research, combined with other geographic information, based on Bayesian theory, we improved the suspended sediment inversion precision, and aim to give references to the related researchers.

  15. Using the time shift in single pushbroom datatakes to detect ships and their heading

    NASA Astrophysics Data System (ADS)

    Willburger, Katharina A. M.; Schwenk, Kurt

    2017-10-01

    The detection of ships from remote sensing data has become an essential task for maritime security. The variety of application scenarios includes piracy, illegal fishery, ocean dumping and ships carrying refugees. While techniques using data from SAR sensors for ship detection are widely common, there is only few literature discussing algorithms based on imagery of optical camera systems. A ship detection algorithm for optical pushbroom data has been developed. It takes advantage of the special detector assembly of most of those scanners, which allows apart from the detection of a ship also the calculation of its heading out of a single acquisition. The proposed algorithm for the detection of moving ships was developed with RapidEye imagery. It algorithm consists mainly of three steps: the creation of a land-watermask, the object extraction and the deeper examination of each single object. The latter step is built up by several spectral and geometric filters, making heavy use of the inter-channel displacement typical for pushbroom sensors with multiple CCD lines, finally yielding a set of ships and their direction of movement. The working principle of time-shifted pushbroom sensors and the developed algorithm is explained in detail. Furthermore, we present our first results and give an outlook to future improvements.

  16. An Energy-Efficient Spectrum-Aware Reinforcement Learning-Based Clustering Algorithm for Cognitive Radio Sensor Networks

    PubMed Central

    Mustapha, Ibrahim; Ali, Borhanuddin Mohd; Rasid, Mohd Fadlee A.; Sali, Aduwati; Mohamad, Hafizal

    2015-01-01

    It is well-known that clustering partitions network into logical groups of nodes in order to achieve energy efficiency and to enhance dynamic channel access in cognitive radio through cooperative sensing. While the topic of energy efficiency has been well investigated in conventional wireless sensor networks, the latter has not been extensively explored. In this paper, we propose a reinforcement learning-based spectrum-aware clustering algorithm that allows a member node to learn the energy and cooperative sensing costs for neighboring clusters to achieve an optimal solution. Each member node selects an optimal cluster that satisfies pairwise constraints, minimizes network energy consumption and enhances channel sensing performance through an exploration technique. We first model the network energy consumption and then determine the optimal number of clusters for the network. The problem of selecting an optimal cluster is formulated as a Markov Decision Process (MDP) in the algorithm and the obtained simulation results show convergence, learning and adaptability of the algorithm to dynamic environment towards achieving an optimal solution. Performance comparisons of our algorithm with the Groupwise Spectrum Aware (GWSA)-based algorithm in terms of Sum of Square Error (SSE), complexity, network energy consumption and probability of detection indicate improved performance from the proposed approach. The results further reveal that an energy savings of 9% and a significant Primary User (PU) detection improvement can be achieved with the proposed approach. PMID:26287191

  17. An Energy-Efficient Spectrum-Aware Reinforcement Learning-Based Clustering Algorithm for Cognitive Radio Sensor Networks.

    PubMed

    Mustapha, Ibrahim; Mohd Ali, Borhanuddin; Rasid, Mohd Fadlee A; Sali, Aduwati; Mohamad, Hafizal

    2015-08-13

    It is well-known that clustering partitions network into logical groups of nodes in order to achieve energy efficiency and to enhance dynamic channel access in cognitive radio through cooperative sensing. While the topic of energy efficiency has been well investigated in conventional wireless sensor networks, the latter has not been extensively explored. In this paper, we propose a reinforcement learning-based spectrum-aware clustering algorithm that allows a member node to learn the energy and cooperative sensing costs for neighboring clusters to achieve an optimal solution. Each member node selects an optimal cluster that satisfies pairwise constraints, minimizes network energy consumption and enhances channel sensing performance through an exploration technique. We first model the network energy consumption and then determine the optimal number of clusters for the network. The problem of selecting an optimal cluster is formulated as a Markov Decision Process (MDP) in the algorithm and the obtained simulation results show convergence, learning and adaptability of the algorithm to dynamic environment towards achieving an optimal solution. Performance comparisons of our algorithm with the Groupwise Spectrum Aware (GWSA)-based algorithm in terms of Sum of Square Error (SSE), complexity, network energy consumption and probability of detection indicate improved performance from the proposed approach. The results further reveal that an energy savings of 9% and a significant Primary User (PU) detection improvement can be achieved with the proposed approach.

  18. A service relation model for web-based land cover change detection

    NASA Astrophysics Data System (ADS)

    Xing, Huaqiao; Chen, Jun; Wu, Hao; Zhang, Jun; Li, Songnian; Liu, Boyu

    2017-10-01

    Change detection with remotely sensed imagery is a critical step in land cover monitoring and updating. Although a variety of algorithms or models have been developed, none of them can be universal for all cases. The selection of appropriate algorithms and construction of processing workflows depend largely on the expertise of experts about the "algorithm-data" relations among change detection algorithms and the imagery data used. This paper presents a service relation model for land cover change detection by integrating the experts' knowledge about the "algorithm-data" relations into the web-based geo-processing. The "algorithm-data" relations are mapped into a set of web service relations with the analysis of functional and non-functional service semantics. These service relations are further classified into three different levels, i.e., interface, behavior and execution levels. A service relation model is then established using the Object and Relation Diagram (ORD) approach to represent the multi-granularity services and their relations for change detection. A set of semantic matching rules are built and used for deriving on-demand change detection service chains from the service relation model. A web-based prototype system is developed in .NET development environment, which encapsulates nine change detection and pre-processing algorithms and represents their service relations as an ORD. Three test areas from Shandong and Hebei provinces, China with different imagery conditions are selected for online change detection experiments, and the results indicate that on-demand service chains can be generated according to different users' demands.

  19. Adaptive statistical pattern classifiers for remotely sensed data

    NASA Technical Reports Server (NTRS)

    Gonzalez, R. C.; Pace, M. O.; Raulston, H. S.

    1975-01-01

    A technique for the adaptive estimation of nonstationary statistics necessary for Bayesian classification is developed. The basic approach to the adaptive estimation procedure consists of two steps: (1) an optimal stochastic approximation of the parameters of interest and (2) a projection of the parameters in time or position. A divergence criterion is developed to monitor algorithm performance. Comparative results of adaptive and nonadaptive classifier tests are presented for simulated four dimensional spectral scan data.

  20. Wavefront Sensing for WFIRST with a Linear Optical Model

    NASA Technical Reports Server (NTRS)

    Jurling, Alden S.; Content, David A.

    2012-01-01

    In this paper we develop methods to use a linear optical model to capture the field dependence of wavefront aberrations in a nonlinear optimization-based phase retrieval algorithm for image-based wavefront sensing. The linear optical model is generated from a ray trace model of the system and allows the system state to be described in terms of mechanical alignment parameters rather than wavefront coefficients. This approach allows joint optimization over images taken at different field points and does not require separate convergence of phase retrieval at individual field points. Because the algorithm exploits field diversity, multiple defocused images per field point are not required for robustness. Furthermore, because it is possible to simultaneously fit images of many stars over the field, it is not necessary to use a fixed defocus to achieve adequate signal-to-noise ratio despite having images with high dynamic range. This allows high performance wavefront sensing using in-focus science data. We applied this technique in a simulation model based on the Wide Field Infrared Survey Telescope (WFIRST) Intermediate Design Reference Mission (IDRM) imager using a linear optical model with 25 field points. We demonstrate sub-thousandth-wave wavefront sensing accuracy in the presence of noise and moderate undersampling for both monochromatic and polychromatic images using 25 high-SNR target stars. Using these high-quality wavefront sensing results, we are able to generate upsampled point-spread functions (PSFs) and use them to determine PSF ellipticity to high accuracy in order to reduce the systematic impact of aberrations on the accuracy of galactic ellipticity determination for weak-lensing science.

  1. EMD self-adaptive selecting relevant modes algorithm for FBG spectrum signal

    NASA Astrophysics Data System (ADS)

    Chen, Yong; Wu, Chun-ting; Liu, Huan-lin

    2017-07-01

    Noise may reduce the demodulation accuracy of fiber Bragg grating (FBG) sensing signal so as to affect the quality of sensing detection. Thus, the recovery of a signal from observed noisy data is necessary. In this paper, a precise self-adaptive algorithm of selecting relevant modes is proposed to remove the noise of signal. Empirical mode decomposition (EMD) is first used to decompose a signal into a set of modes. The pseudo modes cancellation is introduced to identify and eliminate false modes, and then the Mutual Information (MI) of partial modes is calculated. MI is used to estimate the critical point of high and low frequency components. Simulation results show that the proposed algorithm estimates the critical point more accurately than the traditional algorithms for FBG spectral signal. While, compared to the similar algorithms, the signal noise ratio of the signal can be improved more than 10 dB after processing by the proposed algorithm, and correlation coefficient can be increased by 0.5, so it demonstrates better de-noising effect.

  2. Diverse Planning for UAV Control and Remote Sensing

    PubMed Central

    Tožička, Jan; Komenda, Antonín

    2016-01-01

    Unmanned aerial vehicles (UAVs) are suited to various remote sensing missions, such as measuring air quality. The conventional method of UAV control is by human operators. Such an approach is limited by the ability of cooperation among the operators controlling larger fleets of UAVs in a shared area. The remedy for this is to increase autonomy of the UAVs in planning their trajectories by considering other UAVs and their plans. To provide such improvement in autonomy, we need better algorithms for generating alternative trajectory variants that the UAV coordination algorithms can utilize. In this article, we define a novel family of multi-UAV sensing problems, solving task allocation of huge number of tasks (tens of thousands) to a group of configurable UAVs with non-zero weight of equipped sensors (comprising the air quality measurement as well) together with two base-line solvers. To solve the problem efficiently, we use an algorithm for diverse trajectory generation and integrate it with a solver for the multi-UAV coordination problem. Finally, we experimentally evaluate the multi-UAV sensing problem solver. The evaluation is done on synthetic and real-world-inspired benchmarks in a multi-UAV simulator. Results show that diverse planning is a valuable method for remote sensing applications containing multiple UAVs. PMID:28009831

  3. Simulating optoelectronic systems for remote sensing with SENSOR

    NASA Astrophysics Data System (ADS)

    Boerner, Anko

    2003-04-01

    The consistent end-to-end simulation of airborne and spaceborne remote sensing systems is an important task and sometimes the only way for the adaptation and optimization of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software ENvironment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. It allows the simulation of a wide range of optoelectronic systems for remote sensing. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. Part three consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimization requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and examples of its use are given. The verification of SENSOR is demonstrated.

  4. Diverse Planning for UAV Control and Remote Sensing.

    PubMed

    Tožička, Jan; Komenda, Antonín

    2016-12-21

    Unmanned aerial vehicles (UAVs) are suited to various remote sensing missions, such as measuring air quality. The conventional method of UAV control is by human operators. Such an approach is limited by the ability of cooperation among the operators controlling larger fleets of UAVs in a shared area. The remedy for this is to increase autonomy of the UAVs in planning their trajectories by considering other UAVs and their plans. To provide such improvement in autonomy, we need better algorithms for generating alternative trajectory variants that the UAV coordination algorithms can utilize. In this article, we define a novel family of multi-UAV sensing problems, solving task allocation of huge number of tasks (tens of thousands) to a group of configurable UAVs with non-zero weight of equipped sensors (comprising the air quality measurement as well) together with two base-line solvers. To solve the problem efficiently, we use an algorithm for diverse trajectory generation and integrate it with a solver for the multi-UAV coordination problem. Finally, we experimentally evaluate the multi-UAV sensing problem solver. The evaluation is done on synthetic and real-world-inspired benchmarks in a multi-UAV simulator. Results show that diverse planning is a valuable method for remote sensing applications containing multiple UAVs.

  5. Design of a Novel Flexible Capacitive Sensing Mattress for Monitoring Sleeping Respiratory

    PubMed Central

    Chang, Wen-Ying; Huang, Chien-Chun; Chen, Chi-Chun; Chang, Chih-Cheng; Yang, Chin-Lung

    2014-01-01

    In this paper, an algorithm to extract respiration signals using a flexible projected capacitive sensing mattress (FPCSM) designed for personal health assessment is proposed. Unlike the interfaces of conventional measurement systems for poly-somnography (PSG) and other alternative contemporary systems, the proposed FPCSM uses projected capacitive sensing capability that is not worn or attached to the body. The FPCSM is composed of a multi-electrode sensor array that can not only observe gestures and motion behaviors, but also enables the FPCSM to function as a respiration monitor during sleep using the proposed approach. To improve long-term monitoring when body movement is possible, the FPCSM enables the selection of data from the sensing array, and the FPCSM methodology selects the electrodes with the optimal signals after the application of a channel reduction algorithm that counts the reversals in the capacitive sensing signals as a quality indicator. The simple algorithm is implemented in the time domain. The FPCSM system is used in experimental tests and is simultaneously compared with a commercial PSG system for verification. Multiple synchronous measurements are performed from different locations of body contact, and parallel data sets are collected. The experimental comparison yields a correlation coefficient of 0.88 between FPCSM and PSG, demonstrating the feasibility of the system design. PMID:25420152

  6. I/O efficient algorithms and applications in geographic information systems

    NASA Astrophysics Data System (ADS)

    Danner, Andrew

    Modern remote sensing methods such a laser altimetry (lidar) and Interferometric Synthetic Aperture Radar (IfSAR) produce georeferenced elevation data at unprecedented rates. Many Geographic Information System (GIS) algorithms designed for terrain modelling applications cannot process these massive data sets. The primary problem is that these data sets are too large to fit in the main internal memory of modern computers and must therefore reside on larger, but considerably slower disks. In these applications, the transfer of data between disk and main memory, or I/O, becomes the primary bottleneck. Working in a theoretical model that more accurately represents this two level memory hierarchy, we can develop algorithms that are I/O-efficient and reduce the amount of disk I/O needed to solve a problem. In this thesis we aim to modernize GIS algorithms and develop a number of I/O-efficient algorithms for processing geographic data derived from massive elevation data sets. For each application, we convert a geographic question to an algorithmic question, develop an I/O-efficient algorithm that is theoretically efficient, implement our approach and verify its performance using real-world data. The applications we consider include constructing a gridded digital elevation model (DEM) from an irregularly spaced point cloud, removing topological noise from a DEM, modeling surface water flow over a terrain, extracting river networks and watershed hierarchies from the terrain, and locating polygons containing query points in a planar subdivision. We initially developed solutions to each of these applications individually. However, we also show how to combine individual solutions to form a scalable geo-processing pipeline that seamlessly solves a sequence of sub-problems with little or no manual intervention. We present experimental results that demonstrate orders of magnitude improvement over previously known algorithms.

  7. An End-to-End simulator for the development of atmospheric corrections and temperature - emissivity separation algorithms in the TIR spectral domain

    NASA Astrophysics Data System (ADS)

    Rock, Gilles; Fischer, Kim; Schlerf, Martin; Gerhards, Max; Udelhoven, Thomas

    2017-04-01

    The development and optimization of image processing algorithms requires the availability of datasets depicting every step from earth surface to the sensor's detector. The lack of ground truth data obliges to develop algorithms on simulated data. The simulation of hyperspectral remote sensing data is a useful tool for a variety of tasks such as the design of systems, the understanding of the image formation process, and the development and validation of data processing algorithms. An end-to-end simulator has been set up consisting of a forward simulator, a backward simulator and a validation module. The forward simulator derives radiance datasets based on laboratory sample spectra, applies atmospheric contributions using radiative transfer equations, and simulates the instrument response using configurable sensor models. This is followed by the backward simulation branch, consisting of an atmospheric correction (AC), a temperature and emissivity separation (TES) or a hybrid AC and TES algorithm. An independent validation module allows the comparison between input and output dataset and the benchmarking of different processing algorithms. In this study, hyperspectral thermal infrared scenes of a variety of surfaces have been simulated to analyze existing AC and TES algorithms. The ARTEMISS algorithm was optimized and benchmarked against the original implementations. The errors in TES were found to be related to incorrect water vapor retrieval. The atmospheric characterization could be optimized resulting in increasing accuracies in temperature and emissivity retrieval. Airborne datasets of different spectral resolutions were simulated from terrestrial HyperCam-LW measurements. The simulated airborne radiance spectra were subjected to atmospheric correction and TES and further used for a plant species classification study analyzing effects related to noise and mixed pixels.

  8. Spatial Observation and Models for Crop Water Use in Australia (Invited)

    NASA Astrophysics Data System (ADS)

    Hafeez, M. M.; Chemin, Y.; Rabbani, U.

    2009-12-01

    Recent drought in Australia and concerns about climate change have highlighted the need to manage agricultural water resources more sustainably, especially in the Murray Darling Basin which accounts for more than 70% of water for crop production. For Australian continent, approximately 90% of the precipitation that falls on the land is returned back to the atmosphere through actual evapotranspiration (ET) process. However, despite its significance nationally, it is almost impossible to measure or observe it directly at a meaningful scale in space and time through traditional point-based methods. Since late 1990's, the optical-thermal remote sensing satellite data has been extensively used for mapping of actual ET from farm to catchment scales in Australia. Numerous ET algorithms have been developed to make use of remote sensing data acquired by optical-thermal sensors mounted on airborne and satellite platforms. This article concentrates on the Murrumbidgee catchment, where ground truth data has been collected on a fortnightly basis since 2007 using two Eddy Covariance Systems (ECS) and two Large Aperture Scintillometers (LAS). Their setup absorbed variability in the landscape to measure ET-related fluxes. The ground truthing measurement data includes leaf area index (LAI) from LICOR 2000, soil heat fluxes from HuskeFlux, crop reflectance data from CROPScan and from a thermal radiometer. UAV drone equipped with multispectral scanner and thermal imager was used to get very high spatial resolution NDVI and surface temperature maps over the selected farms. This large array of high technology instruments have been used to collect specific measurements within various micro-ecosystems available in our study area. This article starts by an overview of common ET estimation algorithms based on satellite remote sensing data. The algorithms are SEBAL, METRIC, Simplified Surface Energy Balance, Two Source Energy Balance and SEBS. They are used in Australia at both regional and catchment scale for mapping actual ET using imagery from Landsat 5TM/7ETM+ and Terra-MODIS sensors. Results of ET derived from various remote sensing algorithms are matched well against the ECS and LAS data sets. These high-tech observation system are used to collect specific ground truth data to develop new empirical and semi-empirical relationship for creating a Spatial Algorithm for Mapping ET (SAM-ET) dedicated to Australian agro-ecosystems. Such estimates can underpin crop water use, crop water productivity, food security, carbon sequestration and environmental flow requirements to enhance the sustainability of agricultural systems. The next frontier is to integrate these data and models to deliver a decision support system to irrigation managers for coordinating water supply and demand, to help match crop water requirement closely in near real time environment.

  9. Reliable clarity automatic-evaluation method for optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Qin, Bangyong; Shang, Ren; Li, Shengyang; Hei, Baoqin; Liu, Zhiwen

    2015-10-01

    Image clarity, which reflects the sharpness degree at the edge of objects in images, is an important quality evaluate index for optical remote sensing images. Scholars at home and abroad have done a lot of work on estimation of image clarity. At present, common clarity-estimation methods for digital images mainly include frequency-domain function methods, statistical parametric methods, gradient function methods and edge acutance methods. Frequency-domain function method is an accurate clarity-measure approach. However, its calculation process is complicate and cannot be carried out automatically. Statistical parametric methods and gradient function methods are both sensitive to clarity of images, while their results are easy to be affected by the complex degree of images. Edge acutance method is an effective approach for clarity estimate, while it needs picking out the edges manually. Due to the limits in accuracy, consistent or automation, these existing methods are not applicable to quality evaluation of optical remote sensing images. In this article, a new clarity-evaluation method, which is based on the principle of edge acutance algorithm, is proposed. In the new method, edge detection algorithm and gradient search algorithm are adopted to automatically search the object edges in images. Moreover, The calculation algorithm for edge sharpness has been improved. The new method has been tested with several groups of optical remote sensing images. Compared with the existing automatic evaluation methods, the new method perform better both in accuracy and consistency. Thus, the new method is an effective clarity evaluation method for optical remote sensing images.

  10. An algorithm for estimating aerosol optical depth from HIMAWARI-8 data over Ocean

    NASA Astrophysics Data System (ADS)

    Lee, Kwon Ho

    2016-04-01

    The paper presents currently developing algorithm for aerosol detection and retrieval over ocean for the next generation geostationary satellite, HIMAWARI-8. Enhanced geostationary remote sensing observations are now enables for aerosol retrieval of dust, smoke, and ash, which began a new era of geostationary aerosol observations. Sixteen channels of the Advanced HIMAWARI Imager (AHI) onboard HIMAWARI-8 offer capabilities for aerosol remote sensing similar to those currently provided by the Moderate Resolution Imaging Spectroradiometer (MODIS). Aerosols were estimated in detection processing from visible and infrared channel radiances, and in retrieval processing using the inversion-optimization of satellite-observed radiances with those calculated from radiative transfer model. The retrievals are performed operationally every ten minutes for pixel sizes of ~8 km. The algorithm currently under development uses a multichannel approach to estimate the effective radius, aerosol optical depth (AOD) simultaneously. The instantaneous retrieved AOD is evaluated by the MODIS level 2 operational aerosol products (C006), and the daily retrieved AOD was compared with ground-based measurements from the AERONET databases. The results show that the detection of aerosol and estimated AOD are in good agreement with the MODIS data and ground measurements with a correlation coefficient of ˜0.90 and a bias of 4%. These results suggest that the proposed method applied to the HIMAWARI-8 satellite data can accurately estimate continuous AOD. Acknowledgments This work was supported by "Development of Geostationary Meteorological Satellite Ground Segment(NMSC-2014-01)" program funded by National Meteorological Satellite Centre(NMSC) of Korea Meteorological Administration(KMA).

  11. A Clustering Algorithm for Ecological Stream Segment Identification from Spatially Extensive Digital Databases

    NASA Astrophysics Data System (ADS)

    Brenden, T. O.; Clark, R. D.; Wiley, M. J.; Seelbach, P. W.; Wang, L.

    2005-05-01

    Remote sensing and geographic information systems have made it possible to attribute variables for streams at increasingly detailed resolutions (e.g., individual river reaches). Nevertheless, management decisions still must be made at large scales because land and stream managers typically lack sufficient resources to manage on an individual reach basis. Managers thus require a method for identifying stream management units that are ecologically similar and that can be expected to respond similarly to management decisions. We have developed a spatially-constrained clustering algorithm that can merge neighboring river reaches with similar ecological characteristics into larger management units. The clustering algorithm is based on the Cluster Affinity Search Technique (CAST), which was developed for clustering gene expression data. Inputs to the clustering algorithm are the neighbor relationships of the reaches that comprise the digital river network, the ecological attributes of the reaches, and an affinity value, which identifies the minimum similarity for merging river reaches. In this presentation, we describe the clustering algorithm in greater detail and contrast its use with other methods (expert opinion, classification approach, regular clustering) for identifying management units using several Michigan watersheds as a backdrop.

  12. Evaluation of coastal zone color scanner diffuse attenuation coefficient algorithms for application to coastal waters

    NASA Astrophysics Data System (ADS)

    Mueller, James L.; Trees, Charles C.; Arnone, Robert A.

    1990-09-01

    The Coastal Zone Color Scannez (ZCS) and associated atmospheric and in-water algorithms have allowed synoptic analyses of regional and large scale variability of bio-optical properties [phytoplankton pigments and diffuse auenuation coefficient K(490)}. Austin and Petzold (1981) developed a robust in-water K(490) algorithm which related the diffuse attenuation coefficient at one optical depth [1/K(490)] to the ratio of the water-leaving radiances at 443 and 550 nm. Their regression analysis included diffuse attenuation coefficients K(490) up to 0.40 nm, but excluded data from estuarine areas, and other Case II waters, where the optical properties are not predominantly determined by phytoplankton. In these areas, errors are induced in the retrieval of remote sensing K(490) by extremely low water-leaving radiance at 443 nm [Lw(443) as viewed at the sensor may only be 1 or 2 digital counts], and improved cury can be realized using algorithms based on wavelengths where Lw(λ) is larger. Using ocean optical profiles quired by the Visibility Laboratory, algorithms are developed to predict K(490) from ratios of water leaving radiances at 520 and 670, as well as 443 and 550 nm.

  13. Analyzing Fourier Transforms for NASA DFRC's Fiber Optic Strain Sensing System

    NASA Technical Reports Server (NTRS)

    Fiechtner, Kaitlyn Leann

    2010-01-01

    This document provides a basic overview of the fiber optic technology used for sensing stress, strain, and temperature. Also, the document summarizes the research concerning speed and accuracy of the possible mathematical algorithms that can be used for NASA DFRC's Fiber Optic Strain Sensing (FOSS) system.

  14. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    NASA Astrophysics Data System (ADS)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  15. Compressive Video Recovery Using Block Match Multi-Frame Motion Estimation Based on Single Pixel Cameras

    PubMed Central

    Bi, Sheng; Zeng, Xiao; Tang, Xin; Qin, Shujia; Lai, King Wai Chiu

    2016-01-01

    Compressive sensing (CS) theory has opened up new paths for the development of signal processing applications. Based on this theory, a novel single pixel camera architecture has been introduced to overcome the current limitations and challenges of traditional focal plane arrays. However, video quality based on this method is limited by existing acquisition and recovery methods, and the method also suffers from being time-consuming. In this paper, a multi-frame motion estimation algorithm is proposed in CS video to enhance the video quality. The proposed algorithm uses multiple frames to implement motion estimation. Experimental results show that using multi-frame motion estimation can improve the quality of recovered videos. To further reduce the motion estimation time, a block match algorithm is used to process motion estimation. Experiments demonstrate that using the block match algorithm can reduce motion estimation time by 30%. PMID:26950127

  16. Novel lidar algorithms for atmospheric slantrange visibility, planetary boundary layer height, meteorogical phenomena and atmospheric layering measurements

    NASA Astrophysics Data System (ADS)

    Pantazis, Alexandros; Papayannis, Alexandros; Georgoussis, Georgios

    2018-04-01

    In this paper we present a development of novel algorithms and techniques implemented within the Laser Remote Sensing Laboratory (LRSL) of the National Technical University of Athens (NTUA), in collaboration with Raymetrics S.A., in order to incorporate them into a 3-Dimensional (3D) lidar. The lidar is transmitting at 355 nm in the eye safe region and the measurements then are transposed to the visual range at 550 nm, according to the World Meteorological Organization (WMO) and the International Civil Aviation Organization (ICAO) rules of daytime visibility. These algorithms are able to provide horizontal, slant and vertical visibility for tower aircraft controllers, meteorologists, but also from pilot's point of view. Other algorithms are also provided for detection of atmospheric layering in any given direction and vertical angle, along with the detection of the Planetary Boundary Layer Height (PBLH).

  17. Remote Sensing of Particulate Organic Carbon Pools in the High-Latitude Oceans

    NASA Technical Reports Server (NTRS)

    Stramski, Dariusz; Stramska, Malgorzata

    2005-01-01

    The general goal of this project was to characterize spatial distributions at basin scales and variability on monthly to interannual timescales of particulate organic carbon (POC) in the high-latitude oceans. The primary objectives were: (1) To collect in situ data in the north polar waters of the Atlantic and in the Southern Ocean, necessary for the derivation of POC ocean color algorithms for these regions. (2) To derive regional POC algorithms and refine existing regional chlorophyll (Chl) algorithms, to develop understanding of processes that control bio-optical relationships underlying ocean color algorithms for POC and Chl, and to explain bio-optical differentiation between the examined polar regions and within the regions. (3) To determine basin-scale spatial patterns and temporal variability on monthly to interannual scales in satellite-derived estimates of POC and Chl pools in the investigated regions for the period of time covered by SeaWiFS and MODIS missions.

  18. ICAP - An Interactive Cluster Analysis Procedure for analyzing remotely sensed data

    NASA Technical Reports Server (NTRS)

    Wharton, S. W.; Turner, B. J.

    1981-01-01

    An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. ICAP differs from conventional clustering algorithms by allowing the analyst to optimize the cluster configuration by inspection, rather than by manipulating process parameters. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters, and the analyst, who can evaluate and elect to modify the cluster structure. Clusters can be deleted, or lumped together pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The principal advantage of this approach is that it allows prior information (when available) to be used directly in the analysis, since the analyst interacts with ICAP in a straightforward manner, using basic terms with which he is more likely to be familiar. Results from testing ICAP showed that an informed use of ICAP can improve classification, as compared to an existing cluster analysis procedure.

  19. The Tetracorder user guide: version 4.4

    USGS Publications Warehouse

    Livo, Keith Eric; Clark, Roger N.

    2014-01-01

    Imaging spectroscopy mapping software assists in the identification and mapping of materials based on their chemical properties as expressed in spectral measurements of a planet including the solid or liquid surface or atmosphere. Such software can be used to analyze field, aircraft, or spacecraft data; remote sensing datasets; or laboratory spectra. Tetracorder is a set of software algorithms commanded through an expert system to identify materials based on their spectra (Clark and others, 2003). Tetracorder also can be used in traditional remote sensing analyses, because some of the algorithms are a version of a matched filter. Thus, depending on the instructions fed to the Tetracorder system, results can range from simple matched filter output, to spectral feature fitting, to full identification of surface materials (within the limits of the spectral signatures of materials over the spectral range and resolution of the imaging spectroscopy data). A basic understanding of spectroscopy by the user is required for developing an optimum mapping strategy and assessing the results.

  20. Uncertainties of optical parameters and their propagations in an analytical ocean color inversion algorithm.

    PubMed

    Lee, ZhongPing; Arnone, Robert; Hu, Chuanmin; Werdell, P Jeremy; Lubac, Bertrand

    2010-01-20

    Following the theory of error propagation, we developed analytical functions to illustrate and evaluate the uncertainties of inherent optical properties (IOPs) derived by the quasi-analytical algorithm (QAA). In particular, we evaluated the effects of uncertainties of these optical parameters on the inverted IOPs: the absorption coefficient at the reference wavelength, the extrapolation of particle backscattering coefficient, and the spectral ratios of absorption coefficients of phytoplankton and detritus/gelbstoff, respectively. With a systematically simulated data set (46,200 points), we found that the relative uncertainty of QAA-derived total absorption coefficients in the blue-green wavelengths is generally within +/-10% for oceanic waters. The results of this study not only establish theoretical bases to evaluate and understand the effects of the various variables on IOPs derived from remote-sensing reflectance, but also lay the groundwork to analytically estimate uncertainties of these IOPs for each pixel. These are required and important steps for the generation of quality maps of IOP products derived from satellite ocean color remote sensing.

  1. Trace gas detection in hyperspectral imagery using the wavelet packet subspace

    NASA Astrophysics Data System (ADS)

    Salvador, Mark A. Z.

    This dissertation describes research into a new remote sensing method to detect trace gases in hyperspectral and ultra-spectral data. This new method is based on the wavelet packet transform. It attempts to improve both the computational tractability and the detection of trace gases in airborne and spaceborne spectral imagery. Atmospheric trace gas research supports various Earth science disciplines to include climatology, vulcanology, pollution monitoring, natural disasters, and intelligence and military applications. Hyperspectral and ultra-spectral data significantly increases the data glut of existing Earth science data sets. Spaceborne spectral data in particular significantly increases spectral resolution while performing daily global collections of the earth. Application of the wavelet packet transform to the spectral space of hyperspectral and ultra-spectral imagery data potentially improves remote sensing detection algorithms. It also facilities the parallelization of these methods for high performance computing. This research seeks two science goals, (1) developing a new spectral imagery detection algorithm, and (2) facilitating the parallelization of trace gas detection in spectral imagery data.

  2. Advanced Dispersed Fringe Sensing Algorithm for Coarse Phasing Segmented Mirror Telescopes

    NASA Technical Reports Server (NTRS)

    Spechler, Joshua A.; Hoppe, Daniel J.; Sigrist, Norbert; Shi, Fang; Seo, Byoung-Joon; Bikkannavar, Siddarayappa A.

    2013-01-01

    Segment mirror phasing, a critical step of segment mirror alignment, requires the ability to sense and correct the relative pistons between segments from up to a few hundred microns to a fraction of wavelength in order to bring the mirror system to its full diffraction capability. When sampling the aperture of a telescope, using auto-collimating flats (ACFs) is more economical. The performance of a telescope with a segmented primary mirror strongly depends on how well those primary mirror segments can be phased. One such process to phase primary mirror segments in the axial piston direction is dispersed fringe sensing (DFS). DFS technology can be used to co-phase the ACFs. DFS is essentially a signal fitting and processing operation. It is an elegant method of coarse phasing segmented mirrors. DFS performance accuracy is dependent upon careful calibration of the system as well as other factors such as internal optical alignment, system wavefront errors, and detector quality. Novel improvements to the algorithm have led to substantial enhancements in DFS performance. The Advanced Dispersed Fringe Sensing (ADFS) Algorithm is designed to reduce the sensitivity to calibration errors by determining the optimal fringe extraction line. Applying an angular extraction line dithering procedure and combining this dithering process with an error function while minimizing the phase term of the fitted signal, defines in essence the ADFS algorithm.

  3. Sensitivity of Global Sea-Air CO2 Flux to Gas Transfer Algorithms, Climatological Wind Speeds, and Variability of Sea Surface Temperature and Salinity

    NASA Technical Reports Server (NTRS)

    McClain, Charles R.; Signorini, Sergio

    2002-01-01

    Sensitivity analyses of sea-air CO2 flux to gas transfer algorithms, climatological wind speeds, sea surface temperatures (SST) and salinity (SSS) were conducted for the global oceans and selected regional domains. Large uncertainties in the global sea-air flux estimates are identified due to different gas transfer algorithms, global climatological wind speeds, and seasonal SST and SSS data. The global sea-air flux ranges from -0.57 to -2.27 Gt/yr, depending on the combination of gas transfer algorithms and global climatological wind speeds used. Different combinations of SST and SSS global fields resulted in changes as large as 35% on the oceans global sea-air flux. An error as small as plus or minus 0.2 in SSS translates into a plus or minus 43% deviation on the mean global CO2 flux. This result emphasizes the need for highly accurate satellite SSS observations for the development of remote sensing sea-air flux algorithms.

  4. Magnetic STAR technology for real-time localization and classification of unexploded ordnance and buried mines

    NASA Astrophysics Data System (ADS)

    Wiegert, R. F.

    2009-05-01

    A man-portable Magnetic Scalar Triangulation and Ranging ("MagSTAR") technology for Detection, Localization and Classification (DLC) of unexploded ordnance (UXO) has been developed by Naval Surface Warfare Center Panama City Division (NSWC PCD) with support from the Strategic Environmental Research and Development Program (SERDP). Proof of principle of the MagSTAR concept and its unique advantages for real-time, high-mobility magnetic sensing applications have been demonstrated by field tests of a prototype man-portable MagSTAR sensor. The prototype comprises: a) An array of fluxgate magnetometers configured as a multi-tensor gradiometer, b) A GPS-synchronized signal processing system. c) Unique STAR algorithms for point-by-point, standoff DLC of magnetic targets. This paper outlines details of: i) MagSTAR theory, ii) Design and construction of the prototype sensor, iii) Signal processing algorithms recently developed to improve the technology's target-discrimination accuracy, iv) Results of field tests of the portable gradiometer system against magnetic dipole targets. The results demonstrate that the MagSTAR technology is capable of very accurate, high-speed localization of magnetic targets at standoff distances of several meters. These advantages could readily be transitioned to a wide range of defense, security and sensing applications to provide faster and more effective DLC of UXO and buried mines.

  5. Robust Online Monitoring for Calibration Assessment of Transmitters and Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Coble, Jamie B.; Shumaker, Brent

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this article, we discuss an overview of research being performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or moremore » sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation • Virtual sensing • Sensor response-time assessment These algorithms incorporate, at their base, a Gaussian Process-based uncertainty quantification (UQ) method. Various plant models (using kernel regression, GP, or hierarchical models) may be used to predict sensor responses under various plant conditions. These predicted responses can then be applied in fault detection (sensor output and response time) and in computing the correct value (virtual sensing) of a failing physical sensor. The methods being evaluated in this work can compute confidence levels along with the predicted sensor responses, and as a result, may have the potential for compensating for sensor drift in real-time (online recalibration). Evaluation was conducted using data from multiple sources (laboratory flow loops and plant data). Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  6. Kerr Reservoir LANDSAT experiment analysis for November 1980

    NASA Technical Reports Server (NTRS)

    Lecroy, S. R.

    1982-01-01

    An experiment was conducted on the waters of Kerr Reservoir to determine if reliable algorithms could be developed that relate water quality parameters to remotely sensed data. LANDSAT radiance data was used in the analysis since it is readily available and covers the area of interest on a regular basis. By properly designing the experiment, many of the unwanted variations due to atmosphere, solar, and hydraulic changes were minimized. The algorithms developed were constrained to satisfy rigorous statistical criteria before they could be considered dependable in predicting water quality parameters. A complete mix of different types of algorithms using the LANDSAT bands was generated to provide a thorough understanding of the relationships among the data involved. The study demonstrated that for the ranges measured, the algorithms that satisfactorily represented the data are mostly linear and only require a maximum of one or two LANDSAT bands. Rationing techniques did not improve the results since the initial design of the experiment minimized the errors that this procedure is effective against. Good correlations were established for inorganic suspended solids, iron, turbidity, and secchi depth.

  7. Algorithms and architectures for robot vision

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S.

    1990-01-01

    The scope of the current work is to develop practical sensing implementations for robots operating in complex, partially unstructured environments. A focus in this work is to develop object models and estimation techniques which are specific to requirements of robot locomotion, approach and avoidance, and grasp and manipulation. Such problems have to date received limited attention in either computer or human vision - in essence, asking not only how perception is in general modeled, but also what is the functional purpose of its underlying representations. As in the past, researchers are drawing on ideas from both the psychological and machine vision literature. Of particular interest is the development 3-D shape and motion estimates for complex objects when given only partial and uncertain information and when such information is incrementally accrued over time. Current studies consider the use of surface motion, contour, and texture information, with the longer range goal of developing a fused sensing strategy based on these sources and others.

  8. Development of the segment alignment maintenance system (SAMS) for the Hobby-Eberly Telescope

    NASA Astrophysics Data System (ADS)

    Booth, John A.; Adams, Mark T.; Ames, Gregory H.; Fowler, James R.; Montgomery, Edward E.; Rakoczy, John M.

    2000-07-01

    A sensing and control system for maintaining optical alignment of ninety-one 1-meter mirror segments forming the Hobby-Eberly Telescope (HET) primary mirror array is now under development. The Segment Alignment Maintenance System (SAMS) is designed to sense relative shear motion between each segment edge pair and calculated individual segment tip, tilt, and piston position errors. Error information is sent to the HET primary mirror control system, which corrects the physical position of each segment as often as once per minute. Development of SAMS is required to meet optical images quality specifications for the telescope. Segment misalignment over time is though to be due to thermal inhomogeneity within the steel mirror support truss. Challenging problems of sensor resolution, dynamic range, mechanical mounting, calibration, stability, robust algorithm development, and system integration must be overcome to achieve a successful operational solution.

  9. Remote sensing estimation of terrestrially derived colored dissolved organic matterinput to the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Li, J.; Yu, Q.; Tian, Y. Q.

    2017-12-01

    The DOC flux from land to the Arctic Ocean has remarkable implication on the carbon cycle, biogeochemical & ecological processes in the Arctic. This lateral carbon flux is required to be monitored with high spatial & temporal resolution. However, the current studies in the Arctic regions were obstructed by the factors of the low spatial coverages. The remote sensing could provide an alternative bio-optical approach to field sampling for DOC dynamics monitoring through the observation of the colored dissolved organic matter (CDOM). The DOC and CDOM were found highly correlated based on the analysis of the field sampling data from the Arctic-GRO. These provide the solid foundation of the remote sensing observation. In this study, six major Arctic Rivers (Yukon, Kolyma, Lena, Mackenzie, Ob', Yenisey) were selected to derive the CDOM dynamics along four years. Our newly developed SBOP algorithm was applied to the large Landsat-8 OLI image data (nearly 100 images) for getting the high spatial resolution results. The SBOP algorithm is the first approach developing for the Shallow Water Bio-optical properties estimation. The CDOM absorption derived from the satellite images were verified with the field sampling results with high accuracy (R2 = 0.87). The distinct CDOM dynamics were found in different Rivers. The CDOM absorptions were found highly related to the hydrological activities and the terrestrially environmental dynamics. Our study helps to build the reliable system for studying the carbon cycle at Arctic regions.

  10. Scaling-up of CO2 fluxes to assess carbon sequestration in rangelands of Central Asia

    Treesearch

    Bruce K. Wylie; Tagir G. Gilmanov; Douglas A. Johnson; Nicanor Z. Saliendra; Larry L. Tieszen; Ruth Anne F. Doyle; Emilio A. Laca

    2006-01-01

    Flux towers provide temporal quantification of local carbon dynamics at specific sites. The number and distribution of flux towers, however, are generally inadequate to quantify carbon fluxes across a landscape or ecoregion. Thus, scaling up of flux tower measurements through use of algorithms developed from remote sensing and GIS data is needed for spatial...

  11. Water Mapping Technology Rebuilds Lives in Arid Regions

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Using NASA Landsat satellite and other remote sensing topographical data, Radar Technologies International developed an algorithm-based software program that can locate underground water sources. Working with international organizations and governments, the firm, which maintains an office in New Braunfels, Texas, is helping to provide water for refugees and other people in drought-stricken regions such as Kenya, Sudan, and Afghanistan.

  12. Spatial-Spectral Approaches to Edge Detection in Hyperspectral Remote Sensing

    NASA Astrophysics Data System (ADS)

    Cox, Cary M.

    This dissertation advances geoinformation science at the intersection of hyperspectral remote sensing and edge detection methods. A relatively new phenomenology among its remote sensing peers, hyperspectral imagery (HSI) comprises only about 7% of all remote sensing research - there are five times as many radar-focused peer reviewed journal articles than hyperspectral-focused peer reviewed journal articles. Similarly, edge detection studies comprise only about 8% of image processing research, most of which is dedicated to image processing techniques most closely associated with end results, such as image classification and feature extraction. Given the centrality of edge detection to mapping, that most important of geographic functions, improving the collective understanding of hyperspectral imagery edge detection methods constitutes a research objective aligned to the heart of geoinformation sciences. Consequently, this dissertation endeavors to narrow the HSI edge detection research gap by advancing three HSI edge detection methods designed to leverage HSI's unique chemical identification capabilities in pursuit of generating accurate, high-quality edge planes. The Di Zenzo-based gradient edge detection algorithm, an innovative version of the Resmini HySPADE edge detection algorithm and a level set-based edge detection algorithm are tested against 15 traditional and non-traditional HSI datasets spanning a range of HSI data configurations, spectral resolutions, spatial resolutions, bandpasses and applications. This study empirically measures algorithm performance against Dr. John Canny's six criteria for a good edge operator: false positives, false negatives, localization, single-point response, robustness to noise and unbroken edges. The end state is a suite of spatial-spectral edge detection algorithms that produce satisfactory edge results against a range of hyperspectral data types applicable to a diverse set of earth remote sensing applications. This work also explores the concept of an edge within hyperspectral space, the relative importance of spatial and spectral resolutions as they pertain to HSI edge detection and how effectively compressed HSI data improves edge detection results. The HSI edge detection experiments yielded valuable insights into the algorithms' strengths, weaknesses and optimal alignment to remote sensing applications. The gradient-based edge operator produced strong edge planes across a range of evaluation measures and applications, particularly with respect to false negatives, unbroken edges, urban mapping, vegetation mapping and oil spill mapping applications. False positives and uncompressed HSI data presented occasional challenges to the algorithm. The HySPADE edge operator produced satisfactory results with respect to localization, single-point response, oil spill mapping and trace chemical detection, and was challenged by false positives, declining spectral resolution and vegetation mapping applications. The level set edge detector produced high-quality edge planes for most tests and demonstrated strong performance with respect to false positives, single-point response, oil spill mapping and mineral mapping. False negatives were a regular challenge for the level set edge detection algorithm. Finally, HSI data optimized for spectral information compression and noise was shown to improve edge detection performance across all three algorithms, while the gradient-based algorithm and HySPADE demonstrated significant robustness to declining spectral and spatial resolutions.

  13. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  14. Compressive Sensing Based Bio-Inspired Shape Feature Detection CMOS Imager

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A. (Inventor)

    2015-01-01

    A CMOS imager integrated circuit using compressive sensing and bio-inspired detection is presented which integrates novel functions and algorithms within a novel hardware architecture enabling efficient on-chip implementation.

  15. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia

    PubMed Central

    Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David

    2006-01-01

    Background Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. Methods "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). Results The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article [1] describes the evolution of number sense and arithmetic scores before and after training. Conclusion The software, open-source and freely available online, is designed for learning disabled children aged 5–8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains. PMID:16734905

  16. An AdaBoost Based Approach to Automatic Classification and Detection of Buildings Footprints, Vegetation Areas and Roads from Satellite Images

    NASA Astrophysics Data System (ADS)

    Gonulalan, Cansu

    In recent years, there has been an increasing demand for applications to monitor the targets related to land-use, using remote sensing images. Advances in remote sensing satellites give rise to the research in this area. Many applications ranging from urban growth planning to homeland security have already used the algorithms for automated object recognition from remote sensing imagery. However, they have still problems such as low accuracy on detection of targets, specific algorithms for a specific area etc. In this thesis, we focus on an automatic approach to classify and detect building foot-prints, road networks and vegetation areas. The automatic interpretation of visual data is a comprehensive task in computer vision field. The machine learning approaches improve the capability of classification in an intelligent way. We propose a method, which has high accuracy on detection and classification. The multi class classification is developed for detecting multiple objects. We present an AdaBoost-based approach along with the supervised learning algorithm. The combi- nation of AdaBoost with "Attentional Cascade" is adopted from Viola and Jones [1]. This combination decreases the computation time and gives opportunity to real time applications. For the feature extraction step, our contribution is to combine Haar-like features that include corner, rectangle and Gabor. Among all features, AdaBoost selects only critical features and generates in extremely efficient cascade structured classifier. Finally, we present and evaluate our experimental results. The overall system is tested and high performance of detection is achieved. The precision rate of the final multi-class classifier is over 98%.

  17. An infrared-visible image fusion scheme based on NSCT and compressed sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Qiong; Maldague, Xavier

    2015-05-01

    Image fusion, as a research hot point nowadays in the field of infrared computer vision, has been developed utilizing different varieties of methods. Traditional image fusion algorithms are inclined to bring problems, such as data storage shortage and computational complexity increase, etc. Compressed sensing (CS) uses sparse sampling without knowing the priori knowledge and greatly reconstructs the image, which reduces the cost and complexity of image processing. In this paper, an advanced compressed sensing image fusion algorithm based on non-subsampled contourlet transform (NSCT) is proposed. NSCT provides better sparsity than the wavelet transform in image representation. Throughout the NSCT decomposition, the low-frequency and high-frequency coefficients can be obtained respectively. For the fusion processing of low-frequency coefficients of infrared and visible images , the adaptive regional energy weighting rule is utilized. Thus only the high-frequency coefficients are specially measured. Here we use sparse representation and random projection to obtain the required values of high-frequency coefficients, afterwards, the coefficients of each image block can be fused via the absolute maximum selection rule and/or the regional standard deviation rule. In the reconstruction of the compressive sampling results, a gradient-based iterative algorithm and the total variation (TV) method are employed to recover the high-frequency coefficients. Eventually, the fused image is recovered by inverse NSCT. Both the visual effects and the numerical computation results after experiments indicate that the presented approach achieves much higher quality of image fusion, accelerates the calculations, enhances various targets and extracts more useful information.

  18. Neural Network-Based Retrieval of Surface and Root Zone Soil Moisture using Multi-Frequency Remotely-Sensed Observations

    NASA Astrophysics Data System (ADS)

    Hamed Alemohammad, Seyed; Kolassa, Jana; Prigent, Catherine; Aires, Filipe; Gentine, Pierre

    2017-04-01

    Knowledge of root zone soil moisture is essential in studying plant's response to different stress conditions since plant photosynthetic activity and transpiration rate are constrained by the water available through their roots. Current global root zone soil moisture estimates are based on either outputs from physical models constrained by observations, or assimilation of remotely-sensed microwave-based surface soil moisture estimates with physical model outputs. However, quality of these estimates are limited by the accuracy of the model representations of physical processes (such as radiative transfer, infiltration, percolation, and evapotranspiration) as well as errors in the estimates of the surface parameters. Additionally, statistical approaches provide an alternative efficient platform to develop root zone soil moisture retrieval algorithms from remotely-sensed observations. In this study, we present a new neural network based retrieval algorithm to estimate surface and root zone soil moisture from passive microwave observations of SMAP satellite (L-band) and AMSR2 instrument (X-band). SMAP early morning observations are ideal for surface soil moisture retrieval. AMSR2 mid-night observations are used here as an indicator of plant hydraulic properties that are related to root zone soil moisture. The combined observations from SMAP and AMSR2 together with other ancillary observations including the Solar-Induced Fluorescence (SIF) estimates from GOME-2 instrument provide necessary information to estimate surface and root zone soil moisture. The algorithm is applied to observations from the first 18 months of SMAP mission and retrievals are validated against in-situ observations and other global datasets.

  19. Density-independent algorithm for sensing moisture content of sawdust based on reflection measurements

    USDA-ARS?s Scientific Manuscript database

    A density-independent algorithm for moisture content determination in sawdust, based on a one-port reflection measurement technique is proposed for the first time. Performance of this algorithm is demonstrated through measurement of the dielectric properties of sawdust with an open-ended haft-mode s...

  20. Abnormal global and local event detection in compressive sensing domain

    NASA Astrophysics Data System (ADS)

    Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem

    2018-05-01

    Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.

  1. Tracking the Evolution of Smartphone Sensing for Monitoring Human Movement

    PubMed Central

    del Rosario, Michael B.; Redmond, Stephen J.; Lovell, Nigel H.

    2015-01-01

    Advances in mobile technology have led to the emergence of the “smartphone”, a new class of device with more advanced connectivity features that have quickly made it a constant presence in our lives. Smartphones are equipped with comparatively advanced computing capabilities, a global positioning system (GPS) receivers, and sensing capabilities (i.e., an inertial measurement unit (IMU) and more recently magnetometer and barometer) which can be found in wearable ambulatory monitors (WAMs). As a result, algorithms initially developed for WAMs that “count” steps (i.e., pedometers); gauge physical activity levels; indirectly estimate energy expenditure and monitor human movement can be utilised on the smartphone. These algorithms may enable clinicians to “close the loop” by prescribing timely interventions to improve or maintain wellbeing in populations who are at risk of falling or suffer from a chronic disease whose progression is linked to a reduction in movement and mobility. The ubiquitous nature of smartphone technology makes it the ideal platform from which human movement can be remotely monitored without the expense of purchasing, and inconvenience of using, a dedicated WAM. In this paper, an overview of the sensors that can be found in the smartphone are presented, followed by a summary of the developments in this field with an emphasis on the evolution of algorithms used to classify human movement. The limitations identified in the literature will be discussed, as well as suggestions about future research directions. PMID:26263998

  2. Chlorophyll-a specific volume scattering function of phytoplankton.

    PubMed

    Tan, Hiroyuki; Oishi, Tomohiko; Tanaka, Akihiko; Doerffer, Roland; Tan, Yasuhiro

    2017-06-12

    Chlorophyll-a specific light volume scattering functions (VSFs) by cultured phytoplankton in visible spectrum range is presented. Chlorophyll-a specific VSFs were determined based on the linear least squares method using a measured VSFs with different chlorophyll-a concentrations. We found obvious variability of it in terms of spectral and angular shapes of VSF between cultures. It was also presented that chlorophyll-a specific scattering significantly affected on spectral variation of the remote sensing reflectance, depending on spectral shape of b. This result is useful for developing an advance algorithm of ocean color remote sensing and for deep understanding of light in the sea.

  3. Compressive self-interference Fresnel digital holography with faithful reconstruction

    NASA Astrophysics Data System (ADS)

    Wan, Yuhong; Man, Tianlong; Han, Ying; Zhou, Hongqiang; Wang, Dayong

    2017-05-01

    We developed compressive self-interference digital holographic approach that allows retrieving three-dimensional information of the spatially incoherent objects from single-shot captured hologram. The Fresnel incoherent correlation holography is combined with parallel phase-shifting technique to instantaneously obtain spatial-multiplexed phase-shifting holograms. The recording scheme is regarded as compressive forward sensing model, thus the compressive-sensing-based reconstruction algorithm is implemented to reconstruct the original object from the under sampled demultiplexed sub-holograms. The concept was verified by simulations and experiments with simulating use of the polarizer array. The proposed technique has great potential to be applied in 3D tracking of spatially incoherent samples.

  4. Dynamic Domains in Data Production Planning

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Pang, Wanlin

    2005-01-01

    This paper discusses a planner-based approach to automating data production tasks, such as producing fire forecasts from satellite imagery and weather station data. Since the set of available data products is large, dynamic and mostly unknown, planning techniques developed for closed worlds are unsuitable. We discuss a number of techniques we have developed to cope with data production domains, including a novel constraint propagation algorithm based on planning graphs and a constraint-based approach to interleaved planning, sensing and execution.

  5. Advanced Product Development for Combat Casualty Care at the U.S. Army Institute of Surgical Research

    DTIC Science & Technology

    2010-04-01

    for predicting central blood volume changes to focus on the development of software algorithms and systems to provide a capability to track, and...which creatively fills this Critical Care gap. Technology in this sense means hardware and software systems which incorporate sensors, processors...devices for use in forward surgical and combat areas. Mil Med 170: 76-82, 2005. [10] Gaylord KM, Cooper DB, Mercado JM, Kennedy JE, Yoder LH, and

  6. Preliminary investigation of submerged aquatic vegetation mapping using hyperspectral remote sensing.

    PubMed

    William, David J; Rybicki, Nancy B; Lombana, Alfonso V; O'Brien, Tim M; Gomez, Richard B

    2003-01-01

    The use of airborne hyperspectral remote sensing imagery for automated mapping of submerged aquatic vegetation (SAV) in the tidal Potomac River was investigated for near to real-time resource assessment and monitoring. Airborne hyperspectral imagery and field spectrometer measurements were obtained in October of 2000. A spectral library database containing selected ground-based and airborne sensor spectra was developed for use in image processing. The spectral library is used to automate the processing of hyperspectral imagery for potential real-time material identification and mapping. Field based spectra were compared to the airborne imagery using the database to identify and map two species of SAV (Myriophyllum spicatum and Vallisneria americana). Overall accuracy of the vegetation maps derived from hyperspectral imagery was determined by comparison to a product that combined aerial photography and field based sampling at the end of the SAV growing season. The algorithms and databases developed in this study will be useful with the current and forthcoming space-based hyperspectral remote sensing systems.

  7. Real-time bio-sensors for enhanced C2ISR operator performance

    NASA Astrophysics Data System (ADS)

    Miller, James C.

    2005-05-01

    The objectives of two Air Force Small Business research topics were to develop a real-time, unobtrusive, biological sensing and monitoring technology for evaluating cognitive readiness in command and control environments (i.e., console operators). We sought an individualized status monitoring system for command and control operators and teams. The system was to consist of a collection of bio-sensing technologies and processing and feedback algorithms that could eventually guide the effective incorporation of fatigue-adaptive workload interventions into weapon systems to mitigate episodes of cognitive overload and lapses in operator attention that often result in missed signals and catastrophic failures. Contractors set about determining what electro-physiological and other indicators of compromised operator states are most amenable for unobtrusive monitoring of psychophysiological and warfighter performance data. They proposed multi-sensor platforms of bio-sensing technologies for development. The sensors will be continuously-wearable or off-body and will not require complicated or uncomfortable preparation. A general overview of the proposed approaches and of progress toward the objective is presented.

  8. Multidimensional Modeling of Atmospheric Effects and Surface Heterogeneities on Remote Sensing

    NASA Technical Reports Server (NTRS)

    Gerstl, S. A. W.; Simmer, C.; Zardecki, A. (Principal Investigator)

    1985-01-01

    The overall goal of this project is to establish a modeling capability that allows a quantitative determination of atmospheric effects on remote sensing including the effects of surface heterogeneities. This includes an improved understanding of aerosol and haze effects in connection with structural, angular, and spatial surface heterogeneities. One important objective of the research is the possible identification of intrinsic surface or canopy characteristics that might be invariant to atmospheric perturbations so that they could be used for scene identification. Conversely, an equally important objective is to find a correction algorithm for atmospheric effects in satellite-sensed surface reflectances. The technical approach is centered around a systematic model and code development effort based on existing, highly advanced computer codes that were originally developed for nuclear radiation shielding applications. Computational techniques for the numerical solution of the radiative transfer equation are adapted on the basis of the discrete-ordinates finite-element method which proved highly successful for one and two-dimensional radiative transfer problems with fully resolved angular representation of the radiation field.

  9. Spatialized Application of Remotely Sensed Data Assimilation Methods for Farmland Drought Monitoring Using Two Different Crop Models

    NASA Astrophysics Data System (ADS)

    Silvestro, Paolo Cosmo; Casa, Raffaele; Pignatti, Stefano; Castaldi, Fabio; Yang, Hao; Guijun, Yang

    2016-08-01

    The aim of this work was to develop a tool to evaluate the effect of water stress on yield losses at the farmland and regional scale, by assimilating remotely sensed biophysical variables into crop growth models. Biophysical variables were retrieved from HJ1A, HJ1B and Landsat 8 images, using an algorithm based on the training of artificial neural networks on PROSAIL.For the assimilation, two crop models of differing degree of complexity were used: Aquacrop and SAFY. For Aquacrop, an optimization procedure to reduce the difference between the remotely sensed and simulated CC was developed. For the modified version of SAFY, the assimilation procedure was based on the Ensemble Kalman Filter.These procedures were tested in a spatialized application, by using data collected in the rural area of Yangling (Shaanxi Province) between 2013 and 2015Results were validated by utilizing yield data both from ground measurements and statistical survey.

  10. Behavior of an Automatic Pacemaker Sensing Algorithm for Single-Pass VDD Atrial Electrograms

    DTIC Science & Technology

    2001-10-25

    830- s lead (Medico), during several different body postures, deep respiration, and walking. The algorithm had a pre - determined sensing dynamic range...SINGLE-PASS VDD ATRIAL ELECTROGRAMS J. Kim1, S.H. Lee1, S.Y.Yang2, B. S . Cho2, and W. Huh1 1Department of Electronics Engineering, Myongji...University, Yongin, Korea 2Department of Information and Communication, Dongwon College, Kwangju, Korea S T = 5 0 % x ( B + C ) / 2 S T = 5 0 % x ( A + B

  11. High resolution remote sensing of densely urbanised regions: a case study of Hong Kong.

    PubMed

    Nichol, Janet E; Wong, Man Sing

    2009-01-01

    Data on the urban environment such as climate or air quality is usually collected at a few point monitoring stations distributed over a city. However, the synoptic viewpoint of satellites where a whole city is visible on a single image permits the collection of spatially comprehensive data at city-wide scale. In spite of rapid developments in remote sensing systems, deficiencies in image resolution and algorithm development still exist for applications such as air quality monitoring and urban heat island analysis. This paper describes state-of-the-art techniques for enhancing and maximising the spatial detail available from satellite images, and demonstrates their applications to the densely urbanised environment of Hong Kong. An Emissivity Modulation technique for spatial enhancement of thermal satellite images permits modelling of urban microclimate in combination with other urban structural parameters at local scale. For air quality monitoring, a Minimum Reflectance Technique (MRT) has been developed for MODIS 500 m images. The techniques described can promote the routine utilization of remotely sensed images for environmental monitoring in cities of the 21(st) century.

  12. High Resolution Remote Sensing of Densely Urbanised Regions: a Case Study of Hong Kong

    PubMed Central

    Nichol, Janet E.; Wong, Man Sing

    2009-01-01

    Data on the urban environment such as climate or air quality is usually collected at a few point monitoring stations distributed over a city. However, the synoptic viewpoint of satellites where a whole city is visible on a single image permits the collection of spatially comprehensive data at city-wide scale. In spite of rapid developments in remote sensing systems, deficiencies in image resolution and algorithm development still exist for applications such as air quality monitoring and urban heat island analysis. This paper describes state-of-the-art techniques for enhancing and maximising the spatial detail available from satellite images, and demonstrates their applications to the densely urbanised environment of Hong Kong. An Emissivity Modulation technique for spatial enhancement of thermal satellite images permits modelling of urban microclimate in combination with other urban structural parameters at local scale. For air quality monitoring, a Minimum Reflectance Technique (MRT) has been developed for MODIS 500 m images. The techniques described can promote the routine utilization of remotely sensed images for environmental monitoring in cities of the 21st century. PMID:22408549

  13. TRL-6 for JWST wavefront sensing and control

    NASA Astrophysics Data System (ADS)

    Feinberg, Lee D.; Dean, Bruce H.; Aronstein, David L.; Bowers, Charles W.; Hayden, William; Lyon, Richard G.; Shiri, Ron; Smith, J. Scott; Acton, D. Scott; Carey, Larkin; Contos, Adam; Sabatke, Erin; Schwenker, John; Shields, Duncan; Towell, Tim; Shi, Fang; Meza, Luis

    2007-09-01

    NASA's Technology Readiness Level (TRL)-6 is documented for the James Webb Space Telescope (JWST) Wavefront Sensing and Control (WFSC) subsystem. The WFSC subsystem is needed to align the Optical Telescope Element (OTE) after all deployments have occurred, and achieves that requirement through a robust commissioning sequence consisting of unique commissioning algorithms, all of which are part of the WFSC algorithm suite. This paper identifies the technology need, algorithm heritage, describes the finished TRL-6 design platform, and summarizes the TRL-6 test results and compliance. Additionally, the performance requirements needed to satisfy JWST science goals as well as the criterion that relate to the TRL-6 Testbed Telescope (TBT) performance requirements are discussed.

  14. TRL-6 for JWST Wavefront Sensing and Control

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Dean, Bruce; Smith, Scott; Aronstein, David; Shiri, Ron; Lyon, Rick; Hayden, Bill; Bowers, Chuck; Acton, D. Scott; Shields, Duncan; hide

    2007-01-01

    NASA's Technology Readiness Level (TRL)-6 is documented for the James Webb Space Telescope (JWST) Wavefront Sensing and Control (WFSC) subsystem. The WFSC subsystem is needed to align the Optical Telescope Element (OTE) after all deployments have occurred, and achieves that requirement through a robust commissioning sequence consisting of unique commissioning algorithms, all of which are part of the WFSC algorithm suite. This paper identifies the technology need, algorithm heritage, describes the finished TRL-6 design platform, and summarizes the TRL-6 test results and compliance. Additionally, the performance requirements needed to satisfy JWST science goals as well as the criterion that relate to the TRL-6 Testbed Telescope (TBT) performance requirements are discussed

  15. Semantic Drift in Espresso-style Bootstrapping: Graph-theoretic Analysis and Evaluation in Word Sense Disambiguation

    NASA Astrophysics Data System (ADS)

    Komachi, Mamoru; Kudo, Taku; Shimbo, Masashi; Matsumoto, Yuji

    Bootstrapping has a tendency, called semantic drift, to select instances unrelated to the seed instances as the iteration proceeds. We demonstrate the semantic drift of Espresso-style bootstrapping has the same root as the topic drift of Kleinberg's HITS, using a simplified graph-based reformulation of bootstrapping. We confirm that two graph-based algorithms, the von Neumann kernels and the regularized Laplacian, can reduce the effect of semantic drift in the task of word sense disambiguation (WSD) on Senseval-3 English Lexical Sample Task. Proposed algorithms achieve superior performance to Espresso and previous graph-based WSD methods, even though the proposed algorithms have less parameters and are easy to calibrate.

  16. Low dose reconstruction algorithm for differential phase contrast imaging.

    PubMed

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  17. Changing Requirements for Archiving Climate Data Records Derived From Remotely Sensed Data

    NASA Astrophysics Data System (ADS)

    Fleig, A. J.; Tilmes, C.

    2007-05-01

    With the arrival of long term sets of measurements of remotely sensed data it becomes important to improve the standard practices associated with archival of information needed to allow creation of climate data records, CDRs, from individual sets of measurements. Several aspects of the production of CDRs suggest that there should be changes in standard best practices for archival. A fundamental requirement for understanding long- term trends in climate data is that changes with time shown by the data reflect changes in actual geophysical parameters rather than changes in the measurement system. Even well developed and validated data sets from remotely sensed measurements contain artifacts. If the nature of the measurement and the algorithm is consistent over time, these artifacts may have little impact on trends derived from the data. However data sets derived with different algorithms created with different assumptions are likely to introduce non-physical changes in trend data. Yet technology for making measurements and analyzing data improves with time and this must be accounted for. To do this for an ongoing long term data set based on multiple instruments it is important to understand exactly how the preceding data was produced. But we are reaching the point where the scientists and engineers that developed the initial measurements and algorithms are no longer available to explain and assist in adapting today's systems for use with future measurement systems. In an era where tens to hundreds of man years are involved in calibrating an instrument and producing and validating a set of geophysical measurements from the calibrated data we have long passed the time when it was reasonable to say "just give me the basic measurement and a bright graduate student and I can produce anything I need in a year." Examples of problems encountered and alternative solutions will be provided based on developing and reprocessing data sets from long term measurements of atmospheric, land surface and ocean measurements covering, in one case, a series of fifteen instruments currently scheduled to continue from 1978 through several decades into the 2030s. Possible changes in approach for developers of instruments and processing algorithms, archival centers, funding organizations and the climate science community will be suggested. There is a cost in both time and money associated with most of these changes and the hope is that this presentation will prompt further discussion on what should be done.

  18. Improving management performance of P2PSIP for mobile sensing in wireless overlays.

    PubMed

    Sendín-Raña, Pablo; González-Castaño, Francisco Javier; Gómez-Cuba, Felipe; Asorey-Cacheda, Rafael; Pousada-Carballo, José María

    2013-11-08

    Future wireless communications are heading towards an all-Internet Protocol (all-IP) design, and will rely on the Session Initiation Protocol (SIP) to manage services, such as voice over IP (VoIP). The centralized architecture of traditional SIP has numerous disadvantages for mobile ad hoc services that may be possibly overcome by advanced peer-to-peer (P2P) technologies initially developed for the Internet. In the context of mobile sensing, P2PSIP protocols facilitate decentralized and fast communications with sensor-enabled terminals. Nevertheless, in order to make P2PSIP protocols feasible in mobile sensing networks, it is necessary to minimize overhead transmissions for signaling purposes, which reduces the battery lifetime. In this paper, we present a solution to improve the management of wireless overlay networks by defining an adaptive algorithm for the calculation of refresh time. The main advantage of the proposed algorithm is that it takes into account new parameters, such as the delay between nodes, and provides satisfactory performance and reliability levels at a much lower management overhead than previous approaches. The proposed solution can be applied to many structured P2P overlays or P2PSIP protocols. We evaluate it with Kademlia-based distributed hash tables (DHT) and dSIP.

  19. Improving Management Performance of P2PSIP for Mobile Sensing in Wireless Overlays

    PubMed Central

    Sendín-Raña, Pablo; González-Castaño, Francisco Javier; Gómez-Cuba, Felipe; Asorey-Cacheda, Rafael; Pousada-Carballo, José María

    2013-01-01

    Future wireless communications are heading towards an all-Internet Protocol (all-IP) design, and will rely on the Session Initiation Protocol (SIP) to manage services, such as voice over IP (VoIP). The centralized architecture of traditional SIP has numerous disadvantages for mobile ad hoc services that may be possibly overcome by advanced peer-to-peer (P2P) technologies initially developed for the Internet. In the context of mobile sensing, P2PSIP protocols facilitate decentralized and fast communications with sensor-enabled terminals. Nevertheless, in order to make P2PSIP protocols feasible in mobile sensing networks, it is necessary to minimize overhead transmissions for signaling purposes, which reduces the battery lifetime. In this paper, we present a solution to improve the management of wireless overlay networks by defining an adaptive algorithm for the calculation of refresh time. The main advantage of the proposed algorithm is that it takes into account new parameters, such as the delay between nodes, and provides satisfactory performance and reliability levels at a much lower management overhead than previous approaches. The proposed solution can be applied to many structured P2P overlays or P2PSIP protocols. We evaluate it with Kademlia-based distributed hash tables (DHT) and dSIP PMID:24217358

  20. A random optimization approach for inherent optic properties of nearshore waters

    NASA Astrophysics Data System (ADS)

    Zhou, Aijun; Hao, Yongshuai; Xu, Kuo; Zhou, Heng

    2016-10-01

    Traditional method of water quality sampling is time-consuming and highly cost. It can not meet the needs of social development. Hyperspectral remote sensing technology has well time resolution, spatial coverage and more general segment information on spectrum. It has a good potential in water quality supervision. Via the method of semi-analytical, remote sensing information can be related with the water quality. The inherent optical properties are used to quantify the water quality, and an optical model inside the water is established to analysis the features of water. By stochastic optimization algorithm Threshold Acceptance, a global optimization of the unknown model parameters can be determined to obtain the distribution of chlorophyll, organic solution and suspended particles in water. Via the improvement of the optimization algorithm in the search step, the processing time will be obviously reduced, and it will create more opportunity for the increasing the number of parameter. For the innovation definition of the optimization steps and standard, the whole inversion process become more targeted, thus improving the accuracy of inversion. According to the application result for simulated data given by IOCCG and field date provided by NASA, the approach model get continuous improvement and enhancement. Finally, a low-cost, effective retrieval model of water quality from hyper-spectral remote sensing can be achieved.

  1. Kullback-Leibler Divergence-Based Differential Evolution Markov Chain Filter for Global Localization of Mobile Robots.

    PubMed

    Martín, Fernando; Moreno, Luis; Garrido, Santiago; Blanco, Dolores

    2015-09-16

    One of the most important skills desired for a mobile robot is the ability to obtain its own location even in challenging environments. The information provided by the sensing system is used here to solve the global localization problem. In our previous work, we designed different algorithms founded on evolutionary strategies in order to solve the aforementioned task. The latest developments are presented in this paper. The engine of the localization module is a combination of the Markov chain Monte Carlo sampling technique and the Differential Evolution method, which results in a particle filter based on the minimization of a fitness function. The robot's pose is estimated from a set of possible locations weighted by a cost value. The measurements of the perceptive sensors are used together with the predicted ones in a known map to define a cost function to optimize. Although most localization methods rely on quadratic fitness functions, the sensed information is processed asymmetrically in this filter. The Kullback-Leibler divergence is the basis of a cost function that makes it possible to deal with different types of occlusions. The algorithm performance has been checked in a real map. The results are excellent in environments with dynamic and unmodeled obstacles, a fact that causes occlusions in the sensing area.

  2. Kullback-Leibler Divergence-Based Differential Evolution Markov Chain Filter for Global Localization of Mobile Robots

    PubMed Central

    Martín, Fernando; Moreno, Luis; Garrido, Santiago; Blanco, Dolores

    2015-01-01

    One of the most important skills desired for a mobile robot is the ability to obtain its own location even in challenging environments. The information provided by the sensing system is used here to solve the global localization problem. In our previous work, we designed different algorithms founded on evolutionary strategies in order to solve the aforementioned task. The latest developments are presented in this paper. The engine of the localization module is a combination of the Markov chain Monte Carlo sampling technique and the Differential Evolution method, which results in a particle filter based on the minimization of a fitness function. The robot’s pose is estimated from a set of possible locations weighted by a cost value. The measurements of the perceptive sensors are used together with the predicted ones in a known map to define a cost function to optimize. Although most localization methods rely on quadratic fitness functions, the sensed information is processed asymmetrically in this filter. The Kullback-Leibler divergence is the basis of a cost function that makes it possible to deal with different types of occlusions. The algorithm performance has been checked in a real map. The results are excellent in environments with dynamic and unmodeled obstacles, a fact that causes occlusions in the sensing area. PMID:26389914

  3. A sea-land segmentation algorithm based on multi-feature fusion for a large-field remote sensing image

    NASA Astrophysics Data System (ADS)

    Li, Jing; Xie, Weixin; Pei, Jihong

    2018-03-01

    Sea-land segmentation is one of the key technologies of sea target detection in remote sensing images. At present, the existing algorithms have the problems of low accuracy, low universality and poor automatic performance. This paper puts forward a sea-land segmentation algorithm based on multi-feature fusion for a large-field remote sensing image removing island. Firstly, the coastline data is extracted and all of land area is labeled by using the geographic information in large-field remote sensing image. Secondly, three features (local entropy, local texture and local gradient mean) is extracted in the sea-land border area, and the three features combine a 3D feature vector. And then the MultiGaussian model is adopted to describe 3D feature vectors of sea background in the edge of the coastline. Based on this multi-gaussian sea background model, the sea pixels and land pixels near coastline are classified more precise. Finally, the coarse segmentation result and the fine segmentation result are fused to obtain the accurate sea-land segmentation. Comparing and analyzing the experimental results by subjective vision, it shows that the proposed method has high segmentation accuracy, wide applicability and strong anti-disturbance ability.

  4. An overview of remote sensing of chlorophyll fluorescence

    NASA Astrophysics Data System (ADS)

    Xing, Xiao-Gang; Zhao, Dong-Zhi; Liu, Yu-Guang; Yang, Jian-Hong; Xiu, Peng; Wang, Lin

    2007-03-01

    Besides empirical algorithms with the blue-green ratio, the algorithms based on fluorescence are also important and valid methods for retrieving chlorophyll-a concentration in the ocean waters, especially for Case II waters and the sea with algal blooming. This study reviews the history of initial cognitions, investigations and detailed approaches towards chlorophyll fluorescence, and then introduces the biological mechanism of fluorescence remote sensing and main spectral characteristics such as the positive correlation between fluorescence and chlorophyll concentration, the red shift phenomena. Meanwhile, there exist many influence factors that increase complexity of fluorescence remote sensing, such as fluorescence quantum yield, physiological status of various algae, substances with related optical property in the ocean, atmospheric absorption etc. Based on these cognitions, scientists have found two ways to calculate the amount of fluorescence detected by ocean color sensors: fluorescence line height and reflectance ratio. These two ways are currently the foundation for retrieval of chlorophyl l - a concentration in the ocean. As the in-situ measurements and synchronous satellite data are continuously being accumulated, the fluorescence remote sensing of chlorophyll-a concentration in Case II waters should be recognized more thoroughly and new algorithms could be expected.

  5. A stereo remote sensing feature selection method based on artificial bee colony algorithm

    NASA Astrophysics Data System (ADS)

    Yan, Yiming; Liu, Pigang; Zhang, Ye; Su, Nan; Tian, Shu; Gao, Fengjiao; Shen, Yi

    2014-05-01

    To improve the efficiency of stereo information for remote sensing classification, a stereo remote sensing feature selection method is proposed in this paper presents, which is based on artificial bee colony algorithm. Remote sensing stereo information could be described by digital surface model (DSM) and optical image, which contain information of the three-dimensional structure and optical characteristics, respectively. Firstly, three-dimensional structure characteristic could be analyzed by 3D-Zernike descriptors (3DZD). However, different parameters of 3DZD could descript different complexity of three-dimensional structure, and it needs to be better optimized selected for various objects on the ground. Secondly, features for representing optical characteristic also need to be optimized. If not properly handled, when a stereo feature vector composed of 3DZD and image features, that would be a lot of redundant information, and the redundant information may not improve the classification accuracy, even cause adverse effects. To reduce information redundancy while maintaining or improving the classification accuracy, an optimized frame for this stereo feature selection problem is created, and artificial bee colony algorithm is introduced for solving this optimization problem. Experimental results show that the proposed method can effectively improve the computational efficiency, improve the classification accuracy.

  6. Experimental evaluation of ALS point cloud ground extraction over different land cover in the Malopolska Province

    NASA Astrophysics Data System (ADS)

    Korzeniowska, Karolina; Mandlburger, Gottfried; Klimczyk, Agata

    2013-04-01

    The paper presents an evaluation of different terrain point extraction algorithms for Airborne Laser Scanning (ALS) point clouds. The research area covers eight test sites in the Małopolska Province (Poland) with varying point density between 3-15points/m² and surface as well as land cover characteristics. In this paper the existing implementations of algorithms were considered. Approaches based on mathematical morphology, progressive densification, robust surface interpolation and segmentation were compared. From the group of morphological filters, the Progressive Morphological Filter (PMF) proposed by Zhang K. et al. (2003) in LIS software was evaluated. From the progressive densification filter methods developed by Axelsson P. (2000) the Martin Isenburg's implementation in LAStools software (LAStools, 2012) was chosen. The third group of methods are surface-based filters. In this study, we used the hierarchic robust interpolation approach by Kraus K., Pfeifer N. (1998) as implemented in SCOP++ (Trimble, 2012). The fourth group of methods works on segmentation. From this filtering concept the segmentation algorithm available in LIS was tested (Wichmann V., 2012). The main aim in executing the automatic classification for ground extraction was operating in default mode or with default parameters which were selected by the developers of the algorithms. It was assumed that the default settings were equivalent to the parameters on which the best results can be achieved. In case it was not possible to apply an algorithm in default mode, a combination of the available and most crucial parameters for ground extraction were selected. As a result of these analyses, several output LAS files with different ground classification were achieved. The results were described on the basis of qualitative and quantitative analyses, both being in a formal description. The classification differences were verified on point cloud data. Qualitative verification of ground extraction was made on the basis of a visual inspection of the results (Sithole G., Vosselman G., 2004; Meng X. et al., 2010). The results of these analyses were described as a graph using weighted assumption. The quantitative analyses were evaluated on a basis of Type I, Type II and Total errors (Sithole G., Vosselman G., 2003). The achieved results show that the analysed algorithms yield different classification accuracies depending on the landscape and land cover. The simplest terrain for ground extraction was flat rural area with sparse vegetation. The most difficult were mountainous areas with very dense vegetation where only a few ground points were available. Generally the LAStools algorithm gives good results in every type of terrain, but the ground surface is too smooth. The LIS Progressive Morphological Filter algorithm gives good results in forested flat and low slope areas. The surface-based algorithm from SCOP++ gives good results in mountainous areas - both forested and built-up because it better preserves steep slopes, sharp ridges and breaklines, but sometimes it fails to remove off-terrain objects from the ground class. The segmentation-based algorithm in LIS gives quite good results in built-up flat areas, but in forested areas it does not work well. Bibliography: Axelsson, P., 2000. DEM generation from laser scanner data using adaptive TIN models. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXIII (Pt. B4/1), 110- 117 Kraus, K., Pfeifer, N., 1998. Determination of terrain models in wooded areas with airborne laser scanner data. ISPRS Journal of Photogrammetry & Remote Sensing 53 (4), 193-203 LAStools website http://www.cs.unc.edu/~isenburg/lastools/ (verified in September 2012) Meng, X., Currit, N., Zhao, K., 2010. Ground Filtering Algorithms for Airborne LiDAR Data: A Review of Critical Issues. Remote Sensing 2, 833-860 Sithole, G., Vosselman, G., 2003. Report: ISPRS Comparison of Filters. Commission III, Working Group 3. Department of Geodesy, Faculty of Civil Engineering and Geosciences, Delft University of technology, The Netherlands Sithole, G., Vosselman, G., 2004. Experimental comparison of filter algorithms for bare-Earth extraction form airborne laser scanning point clouds. ISPRS Journal of Photogrammetry & Remote Sensing 59, 85-101 Trimble, 2012 http://www.trimble.com/geospatial/aerial-software.aspx (verified in November 2012) Wichmann, V., 2012. LIS Command Reference, LASERDATA GmbH, 1-231 Zhang, K., Chen, S.-C., Whitman, D., Shyu, M.-L., Yan, J., Zhang, C., 2003. A progressive morphological filter for removing non-ground measurements from airborne LIDAR data. IEEE Transactions on Geoscience and Remote Sensing, 41(4), 872-882

  7. An algorithm for hyperspectral remote sensing of aerosols: 1. Development of theoretical framework

    NASA Astrophysics Data System (ADS)

    Hou, Weizhen; Wang, Jun; Xu, Xiaoguang; Reid, Jeffrey S.; Han, Dong

    2016-07-01

    This paper describes the first part of a series of investigations to develop algorithms for simultaneous retrieval of aerosol parameters and surface reflectance from a newly developed hyperspectral instrument, the GEOstationary Trace gas and Aerosol Sensor Optimization (GEO-TASO), by taking full advantage of available hyperspectral measurement information in the visible bands. We describe the theoretical framework of an inversion algorithm for the hyperspectral remote sensing of the aerosol optical properties, in which major principal components (PCs) for surface reflectance is assumed known, and the spectrally dependent aerosol refractive indices are assumed to follow a power-law approximation with four unknown parameters (two for real and two for imaginary part of refractive index). New capabilities for computing the Jacobians of four Stokes parameters of reflected solar radiation at the top of the atmosphere with respect to these unknown aerosol parameters and the weighting coefficients for each PC of surface reflectance are added into the UNified Linearized Vector Radiative Transfer Model (UNL-VRTM), which in turn facilitates the optimization in the inversion process. Theoretical derivations of the formulas for these new capabilities are provided, and the analytical solutions of Jacobians are validated against the finite-difference calculations with relative error less than 0.2%. Finally, self-consistency check of the inversion algorithm is conducted for the idealized green-vegetation and rangeland surfaces that were spectrally characterized by the U.S. Geological Survey digital spectral library. It shows that the first six PCs can yield the reconstruction of spectral surface reflectance with errors less than 1%. Assuming that aerosol properties can be accurately characterized, the inversion yields a retrieval of hyperspectral surface reflectance with an uncertainty of 2% (and root-mean-square error of less than 0.003), which suggests self-consistency in the inversion framework. The next step of using this framework to study the aerosol information content in GEO-TASO measurements is also discussed.

  8. A new strategy for snow-cover mapping using remote sensing data and ensemble based systems techniques

    NASA Astrophysics Data System (ADS)

    Roberge, S.; Chokmani, K.; De Sève, D.

    2012-04-01

    The snow cover plays an important role in the hydrological cycle of Quebec (Eastern Canada). Consequently, evaluating its spatial extent interests the authorities responsible for the management of water resources, especially hydropower companies. The main objective of this study is the development of a snow-cover mapping strategy using remote sensing data and ensemble based systems techniques. Planned to be tested in a near real-time operational mode, this snow-cover mapping strategy has the advantage to provide the probability of a pixel to be snow covered and its uncertainty. Ensemble systems are made of two key components. First, a method is needed to build an ensemble of classifiers that is diverse as much as possible. Second, an approach is required to combine the outputs of individual classifiers that make up the ensemble in such a way that correct decisions are amplified, and incorrect ones are cancelled out. In this study, we demonstrate the potential of ensemble systems to snow-cover mapping using remote sensing data. The chosen classifier is a sequential thresholds algorithm using NOAA-AVHRR data adapted to conditions over Eastern Canada. Its special feature is the use of a combination of six sequential thresholds varying according to the day in the winter season. Two versions of the snow-cover mapping algorithm have been developed: one is specific for autumn (from October 1st to December 31st) and the other for spring (from March 16th to May 31st). In order to build the ensemble based system, different versions of the algorithm are created by varying randomly its parameters. One hundred of the versions are included in the ensemble. The probability of a pixel to be snow, no-snow or cloud covered corresponds to the amount of votes the pixel has been classified as such by all classifiers. The overall performance of ensemble based mapping is compared to the overall performance of the chosen classifier, and also with ground observations at meteorological stations.

  9. Compressed sensing for energy-efficient wireless telemonitoring of noninvasive fetal ECG via block sparse Bayesian learning.

    PubMed

    Zhang, Zhilin; Jung, Tzyy-Ping; Makeig, Scott; Rao, Bhaskar D

    2013-02-01

    Fetal ECG (FECG) telemonitoring is an important branch in telemedicine. The design of a telemonitoring system via a wireless body area network with low energy consumption for ambulatory use is highly desirable. As an emerging technique, compressed sensing (CS) shows great promise in compressing/reconstructing data with low energy consumption. However, due to some specific characteristics of raw FECG recordings such as nonsparsity and strong noise contamination, current CS algorithms generally fail in this application. This paper proposes to use the block sparse Bayesian learning framework to compress/reconstruct nonsparse raw FECG recordings. Experimental results show that the framework can reconstruct the raw recordings with high quality. Especially, the reconstruction does not destroy the interdependence relation among the multichannel recordings. This ensures that the independent component analysis decomposition of the reconstructed recordings has high fidelity. Furthermore, the framework allows the use of a sparse binary sensing matrix with much fewer nonzero entries to compress recordings. Particularly, each column of the matrix can contain only two nonzero entries. This shows that the framework, compared to other algorithms such as current CS algorithms and wavelet algorithms, can greatly reduce code execution in CPU in the data compression stage.

  10. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-01-01

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm. PMID:28587084

  11. Infrared super-resolution imaging based on compressed sensing

    NASA Astrophysics Data System (ADS)

    Sui, Xiubao; Chen, Qian; Gu, Guohua; Shen, Xuewei

    2014-03-01

    The theoretical basis of traditional infrared super-resolution imaging method is Nyquist sampling theorem. The reconstruction premise is that the relative positions of the infrared objects in the low-resolution image sequences should keep fixed and the image restoration means is the inverse operation of ill-posed issues without fixed rules. The super-resolution reconstruction ability of the infrared image, algorithm's application area and stability of reconstruction algorithm are limited. To this end, we proposed super-resolution reconstruction method based on compressed sensing in this paper. In the method, we selected Toeplitz matrix as the measurement matrix and realized it by phase mask method. We researched complementary matching pursuit algorithm and selected it as the recovery algorithm. In order to adapt to the moving target and decrease imaging time, we take use of area infrared focal plane array to acquire multiple measurements at one time. Theoretically, the method breaks though Nyquist sampling theorem and can greatly improve the spatial resolution of the infrared image. The last image contrast and experiment data indicate that our method is effective in improving resolution of infrared images and is superior than some traditional super-resolution imaging method. The compressed sensing super-resolution method is expected to have a wide application prospect.

  12. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors.

    PubMed

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-05-25

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ -connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  13. Infrared remote sensing of the vertical and horizontal distribution of clouds

    NASA Technical Reports Server (NTRS)

    Chahine, M. T.; Haskins, R. D.

    1982-01-01

    An algorithm has been developed to derive the horizontal and vertical distribution of clouds from the same set of infrared radiance data used to retrieve atmospheric temperature profiles. The method leads to the determination of the vertical atmospheric temperature structure and the cloud distribution simultaneously, providing information on heat sources and sinks, storage rates and transport phenomena in the atmosphere. Experimental verification of this algorithm was obtained using the 15-micron data measured by the NOAA-VTPR temperature sounder. After correcting for water vapor emission, the results show that the cloud cover derived from 15-micron data is less than that obtained from visible data.

  14. Least mean square fourth based microgrid state estimation algorithm using the internet of things technology

    PubMed Central

    2017-01-01

    This paper proposes an innovative internet of things (IoT) based communication framework for monitoring microgrid under the condition of packet dropouts in measurements. First of all, the microgrid incorporating the renewable distributed energy resources is represented by a state-space model. The IoT embedded wireless sensor network is adopted to sense the system states. Afterwards, the information is transmitted to the energy management system using the communication network. Finally, the least mean square fourth algorithm is explored for estimating the system states. The effectiveness of the developed approach is verified through numerical simulations. PMID:28459848

  15. Developing a semi-analytical algorithm to estimate particulate organic carbon (POC) levels in inland eutrophic turbid water based on MERIS images: A case study of Lake Taihu

    NASA Astrophysics Data System (ADS)

    Lyu, Heng; Wang, Yannan; Jin, Qi; Shi, Lei; Li, Yunmei; Wang, Qiao

    2017-10-01

    Particulate organic carbon (POC) plays an important role in the carbon cycle in water due to its biological pump process. In the open ocean, algorithms can accurately estimate the surface POC concentration. However, no suitable POC-estimation algorithm based on MERIS bands is available for inland turbid eutrophic water. A total of 228 field samples were collected from Lake Taihu in different seasons between 2013 and 2015. At each site, the optical parameters and water quality were analyzed. Using in situ data, it was found that POC-estimation algorithms developed for the open ocean and coastal waters using remote sensing reflectance were not suitable for inland turbid eutrophic water. The organic suspended matter (OSM) concentration was found to be the best indicator of the POC concentration, and POC has an exponential relationship with the OSM concentration. Through an analysis of the POC concentration and optical parameters, it was found that the absorption peak of total suspended matter (TSM) at 665 nm was the optimum parameter to estimate POC. As a result, MERIS band 7, MERIS band 10 and MERIS band 12 were used to derive the absorption coefficient of TSM at 665 nm, and then, a semi-analytical algorithm was used to estimate the POC concentration for inland turbid eutrophic water. An accuracy assessment showed that the developed semi-analytical algorithm could be successfully applied with a MAPE of 31.82% and RMSE of 2.68 mg/L. The developed algorithm was successfully applied to a MERIS image, and two full-resolution MERIS images, acquired on August 13, 2010, and December 7, 2010, were used to map the POC spatial distribution in Lake Taihu in summer and winter.

  16. Actual daily evapotranspiration estimated from MERIS and AATSR data over the Chinese Loess Plateau

    NASA Astrophysics Data System (ADS)

    Liu, R.; Wen, J.; Wang, X.; Wang, L.; Tian, H.; Zhang, T. T.; Shi, X. K.; Zhang, J. H.; Lu, Sh. N.

    2009-02-01

    The Loess Plateau is located in north of China and has a significant impact on the climate and ecosystem evolvement over the East Asian continent. Based on the land surface energy balance theory, the potential of using Medium Resolution Imaging Spectrometer (onboard sensor of the Environmental Satellite) remote sensing data on 7, 11 and 27 June 2005 is explored. The "split-window" algorithm is used to retrieve surface temperature from the Advanced the Along-Track Scanning Radiometer, another onboard senor of the Environmental Satellite. Then the near surface net radiation, sensible heat flux and soil heat flux are estimated by using the developed algorithm. We introduce a simple algorithm to predict the heat flux partitioning between the soil and vegetation. Combining the sunshine hours, air temperature, sunshine duration and wind speed measured by weather stations, a model for estimating daily ET is proposed. The instantaneous ET is also converted to daily value. Comparison of latent heats flux retrieved by remote sensing data with ground observation from eddy covariance flux system during Loess Plateau land surface process field Experiment, the maximum and minimum error of this approach are 10.96% and 4.80% respectively, the cause of the bias is also explored and discussed.

  17. Study on Landslide Disaster Extraction Method Based on Spaceborne SAR Remote Sensing Images - Take Alos Palsar for AN Example

    NASA Astrophysics Data System (ADS)

    Xue, D.; Yu, X.; Jia, S.; Chen, F.; Li, X.

    2018-04-01

    In this paper, sequence ALOS PALSAR data and airborne SAR data of L-band from June 5, 2008 to September 8, 2015 are used. Based on the research of SAR data preprocessing and core algorithms, such as geocode, registration, filtering, unwrapping and baseline estimation, the improved Goldstein filtering algorithm and the branch-cut path tracking algorithm are used to unwrap the phase. The DEM and surface deformation information of the experimental area were extracted. Combining SAR-specific geometry and differential interferometry, on the basis of composite analysis of multi-source images, a method of detecting landslide disaster combining coherence of SAR image is developed, which makes up for the deficiency of single SAR and optical remote sensing acquisition ability. Especially in bad weather and abnormal climate areas, the speed of disaster emergency and the accuracy of extraction are improved. It is found that the deformation in this area is greatly affected by faults, and there is a tendency of uplift in the southeast plain and western mountainous area, while in the southwest part of the mountain area there is a tendency to sink. This research result provides a basis for decision-making for local disaster prevention and control.

  18. Spatiotemporal Local-Remote Senor Fusion (ST-LRSF) for Cooperative Vehicle Positioning

    PubMed Central

    Bhawiyuga, Adhitya

    2018-01-01

    Vehicle positioning plays an important role in the design of protocols, algorithms, and applications in the intelligent transport systems. In this paper, we present a new framework of spatiotemporal local-remote sensor fusion (ST-LRSF) that cooperatively improves the accuracy of absolute vehicle positioning based on two state estimates of a vehicle in the vicinity: a local sensing estimate, measured by the on-board exteroceptive sensors, and a remote sensing estimate, received from neighbor vehicles via vehicle-to-everything communications. Given both estimates of vehicle state, the ST-LRSF scheme identifies the set of vehicles in the vicinity, determines the reference vehicle state, proposes a spatiotemporal dissimilarity metric between two reference vehicle states, and presents a greedy algorithm to compute a minimal weighted matching (MWM) between them. Given the outcome of MWM, the theoretical position uncertainty of the proposed refinement algorithm is proven to be inversely proportional to the square root of matching size. To further reduce the positioning uncertainty, we also develop an extended Kalman filter model with the refined position of ST-LRSF as one of the measurement inputs. The numerical results demonstrate that the proposed ST-LRSF framework can achieve high positioning accuracy for many different scenarios of cooperative vehicle positioning. PMID:29617341

  19. Reconstruction of in-plane strain maps using hybrid dense sensor network composed of sensing skin

    NASA Astrophysics Data System (ADS)

    Downey, Austin; Laflamme, Simon; Ubertini, Filippo

    2016-12-01

    The authors have recently developed a soft-elastomeric capacitive (SEC)-based thin film sensor for monitoring strain on mesosurfaces. Arranged in a network configuration, the sensing system is analogous to a biological skin, where local strain can be monitored over a global area. Under plane stress conditions, the sensor output contains the additive measurement of the two principal strain components over the monitored surface. In applications where the evaluation of strain maps is useful, in structural health monitoring for instance, such signal must be decomposed into linear strain components along orthogonal directions. Previous work has led to an algorithm that enabled such decomposition by leveraging a dense sensor network configuration with the addition of assumed boundary conditions. Here, we significantly improve the algorithm’s accuracy by leveraging mature off-the-shelf solutions to create a hybrid dense sensor network (HDSN) to improve on the boundary condition assumptions. The system’s boundary conditions are enforced using unidirectional RSGs and assumed virtual sensors. Results from an extensive experimental investigation demonstrate the good performance of the proposed algorithm and its robustness with respect to sensors’ layout. Overall, the proposed algorithm is seen to effectively leverage the advantages of a hybrid dense network for application of the thin film sensor to reconstruct surface strain fields over large surfaces.

  20. Expansion of Smartwatch Touch Interface from Touchscreen to Around Device Interface Using Infrared Line Image Sensors

    PubMed Central

    Lim, Soo-Chul; Shin, Jungsoon; Kim, Seung-Chan; Park, Joonah

    2015-01-01

    Touchscreen interaction has become a fundamental means of controlling mobile phones and smartwatches. However, the small form factor of a smartwatch limits the available interactive surface area. To overcome this limitation, we propose the expansion of the touch region of the screen to the back of the user’s hand. We developed a touch module for sensing the touched finger position on the back of the hand using infrared (IR) line image sensors, based on the calibrated IR intensity and the maximum intensity region of an IR array. For complete touch-sensing solution, a gyroscope installed in the smartwatch is used to read the wrist gestures. The gyroscope incorporates a dynamic time warping gesture recognition algorithm for eliminating unintended touch inputs during the free motion of the wrist while wearing the smartwatch. The prototype of the developed sensing module was implemented in a commercial smartwatch, and it was confirmed that the sensed positional information of the finger when it was used to touch the back of the hand could be used to control the smartwatch graphical user interface. Our system not only affords a novel experience for smartwatch users, but also provides a basis for developing other useful interfaces. PMID:26184202

  1. Modification of the random forest algorithm to avoid statistical dependence problems when classifying remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Cánovas-García, Fulgencio; Alonso-Sarría, Francisco; Gomariz-Castillo, Francisco; Oñate-Valdivieso, Fernando

    2017-06-01

    Random forest is a classification technique widely used in remote sensing. One of its advantages is that it produces an estimation of classification accuracy based on the so called out-of-bag cross-validation method. It is usually assumed that such estimation is not biased and may be used instead of validation based on an external data-set or a cross-validation external to the algorithm. In this paper we show that this is not necessarily the case when classifying remote sensing imagery using training areas with several pixels or objects. According to our results, out-of-bag cross-validation clearly overestimates accuracy, both overall and per class. The reason is that, in a training patch, pixels or objects are not independent (from a statistical point of view) of each other; however, they are split by bootstrapping into in-bag and out-of-bag as if they were really independent. We believe that putting whole patch, rather than pixels/objects, in one or the other set would produce a less biased out-of-bag cross-validation. To deal with the problem, we propose a modification of the random forest algorithm to split training patches instead of the pixels (or objects) that compose them. This modified algorithm does not overestimate accuracy and has no lower predictive capability than the original. When its results are validated with an external data-set, the accuracy is not different from that obtained with the original algorithm. We analysed three remote sensing images with different classification approaches (pixel and object based); in the three cases reported, the modification we propose produces a less biased accuracy estimation.

  2. Optimisation of sensing time and transmission time in cognitive radio-based smart grid networks

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Fu, Yuli; Yang, Junjie

    2016-07-01

    Cognitive radio (CR)-based smart grid (SG) networks have been widely recognised as emerging communication paradigms in power grids. However, a sufficient spectrum resource and reliability are two major challenges for real-time applications in CR-based SG networks. In this article, we study the traffic data collection problem. Based on the two-stage power pricing model, the power price is associated with the efficient received traffic data in a metre data management system (MDMS). In order to minimise the system power price, a wideband hybrid access strategy is proposed and analysed, to share the spectrum between the SG nodes and CR networks. The sensing time and transmission time are jointly optimised, while both the interference to primary users and the spectrum opportunity loss of secondary users are considered. Two algorithms are proposed to solve the joint optimisation problem. Simulation results show that the proposed joint optimisation algorithms outperform the fixed parameters (sensing time and transmission time) algorithms, and the power cost is reduced efficiently.

  3. TH-E-17A-06: Anatomical-Adaptive Compressed Sensing (AACS) Reconstruction for Thoracic 4-Dimensional Cone-Beam CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shieh, C; Kipritidis, J; OBrien, R

    2014-06-15

    Purpose: The Feldkamp-Davis-Kress (FDK) algorithm currently used for clinical thoracic 4-dimensional (4D) cone-beam CT (CBCT) reconstruction suffers from noise and streaking artifacts due to projection under-sampling. Compressed sensing theory enables reconstruction of under-sampled datasets via total-variation (TV) minimization, but TV-minimization algorithms such as adaptive-steepest-descent-projection-onto-convex-sets (ASD-POCS) often converge slowly and are prone to over-smoothing anatomical details. These disadvantages can be overcome by incorporating general anatomical knowledge via anatomy segmentation. Based on this concept, we have developed an anatomical-adaptive compressed sensing (AACS) algorithm for thoracic 4D-CBCT reconstruction. Methods: AACS is based on the ASD-POCS framework, where each iteration consists of a TV-minimizationmore » step and a data fidelity constraint step. Prior to every AACS iteration, four major thoracic anatomical structures - soft tissue, lungs, bony anatomy, and pulmonary details - were segmented from the updated solution image. Based on the segmentation, an anatomical-adaptive weighting was applied to the TV-minimization step, so that TV-minimization was enhanced at noisy/streaky regions and suppressed at anatomical structures of interest. The image quality and convergence speed of AACS was compared to conventional ASD-POCS using an XCAT digital phantom and a patient scan. Results: For the XCAT phantom, the AACS image represented the ground truth better than the ASD-POCS image, giving a higher structural similarity index (0.93 vs. 0.84) and lower absolute difference (1.1*10{sup 4} vs. 1.4*10{sup 4}). For the patient case, while both algorithms resulted in much less noise and streaking than FDK, the AACS image showed considerably better contrast and sharpness of the vessels, tumor, and fiducial marker than the ASD-POCS image. In addition, AACS converged over 50% faster than ASD-POCS in both cases. Conclusions: The proposed AACS algorithm was shown to reconstruct thoracic 4D-CBCT images more accurately and with faster convergence compared to ASD-POCS. The superior image quality and rapid convergence makes AACS promising for future clinical use.« less

  4. A framework for evaluating mixture analysis algorithms

    NASA Astrophysics Data System (ADS)

    Dasaratha, Sridhar; Vignesh, T. S.; Shanmukh, Sarat; Yarra, Malathi; Botonjic-Sehic, Edita; Grassi, James; Boudries, Hacene; Freeman, Ivan; Lee, Young K.; Sutherland, Scott

    2010-04-01

    In recent years, several sensing devices capable of identifying unknown chemical and biological substances have been commercialized. The success of these devices in analyzing real world samples is dependent on the ability of the on-board identification algorithm to de-convolve spectra of substances that are mixtures. To develop effective de-convolution algorithms, it is critical to characterize the relationship between the spectral features of a substance and its probability of detection within a mixture, as these features may be similar to or overlap with other substances in the mixture and in the library. While it has been recognized that these aspects pose challenges to mixture analysis, a systematic effort to quantify spectral characteristics and their impact, is generally lacking. In this paper, we propose metrics that can be used to quantify these spectral features. Some of these metrics, such as a modification of variance inflation factor, are derived from classical statistical measures used in regression diagnostics. We demonstrate that these metrics can be correlated to the accuracy of the substance's identification in a mixture. We also develop a framework for characterizing mixture analysis algorithms, using these metrics. Experimental results are then provided to show the application of this framework to the evaluation of various algorithms, including one that has been developed for a commercial device. The illustration is based on synthetic mixtures that are created from pure component Raman spectra measured on a portable device.

  5. Observability and Estimation of Distributed Space Systems via Local Information-Exchange Networks

    NASA Technical Reports Server (NTRS)

    Fathpour, Nanaz; Hadaegh, Fred Y.; Mesbahi, Mehran; Rahmani, Amirreza

    2011-01-01

    Spacecraft formation flying involves the coordination of states among multiple spacecraft through relative sensing, inter-spacecraft communication, and control. Most existing formation-flying estimation algorithms can only be supported via highly centralized, all-to-all, static relative sensing. New algorithms are proposed that are scalable, modular, and robust to variations in the topology and link characteristics of the formation exchange network. These distributed algorithms rely on a local information exchange network, relaxing the assumptions on existing algorithms. Distributed space systems rely on a signal transmission network among multiple spacecraft for their operation. Control and coordination among multiple spacecraft in a formation is facilitated via a network of relative sensing and interspacecraft communications. Guidance, navigation, and control rely on the sensing network. This network becomes more complex the more spacecraft are added, or as mission requirements become more complex. The observability of a formation state was observed by a set of local observations from a particular node in the formation. Formation observability can be parameterized in terms of the matrices appearing in the formation dynamics and observation matrices. An agreement protocol was used as a mechanism for observing formation states from local measurements. An agreement protocol is essentially an unforced dynamic system whose trajectory is governed by the interconnection geometry and initial condition of each node, with a goal of reaching a common value of interest. The observability of the interconnected system depends on the geometry of the network, as well as the position of the observer relative to the topology. For the first time, critical GN&C (guidance, navigation, and control estimation) subsystems are synthesized by bringing the contribution of the spacecraft information-exchange network to the forefront of algorithmic analysis and design. The result is a formation estimation algorithm that is modular and robust to variations in the topology and link properties of the underlying formation network.

  6. Cooperative remote sensing and actuation using networked unmanned vehicles

    NASA Astrophysics Data System (ADS)

    Chao, Haiyang

    This dissertation focuses on how to design and employ networked unmanned vehicles for remote sensing and distributed control purposes in the current information-rich world. The target scenarios are environmental or agricultural applications such as river/reservoir surveillance, wind profiling measurement, and monitoring/control of chemical leaks, etc. AggieAir, a small and low-cost unmanned aircraft system, is designed based on the remote sensing requirements from environmental monitoring missions. The state estimation problem and the advanced lateral flight controller design problem are further attacked focusing on the small unmanned aerial vehicle (UAV) platform. Then the UAV-based remote sensing problem is focused with further flight test results. Given the measurements from unmanned vehicles, the actuation algorithms are needed for missions like the diffusion control. A consensus-based central Voronoi tessellation (CVT) algorithm is proposed for better control of the diffusion process. Finally, the dissertation conclusion and some new research suggestions are presented.

  7. Separating vegetation and soil temperature using airborne multiangular remote sensing image data

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Yan, Chunyan; Xiao, Qing; Yan, Guangjian; Fang, Li

    2012-07-01

    Land surface temperature (LST) is a key parameter in land process research. Many research efforts have been devoted to increase the accuracy of LST retrieval from remote sensing. However, because natural land surface is non-isothermal, component temperature is also required in applications such as evapo-transpiration (ET) modeling. This paper proposes a new algorithm to separately retrieve vegetation temperature and soil background temperature from multiangular thermal infrared (TIR) remote sensing data. The algorithm is based on the localized correlation between the visible/near-infrared (VNIR) bands and the TIR band. This method was tested on the airborne image data acquired during the Watershed Allied Telemetry Experimental Research (WATER) campaign. Preliminary validation indicates that the remote sensing-retrieved results can reflect the spatial and temporal trend of component temperatures. The accuracy is within three degrees while the difference between vegetation and soil temperature can be as large as twenty degrees.

  8. Three-Dimensional Inverse Transport Solver Based on Compressive Sensing Technique

    NASA Astrophysics Data System (ADS)

    Cheng, Yuxiong; Wu, Hongchun; Cao, Liangzhi; Zheng, Youqi

    2013-09-01

    According to the direct exposure measurements from flash radiographic image, a compressive sensing-based method for three-dimensional inverse transport problem is presented. The linear absorption coefficients and interface locations of objects are reconstructed directly at the same time. It is always very expensive to obtain enough measurements. With limited measurements, compressive sensing sparse reconstruction technique orthogonal matching pursuit is applied to obtain the sparse coefficients by solving an optimization problem. A three-dimensional inverse transport solver is developed based on a compressive sensing-based technique. There are three features in this solver: (1) AutoCAD is employed as a geometry preprocessor due to its powerful capacity in graphic. (2) The forward projection matrix rather than Gauss matrix is constructed by the visualization tool generator. (3) Fourier transform and Daubechies wavelet transform are adopted to convert an underdetermined system to a well-posed system in the algorithm. Simulations are performed and numerical results in pseudo-sine absorption problem, two-cube problem and two-cylinder problem when using compressive sensing-based solver agree well with the reference value.

  9. The design and performance characteristics of a cellular logic 3-D image classification processor

    NASA Astrophysics Data System (ADS)

    Ankeney, L. A.

    1981-04-01

    The introduction of high resolution scanning laser radar systems which are capable of collecting range and reflectivity images, is predicted to significantly influence the development of processors capable of performing autonomous target classification tasks. Actively sensed range images are shown to be superior to passively collected infrared images in both image stability and information content. An illustrated tutorial introduces cellular logic (neighborhood) transformations and two and three dimensional erosion and dilation operations which are used for noise filters and geometric shape measurement. A unique 'cookbook' approach to selecting a sequence of neighborhood transformations suitable for object measurement is developed and related to false alarm rate and algorithm effectiveness measures. The cookbook design approach is used to develop an algorithm to classify objects based upon their 3-D geometrical features. A Monte Carlo performance analysis is used to demonstrate the utility of the design approach by characterizing the ability of the algorithm to classify randomly positioned three dimensional objects in the presence of additive noise, scale variations, and other forms of image distortion.

  10. Detect and Avoid (DAA) Automation Maneuver Study

    DTIC Science & Technology

    2017-02-01

    88ABW-2017-2261. 14. ABSTRACT The study described herein was an operator–in–the–loop assessment supporting the development of a Sense and Avoid ( SAA ...display that enables effective teaming of an Unmanned Aerial Systems (UAS) operator with an advanced SAA maneuver algorithm to safely avoid proximal...air traffic. This study examined performance differences between candidate SAA display configurations and automation thresholds while UAS operators

  11. A Novel approach for monitoring cyanobacterial blooms using an ensemble based system from MODIS imagery downscaled to 250 metres spatial resolution

    NASA Astrophysics Data System (ADS)

    El Alem, A.; Chokmani, K.; Laurion, I.; El-Adlouni, S. E.

    2014-12-01

    In reason of inland freshwaters sensitivity to Harmful algae blooms (HAB) development and the limits coverage of standards monitoring programs, remote sensing data have become increasingly used for monitoring HAB extension. Usually, HAB monitoring using remote sensing data is based on empirical and semi-empirical models. Development of such models requires a great number of continuous in situ measurements to reach an acceptable accuracy. However, Ministries and water management organizations often use two thresholds, established by the World Health Organization, to determine water quality. Consequently, the available data are ordinal «semi-qualitative» and they are mostly unexploited. Use of such databases with remote sensing data and statistical classification algorithms can produce hazard management maps linked to the presence of cyanobacteria. Unlike standard classification algorithms, which are generally unstable, classifiers based on ensemble systems are more general and stable. In the present study, an ensemble based classifier was developed and compared to a standard classification method called CART (Classification and Regression Tree) in a context of HAB monitoring in freshwaters using MODIS images downscaled to 250 spatial resolution and ordinal in situ data. Calibration and validation data on cyanobacteria densities were collected by the Ministère du Développement durable, de l'Environnement et de la Lutte contre les changements climatiques on 22 waters bodies between 2000 and 2010. These data comprise three density classes: waters poorly (< 20,000 cells mL-1), moderately (20,000 - 100,000 cells mL-1), and highly (> 100,000 cells mL-1) loaded in cyanobacteria. Results were very interesting and highlighted that inland waters exhibit different spectral response allowing them to be classified into the three above classes for water quality monitoring. On the other, even if the accuracy (Kappa-index = 0.86) of the proposed approach is relatively lower than that of the CART algorithm (Kappa-index = 0.87), but its robustness is higher with a standard-deviation of 0.05 versus 0.06, specifically when applied on MODIS images. A new accurate, robust, and quick approach is thus proposed for a daily near real-time monitoring of HAB in southern Quebec freshwaters.

  12. Compressed sensing with gradient total variation for low-dose CBCT reconstruction

    NASA Astrophysics Data System (ADS)

    Seo, Chang-Woo; Cha, Bo Kyung; Jeon, Seongchae; Huh, Young; Park, Justin C.; Lee, Byeonghun; Baek, Junghee; Kim, Eunyoung

    2015-06-01

    This paper describes the improvement of convergence speed with gradient total variation (GTV) in compressed sensing (CS) for low-dose cone-beam computed tomography (CBCT) reconstruction. We derive a fast algorithm for the constrained total variation (TV)-based a minimum number of noisy projections. To achieve this task we combine the GTV with a TV-norm regularization term to promote an accelerated sparsity in the X-ray attenuation characteristics of the human body. The GTV is derived from a TV and enforces more efficient computationally and faster in convergence until a desired solution is achieved. The numerical algorithm is simple and derives relatively fast convergence. We apply a gradient projection algorithm that seeks a solution iteratively in the direction of the projected gradient while enforcing a non-negatively of the found solution. In comparison with the Feldkamp, Davis, and Kress (FDK) and conventional TV algorithms, the proposed GTV algorithm showed convergence in ≤18 iterations, whereas the original TV algorithm needs at least 34 iterations in reducing 50% of the projections compared with the FDK algorithm in order to reconstruct the chest phantom images. Future investigation includes improving imaging quality, particularly regarding X-ray cone-beam scatter, and motion artifacts of CBCT reconstruction.

  13. Remote sensing of Earth terrain

    NASA Technical Reports Server (NTRS)

    Kong, Jin AU; Shin, Robert T.; Nghiem, Son V.; Yueh, Herng-Aung; Han, Hsiu C.; Lim, Harold H.; Arnold, David V.

    1990-01-01

    Remote sensing of earth terrain is examined. The layered random medium model is used to investigate the fully polarimetric scattering of electromagnetic waves from vegetation. The model is used to interpret the measured data for vegetation fields such as rice, wheat, or soybean over water or soil. Accurate calibration of polarimetric radar systems is essential for the polarimetric remote sensing of earth terrain. A polarimetric calibration algorithm using three arbitrary in-scene reflectors is developed. In the interpretation of active and passive microwave remote sensing data from the earth terrain, the random medium model was shown to be quite successful. A multivariate K-distribution is proposed to model the statistics of fully polarimetric radar returns from earth terrain. In the terrain cover classification using the synthetic aperture radar (SAR) images, the applications of the K-distribution model will provide better performance than the conventional Gaussian classifiers. The layered random medium model is used to study the polarimetric response of sea ice. Supervised and unsupervised classification procedures are also developed and applied to synthetic aperture radar polarimetric images in order to identify their various earth terrain components for more than two classes. These classification procedures were applied to San Francisco Bay and Traverse City SAR images.

  14. A stochastic atmospheric model for remote sensing applications

    NASA Technical Reports Server (NTRS)

    Turner, R. E.

    1983-01-01

    There are many factors which reduce the accuracy of classification of objects in the satellite remote sensing of Earth's surface. One important factor is the variability in the scattering and absorptive properties of the atmospheric components such as particulates and the variable gases. For multispectral remote sensing of the Earth's surface in the visible and infrared parts of the spectrum the atmospheric particulates are a major source of variability in the received signal. It is difficult to design a sensor which will determine the unknown atmospheric components by remote sensing methods, at least to the accuracy needed for multispectral classification. The problem of spatial and temporal variations in the atmospheric quantities which can affect the measured radiances are examined. A method based upon the stochastic nature of the atmospheric components was developed, and, using actual data the statistical parameters needed for inclusion into a radiometric model was generated. Methods are then described for an improved correction of radiances. These algorithms will then result in a more accurate and consistent classification procedure.

  15. Reconstructing high-dimensional two-photon entangled states via compressive sensing

    PubMed Central

    Tonolini, Francesco; Chan, Susan; Agnew, Megan; Lindsay, Alan; Leach, Jonathan

    2014-01-01

    Accurately establishing the state of large-scale quantum systems is an important tool in quantum information science; however, the large number of unknown parameters hinders the rapid characterisation of such states, and reconstruction procedures can become prohibitively time-consuming. Compressive sensing, a procedure for solving inverse problems by incorporating prior knowledge about the form of the solution, provides an attractive alternative to the problem of high-dimensional quantum state characterisation. Using a modified version of compressive sensing that incorporates the principles of singular value thresholding, we reconstruct the density matrix of a high-dimensional two-photon entangled system. The dimension of each photon is equal to d = 17, corresponding to a system of 83521 unknown real parameters. Accurate reconstruction is achieved with approximately 2500 measurements, only 3% of the total number of unknown parameters in the state. The algorithm we develop is fast, computationally inexpensive, and applicable to a wide range of quantum states, thus demonstrating compressive sensing as an effective technique for measuring the state of large-scale quantum systems. PMID:25306850

  16. Simultaneous data communication and position sensing with an impact ionization engineered avalanche photodiode array for free space optical communication

    NASA Astrophysics Data System (ADS)

    Ferraro, Mike S.; Mahon, Rita; Rabinovich, William S.; Murphy, James L.; Dexter, James L.; Clark, William R.; Waters, William D.; Vaccaro, Kenneth; Krejca, Brian D.

    2017-02-01

    Photodetectors in free space optical communication systems perform two functions: reception of data communication signals and position sensing for pointing, tracking, and stabilization. Traditionally, the optical receive path in an FSO system is split into separate paths for data detection and position sensing. The need for separate paths is a consequence of conflicting performance criteria between position sensitive detectors (PSD) and data detectors. Combining the functionality of both detector types requires that the combinational sensor not only have the bandwidth to support high data rate communication but the active area and spatial discrimination to accommodate position sensing. In this paper we present a large area, concentric five element impact ionization engineered avalanche photodiode array rated for bandwidths beyond 1GHz with a measured carrier ionization ratio of less than 0.1 at moderate APD gains. The integration of this array as a combinational sensor in an FSO system is discussed along with the development of a pointing and stabilization algorithm.

  17. Self-sensing of dielectric elastomer actuator enhanced by artificial neural network

    NASA Astrophysics Data System (ADS)

    Ye, Zhihang; Chen, Zheng

    2017-09-01

    Dielectric elastomer (DE) is a type of soft actuating material, the shape of which can be changed under electrical voltage stimuli. DE materials have promising usage in future’s soft actuators and sensors, such as soft robotics, energy harvesters, and wearable sensors. In this paper, a stripe DE actuator with integrated sensing capability is designed, fabricated, and characterized. Since the strip actuator can be approximated as a compliant capacitor, it is possible to detect the actuator’s displacement by analyzing the actuator’s impedance change. An integrated sensing scheme that adds a high frequency probing signal into actuation signal is developed. Electrical impedance changes in the probing signal are extracted by fast Fourier transform algorithm, and nonlinear data fitting methods involving artificial neural network are implemented to detect the actuator’s displacement. A series of experiments show that by improving data processing and analyzing methods, the integrated sensing method can achieve error level of lower than 1%.

  18. Development of a Flush Airdata Sensing System on a Sharp-Nosed Vehicle for Flight at Mach 3 to 8

    NASA Technical Reports Server (NTRS)

    Davis, Mark C.; Pahle, Joseph W.; White, John Terry; Marshall, Laurie A.; Mashburn, Michael J.; Franks, Rick

    2000-01-01

    NASA Dryden Flight Research Center has developed a flush airdata sensing (FADS) system on a sharp-nosed, wedge-shaped vehicle. This paper details the design and calibration of a real-time angle-of-attack estimation scheme developed to meet the onboard airdata measurement requirements for a research vehicle equipped with a supersonic-combustion ramjet engine. The FADS system has been designed to perform in flights at Mach 3-8 and at -6 deg - 12 deg angle of attack. The description of the FADS architecture includes port layout, pneumatic design, and hardware integration. Predictive models of static and dynamic performance are compared with wind-tunnel results across the Mach and angle-of-attack range. Results indicate that static angle-of-attack accuracy and pneumatic lag can be adequately characterized and incorporated into a real-time algorithm.

  19. Development of a Flush Airdata Sensing System on a Sharp-Nosed Vehicle for Flight at Mach 3 to 8

    NASA Technical Reports Server (NTRS)

    Davis, Mark C.; Pahle, Joseph W.; White, John Terry; Marshall, Laurie A.; Mashburn, Michael J.; Franks, Rick

    2000-01-01

    NASA Dryden Flight Research Center has developed a flush airdata sensing (FADS) system on a sharp-nosed, wedge-shaped vehicle. This paper details the design and calibration of a real-time angle-of-attack estimation scheme developed to meet the onboard airdata measurement requirements for a research vehicle equipped with a supersonic-combustion ramjet engine. The FADS system has been designed to perform in flights at speeds between Mach 3 and Mach 8 and at angles of attack between -6 deg. and 12 deg. The description of the FADS architecture includes port layout, pneumatic design, and hardware integration. Predictive models of static and dynamic performance are compared with wind-tunnel results across the Mach and angle-of-attack range. Results indicate that static angle-of-attack accuracy and pneumatic lag can be adequately characterized and incorporated into a real-time algorithm.

  20. Water Column Correction for Coral Reef Studies by Remote Sensing

    PubMed Central

    Zoffoli, Maria Laura; Frouin, Robert; Kampel, Milton

    2014-01-01

    Human activity and natural climate trends constitute a major threat to coral reefs worldwide. Models predict a significant reduction in reef spatial extension together with a decline in biodiversity in the relatively near future. In this context, monitoring programs to detect changes in reef ecosystems are essential. In recent years, coral reef mapping using remote sensing data has benefited from instruments with better resolution and computational advances in storage and processing capabilities. However, the water column represents an additional complexity when extracting information from submerged substrates by remote sensing that demands a correction of its effect. In this article, the basic concepts of bottom substrate remote sensing and water column interference are presented. A compendium of methodologies developed to reduce water column effects in coral ecosystems studied by remote sensing that include their salient features, advantages and drawbacks is provided. Finally, algorithms to retrieve the bottom reflectance are applied to simulated data and actual remote sensing imagery and their performance is compared. The available methods are not able to completely eliminate the water column effect, but they can minimize its influence. Choosing the best method depends on the marine environment, available input data and desired outcome or scientific application. PMID:25215941

Top